Sample records for block cipher analysis

  1. Optical design of cipher block chaining (CBC) encryption mode by using digital holography

    NASA Astrophysics Data System (ADS)

    Gil, Sang Keun; Jeon, Seok Hee; Jung, Jong Rae; Kim, Nam

    2016-03-01

    We propose an optical design of cipher block chaining (CBC) encryption by using digital holographic technique, which has higher security than the conventional electronic method because of the analog-type randomized cipher text with 2-D array. In this paper, an optical design of CBC encryption mode is implemented by 2-step quadrature phase-shifting digital holographic encryption technique using orthogonal polarization. A block of plain text is encrypted with the encryption key by applying 2-step phase-shifting digital holography, and it is changed into cipher text blocks which are digital holograms. These ciphered digital holograms with the encrypted information are Fourier transform holograms and are recorded on CCDs with 256 gray levels quantized intensities. The decryption is computed by these encrypted digital holograms of cipher texts, the same encryption key and the previous cipher text. Results of computer simulations are presented to verify that the proposed method shows the feasibility in the high secure CBC encryption system.

  2. Using the Hill Cipher to Teach Cryptographic Principles

    ERIC Educational Resources Information Center

    McAndrew, Alasdair

    2008-01-01

    The Hill cipher is the simplest example of a "block cipher," which takes a block of plaintext as input, and returns a block of ciphertext as output. Although it is insecure by modern standards, its simplicity means that it is well suited for the teaching of such concepts as encryption modes, and properties of cryptographic hash functions. Although…

  3. Block cipher based on modular arithmetic and methods of information compression

    NASA Astrophysics Data System (ADS)

    Krendelev, S.; Zbitnev, N.; Shishlyannikov, D.; Gridin, D.

    2017-10-01

    The article focuses on the description of a new block cipher. Due to the heightened interest in BigData the described cipher is used to encrypt big volumes of data in cloud storage services. The main advantages of the given cipher are the ease of implementation and the possibility of probabilistic encryption. This means that the text encryption will be different when the key is the same and the data is the same. So, the strength of the encryption is improved. Additionally, the ciphered message size can be hardly predicted.

  4. A fast image encryption algorithm based on only blocks in cipher text

    NASA Astrophysics Data System (ADS)

    Wang, Xing-Yuan; Wang, Qian

    2014-03-01

    In this paper, a fast image encryption algorithm is proposed, in which the shuffling and diffusion is performed simultaneously. The cipher-text image is divided into blocks and each block has k ×k pixels, while the pixels of the plain-text are scanned one by one. Four logistic maps are used to generate the encryption key stream and the new place in the cipher image of plain image pixels, including the row and column of the block which the pixel belongs to and the place where the pixel would be placed in the block. After encrypting each pixel, the initial conditions of logistic maps would be changed according to the encrypted pixel's value; after encrypting each row of plain image, the initial condition would also be changed by the skew tent map. At last, it is illustrated that this algorithm has a faster speed, big key space, and better properties in withstanding differential attacks, statistical analysis, known plaintext, and chosen plaintext attacks.

  5. A joint encryption/watermarking system for verifying the reliability of medical images.

    PubMed

    Bouslimi, Dalel; Coatrieux, Gouenou; Cozic, Michel; Roux, Christian

    2012-09-01

    In this paper, we propose a joint encryption/water-marking system for the purpose of protecting medical images. This system is based on an approach which combines a substitutive watermarking algorithm, the quantization index modulation, with an encryption algorithm: a stream cipher algorithm (e.g., the RC4) or a block cipher algorithm (e.g., the AES in cipher block chaining (CBC) mode of operation). Our objective is to give access to the outcomes of the image integrity and of its origin even though the image is stored encrypted. If watermarking and encryption are conducted jointly at the protection stage, watermark extraction and decryption can be applied independently. The security analysis of our scheme and experimental results achieved on 8-bit depth ultrasound images as well as on 16-bit encoded positron emission tomography images demonstrate the capability of our system to securely make available security attributes in both spatial and encrypted domains while minimizing image distortion. Furthermore, by making use of the AES block cipher in CBC mode, the proposed system is compliant with or transparent to the DICOM standard.

  6. Cloud Computing Security Model with Combination of Data Encryption Standard Algorithm (DES) and Least Significant Bit (LSB)

    NASA Astrophysics Data System (ADS)

    Basri, M.; Mawengkang, H.; Zamzami, E. M.

    2018-03-01

    Limitations of storage sources is one option to switch to cloud storage. Confidentiality and security of data stored on the cloud is very important. To keep up the confidentiality and security of such data can be done one of them by using cryptography techniques. Data Encryption Standard (DES) is one of the block cipher algorithms used as standard symmetric encryption algorithm. This DES will produce 8 blocks of ciphers combined into one ciphertext, but the ciphertext are weak against brute force attacks. Therefore, the last 8 block cipher will be converted into 8 random images using Least Significant Bit (LSB) algorithm which later draws the result of cipher of DES algorithm to be merged into one.

  7. Audio Steganography with Embedded Text

    NASA Astrophysics Data System (ADS)

    Teck Jian, Chua; Chai Wen, Chuah; Rahman, Nurul Hidayah Binti Ab.; Hamid, Isredza Rahmi Binti A.

    2017-08-01

    Audio steganography is about hiding the secret message into the audio. It is a technique uses to secure the transmission of secret information or hide their existence. It also may provide confidentiality to secret message if the message is encrypted. To date most of the steganography software such as Mp3Stego and DeepSound use block cipher such as Advanced Encryption Standard or Data Encryption Standard to encrypt the secret message. It is a good practice for security. However, the encrypted message may become too long to embed in audio and cause distortion of cover audio if the secret message is too long. Hence, there is a need to encrypt the message with stream cipher before embedding the message into the audio. This is because stream cipher provides bit by bit encryption meanwhile block cipher provide a fixed length of bits encryption which result a longer output compare to stream cipher. Hence, an audio steganography with embedding text with Rivest Cipher 4 encryption cipher is design, develop and test in this project.

  8. Affine Equivalence and Constructions of Cryptographically Strong Boolean Functions

    DTIC Science & Technology

    2013-09-01

    manner is crucial for today’s global citizen. We want our financial transactions over the Internet to get processed without error. Cyber warfare between...encryption and decryption processes . An asymmetric cipher uses different keys to encrypt and decrypt a message, and the connection between the encryption and...Depending on how a symmetric cipher processes a message before encryption or de- cryption, a symmetric cipher can be further classified into a block or

  9. Quantum exhaustive key search with simplified-DES as a case study.

    PubMed

    Almazrooie, Mishal; Samsudin, Azman; Abdullah, Rosni; Mutter, Kussay N

    2016-01-01

    To evaluate the security of a symmetric cryptosystem against any quantum attack, the symmetric algorithm must be first implemented on a quantum platform. In this study, a quantum implementation of a classical block cipher is presented. A quantum circuit for a classical block cipher of a polynomial size of quantum gates is proposed. The entire work has been tested on a quantum mechanics simulator called libquantum. First, the functionality of the proposed quantum cipher is verified and the experimental results are compared with those of the original classical version. Then, quantum attacks are conducted by using Grover's algorithm to recover the secret key. The proposed quantum cipher is used as a black box for the quantum search. The quantum oracle is then queried over the produced ciphertext to mark the quantum state, which consists of plaintext and key qubits. The experimental results show that for a key of n-bit size and key space of N such that [Formula: see text], the key can be recovered in [Formula: see text] computational steps.

  10. Chaos based video encryption using maps and Ikeda time delay system

    NASA Astrophysics Data System (ADS)

    Valli, D.; Ganesan, K.

    2017-12-01

    Chaos based cryptosystems are an efficient method to deal with improved speed and highly secured multimedia encryption because of its elegant features, such as randomness, mixing, ergodicity, sensitivity to initial conditions and control parameters. In this paper, two chaos based cryptosystems are proposed: one is the higher-dimensional 12D chaotic map and the other is based on the Ikeda delay differential equation (DDE) suitable for designing a real-time secure symmetric video encryption scheme. These encryption schemes employ a substitution box (S-box) to diffuse the relationship between pixels of plain video and cipher video along with the diffusion of current input pixel with the previous cipher pixel, called cipher block chaining (CBC). The proposed method enhances the robustness against statistical, differential and chosen/known plain text attacks. Detailed analysis is carried out in this paper to demonstrate the security and uniqueness of the proposed scheme.

  11. Manticore and CS mode : parallelizable encryption with joint cipher-state authentication.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree

    2004-10-01

    We describe a new mode of encryption with inexpensive authentication, which uses information from the internal state of the cipher to provide the authentication. Our algorithms have a number of benefits: (1) the encryption has properties similar to CBC mode, yet the encipherment and authentication can be parallelized and/or pipelined, (2) the authentication overhead is minimal, and (3) the authentication process remains resistant against some IV reuse. We offer a Manticore class of authenticated encryption algorithms based on cryptographic hash functions, which support variable block sizes up to twice the hash output length and variable key lengths. A proof ofmore » security is presented for the MTC4 and Pepper algorithms. We then generalize the construction to create the Cipher-State (CS) mode of encryption that uses the internal state of any round-based block cipher as an authenticator. We provide hardware and software performance estimates for all of our constructions and give a concrete example of the CS mode of encryption that uses AES as the encryption primitive and adds a small speed overhead (10-15%) compared to AES alone.« less

  12. Possibilities and testing of CPRNG in block cipher mode of operation PM-DC-LM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zacek, Petr; Jasek, Roman; Malanik, David

    2016-06-08

    This paper discusses the chaotic pseudo-random number generator (CPRNG), which is used in block cipher mode of operation called PM-DC-LM. PM-DC-LM is one of possible subversions of general PM mode. In this paper is not discussed the design of PM-DC-LM, but only CPRNG as a part of it because designing is written in other papers. Possibilities, how to change or to improve CPRNG are mentioned. The final part is devoted for a little testing of CPRNG and some testing data are shown.

  13. Multicollision attack on CBC-MAC, EMAC, and XCBC-MAC of AES-128 algorithm

    NASA Astrophysics Data System (ADS)

    Brolin Sihite, Alfonso; Hayat Susanti, Bety

    2017-10-01

    A Message Authentication Codes (MAC) can be constructed based on a block cipher algorithm. CBC-MAC, EMAC, and XCBC-MAC constructions are some of MAC schemes that used in the hash function. In this paper, we do multicollision attack on CBC-MAC, EMAC, and XCBC-MAC construction which uses AES-128 block cipher algorithm as basic construction. The method of multicollision attack utilizes the concept of existential forgery on CBC-MAC. The results show that the multicollision can be obtained easily in CBC-MAC, EMAC, and XCBC-MAC construction.

  14. Design of an image encryption scheme based on a multiple chaotic map

    NASA Astrophysics Data System (ADS)

    Tong, Xiao-Jun

    2013-07-01

    In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.

  15. Quantum enigma cipher as a generalization of the quantum stream cipher

    NASA Astrophysics Data System (ADS)

    Kato, Kentaro

    2016-09-01

    Various types of randomizations for the quantum stream cipher by Y00 protocol have been developed so far. In particular, it must be noted that the analysis of immunity against correlation attacks with a new type of randomization by Hirota and Kurosawa prompted a new look at the quantum stream cipher by Y00 protocol (Quant. Inform. Process. 6(2) 2007). From the preceding study on the quantum stream cipher, we recognized that the quantum stream cipher by Y00 protocol would be able to be generalized to a new type of physical cipher that has potential to exceed the Shannon limit by installing additional randomization mechanisms, in accordance with the law of quantum mechanics. We call this new type of physical random cipher the quantum enigma cipher. In this article, we introduce the recent developments for the quantum stream cipher by Y00 protocol and future plans toward the quantum enigma cipher.

  16. A pipelined FPGA implementation of an encryption algorithm based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    Thirer, Nonel

    2013-05-01

    With the evolution of digital data storage and exchange, it is essential to protect the confidential information from every unauthorized access. High performance encryption algorithms were developed and implemented by software and hardware. Also many methods to attack the cipher text were developed. In the last years, the genetic algorithm has gained much interest in cryptanalysis of cipher texts and also in encryption ciphers. This paper analyses the possibility to use the genetic algorithm as a multiple key sequence generator for an AES (Advanced Encryption Standard) cryptographic system, and also to use a three stages pipeline (with four main blocks: Input data, AES Core, Key generator, Output data) to provide a fast encryption and storage/transmission of a large amount of data.

  17. A joint watermarking/encryption algorithm for verifying medical image integrity and authenticity in both encrypted and spatial domains.

    PubMed

    Bouslimi, D; Coatrieux, G; Roux, Ch

    2011-01-01

    In this paper, we propose a new joint watermarking/encryption algorithm for the purpose of verifying the reliability of medical images in both encrypted and spatial domains. It combines a substitutive watermarking algorithm, the quantization index modulation (QIM), with a block cipher algorithm, the Advanced Encryption Standard (AES), in CBC mode of operation. The proposed solution gives access to the outcomes of the image integrity and of its origins even though the image is stored encrypted. Experimental results achieved on 8 bits encoded Ultrasound images illustrate the overall performances of the proposed scheme. By making use of the AES block cipher in CBC mode, the proposed solution is compliant with or transparent to the DICOM standard.

  18. Implementation analysis of RC5 algorithm on Preneel-Govaerts-Vandewalle (PGV) hashing schemes using length extension attack

    NASA Astrophysics Data System (ADS)

    Siswantyo, Sepha; Susanti, Bety Hayat

    2016-02-01

    Preneel-Govaerts-Vandewalle (PGV) schemes consist of 64 possible single-block-length schemes that can be used to build a hash function based on block ciphers. For those 64 schemes, Preneel claimed that 4 schemes are secure. In this paper, we apply length extension attack on those 4 secure PGV schemes which use RC5 algorithm in its basic construction to test their collision resistance property. The attack result shows that the collision occurred on those 4 secure PGV schemes. Based on the analysis, we indicate that Feistel structure and data dependent rotation operation in RC5 algorithm, XOR operations on the scheme, along with selection of additional message block value also give impact on the collision to occur.

  19. Fast encryption of RGB color digital images using a tweakable cellular automaton based schema

    NASA Astrophysics Data System (ADS)

    Faraoun, Kamel Mohamed

    2014-12-01

    We propose a new tweakable construction of block-enciphers using second-order reversible cellular automata, and we apply it to encipher RGB-colored images. The proposed construction permits a parallel encryption of the image content by extending the standard definition of a block cipher to take into account a supplementary parameter used as a tweak (nonce) to control the behavior of the cipher from one region of the image to the other, and hence avoid the necessity to use slow sequential encryption's operating modes. The proposed construction defines a flexible pseudorandom permutation that can be used with efficacy to solve the electronic code book problem without the need to a specific sequential mode. Obtained results from various experiments show that the proposed schema achieves high security and execution performances, and enables an interesting mode of selective area decryption due to the parallel character of the approach.

  20. Attack to AN Image Encryption Based on Chaotic Logistic Map

    NASA Astrophysics Data System (ADS)

    Wang, Xing-Yuan; Chen, Feng; Wang, Tian; Xu, Dahai; Ma, Yutian

    2013-10-01

    This paper offers two different attacks on a freshly proposed image encryption based on chaotic logistic map. The cryptosystem under study first uses a secret key of 80-bit and employed two chaotic logistic maps. We derived the initial conditions of the logistic maps from using the secret key by providing different weights to all its bits. Additionally, in this paper eight different types of procedures are used to encrypt the pixels of an image in the proposed encryption process of which one of them will be used for a certain pixel which is determined by the product of the logistic map. The secret key is revised after encrypting each block which consisted of 16 pixels of the image. The encrypting process have weakness, worst of which is that every byte of plaintext is independent when substituted, so the cipher text of the byte will not change even the other bytes have changed. As a result of weakness, a chosen plaintext attack and a chosen cipher text attack can be completed without any knowledge of the key value to recuperate the ciphered image.

  1. Cryptanalysis on classical cipher based on Indonesian language

    NASA Astrophysics Data System (ADS)

    Marwati, R.; Yulianti, K.

    2018-05-01

    Cryptanalysis is a process of breaking a cipher in an illegal decryption. This paper discusses about encryption some classic cryptography, breaking substitution cipher and stream cipher, and increasing its security. Encryption and ciphering based on Indonesian Language text. Microsoft Word and Microsoft Excel were chosen as ciphering and breaking tools.

  2. Dynamic video encryption algorithm for H.264/AVC based on a spatiotemporal chaos system.

    PubMed

    Xu, Hui; Tong, Xiao-Jun; Zhang, Miao; Wang, Zhu; Li, Ling-Hao

    2016-06-01

    Video encryption schemes mostly employ the selective encryption method to encrypt parts of important and sensitive video information, aiming to ensure the real-time performance and encryption efficiency. The classic block cipher is not applicable to video encryption due to the high computational overhead. In this paper, we propose the encryption selection control module to encrypt video syntax elements dynamically which is controlled by the chaotic pseudorandom sequence. A novel spatiotemporal chaos system and binarization method is used to generate a key stream for encrypting the chosen syntax elements. The proposed scheme enhances the resistance against attacks through the dynamic encryption process and high-security stream cipher. Experimental results show that the proposed method exhibits high security and high efficiency with little effect on the compression ratio and time cost.

  3. Protecting privacy in a clinical data warehouse.

    PubMed

    Kong, Guilan; Xiao, Zhichun

    2015-06-01

    Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.

  4. Secure chaotic map based block cryptosystem with application to camera sensor networks.

    PubMed

    Guo, Xianfeng; Zhang, Jiashu; Khan, Muhammad Khurram; Alghathbar, Khaled

    2011-01-01

    Recently, Wang et al. presented an efficient logistic map based block encryption system. The encryption system employs feedback ciphertext to achieve plaintext dependence of sub-keys. Unfortunately, we discovered that their scheme is unable to withstand key stream attack. To improve its security, this paper proposes a novel chaotic map based block cryptosystem. At the same time, a secure architecture for camera sensor network is constructed. The network comprises a set of inexpensive camera sensors to capture the images, a sink node equipped with sufficient computation and storage capabilities and a data processing server. The transmission security between the sink node and the server is gained by utilizing the improved cipher. Both theoretical analysis and simulation results indicate that the improved algorithm can overcome the flaws and maintain all the merits of the original cryptosystem. In addition, computational costs and efficiency of the proposed scheme are encouraging for the practical implementation in the real environment as well as camera sensor network.

  5. Secure Chaotic Map Based Block Cryptosystem with Application to Camera Sensor Networks

    PubMed Central

    Guo, Xianfeng; Zhang, Jiashu; Khan, Muhammad Khurram; Alghathbar, Khaled

    2011-01-01

    Recently, Wang et al. presented an efficient logistic map based block encryption system. The encryption system employs feedback ciphertext to achieve plaintext dependence of sub-keys. Unfortunately, we discovered that their scheme is unable to withstand key stream attack. To improve its security, this paper proposes a novel chaotic map based block cryptosystem. At the same time, a secure architecture for camera sensor network is constructed. The network comprises a set of inexpensive camera sensors to capture the images, a sink node equipped with sufficient computation and storage capabilities and a data processing server. The transmission security between the sink node and the server is gained by utilizing the improved cipher. Both theoretical analysis and simulation results indicate that the improved algorithm can overcome the flaws and maintain all the merits of the original cryptosystem. In addition, computational costs and efficiency of the proposed scheme are encouraging for the practical implementation in the real environment as well as camera sensor network. PMID:22319371

  6. Exploitation of Unintentional Information Leakage from Integrated Circuits

    DTIC Science & Technology

    2011-12-01

    U.S. Defense Science Board Task Force examined the effects and risks of outsourcing high performance microchip production to foreign countries [Off05...mapping methodology is developed and demon- strated to comprehensively assess the information leakage of arbitrary block cipher implementations. The...engineering poses a serious threat since it can en- able competitors or adversaries to bypass years of research and development through counterfeiting or

  7. Impact of tailored blogs and content on usage of Web CIPHER - an online platform to help policymakers better engage with evidence from research.

    PubMed

    Makkar, Steve R; Howe, Megan; Williamson, Anna; Gilham, Frances

    2016-12-01

    There is a need to develop innovations that can help bridge the gap between research and policy. Web CIPHER is an online tool designed to help policymakers better engage with research in order to increase its use in health policymaking. The aim of the present study was to test interventions in order to increase policymakers' usage of Web CIPHER. Namely, the impact of posting articles and blogs on topics relevant to the missions and scope of selected policy agencies in the Web CIPHER community. Five policy agencies were targeted for the intervention. Web CIPHER usage data was gathered over a 30-month period using Google Analytics. Time series analysis was used to evaluate whether publication of tailored articles and blogs led to significant changes in usage for all Web CIPHER members from policy agencies, including those from the five target agencies. We further evaluated whether these users showed greater increases in usage following publication of articles and blogs directly targeted at their agency, and if these effects were moderated by the blog author. Web CIPHER usage gradually increased over time and was significantly predicted by the number of articles but not blogs that were posted throughout the study period. Publication of articles on sexual and reproductive health was followed by sustained increases in usage among all users, including users from the policy agency that targets this area. This effect of topic relevance did not occur for the four remaining target agencies. Finally, page views were higher for articles targeted at one's agency compared to other agencies. This effect also occurred for blogs, particularly when the author was internal to one's agency. The findings suggest that Web CIPHER usage in general was motivated by general interest, engagement and appeal, as opposed to the agency specificity of content and work relevance. Blogs in and of themselves may not be effective at promoting usage. Thus, in order to increase policymakers' engagement with research through similar online platforms, a potentially effective approach would be to post abundant, frequently updated, engaging, interesting and widely appealing content irrespective of form.

  8. Differential Fault Analysis on CLEFIA

    NASA Astrophysics Data System (ADS)

    Chen, Hua; Wu, Wenling; Feng, Dengguo

    CLEFIA is a new 128-bit block cipher proposed by SONY corporation recently. The fundamental structure of CLEFIA is a generalized Feistel structure consisting of 4 data lines. In this paper, the strength of CLEFIA against the differential fault attack is explored. Our attack adopts the byte-oriented model of random faults. Through inducing randomly one byte fault in one round, four bytes of faults can be simultaneously obtained in the next round, which can efficiently reduce the total induce times in the attack. After attacking the last several rounds' encryptions, the original secret key can be recovered based on some analysis of the key schedule. The data complexity analysis and experiments show that only about 18 faulty ciphertexts are needed to recover the entire 128-bit secret key and about 54 faulty ciphertexts for 192/256-bit keys.

  9. DNA-based cryptographic methods for data hiding in DNA media.

    PubMed

    Marwan, Samiha; Shawish, Ahmed; Nagaty, Khaled

    2016-12-01

    Information security can be achieved using cryptography, steganography or a combination of them, where data is firstly encrypted using any of the available cryptography techniques and then hid into any hiding medium. Recently, the famous genomic DNA has been introduced as a hiding medium, known as DNA steganography, due to its notable ability to hide huge data sets with a high level of randomness and hence security. Despite the numerous cryptography techniques, to our knowledge only the vigenere cipher and the DNA-based playfair cipher have been combined with the DNA steganography, which keeps space for investigation of other techniques and coming up with new improvements. This paper presents a comprehensive analysis between the DNA-based playfair, vigenere, RSA and the AES ciphers, each combined with a DNA hiding technique. The conducted analysis reports the performance diversity of each combined technique in terms of security, speed, hiding capacity in addition to both key size and data size. Moreover, this paper proposes a modification of the current combined DNA-based playfair cipher technique, which makes it not only simple and fast but also provides a significantly higher hiding capacity and security. The conducted extensive experimental studies confirm such outstanding performance in comparison with all the discussed combined techniques. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Proving Chaotic Behavior of CBC Mode of Operation

    NASA Astrophysics Data System (ADS)

    Abidi, Abdessalem; Wang, Qianxue; Bouallegue, Belgacem; Machhout, Mohsen; Guyeux, Christophe

    2016-06-01

    The cipher block chaining (CBC) mode of operation was invented by IBM (International Business Machine) in 1976. It presents a very popular way of encrypting that is used in various applications. In this paper, we have mathematically proven that, under some conditions, the CBC mode of operation can admit a chaotic behavior according to Devaney. Some cases will be properly studied in order to provide evidence for this idea.

  11. Developing of Library for Proofs of Data Possession in Charm

    DTIC Science & Technology

    2013-06-01

    INTENTIONALLY LEFT BLANK x LIST OF ACRONYMS AND ABBREVIATIONS API Application Programmer Interface DTP Datatype -preserving Encryption FedRAMP U.S...proposed block-cipher mode for Datatype -Preserving Encryption (DTP) uses the Knuth Shuffle in one of its steps [19]. It may be advantageous to...http://www.clustal.org/omega/clustalo-api/util_8c.html. [19] U. T. Mattsson, “Format-controlling encryption using datatype -preserving encryption

  12. A Study on the Stream Cipher Embedded Magic Square of Random Access Files

    NASA Astrophysics Data System (ADS)

    Liu, Chenglian; Zhao, Jian-Ming; Rafsanjani, Marjan Kuchaki; Shen, Yijuan

    2011-09-01

    Magic square and stream cipher issues are both interesting and well-tried topics. In this paper, we are proposing a new scheme which streams cipher applications for random access files based on the magic square method. There are two thresholds required to secure our data, if using only decrypts by the stream cipher. It isn't to recovery original source. On other hand, we improve the model of cipher stream to strengthen and defend efficiently; it also was its own high speed and calculates to most parts of the key stream generator.

  13. The Combination of RSA And Block Chiper Algorithms To Maintain Message Authentication

    NASA Astrophysics Data System (ADS)

    Yanti Tarigan, Sepri; Sartika Ginting, Dewi; Lumban Gaol, Melva; Lorensi Sitompul, Kristin

    2017-12-01

    RSA algorithm is public key algorithm using prime number and even still used today. The strength of this algorithm lies in the exponential process, and the factorial number into 2 prime numbers which until now difficult to do factoring. The RSA scheme itself adopts the block cipher scheme, where prior to encryption, the existing plaintext is divide in several block of the same length, where the plaintext and ciphertext are integers between 1 to n, where n is typically 1024 bit, and the block length itself is smaller or equal to log(n)+1 with base 2. With the combination of RSA algorithm and block chiper it is expected that the authentication of plaintext is secure. The secured message will be encrypted with RSA algorithm first and will be encrypted again using block chiper. And conversely, the chipertext will be decrypted with the block chiper first and decrypted again with the RSA algorithm. This paper suggests a combination of RSA algorithms and block chiper to secure data.

  14. A new feedback image encryption scheme based on perturbation with dynamical compound chaotic sequence cipher generator

    NASA Astrophysics Data System (ADS)

    Tong, Xiaojun; Cui, Minggen; Wang, Zhu

    2009-07-01

    The design of the new compound two-dimensional chaotic function is presented by exploiting two one-dimensional chaotic functions which switch randomly, and the design is used as a chaotic sequence generator which is proved by Devaney's definition proof of chaos. The properties of compound chaotic functions are also proved rigorously. In order to improve the robustness against difference cryptanalysis and produce avalanche effect, a new feedback image encryption scheme is proposed using the new compound chaos by selecting one of the two one-dimensional chaotic functions randomly and a new image pixels method of permutation and substitution is designed in detail by array row and column random controlling based on the compound chaos. The results from entropy analysis, difference analysis, statistical analysis, sequence randomness analysis, cipher sensitivity analysis depending on key and plaintext have proven that the compound chaotic sequence cipher can resist cryptanalytic, statistical and brute-force attacks, and especially it accelerates encryption speed, and achieves higher level of security. By the dynamical compound chaos and perturbation technology, the paper solves the problem of computer low precision of one-dimensional chaotic function.

  15. Proof of cipher text ownership based on convergence encryption

    NASA Astrophysics Data System (ADS)

    Zhong, Weiwei; Liu, Zhusong

    2017-08-01

    Cloud storage systems save disk space and bandwidth through deduplication technology, but with the use of this technology has been targeted security attacks: the attacker can get the original file just use hash value to deceive the server to obtain the file ownership. In order to solve the above security problems and the different security requirements of cloud storage system files, an efficient information theory security proof of ownership scheme is proposed. This scheme protects the data through the convergence encryption method, and uses the improved block-level proof of ownership scheme, and can carry out block-level client deduplication to achieve efficient and secure cloud storage deduplication scheme.

  16. Symmetric Stream Cipher using Triple Transposition Key Method and Base64 Algorithm for Security Improvement

    NASA Astrophysics Data System (ADS)

    Nurdiyanto, Heri; Rahim, Robbi; Wulan, Nur

    2017-12-01

    Symmetric type cryptography algorithm is known many weaknesses in encryption process compared with asymmetric type algorithm, symmetric stream cipher are algorithm that works on XOR process between plaintext and key, to improve the security of symmetric stream cipher algorithm done improvisation by using Triple Transposition Key which developed from Transposition Cipher and also use Base64 algorithm for encryption ending process, and from experiment the ciphertext that produced good enough and very random.

  17. Ganzua: A Cryptanalysis Tool for Monoalphabetic and Polyalphabetic Ciphers

    ERIC Educational Resources Information Center

    Garcia-Pasquel, Jesus Adolfo; Galaviz, Jose

    2006-01-01

    Many introductory courses to cryptology and computer security start with or include a discussion of classical ciphers that usually contemplates some cryptanalysis techniques used to break them. Ganzua (picklock in Spanish) is an application designed to assist the cryptanalysis of ciphertext obtained with monoalphabetic or polyalphabetic ciphers.…

  18. A Scheme for Obtaining Secure S-Boxes Based on Chaotic Baker's Map

    NASA Astrophysics Data System (ADS)

    Gondal, Muhammad Asif; Abdul Raheem; Hussain, Iqtadar

    2014-09-01

    In this paper, a method for obtaining cryptographically strong 8 × 8 substitution boxes (S-boxes) is presented. The method is based on chaotic baker's map and a "mini version" of a new block cipher with block size 8 bits and can be easily and efficiently performed on a computer. The cryptographic strength of some 8 × 8 S-boxes randomly produced by the method is analyzed. The results show (1) all of them are bijective; (2) the nonlinearity of each output bit of them is usually about 100; (3) all of them approximately satisfy the strict avalanche criterion and output bits independence criterion; (4) they all have an almost equiprobable input/output XOR distribution.

  19. Evaluation of Information Leakage from Cryptographic Hardware via Common-Mode Current

    NASA Astrophysics Data System (ADS)

    Hayashi, Yu-Ichi; Homma, Naofumi; Mizuki, Takaaki; Sugawara, Takeshi; Kayano, Yoshiki; Aoki, Takafumi; Minegishi, Shigeki; Satoh, Akashi; Sone, Hideaki; Inoue, Hiroshi

    This paper presents a possibility of Electromagnetic (EM) analysis against cryptographic modules outside their security boundaries. The mechanism behind the information leakage is explained from the view point of Electromagnetic Compatibility: electric fluctuation released from cryptographic modules can conduct to peripheral circuits based on ground bounce, resulting in radiation. We demonstrate the consequence of the mechanism through experiments where the ISO/IEC standard block cipher AES (Advanced Encryption Standard) is implemented on an FPGA board and EM radiations from power and communication cables are measured. Correlation Electromagnetic Analysis (CEMA) is conducted in order to evaluate the information leakage. The experimental results show that secret keys are revealed even though there are various disturbing factors such as voltage regulators and AC/DC converters between the target module and the measurement points. We also discuss information-suppression techniques as electrical-level countermeasures against such CEMAs.

  20. Experiments of 10 Gbit/sec quantum stream cipher applicable to optical Ethernet and optical satellite link

    NASA Astrophysics Data System (ADS)

    Hirota, Osamu; Ohhata, Kenichi; Honda, Makoto; Akutsu, Shigeto; Doi, Yoshifumi; Harasawa, Katsuyoshi; Yamashita, Kiichi

    2009-08-01

    The security issue for the next generation optical network which realizes Cloud Computing System Service with data center" is urgent problem. In such a network, the encryption by physical layer which provide super security and small delay should be employed. It must provide, however, very high speed encryption because the basic link is operated at 2.5 Gbit/sec or 10 Gbit/sec. The quantum stream cipher by Yuen-2000 protocol (Y-00) is a completely new type random cipher so called Gauss-Yuen random cipher, which can break the Shannon limit for the symmetric key cipher. We develop such a cipher which has good balance of the security, speed and cost performance. In SPIE conference on quantum communication and quantum imaging V, we reported a demonstration of 2.5 Gbit/sec system for the commercial link and proposed how to improve it to 10 Gbit/sec. This paper reports a demonstration of the Y-00 cipher system which works at 10 Gbit/sec. A transmission test in a laboratory is tried to get the basic data on what parameters are important to operate in the real commercial networks. In addition, we give some theoretical results on the security. It is clarified that the necessary condition to break the Shannon limit requires indeed the quantum phenomenon, and that the full information theoretically secure system is available in the satellite link application.

  1. A novel chaotic stream cipher and its application to palmprint template protection

    NASA Astrophysics Data System (ADS)

    Li, Heng-Jian; Zhang, Jia-Shu

    2010-04-01

    Based on a coupled nonlinear dynamic filter (NDF), a novel chaotic stream cipher is presented in this paper and employed to protect palmprint templates. The chaotic pseudorandom bit generator (PRBG) based on a coupled NDF, which is constructed in an inverse flow, can generate multiple bits at one iteration and satisfy the security requirement of cipher design. Then, the stream cipher is employed to generate cancelable competitive code palmprint biometrics for template protection. The proposed cancelable palmprint authentication system depends on two factors: the palmprint biometric and the password/token. Therefore, the system provides high-confidence and also protects the user's privacy. The experimental results of verification on the Hong Kong PolyU Palmprint Database show that the proposed approach has a large template re-issuance ability and the equal error rate can achieve 0.02%. The performance of the palmprint template protection scheme proves the good practicability and security of the proposed stream cipher.

  2. An Unequal Secure Encryption Scheme for H.264/AVC Video Compression Standard

    NASA Astrophysics Data System (ADS)

    Fan, Yibo; Wang, Jidong; Ikenaga, Takeshi; Tsunoo, Yukiyasu; Goto, Satoshi

    H.264/AVC is the newest video coding standard. There are many new features in it which can be easily used for video encryption. In this paper, we propose a new scheme to do video encryption for H.264/AVC video compression standard. We define Unequal Secure Encryption (USE) as an approach that applies different encryption schemes (with different security strength) to different parts of compressed video data. This USE scheme includes two parts: video data classification and unequal secure video data encryption. Firstly, we classify the video data into two partitions: Important data partition and unimportant data partition. Important data partition has small size with high secure protection, while unimportant data partition has large size with low secure protection. Secondly, we use AES as a block cipher to encrypt the important data partition and use LEX as a stream cipher to encrypt the unimportant data partition. AES is the most widely used symmetric cryptography which can ensure high security. LEX is a new stream cipher which is based on AES and its computational cost is much lower than AES. In this way, our scheme can achieve both high security and low computational cost. Besides the USE scheme, we propose a low cost design of hybrid AES/LEX encryption module. Our experimental results show that the computational cost of the USE scheme is low (about 25% of naive encryption at Level 0 with VEA used). The hardware cost for hybrid AES/LEX module is 4678 Gates and the AES encryption throughput is about 50Mbps.

  3. An Implementation of RC4+ Algorithm and Zig-zag Algorithm in a Super Encryption Scheme for Text Security

    NASA Astrophysics Data System (ADS)

    Budiman, M. A.; Amalia; Chayanie, N. I.

    2018-03-01

    Cryptography is the art and science of using mathematical methods to preserve message security. There are two types of cryptography, namely classical and modern cryptography. Nowadays, most people would rather use modern cryptography than classical cryptography because it is harder to break than the classical one. One of classical algorithm is the Zig-zag algorithm that uses the transposition technique: the original message is unreadable unless the person has the key to decrypt the message. To improve the security, the Zig-zag Cipher is combined with RC4+ Cipher which is one of the symmetric key algorithms in the form of stream cipher. The two algorithms are combined to make a super-encryption. By combining these two algorithms, the message will be harder to break by a cryptanalyst. The result showed that complexity of the combined algorithm is θ(n2 ), while the complexity of Zig-zag Cipher and RC4+ Cipher are θ(n2 ) and θ(n), respectively.

  4. Defense in Depth Added to Malicious Activities Simulation Tools (MAST)

    DTIC Science & Technology

    2015-09-01

    cipher suites. The TLS Handshake is a combination of three components: handshake, change cipher spec, and alert. 41 (1) The Handshake ( Hello ) The...TLS Handshake, specifically the “ Hello ” portion, is designed to negotiate session parameters (cipher suite). The client informs the server of the...protocols and standards that it supports and then the server selects the highest common protocols and standards. Specifically, the Client Hello message

  5. An implementation of super-encryption using RC4A and MDTM cipher algorithms for securing PDF Files on android

    NASA Astrophysics Data System (ADS)

    Budiman, M. A.; Rachmawati, D.; Parlindungan, M. R.

    2018-03-01

    MDTM is a classical symmetric cryptographic algorithm. As with other classical algorithms, the MDTM Cipher algorithm is easy to implement but it is less secure compared to modern symmetric algorithms. In order to make it more secure, a stream cipher RC4A is added and thus the cryptosystem becomes super encryption. In this process, plaintexts derived from PDFs are firstly encrypted with the MDTM Cipher algorithm and are encrypted once more with the RC4A algorithm. The test results show that the value of complexity is Θ(n2) and the running time is linearly directly proportional to the length of plaintext characters and the keys entered.

  6. Hybrid cryptosystem implementation using fast data encipherment algorithm (FEAL) and goldwasser-micali algorithm for file security

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Budiman, M. A.; Siburian, W. S. E.

    2018-05-01

    On the process of exchanging files, security is indispensable to avoid the theft of data. Cryptography is one of the sciences used to secure the data by way of encoding. Fast Data Encipherment Algorithm (FEAL) is a block cipher symmetric cryptographic algorithms. Therefore, the file which wants to protect is encrypted and decrypted using the algorithm FEAL. To optimize the security of the data, session key that is utilized in the algorithm FEAL encoded with the Goldwasser-Micali algorithm, which is an asymmetric cryptographic algorithm and using probabilistic concept. In the encryption process, the key was converted into binary form. The selection of values of x that randomly causes the results of the cipher key is different for each binary value. The concept of symmetry and asymmetry algorithm merger called Hybrid Cryptosystem. The use of the algorithm FEAL and Goldwasser-Micali can restore the message to its original form and the algorithm FEAL time required for encryption and decryption is directly proportional to the length of the message. However, on Goldwasser- Micali algorithm, the length of the message is not directly proportional to the time of encryption and decryption.

  7. Meet-in-the-Middle Preimage Attacks on Hash Modes of Generalized Feistel and Misty Schemes with SP Round Function

    NASA Astrophysics Data System (ADS)

    Moon, Dukjae; Hong, Deukjo; Kwon, Daesung; Hong, Seokhie

    We assume that the domain extender is the Merkle-Damgård (MD) scheme and he message is padded by a ‘1’, and minimum number of ‘0’s, followed by a fixed size length information so that the length of padded message is multiple of block length. Under this assumption, we analyze securities of the hash mode when the compression function follows the Davies-Meyer (DM) scheme and the underlying block cipher is one of the plain Feistel or Misty scheme or the generalized Feistel or Misty schemes with Substitution-Permutation (SP) round function. We do this work based on Meet-in-the-Middle (MitM) preimage attack techniques, and develop several useful initial structures.

  8. From Greeks to Today: Cipher Trees and Computer Cryptography.

    ERIC Educational Resources Information Center

    Grady, M. Tim; Brumbaugh, Doug

    1988-01-01

    Explores the use of computers for teaching mathematical models of transposition ciphers. Illustrates the ideas, includes activities and extensions, provides a mathematical model and includes computer programs to implement these topics. (MVL)

  9. Implementation of Rivest Shamir Adleman Algorithm (RSA) and Vigenere Cipher In Web Based Information System

    NASA Astrophysics Data System (ADS)

    Aryanti, Aryanti; Mekongga, Ikhthison

    2018-02-01

    Data security and confidentiality is one of the most important aspects of information systems at the moment. One attempt to secure data such as by using cryptography. In this study developed a data security system by implementing the cryptography algorithm Rivest, Shamir Adleman (RSA) and Vigenere Cipher. The research was done by combining Rivest, Shamir Adleman (RSA) and Vigenere Cipher cryptographic algorithms to document file either word, excel, and pdf. This application includes the process of encryption and decryption of data, which is created by using PHP software and my SQL. Data encryption is done on the transmit side through RSA cryptographic calculations using the public key, then proceed with Vigenere Cipher algorithm which also uses public key. As for the stage of the decryption side received by using the Vigenere Cipher algorithm still use public key and then the RSA cryptographic algorithm using a private key. Test results show that the system can encrypt files, decrypt files and transmit files. Tests performed on the process of encryption and decryption of files with different file sizes, file size affects the process of encryption and decryption. The larger the file size the longer the process of encryption and decryption.

  10. Exploring Hill Ciphers with Graphing Calculators.

    ERIC Educational Resources Information Center

    St. John, Dennis

    1998-01-01

    Explains how to code and decode messages using Hill ciphers which combine matrix multiplication and modular arithmetic. Discusses how a graphing calculator can facilitate the matrix and modular arithmetic used in the coding and decoding procedures. (ASK)

  11. William Friedman, Geneticist Turned Cryptographer

    PubMed Central

    Goldman, Irwin L.

    2017-01-01

    William Friedman (1891–1969), trained as a plant geneticist at Cornell University, was employed at Riverbank Laboratories by the eccentric millionaire George Fabyan to work on wheat breeding. Friedman, however, soon became intrigued by and started working on a pet project of Fabyan’s involving the conjecture that Francis Bacon, a polymath known for the study of ciphers, was the real author of Shakespeare’s plays. Thus, beginning in ∼1916, Friedman turned his attention to the so called “Baconian cipher,” and developed decryption techniques that bore similarity to approaches for solving problems in population genetics. His most significant, indeed pathbreaking, work used ideas from genetics and statistics, focusing on analysis of the frequencies of letters in language use. Although he had transitioned from being a geneticist to a cryptographer, his earlier work had resonance in his later pursuits. He soon began working directly for the United States government and produced solutions used to solve complex military ciphers, in particular to break the Japanese Purple code during World War II. Another important legacy of his work was the establishment of the Signal Intelligence Service and eventually the National Security Agency. PMID:28476859

  12. William Friedman, Geneticist Turned Cryptographer.

    PubMed

    Goldman, Irwin L

    2017-05-01

    William Friedman (1891-1969), trained as a plant geneticist at Cornell University, was employed at Riverbank Laboratories by the eccentric millionaire George Fabyan to work on wheat breeding. Friedman, however, soon became intrigued by and started working on a pet project of Fabyan's involving the conjecture that Francis Bacon, a polymath known for the study of ciphers, was the real author of Shakespeare's plays. Thus, beginning in ∼1916, Friedman turned his attention to the so called "Baconian cipher," and developed decryption techniques that bore similarity to approaches for solving problems in population genetics. His most significant, indeed pathbreaking, work used ideas from genetics and statistics, focusing on analysis of the frequencies of letters in language use. Although he had transitioned from being a geneticist to a cryptographer, his earlier work had resonance in his later pursuits. He soon began working directly for the United States government and produced solutions used to solve complex military ciphers, in particular to break the Japanese Purple code during World War II. Another important legacy of his work was the establishment of the Signal Intelligence Service and eventually the National Security Agency. Copyright © 2017 by the Genetics Society of America.

  13. Infrared spectrometry studies: Spectral digital data acquisition system (1971 version)

    NASA Technical Reports Server (NTRS)

    Lu, L.; Lyon, R. J. P.

    1971-01-01

    The construction of the Stanford Spectral Digital Data Acquisition System is described. The objective of the system is to record both the spectral distribution of incoming radiation from the rock samples measured by the spectroradiometer (Exotech Model 10-34 Circular Variable Filter Infrared Spectroradiometer) together with other weather information. This system is designed for both laboratory and field measurement programs. The multichannel inputs (8 channels) of the system are as follows: Ch 1 the Spectro-radiometer, Ch 2 the radiometer (PRT-5), and Ch 3 to Ch 8 for the weather information. The system records data from channel 1 and channel 2 alternately for 48 times, before a fast sweep across the six weather channels, to form a single scan in the scan counter. The operation is illustrated in a block diagram, and the theory of operation is described. The outputs are written on a 7-track magnetic tape with IBM compatible form. The format of the tape and the playback computer programs are included. The micro-pac digital modules and a CIPHER model 70 tape recorder (Cipher Data Products) are used. One of the major characteristics of this system is that it is externally clocked by the spectroradiometer instead of taking data at intervals of various wavelengths by using internal-clocking.

  14. Coherent pulse position modulation quantum cipher

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sohma, Masaki; Hirota, Osamu

    2014-12-04

    On the basis of fundamental idea of Yuen, we present a new type of quantum random cipher, where pulse position modulated signals are encrypted in the picture of quantum Gaussian wave form. We discuss the security of our proposed system with a phase mask encryption.

  15. An Image Encryption Algorithm Based on Information Hiding

    NASA Astrophysics Data System (ADS)

    Ge, Xin; Lu, Bin; Liu, Fenlin; Gong, Daofu

    Aiming at resolving the conflict between security and efficiency in the design of chaotic image encryption algorithms, an image encryption algorithm based on information hiding is proposed based on the “one-time pad” idea. A random parameter is introduced to ensure a different keystream for each encryption, which has the characteristics of “one-time pad”, improving the security of the algorithm rapidly without significant increase in algorithm complexity. The random parameter is embedded into the ciphered image with information hiding technology, which avoids negotiation for its transport and makes the application of the algorithm easier. Algorithm analysis and experiments show that the algorithm is secure against chosen plaintext attack, differential attack and divide-and-conquer attack, and has good statistical properties in ciphered images.

  16. Security Analysis of Some Diffusion Mechanisms Used in Chaotic Ciphers

    NASA Astrophysics Data System (ADS)

    Zhang, Leo Yu; Zhang, Yushu; Liu, Yuansheng; Yang, Anjia; Chen, Guanrong

    As a variant of the substitution-permutation network, the permutation-diffusion structure has received extensive attention in the field of chaotic cryptography over the last three decades. Because of the high implementation speed and nonlinearity over GF(2), the Galois field of two elements, mixing modulo addition/multiplication and Exclusive OR becomes very popular in various designs to achieve the desired diffusion effect. This paper reports that some diffusion mechanisms based on modulo addition/multiplication and Exclusive OR are not resistant to plaintext attacks as claimed. By cracking several recently proposed chaotic ciphers as examples, it is demonstrated that a good understanding of the strength and weakness of these crypto-primitives is crucial for designing more practical chaotic encryption algorithms in the future.

  17. Implementation of digital image encryption algorithm using logistic function and DNA encoding

    NASA Astrophysics Data System (ADS)

    Suryadi, MT; Satria, Yudi; Fauzi, Muhammad

    2018-03-01

    Cryptography is a method to secure information that might be in form of digital image. Based on past research, in order to increase security level of chaos based encryption algorithm and DNA based encryption algorithm, encryption algorithm using logistic function and DNA encoding was proposed. Digital image encryption algorithm using logistic function and DNA encoding use DNA encoding to scramble the pixel values into DNA base and scramble it in DNA addition, DNA complement, and XOR operation. The logistic function in this algorithm used as random number generator needed in DNA complement and XOR operation. The result of the test show that the PSNR values of cipher images are 7.98-7.99 bits, the entropy values are close to 8, the histogram of cipher images are uniformly distributed and the correlation coefficient of cipher images are near 0. Thus, the cipher image can be decrypted perfectly and the encryption algorithm has good resistance to entropy attack and statistical attack.

  18. Cryptology

    ERIC Educational Resources Information Center

    Tech Directions, 2011

    2011-01-01

    Cryptology, or cryptography, is the study of writing and deciphering hidden messages in codes, ciphers, and writings. It is almost as old as writing itself. Ciphers are messages in which letters are rearranged or substituted for other letters or numbers. Codes are messages in which letters are replaced by letter groups, syllables, or sentences.…

  19. Escaping Embarrassment: Face-Work in the Rap Cipher

    ERIC Educational Resources Information Center

    Lee, Jooyoung

    2009-01-01

    How do individuals escape embarrassing moments in interaction? Drawing from ethnographic fieldwork, in-depth interviews, and video recordings of weekly street corner ciphers (impromptu rap sessions), this paper expands Goffman's theory of defensive and protective face-work. The findings reveal formulaic and indirect dimensions of face-work. First,…

  20. Investigating the structure preserving encryption of high efficiency video coding (HEVC)

    NASA Astrophysics Data System (ADS)

    Shahid, Zafar; Puech, William

    2013-02-01

    This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.

  1. Implementation of Super-Encryption with Trithemius Algorithm and Double Transposition Cipher in Securing PDF Files on Android Platform

    NASA Astrophysics Data System (ADS)

    Budiman, M. A.; Rachmawati, D.; Jessica

    2018-03-01

    This study aims to combine the trithemus algorithm and double transposition cipher in file security that will be implemented to be an Android-based application. The parameters being examined are the real running time, and the complexity value. The type of file to be used is a file in PDF format. The overall result shows that the complexity of the two algorithms with duper encryption method is reported as Θ (n 2). However, the processing time required in the encryption process uses the Trithemius algorithm much faster than using the Double Transposition Cipher. With the length of plaintext and password linearly proportional to the processing time.

  2. Joint Schemes for Physical Layer Security and Error Correction

    ERIC Educational Resources Information Center

    Adamo, Oluwayomi

    2011-01-01

    The major challenges facing resource constraint wireless devices are error resilience, security and speed. Three joint schemes are presented in this research which could be broadly divided into error correction based and cipher based. The error correction based ciphers take advantage of the properties of LDPC codes and Nordstrom Robinson code. A…

  3. A Novel Image Encryption Scheme Based on Intertwining Chaotic Maps and RC4 Stream Cipher

    NASA Astrophysics Data System (ADS)

    Kumari, Manju; Gupta, Shailender

    2018-03-01

    As the systems are enabling us to transmit large chunks of data, both in the form of texts and images, there is a need to explore algorithms which can provide a higher security without increasing the time complexity significantly. This paper proposes an image encryption scheme which uses intertwining chaotic maps and RC4 stream cipher to encrypt/decrypt the images. The scheme employs chaotic map for the confusion stage and for generation of key for the RC4 cipher. The RC4 cipher uses this key to generate random sequences which are used to implement an efficient diffusion process. The algorithm is implemented in MATLAB-2016b and various performance metrics are used to evaluate its efficacy. The proposed scheme provides highly scrambled encrypted images and can resist statistical, differential and brute-force search attacks. The peak signal-to-noise ratio values are quite similar to other schemes, the entropy values are close to ideal. In addition, the scheme is very much practical since having lowest time complexity then its counterparts.

  4. Cipher image damage and decisions in real time

    NASA Astrophysics Data System (ADS)

    Silva-García, Victor Manuel; Flores-Carapia, Rolando; Rentería-Márquez, Carlos; Luna-Benoso, Benjamín; Jiménez-Vázquez, Cesar Antonio; González-Ramírez, Marlon David

    2015-01-01

    This paper proposes a method for constructing permutations on m position arrangements. Our objective is to encrypt color images using advanced encryption standard (AES), using variable permutations means a different one for each 128-bit block in the first round after the x-or operation is applied. Furthermore, this research offers the possibility of knowing the original image when the encrypted figure suffered a failure from either an attack or not. This is achieved by permuting the original image pixel positions before being encrypted with AES variable permutations, which means building a pseudorandom permutation of 250,000 position arrays or more. To this end, an algorithm that defines a bijective function between the nonnegative integer and permutation sets is built. From this algorithm, the way to build permutations on the 0,1,…,m-1 array, knowing m-1 constants, is presented. The transcendental numbers are used to select these m-1 constants in a pseudorandom way. The quality of the proposed encryption according to the following criteria is evaluated: the correlation coefficient, the entropy, and the discrete Fourier transform. A goodness-of-fit test for each basic color image is proposed to measure the bits randomness degree of the encrypted figure. On the other hand, cipher images are obtained in a loss-less encryption way, i.e., no JPEG file formats are used.

  5. Hill Ciphers over Near-Fields

    ERIC Educational Resources Information Center

    Farag, Mark

    2007-01-01

    Hill ciphers are linear codes that use as input a "plaintext" vector [p-right arrow above] of size n, which is encrypted with an invertible n x n matrix E to produce a "ciphertext" vector [c-right arrow above] = E [middle dot] [p-right arrow above]. Informally, a near-field is a triple [left angle bracket]N; +, *[right angle bracket] that…

  6. Chaos and Cryptography: A new dimension in secure communications

    NASA Astrophysics Data System (ADS)

    Banerjee, Santo; Kurths, J.

    2014-06-01

    This issue is a collection of contributions on recent developments and achievements of cryptography and communications using chaos. The various contributions report important and promising results such as synchronization of networks and data transmissions; image cipher; optical and TDMA communications, quantum keys etc. Various experiments and applications such as FPGA, smartphone cipher, semiconductor lasers etc, are also included.

  7. Test and Verification of AES Used for Image Encryption

    NASA Astrophysics Data System (ADS)

    Zhang, Yong

    2018-03-01

    In this paper, an image encryption program based on AES in cipher block chaining mode was designed with C language. The encryption/decryption speed and security performance of AES based image cryptosystem were tested and used to compare the proposed cryptosystem with some existing image cryptosystems based on chaos. Simulation results show that AES can apply to image encryption, which refutes the widely accepted point of view that AES is not suitable for image encryption. This paper also suggests taking the speed of AES based image encryption as the speed benchmark of image encryption algorithms. And those image encryption algorithms whose speeds are lower than the benchmark should be discarded in practical communications.

  8. PDF file encryption on mobile phone using super-encryption of Variably Modified Permutation Composition (VMPC) and two square cipher algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Budiman, M. A.; Atika, F.

    2018-03-01

    Data security is becoming one of the most significant challenges in the digital world. Retrieval of data by unauthorized parties will result in harm to the owner of the data. PDF data are also susceptible to data security disorder. These things affect the security of the information. To solve the security problem, it needs a method to maintain the protection of the data, such as cryptography. In cryptography, several algorithms can encode data, one of them is Two Square Cipher algorithm which is a symmetric algorithm. At this research, Two Square Cipher algorithm has already developed into a 16 x 16 key aims to enter the various plaintexts. However, for more enhancement security it will be combined with the VMPC algorithm which is a symmetric algorithm. The combination of the two algorithms is called with the super-encryption. At this point, the data already can be stored on a mobile phone allowing users to secure data flexibly and can be accessed anywhere. The application of PDF document security on this research built by Android-platform. At this study will also calculate the complexity of algorithms and process time. Based on the test results the complexity of the algorithm is θ (n) for Two Square Cipher and θ (n) for VMPC algorithm, so the complexity of the super-encryption is also θ (n). VMPC algorithm processing time results quicker than on Two Square Cipher. And the processing time is directly proportional to the length of the plaintext and passwords.

  9. Nonquadratic Variation of the Blum Blum Shub Pseudorandom Number Generator

    DTIC Science & Technology

    2016-09-01

    maximum 200 words) Cryptography is essential for secure online communications. Many different types of ciphers are implemented in modern-day... cryptography , but they all have one common factor. All ciphers require a source of randomness, which makes them unpre- dictable. One such source of this...Martinsen Second Reader Craig Rasmussen Chair, Department of Applied Mathematics iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT Cryptography is

  10. HEaDS-UP Phase IV Assessment: Headgear Effects on Auditory Perception

    DTIC Science & Technology

    2015-02-01

    8 Fig. 6 Average attenuation measured for the CIPHER and INTERCPT helmets as a function of noise level, mandible/ eyewear ...impulsive noise consistent with the US Occupational Safety and Health Administration (OSHA 1981), the National Institute for Occupational Safety and... eyewear , or HPDs) (Fig. 5) show that the CIPHER and INTERCPT compared favorably with the currently fielded advanced combat helmet (ACH). Figure 6

  11. Progress in Y-00 physical cipher for Giga bit/sec optical data communications (intensity modulation method)

    NASA Astrophysics Data System (ADS)

    Hirota, Osamu; Futami, Fumio

    2014-10-01

    To guarantee a security of Cloud Computing System is urgent problem. Although there are several threats in a security problem, the most serious problem is cyber attack against an optical fiber transmission among data centers. In such a network, an encryption scheme on Layer 1(physical layer) with an ultimately strong security, a small delay, and a very high speed should be employed, because a basic optical link is operated at 10 Gbit/sec/wavelength. We have developed a quantum noise randomied stream cipher so called Yuen- 2000 encryption scheme (Y-00) during a decade. This type of cipher is a completely new type random cipher in which ciphertext for a legitimate receiver and eavesdropper are different. This is a condition to break the Shannon limit in theory of cryptography. In addition, this scheme has a good balance on a security, a speed and a cost performance. To realize such an encryption, several modulation methods are candidates such as phase-modulation, intensity-modulation, quadrature amplitude modulation, and so on. Northwestern university group demonstrated a phase modulation system (α=η) in 2003. In 2005, we reported a demonstration of 1 Gbit/sec system based on intensity modulation scheme(ISK-Y00), and gave a design method for quadratic amplitude modulation (QAM-Y00) in 2005 and 2010. An intensity modulation scheme promises a real application to a secure fiber communication of current data centers. This paper presents a progress in quantum noise randomized stream cipher based on ISK-Y00, integrating our theoretical and experimental achievements in the past and recent 100 Gbit/sec(10Gbit/sec × 10 wavelengths) experiment.

  12. Field performance of timber bridges. 17, Ciphers stress-laminated deck bridge

    Treesearch

    James P. Wacker; James A. Kainz; Michael A. Ritter

    In September 1989, the Ciphers bridge was constructed within the Beltrami Island State Forest in Roseau County, Minnesota. The bridge superstructure is a two-span continuous stress-laminated deck that is approximately 12.19 m long, 5.49 m wide, and 305 mm deep (40 ft long, 18 ft wide, and 12 in. deep). The bridge is one of the first to utilize red pine sawn lumber for...

  13. Running key mapping in a quantum stream cipher by the Yuen 2000 protocol

    NASA Astrophysics Data System (ADS)

    Shimizu, Tetsuya; Hirota, Osamu; Nagasako, Yuki

    2008-03-01

    A quantum stream cipher by Yuen 2000 protocol (so-called Y00 protocol or αη scheme) consisting of linear feedback shift register of short key is very attractive in implementing secure 40 Gbits/s optical data transmission, which is expected as a next-generation network. However, a basic model of the Y00 protocol with a very short key needs a careful design against fast correlation attacks as pointed out by Donnet This Brief Report clarifies an effectiveness of irregular mapping between running key and physical signals in the driver for selection of M -ary basis in the transmitter, and gives a design method. Consequently, quantum stream cipher by the Y00 protocol with our mapping has immunity against the proposed fast correlation attacks on a basic model of the Y00 protocol even if the key is very short.

  14. Studies and simulations of the DigiCipher system

    NASA Technical Reports Server (NTRS)

    Sayood, K.; Chen, Y. C.; Kipp, G.

    1993-01-01

    During this period the development of simulators for the various high definition television (HDTV) systems proposed to the FCC was continued. The FCC has indicated that it wants the various proposers to collaborate on a single system. Based on all available information this system will look very much like the advanced digital television (ADTV) system with major contributions only from the DigiCipher system. The results of our simulations of the DigiCipher system are described. This simulator was tested using test sequences from the MPEG committee. The results are extrapolated to HDTV video sequences. Once again, some caveats are in order. The sequences used for testing the simulator and generating the results are those used for testing the MPEG algorithm. The sequences are of much lower resolution than the HDTV sequences would be, and therefore the extrapolations are not totally accurate. One would expect to get significantly higher compression in terms of bits per pixel with sequences that are of higher resolution. However, the simulator itself is a valid one, and should HDTV sequences become available, they could be used directly with the simulator. A brief overview of the DigiCipher system is given. Some coding results obtained using the simulator are looked at. These results are compared to those obtained using the ADTV system. These results are evaluated in the context of the CCSDS specifications and make some suggestions as to how the DigiCipher system could be implemented in the NASA network. Simulations such as the ones reported can be biased depending on the particular source sequence used. In order to get more complete information about the system one needs to obtain a reasonable set of models which mirror the various kinds of sources encountered during video coding. A set of models which can be used to effectively model the various possible scenarios is provided. As this is somewhat tangential to the other work reported, the results are included as an appendix.

  15. Mobile Assisted Security in Wireless Sensor Networks

    DTIC Science & Technology

    2015-08-03

    server from Google’s DNS, Chromecast and the content server does the 3-way TCP Handshake which is followed by Client Hello and Server Hello TLS messages...utilized TLS v1.2, except NTP servers and google’s DNS server. In the TLS v1.2, after handshake, client and server sends Client Hello and Server Hello ...Messages in order. In Client Hello messages, client offers a list of Cipher Suites that it supports. Each Cipher Suite defines the key exchange algorithm

  16. Efficient Hardware Implementation of the Lightweight Block Encryption Algorithm LEA

    PubMed Central

    Lee, Donggeon; Kim, Dong-Chan; Kwon, Daesung; Kim, Howon

    2014-01-01

    Recently, due to the advent of resource-constrained trends, such as smartphones and smart devices, the computing environment is changing. Because our daily life is deeply intertwined with ubiquitous networks, the importance of security is growing. A lightweight encryption algorithm is essential for secure communication between these kinds of resource-constrained devices, and many researchers have been investigating this field. Recently, a lightweight block cipher called LEA was proposed. LEA was originally targeted for efficient implementation on microprocessors, as it is fast when implemented in software and furthermore, it has a small memory footprint. To reflect on recent technology, all required calculations utilize 32-bit wide operations. In addition, the algorithm is comprised of not complex S-Box-like structures but simple Addition, Rotation, and XOR operations. To the best of our knowledge, this paper is the first report on a comprehensive hardware implementation of LEA. We present various hardware structures and their implementation results according to key sizes. Even though LEA was originally targeted at software efficiency, it also shows high efficiency when implemented as hardware. PMID:24406859

  17. Known plaintext attack on double random phase encoding using fingerprint as key and a method for avoiding the attack.

    PubMed

    Tashima, Hideaki; Takeda, Masafumi; Suzuki, Hiroyuki; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2010-06-21

    We have shown that the application of double random phase encoding (DRPE) to biometrics enables the use of biometrics as cipher keys for binary data encryption. However, DRPE is reported to be vulnerable to known-plaintext attacks (KPAs) using a phase recovery algorithm. In this study, we investigated the vulnerability of DRPE using fingerprints as cipher keys to the KPAs. By means of computational experiments, we estimated the encryption key and restored the fingerprint image using the estimated key. Further, we propose a method for avoiding the KPA on the DRPE that employs the phase retrieval algorithm. The proposed method makes the amplitude component of the encrypted image constant in order to prevent the amplitude component of the encrypted image from being used as a clue for phase retrieval. Computational experiments showed that the proposed method not only avoids revealing the cipher key and the fingerprint but also serves as a sufficiently accurate verification system.

  18. A Security Analysis of the 802.11s Wireless Mesh Network Routing Protocol and Its Secure Routing Protocols

    PubMed Central

    Tan, Whye Kit; Lee, Sang-Gon; Lam, Jun Huy; Yoo, Seong-Moo

    2013-01-01

    Wireless mesh networks (WMNs) can act as a scalable backbone by connecting separate sensor networks and even by connecting WMNs to a wired network. The Hybrid Wireless Mesh Protocol (HWMP) is the default routing protocol for the 802.11s WMN. The routing protocol is one of the most important parts of the network, and it requires protection, especially in the wireless environment. The existing security protocols, such as the Broadcast Integrity Protocol (BIP), Counter with cipher block chaining message authentication code protocol (CCMP), Secure Hybrid Wireless Mesh Protocol (SHWMP), Identity Based Cryptography HWMP (IBC-HWMP), Elliptic Curve Digital Signature Algorithm HWMP (ECDSA-HWMP), and Watchdog-HWMP aim to protect the HWMP frames. In this paper, we have analyzed the vulnerabilities of the HWMP and developed security requirements to protect these identified vulnerabilities. We applied the security requirements to analyze the existing secure schemes for HWMP. The results of our analysis indicate that none of these protocols is able to satisfy all of the security requirements. We also present a quantitative complexity comparison among the protocols and an example of a security scheme for HWMP to demonstrate how the result of our research can be utilized. Our research results thus provide a tool for designing secure schemes for the HWMP. PMID:24002231

  19. A security analysis of the 802.11s wireless mesh network routing protocol and its secure routing protocols.

    PubMed

    Tan, Whye Kit; Lee, Sang-Gon; Lam, Jun Huy; Yoo, Seong-Moo

    2013-09-02

    Wireless mesh networks (WMNs) can act as a scalable backbone by connecting separate sensor networks and even by connecting WMNs to a wired network. The Hybrid Wireless Mesh Protocol (HWMP) is the default routing protocol for the 802.11s WMN. The routing protocol is one of the most important parts of the network, and it requires protection, especially in the wireless environment. The existing security protocols, such as the Broadcast Integrity Protocol (BIP), Counter with cipher block chaining message authentication code protocol (CCMP), Secure Hybrid Wireless Mesh Protocol (SHWMP), Identity Based Cryptography HWMP (IBC-HWMP), Elliptic Curve Digital Signature Algorithm HWMP (ECDSA-HWMP), and Watchdog-HWMP aim to protect the HWMP frames. In this paper, we have analyzed the vulnerabilities of the HWMP and developed security requirements to protect these identified vulnerabilities. We applied the security requirements to analyze the existing secure schemes for HWMP. The results of our analysis indicate that none of these protocols is able to satisfy all of the security requirements. We also present a quantitative complexity comparison among the protocols and an example of a security scheme for HWMP to demonstrate how the result of our research can be utilized. Our research results thus provide a tool for designing secure schemes for the HWMP.

  20. Power Consumption and Calculation Requirement Analysis of AES for WSN IoT.

    PubMed

    Hung, Chung-Wen; Hsu, Wen-Ting

    2018-05-23

    Because of the ubiquity of Internet of Things (IoT) devices, the power consumption and security of IoT systems have become very important issues. Advanced Encryption Standard (AES) is a block cipher algorithm is commonly used in IoT devices. In this paper, the power consumption and cryptographic calculation requirement for different payload lengths and AES encryption types are analyzed. These types include software-based AES-CB, hardware-based AES-ECB (Electronic Codebook Mode), and hardware-based AES-CCM (Counter with CBC-MAC Mode). The calculation requirement and power consumption for these AES encryption types are measured on the Texas Instruments LAUNCHXL-CC1310 platform. The experimental results show that the hardware-based AES performs better than the software-based AES in terms of power consumption and calculation cycle requirements. In addition, in terms of AES mode selection, the AES-CCM-MIC64 mode may be a better choice if the IoT device is considering security, encryption calculation requirement, and low power consumption at the same time. However, if the IoT device is pursuing lower power and the payload length is generally less than 16 bytes, then AES-ECB could be considered.

  1. Three-pass protocol scheme for bitmap image security by using vernam cipher algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Budiman, M. A.; Aulya, L.

    2018-02-01

    Confidentiality, integrity, and efficiency are the crucial aspects of data security. Among the other digital data, image data is too prone to abuse of operation like duplication, modification, etc. There are some data security techniques, one of them is cryptography. The security of Vernam Cipher cryptography algorithm is very dependent on the key exchange process. If the key is leaked, security of this algorithm will collapse. Therefore, a method that minimizes key leakage during the exchange of messages is required. The method which is used, is known as Three-Pass Protocol. This protocol enables message delivery process without the key exchange. Therefore, the sending messages process can reach the receiver safely without fear of key leakage. The system is built by using Java programming language. The materials which are used for system testing are image in size 200×200 pixel, 300×300 pixel, 500×500 pixel, 800×800 pixel and 1000×1000 pixel. The result of experiments showed that Vernam Cipher algorithm in Three-Pass Protocol scheme could restore the original image.

  2. A novel image encryption algorithm based on the chaotic system and DNA computing

    NASA Astrophysics Data System (ADS)

    Chai, Xiuli; Gan, Zhihua; Lu, Yang; Chen, Yiran; Han, Daojun

    A novel image encryption algorithm using the chaotic system and deoxyribonucleic acid (DNA) computing is presented. Different from the traditional encryption methods, the permutation and diffusion of our method are manipulated on the 3D DNA matrix. Firstly, a 3D DNA matrix is obtained through bit plane splitting, bit plane recombination, DNA encoding of the plain image. Secondly, 3D DNA level permutation based on position sequence group (3DDNALPBPSG) is introduced, and chaotic sequences generated from the chaotic system are employed to permutate the positions of the elements of the 3D DNA matrix. Thirdly, 3D DNA level diffusion (3DDNALD) is given, the confused 3D DNA matrix is split into sub-blocks, and XOR operation by block is manipulated to the sub-DNA matrix and the key DNA matrix from the chaotic system. At last, by decoding the diffused DNA matrix, we get the cipher image. SHA 256 hash of the plain image is employed to calculate the initial values of the chaotic system to avoid chosen plaintext attack. Experimental results and security analyses show that our scheme is secure against several known attacks, and it can effectively protect the security of the images.

  3. Architecture of security management unit for safe hosting of multiple agents

    NASA Astrophysics Data System (ADS)

    Gilmont, Tanguy; Legat, Jean-Didier; Quisquater, Jean-Jacques

    1999-04-01

    In such growing areas as remote applications in large public networks, electronic commerce, digital signature, intellectual property and copyright protection, and even operating system extensibility, the hardware security level offered by existing processors is insufficient. They lack protection mechanisms that prevent the user from tampering critical data owned by those applications. Some devices make exception, but have not enough processing power nor enough memory to stand up to such applications (e.g. smart cards). This paper proposes an architecture of secure processor, in which the classical memory management unit is extended into a new security management unit. It allows ciphered code execution and ciphered data processing. An internal permanent memory can store cipher keys and critical data for several client agents simultaneously. The ordinary supervisor privilege scheme is replaced by a privilege inheritance mechanism that is more suited to operating system extensibility. The result is a secure processor that has hardware support for extensible multitask operating systems, and can be used for both general applications and critical applications needing strong protection. The security management unit and the internal permanent memory can be added to an existing CPU core without loss of performance, and do not require it to be modified.

  4. A Novel Bit-level Image Encryption Method Based on Chaotic Map and Dynamic Grouping

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Ji; Shen, Yan

    2012-10-01

    In this paper, a novel bit-level image encryption method based on dynamic grouping is proposed. In the proposed method, the plain-image is divided into several groups randomly, then permutation-diffusion process on bit level is carried out. The keystream generated by logistic map is related to the plain-image, which confuses the relationship between the plain-image and the cipher-image. The computer simulation results of statistical analysis, information entropy analysis and sensitivity analysis show that the proposed encryption method is secure and reliable enough to be used for communication application.

  5. A Lightweight Encryption Scheme Combined with Trust Management for Privacy-Preserving in Body Sensor Networks.

    PubMed

    Guo, Ping; Wang, Jin; Ji, Sai; Geng, Xue Hua; Xiong, Neal N

    2015-12-01

    With the pervasiveness of smart phones and the advance of wireless body sensor network (BSN), mobile Healthcare (m-Healthcare), which extends the operation of Healthcare provider into a pervasive environment for better health monitoring, has attracted considerable interest recently. However, the flourish of m-Healthcare still faces many challenges including information security and privacy preservation. In this paper, we propose a secure and privacy-preserving framework combining with multilevel trust management. In our scheme, smart phone resources including computing power and energy can be opportunistically gathered to process the computing-intensive PHI (personal health information) during m-Healthcare emergency with minimal privacy disclosure. In specific, to leverage the PHI privacy disclosure and the high reliability of PHI process and transmission in m-Healthcare emergency, we introduce an efficient lightweight encryption for those users whose trust level is low, which is based on mix cipher algorithms and pair of plain text and cipher texts, and allow a medical user to decide who can participate in the opportunistic computing to assist in processing his overwhelming PHI data. Detailed security analysis and simulations show that the proposed framework can efficiently achieve user-centric privacy protection in m-Healthcare system.

  6. Physical-layer security analysis of a quantum-noise randomized cipher based on the wire-tap channel model.

    PubMed

    Jiao, Haisong; Pu, Tao; Zheng, Jilin; Xiang, Peng; Fang, Tao

    2017-05-15

    The physical-layer security of a quantum-noise randomized cipher (QNRC) system is, for the first time, quantitatively evaluated with secrecy capacity employed as the performance metric. Considering quantum noise as a channel advantage for legitimate parties over eavesdroppers, the specific wire-tap models for both channels of the key and data are built with channel outputs yielded by quantum heterodyne measurement; the general expressions of secrecy capacities for both channels are derived, where the matching codes are proved to be uniformly distributed. The maximal achievable secrecy rate of the system is proposed, under which secrecy of both the key and data is guaranteed. The influences of various system parameters on secrecy capacities are assessed in detail. The results indicate that QNRC combined with proper channel codes is a promising framework of secure communication for long distance with high speed, which can be orders of magnitude higher than the perfect secrecy rates of other encryption systems. Even if the eavesdropper intercepts more signal power than the legitimate receiver, secure communication (up to Gb/s) can still be achievable. Moreover, the secrecy of running key is found to be the main constraint to the systemic maximal secrecy rate.

  7. Physical-layer security analysis of PSK quantum-noise randomized cipher in optically amplified links

    NASA Astrophysics Data System (ADS)

    Jiao, Haisong; Pu, Tao; Xiang, Peng; Zheng, Jilin; Fang, Tao; Zhu, Huatao

    2017-08-01

    The quantitative security of quantum-noise randomized cipher (QNRC) in optically amplified links is analyzed from the perspective of physical-layer advantage. Establishing the wire-tap channel models for both key and data, we derive the general expressions of secrecy capacities for the key against ciphertext-only attack and known-plaintext attack, and that for the data, which serve as the basic performance metrics. Further, the maximal achievable secrecy rate of the system is proposed, under which secrecy of both the key and data is guaranteed. Based on the same framework, the secrecy capacities of various cases can be assessed and compared. The results indicate perfect secrecy is potentially achievable for data transmission, and an elementary principle of setting proper number of photons and bases is given to ensure the maximal data secrecy capacity. But the key security is asymptotically perfect, which tends to be the main constraint of systemic maximal secrecy rate. Moreover, by adopting cascaded optical amplification, QNRC can realize long-haul transmission with secure rate up to Gb/s, which is orders of magnitude higher than the perfect secrecy rates of other encryption systems.

  8. Strong Password-Based Authentication in TLS Using the Three-PartyGroup Diffie-Hellman Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdalla, Michel; Bresson, Emmanuel; Chevassut, Olivier

    2006-08-26

    The Internet has evolved into a very hostile ecosystem where"phishing'' attacks are common practice. This paper shows that thethree-party group Diffie-Hellman key exchange can help protect againstthese attacks. We have developed a suite of password-based cipher suitesfor the Transport Layer Security (TLS) protocol that are not onlyprovably secure but also assumed to be free from patent and licensingrestrictions based on an analysis of relevant patents in thearea.

  9. Encryption and decryption using FPGA

    NASA Astrophysics Data System (ADS)

    Nayak, Nikhilesh; Chandak, Akshay; Shah, Nisarg; Karthikeyan, B.

    2017-11-01

    In this paper, we are performing multiple cryptography methods on a set of data and comparing their outputs. Here AES algorithm and RSA algorithm are used. Using AES Algorithm an 8 bit input (plain text) gets encrypted using a cipher key and the result is displayed on tera term (serially). For simulation a 128 bit input is used and operated with a 128 bit cipher key to generate encrypted text. The reverse operations are then performed to get decrypted text. In RSA Algorithm file handling is used to input plain text. This text is then operated on to get the encrypted and decrypted data, which are then stored in a file. Finally the results of both the algorithms are compared.

  10. File text security using Hybrid Cryptosystem with Playfair Cipher Algorithm and Knapsack Naccache-Stern Algorithm

    NASA Astrophysics Data System (ADS)

    Amalia; Budiman, M. A.; Sitepu, R.

    2018-03-01

    Cryptography is one of the best methods to keep the information safe from security attack by unauthorized people. At present, Many studies had been done by previous researchers to generate a more robust cryptographic algorithm to provide high security for data communication. To strengthen data security, one of the methods is hybrid cryptosystem method that combined symmetric and asymmetric algorithm. In this study, we observed a hybrid cryptosystem method contain Modification Playfair Cipher 16x16 algorithm as a symmetric algorithm and Knapsack Naccache-Stern as an asymmetric algorithm. We observe a running time of this hybrid algorithm with some of the various experiments. We tried different amount of characters to be tested which are 10, 100, 1000, 10000 and 100000 characters and we also examined the algorithm with various key’s length which are 10, 20, 30, 40 of key length. The result of our study shows that the processing time for encryption and decryption process each algorithm is linearly proportional, it means the longer messages character then, the more significant times needed to encrypt and decrypt the messages. The encryption running time of Knapsack Naccache-Stern algorithm takes a longer time than its decryption, while the encryption running time of modification Playfair Cipher 16x16 algorithm takes less time than its decryption.

  11. Complex Conjugated certificateless-based signcryption with differential integrated factor for secured message communication in mobile network

    PubMed Central

    Rajagopalan, S. P.

    2017-01-01

    Certificateless-based signcryption overcomes inherent shortcomings in traditional Public Key Infrastructure (PKI) and Key Escrow problem. It imparts efficient methods to design PKIs with public verifiability and cipher text authenticity with minimum dependency. As a classic primitive in public key cryptography, signcryption performs validity of cipher text without decryption by combining authentication, confidentiality, public verifiability and cipher text authenticity much more efficiently than the traditional approach. In this paper, we first define a security model for certificateless-based signcryption called, Complex Conjugate Differential Integrated Factor (CC-DIF) scheme by introducing complex conjugates through introduction of the security parameter and improving secured message distribution rate. However, both partial private key and secret value changes with respect to time. To overcome this weakness, a new certificateless-based signcryption scheme is proposed by setting the private key through Differential (Diff) Equation using an Integration Factor (DiffEIF), minimizing computational cost and communication overhead. The scheme is therefore said to be proven secure (i.e. improving the secured message distributing rate) against certificateless access control and signcryption-based scheme. In addition, compared with the three other existing schemes, the CC-DIF scheme has the least computational cost and communication overhead for secured message communication in mobile network. PMID:29040290

  12. Complex Conjugated certificateless-based signcryption with differential integrated factor for secured message communication in mobile network.

    PubMed

    Alagarsamy, Sumithra; Rajagopalan, S P

    2017-01-01

    Certificateless-based signcryption overcomes inherent shortcomings in traditional Public Key Infrastructure (PKI) and Key Escrow problem. It imparts efficient methods to design PKIs with public verifiability and cipher text authenticity with minimum dependency. As a classic primitive in public key cryptography, signcryption performs validity of cipher text without decryption by combining authentication, confidentiality, public verifiability and cipher text authenticity much more efficiently than the traditional approach. In this paper, we first define a security model for certificateless-based signcryption called, Complex Conjugate Differential Integrated Factor (CC-DIF) scheme by introducing complex conjugates through introduction of the security parameter and improving secured message distribution rate. However, both partial private key and secret value changes with respect to time. To overcome this weakness, a new certificateless-based signcryption scheme is proposed by setting the private key through Differential (Diff) Equation using an Integration Factor (DiffEIF), minimizing computational cost and communication overhead. The scheme is therefore said to be proven secure (i.e. improving the secured message distributing rate) against certificateless access control and signcryption-based scheme. In addition, compared with the three other existing schemes, the CC-DIF scheme has the least computational cost and communication overhead for secured message communication in mobile network.

  13. Public Key Cryptography.

    ERIC Educational Resources Information Center

    Tapson, Frank

    1996-01-01

    Describes public key cryptography, also known as RSA, which is a system using two keys, one used to put a message into cipher and another used to decipher the message. Presents examples using small prime numbers. (MKR)

  14. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    NASA Astrophysics Data System (ADS)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  15. Dragon Stream Cipher for Secure Blackbox Cockpit Voice Recorder

    NASA Astrophysics Data System (ADS)

    Akmal, Fadira; Michrandi Nasution, Surya; Azmi, Fairuz

    2017-11-01

    Aircraft blackbox is a device used to record all aircraft information, which consists of Flight Data Recorder (FDR) and Cockpit Voice Recorder (CVR). Cockpit Voice Recorder contains conversations in the aircraft during the flight.Investigations on aircraft crashes usually take a long time, because it is difficult to find the aircraft blackbox. Then blackbox should have the ability to send information to other places. Aircraft blackbox must have a data security system, data security is a very important part at the time of information exchange process. The system in this research is to perform the encryption and decryption process on Cockpit Voice Recorder by people who are entitled by using Dragon Stream Cipher algorithm. The tests performed are time of data encryption and decryption, and avalanche effect. Result in this paper show us time encryption and decryption are 0,85 seconds and 1,84 second for 30 seconds Cockpit Voice Recorder data witn an avalanche effect 48,67 %.

  16. Searchable attribute-based encryption scheme with attribute revocation in cloud storage.

    PubMed

    Wang, Shangping; Zhao, Duqiao; Zhang, Yaling

    2017-01-01

    Attribute based encryption (ABE) is a good way to achieve flexible and secure access control to data, and attribute revocation is the extension of the attribute-based encryption, and the keyword search is an indispensable part for cloud storage. The combination of both has an important application in the cloud storage. In this paper, we construct a searchable attribute-based encryption scheme with attribute revocation in cloud storage, the keyword search in our scheme is attribute based with access control, when the search succeeds, the cloud server returns the corresponding cipher text to user and the user can decrypt the cipher text definitely. Besides, our scheme supports multiple keywords search, which makes the scheme more practical. Under the assumption of decisional bilinear Diffie-Hellman exponent (q-BDHE) and decisional Diffie-Hellman (DDH) in the selective security model, we prove that our scheme is secure.

  17. Toward privacy-preserving JPEG image retrieval

    NASA Astrophysics Data System (ADS)

    Cheng, Hang; Wang, Jingyue; Wang, Meiqing; Zhong, Shangping

    2017-07-01

    This paper proposes a privacy-preserving retrieval scheme for JPEG images based on local variance. Three parties are involved in the scheme: the content owner, the server, and the authorized user. The content owner encrypts JPEG images for privacy protection by jointly using permutation cipher and stream cipher, and then, the encrypted versions are uploaded to the server. With an encrypted query image provided by an authorized user, the server may extract blockwise local variances in different directions without knowing the plaintext content. After that, it can calculate the similarity between the encrypted query image and each encrypted database image by a local variance-based feature comparison mechanism. The authorized user with the encryption key can decrypt the returned encrypted images with plaintext content similar to the query image. The experimental results show that the proposed scheme not only provides effective privacy-preserving retrieval service but also ensures both format compliance and file size preservation for encrypted JPEG images.

  18. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  19. Medical Image Encryption: An Application for Improved Padding Based GGH Encryption Algorithm

    PubMed Central

    Sokouti, Massoud; Zakerolhosseini, Ali; Sokouti, Babak

    2016-01-01

    Medical images are regarded as important and sensitive data in the medical informatics systems. For transferring medical images over an insecure network, developing a secure encryption algorithm is necessary. Among the three main properties of security services (i.e., confidentiality, integrity, and availability), the confidentiality is the most essential feature for exchanging medical images among physicians. The Goldreich Goldwasser Halevi (GGH) algorithm can be a good choice for encrypting medical images as both the algorithm and sensitive data are represented by numeric matrices. Additionally, the GGH algorithm does not increase the size of the image and hence, its complexity will remain as simple as O(n2). However, one of the disadvantages of using the GGH algorithm is the Chosen Cipher Text attack. In our strategy, this shortcoming of GGH algorithm has been taken in to consideration and has been improved by applying the padding (i.e., snail tour XORing), before the GGH encryption process. For evaluating their performances, three measurement criteria are considered including (i) Number of Pixels Change Rate (NPCR), (ii) Unified Average Changing Intensity (UACI), and (iii) Avalanche effect. The results on three different sizes of images showed that padding GGH approach has improved UACI, NPCR, and Avalanche by almost 100%, 35%, and 45%, respectively, in comparison to the standard GGH algorithm. Also, the outcomes will make the padding GGH resist against the cipher text, the chosen cipher text, and the statistical attacks. Furthermore, increasing the avalanche effect of more than 50% is a promising achievement in comparison to the increased complexities of the proposed method in terms of encryption and decryption processes. PMID:27857824

  20. An approach to the language discrimination in different scripts using adjacent local binary pattern

    NASA Astrophysics Data System (ADS)

    Brodić, D.; Amelio, A.; Milivojević, Z. N.

    2017-09-01

    The paper proposes a language discrimination method of documents. First, each letter is encoded with the certain script type according to its status in baseline area. Such a cipher text is subjected to a feature extraction process. Accordingly, the local binary pattern as well as its expanded version called adjacent local binary pattern are extracted. Because of the difference in the language characteristics, the above analysis shows significant diversity. This type of diversity is a key aspect in the decision-making differentiation of the languages. Proposed method is tested on an example of documents. The experiments give encouraging results.

  1. A secure image encryption method based on dynamic harmony search (DHS) combined with chaotic map

    NASA Astrophysics Data System (ADS)

    Mirzaei Talarposhti, Khadijeh; Khaki Jamei, Mehrzad

    2016-06-01

    In recent years, there has been increasing interest in the security of digital images. This study focuses on the gray scale image encryption using dynamic harmony search (DHS). In this research, first, a chaotic map is used to create cipher images, and then the maximum entropy and minimum correlation coefficient is obtained by applying a harmony search algorithm on them. This process is divided into two steps. In the first step, the diffusion of a plain image using DHS to maximize the entropy as a fitness function will be performed. However, in the second step, a horizontal and vertical permutation will be applied on the best cipher image, which is obtained in the previous step. Additionally, DHS has been used to minimize the correlation coefficient as a fitness function in the second step. The simulation results have shown that by using the proposed method, the maximum entropy and the minimum correlation coefficient, which are approximately 7.9998 and 0.0001, respectively, have been obtained.

  2. A new simple technique for improving the random properties of chaos-based cryptosystems

    NASA Astrophysics Data System (ADS)

    Garcia-Bosque, M.; Pérez-Resa, A.; Sánchez-Azqueta, C.; Celma, S.

    2018-03-01

    A new technique for improving the security of chaos-based stream ciphers has been proposed and tested experimentally. This technique manages to improve the randomness properties of the generated keystream by preventing the system to fall into short period cycles due to digitation. In order to test this technique, a stream cipher based on a Skew Tent Map algorithm has been implemented on a Virtex 7 FPGA. The randomness of the keystream generated by this system has been compared to the randomness of the keystream generated by the same system with the proposed randomness-enhancement technique. By subjecting both keystreams to the National Institute of Standards and Technology (NIST) tests, we have proved that our method can considerably improve the randomness of the generated keystreams. In order to incorporate our randomness-enhancement technique, only 41 extra slices have been needed, proving that, apart from effective, this method is also efficient in terms of area and hardware resources.

  3. Secure biometric image sensor and authentication scheme based on compressed sensing.

    PubMed

    Suzuki, Hiroyuki; Suzuki, Masamichi; Urabe, Takuya; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2013-11-20

    It is important to ensure the security of biometric authentication information, because its leakage causes serious risks, such as replay attacks using the stolen biometric data, and also because it is almost impossible to replace raw biometric information. In this paper, we propose a secure biometric authentication scheme that protects such information by employing an optical data ciphering technique based on compressed sensing. The proposed scheme is based on two-factor authentication, the biometric information being supplemented by secret information that is used as a random seed for a cipher key. In this scheme, a biometric image is optically encrypted at the time of image capture, and a pair of restored biometric images for enrollment and verification are verified in the authentication server. If any of the biometric information is exposed to risk, it can be reenrolled by changing the secret information. Through numerical experiments, we confirm that finger vein images can be restored from the compressed sensing measurement data. We also present results that verify the accuracy of the scheme.

  4. Analysis Resistant Cipher Method and Apparatus

    NASA Technical Reports Server (NTRS)

    Oakley, Ernest C. (Inventor)

    2009-01-01

    A system for encoding and decoding data words including an anti-analysis encoder unit for receiving an original plaintext and producing a recoded data, a data compression unit for receiving the recoded data and producing a compressed recoded data, and an encryption unit for receiving the compressed recoded data and producing an encrypted data. The recoded data has an increased non-correlatable data redundancy compared with the original plaintext in order to mask the statistical distribution of characters in the plaintext data. The system of the present invention further includes a decryption unit for receiving the encrypted data and producing a decrypted data, a data decompression unit for receiving the decrypted data and producing an uncompressed recoded data, and an anti-analysis decoder unit for receiving the uncompressed recoded data and producing a recovered plaintext that corresponds with the original plaintext.

  5. The Vigenere Cipher with the TI-83

    ERIC Educational Resources Information Center

    Hamilton, Michael; Yankosky, Bill

    2004-01-01

    Cryptology, the science of secret writing, is a great way to introduce students to different areas of mathematics such as number theory, linear algebra, probability and statistics. Cryptology consists of two branches: cryptography and cryptanalysis. Cryptography is the science of designing techniques for encrypting and decrypting a message.…

  6. FBIS report. Science and technology: Japan, November 6, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-06

    Some articles are: R&D on Microfactory Technologies; MHI Develops Low Cost, Low Noise Mid-size Helicopters; Kumamoto University to Apply for Approval to Conduct Clinical Experiment for Gene Therapy; MITI To Support Private Sector to Develop Cipher Technology; and Hitachi Electronics Develops Digital Broadcasting Camera System.

  7. Jinneography: Post-Soviet passages of traumatic exemplarity.

    PubMed

    Beigi, Khashayar

    2016-04-01

    While Russia has historically and geographically close ties with Islam, the second most-practiced religion in its vast territories, the collapse of the USSR changed the terms of this relationship in significant ways. One key shift is the emergence of new immigration patterns between Russia and former Soviet states. Traversing distant lands from the peripheries of the Caucasus and Central Asia to mainland Russia in search of work, migrants have come to recognize each other as fellow Muslims dispersed in a theological geography on the ruins of the universal comradeship dreamed by the Soviet utopia. I propose to study the Islamic pedagogical practice of ibra in the context of sociohistorical dynamics of education and migration between Russia and Central Asia to further locate and analyze this shift in relation to current debates on post-Soviet subjectivity. By discussing the case of a spirit possession of a Tajik national performed in Russia, I argue that the collective participation in the session pedagogically invokes, ciphers, and extends the post-Soviet terrains of history as ibra, or exemplary passage of worldly events. To do so, I first locate the Quranic concept of ibra as a pedagogical paradigm in Islamic traditions as well as an ethnographic lens in the context of educational campaigns for the Muslims of Eurasia and then apply the concept to my analysis of the possession session in order to show that in the ritualistic incarnations of ghosts, or jinns, the civil war of Tajikistan and its continuing cycle of terror is ciphered into a desire for learning, as well as a focus on approximation to the divine. © The Author(s) 2015.

  8. A joint FED watermarking system using spatial fusion for verifying the security issues of teleradiology.

    PubMed

    Viswanathan, P; Krishna, P Venkata

    2014-05-01

    Teleradiology allows transmission of medical images for clinical data interpretation to provide improved e-health care access, delivery, and standards. The remote transmission raises various ethical and legal issues like image retention, fraud, privacy, malpractice liability, etc. A joint FED watermarking system means a joint fingerprint/encryption/dual watermarking system is proposed for addressing these issues. The system combines a region based substitution dual watermarking algorithm using spatial fusion, stream cipher algorithm using symmetric key, and fingerprint verification algorithm using invariants. This paper aims to give access to the outcomes of medical images with confidentiality, availability, integrity, and its origin. The watermarking, encryption, and fingerprint enrollment are conducted jointly in protection stage such that the extraction, decryption, and verification can be applied independently. The dual watermarking system, introducing two different embedding schemes, one used for patient data and other for fingerprint features, reduces the difficulty in maintenance of multiple documents like authentication data, personnel and diagnosis data, and medical images. The spatial fusion algorithm, which determines the region of embedding using threshold from the image to embed the encrypted patient data, follows the exact rules of fusion resulting in better quality than other fusion techniques. The four step stream cipher algorithm using symmetric key for encrypting the patient data with fingerprint verification system using algebraic invariants improves the robustness of the medical information. The experiment result of proposed scheme is evaluated for security and quality analysis in DICOM medical images resulted well in terms of attacks, quality index, and imperceptibility.

  9. Modulation for terrestrial broadcasting of digital HDTV

    NASA Technical Reports Server (NTRS)

    Kohn, Elliott S.

    1991-01-01

    The digital modulation methods used by the DigiCipher, DSC-HDTV, ADTV, and ATVA-P digital high-definition television (HDTV) systems are discussed. Three of the systems use a quadrature amplitude modulation method, and the fourth uses a vestigial sideband modulation method. The channel equalization and spectrum sharing of the digital HDTV systems is discussed.

  10. College and Industry: Partners in the Handicapped Role (Cipher III).

    ERIC Educational Resources Information Center

    Katz, David; And Others

    A project was designed and instituted to furnish a structure that would bring together three groups--potential employers, college personnel, and disabled people--to increase employment opportunities for the handicapped. During the third and final project year, representatives of all three groups met in workshops to discuss issues and concerns.…

  11. Signatures and Popular Literacy in Early Seventeenth-Century Japan

    ERIC Educational Resources Information Center

    Rubinger, Richard

    2006-01-01

    My paper looks at "signatures" in the form of "ciphers" (kao) and other personal marks made on population registers, town rules, and apostasy oaths in the early seventeenth century to provide some empirical evidence of very high literacy among village leaders. The essay also argues, using the same data, that literacy had…

  12. Website-based PNG image steganography using the modified Vigenere Cipher, least significant bit, and dictionary based compression methods

    NASA Astrophysics Data System (ADS)

    Rojali, Salman, Afan Galih; George

    2017-08-01

    Along with the development of information technology in meeting the needs, various adverse actions and difficult to avoid are emerging. One of such action is data theft. Therefore, this study will discuss about cryptography and steganography that aims to overcome these problems. This study will use the Modification Vigenere Cipher, Least Significant Bit and Dictionary Based Compression methods. To determine the performance of study, Peak Signal to Noise Ratio (PSNR) method is used to measure objectively and Mean Opinion Score (MOS) method is used to measure subjectively, also, the performance of this study will be compared to other method such as Spread Spectrum and Pixel Value differencing. After comparing, it can be concluded that this study can provide better performance when compared to other methods (Spread Spectrum and Pixel Value Differencing) and has a range of MSE values (0.0191622-0.05275) and PSNR (60.909 to 65.306) with a hidden file size of 18 kb and has a MOS value range (4.214 to 4.722) or image quality that is approaching very good.

  13. Ciphers and Currencies: Literacy Dilemmas and Shifting Knowledges.

    ERIC Educational Resources Information Center

    Kell, Catherine

    2001-01-01

    Provides an overview of issues in the literacy policy field from a social practices perspective. Outlines a central dilemma in both theory and practice in adult literacy work: that practice theory has not impacted on literacy policy in large parts of the world. Suggests there is an ever-widening gap between literacies of everyday life and the…

  14. Secret Writing. Keys to the Mysteries of Reading and Writing.

    ERIC Educational Resources Information Center

    Sears, Peter

    With a central theme of how people create a means to communicate reliably, and based on language-making exercises that touch students' imaginations, this book aims to interest students in language and how language is made. Since students like codes and ciphers, the book begins with secret writing, which is then used to reveal the foundation of…

  15. Codes, Ciphers, and Cryptography--An Honors Colloquium

    ERIC Educational Resources Information Center

    Karls, Michael A.

    2010-01-01

    At the suggestion of a colleague, I read "The Code Book", [32], by Simon Singh to get a basic introduction to the RSA encryption scheme. Inspired by Singh's book, I designed a Ball State University Honors Colloquium in Mathematics for both majors and non-majors, with material coming from "The Code Book" and many other sources. This course became…

  16. From Candy Girls to Cyber Sista-Cipher: Narrating Black Females' Color-Consciousness and Counterstories in "and" out "of School"

    ERIC Educational Resources Information Center

    Kynard, Carmen

    2010-01-01

    In this article, Carmen Kynard provides a window into a present-day "hush harbor," a site where a group of black women build generative virtual spaces for counterstories that fight institutional racism. Hidden in plain view, these intentional communities have historically allowed African American participants to share and create knowledge and find…

  17. Hybrid Cryptosystem Using Tiny Encryption Algorithm and LUC Algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, Dian; Sharif, Amer; Jaysilen; Andri Budiman, Mohammad

    2018-01-01

    Security becomes a very important issue in data transmission and there are so many methods to make files more secure. One of that method is cryptography. Cryptography is a method to secure file by writing the hidden code to cover the original file. Therefore, if the people do not involve in cryptography, they cannot decrypt the hidden code to read the original file. There are many methods are used in cryptography, one of that method is hybrid cryptosystem. A hybrid cryptosystem is a method that uses a symmetric algorithm to secure the file and use an asymmetric algorithm to secure the symmetric algorithm key. In this research, TEA algorithm is used as symmetric algorithm and LUC algorithm is used as an asymmetric algorithm. The system is tested by encrypting and decrypting the file by using TEA algorithm and using LUC algorithm to encrypt and decrypt the TEA key. The result of this research is by using TEA Algorithm to encrypt the file, the cipher text form is the character from ASCII (American Standard for Information Interchange) table in the form of hexadecimal numbers and the cipher text size increase by sixteen bytes as the plaintext length is increased by eight characters.

  18. Single-channel 40 Gbit/s digital coherent QAM quantum noise stream cipher transmission over 480 km.

    PubMed

    Yoshida, Masato; Hirooka, Toshihiko; Kasai, Keisuke; Nakazawa, Masataka

    2016-01-11

    We demonstrate the first 40 Gbit/s single-channel polarization-multiplexed, 5 Gsymbol/s, 16 QAM quantum noise stream cipher (QNSC) transmission over 480 km by incorporating ASE quantum noise from EDFAs as well as the quantum shot noise of the coherent state with multiple photons for the random masking of data. By using a multi-bit encoded scheme and digital coherent transmission techniques, secure optical communication with a record data capacity and transmission distance has been successfully realized. In this system, the signal level received by Eve is hidden by both the amplitude and the phase noise. The highest number of masked signals, 7.5 x 10(4), was achieved by using a QAM scheme with FEC, which makes it possible to reduce the output power from the transmitter while maintaining an error free condition for Bob. We have newly measured the noise distribution around I and Q encrypted data and shown experimentally with a data size of as large as 2(25) that the noise has a Gaussian distribution with no correlations. This distribution is suitable for the random masking of data.

  19. [Side Effects of Modernity : Dam Building, Health Care, and the Construction of Power in the Context of the Control of Schistosomiasis in Egypt in the 1960s and early 1970s].

    PubMed

    Brendel, Benjamin

    2017-09-01

    This article analyzes the modernization campaigns in Egypt in the 1960s and early 1970s. The regulation of the Nile by the Aswan High Dam and the resulting irrigation projects caused the rate of schistosomiasis infestation in the population to rise. The result was a discourse between experts from the global north and Egyptian elites about modernization, development aid, dam building and health care. The fight against schistosomiasis was like a cipher, which combined different power-laden concepts and arguments. This article will decode the cipher and allow a deeper look into the contemporary dimensions of power bound to this subject. The text is conceived around three thematic axes. The first deals with the discursive interplay of modernization, health and development aid in and for Egypt. The second focuses on far-reaching and long-standing arguments within an international expert discourse about these concepts. Finally, the third presents an exemplary case study of West German health and development aid for fighting schistosomiasis in the Egyptian Fayoum oasis.

  20. Enhanced K-means clustering with encryption on cloud

    NASA Astrophysics Data System (ADS)

    Singh, Iqjot; Dwivedi, Prerna; Gupta, Taru; Shynu, P. G.

    2017-11-01

    This paper tries to solve the problem of storing and managing big files over cloud by implementing hashing on Hadoop in big-data and ensure security while uploading and downloading files. Cloud computing is a term that emphasis on sharing data and facilitates to share infrastructure and resources.[10] Hadoop is an open source software that gives us access to store and manage big files according to our needs on cloud. K-means clustering algorithm is an algorithm used to calculate distance between the centroid of the cluster and the data points. Hashing is a algorithm in which we are storing and retrieving data with hash keys. The hashing algorithm is called as hash function which is used to portray the original data and later to fetch the data stored at the specific key. [17] Encryption is a process to transform electronic data into non readable form known as cipher text. Decryption is the opposite process of encryption, it transforms the cipher text into plain text that the end user can read and understand well. For encryption and decryption we are using Symmetric key cryptographic algorithm. In symmetric key cryptography are using DES algorithm for a secure storage of the files. [3

  1. Cryptanalysis of the Sodark Family of Cipher Algorithms

    DTIC Science & Technology

    2017-09-01

    software project for building three-bit LUT circuit representations of S- boxes is available as a GitHub repository [40]. It contains several improvements...DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The...second- and third-generation automatic link establishment (ALE) systems for high frequency radios. Radios utilizing ALE technology are in use by a

  2. Federation for a Secure Enterprise

    DTIC Science & Technology

    2016-09-10

    12 October 2005 e. RFC Internet X.509 Public Key Infrastructure: Certification Path Building, 2005 f. Public Key Cryptography Standard, PKCS #1...v2.2: RSA Cryptography Standard, RSA Laboratories, October 27, 2012 g. PKCS#12 format PKCS #12 v1.0: Personal Information Exchange Syntax Standard, RSA...ClientHello padding extension, 2015-02-17 f. Elliptic Curve Cryptography (ECC) Cipher Suites for Transport Layer Security (TLS) Versions 1.2 and Earlier

  3. The Role of Counterintelligence in the European Theater of Operations During World War II

    DTIC Science & Technology

    1993-06-04

    revolvers, Minox cameras, portable typewriters, 48 fingerprint cameras, latent fingerprint kits, handcuffs, and listening and recording devices.13 This...Comments from the detachments indicated that the fingerprint equipment, and listening and recording devices were of little use. However, the revolvers...40-49. 138 Moulage* 2 Fingerprinting 2 Latent Fingerprinting 3 System of Identification 1 Codes and Ciphers 1 Handwriting Comparison 2 Documentary

  4. Physical-layer encryption on the public internet: A stochastic approach to the Kish-Sethuraman cipher

    NASA Astrophysics Data System (ADS)

    Gunn, Lachlan J.; Chappell, James M.; Allison, Andrew; Abbott, Derek

    2014-09-01

    While information-theoretic security is often associated with the one-time pad and quantum key distribution, noisy transport media leave room for classical techniques and even covert operation. Transit times across the public internet exhibit a degree of randomness, and cannot be determined noiselessly by an eavesdropper. We demonstrate the use of these measurements for information-theoretically secure communication over the public internet.

  5. An Architecture for Enabling Migration of Tactical Networks to Future Flexible Ad Hoc WBWF

    DTIC Science & Technology

    2010-09-01

    Requirements Several multiple access schemes TDMA OFDMA SC-OFDMA, FH- CDMA , DS - CDMA , hybrid access schemes, transitions between them Dynamic...parameters algorithms depend on the multiple access scheme If DS - CDMA : handling of macro-diversity (linked to cooperative routing) TDMA and/of OFDMA...Transport format Ciphering @MAC/RLC level : SCM Physical layer (PHY) : signal processing (mod, FEC, etc) CDMA : macro-diversity CDMA , OFDMA

  6. Pre-Mrna Introns as a Model for Cryptographic Algorithm:. Theory and Experiments

    NASA Astrophysics Data System (ADS)

    Regoli, Massimo

    2010-01-01

    The RNA-Crypto System (shortly RCS) is a symmetric key algorithm to cipher data. The idea for this new algorithm starts from the observation of nature. In particular from the observation of RNA behavior and some of its properties. In particular the RNA sequences have some sections called Introns. Introns, derived from the term "intragenic regions", are non-coding sections of precursor mRNA (pre-mRNA) or other RNAs, that are removed (spliced out of the RNA) before the mature RNA is formed. Once the introns have been spliced out of a pre-mRNA, the resulting mRNA sequence is ready to be translated into a protein. The corresponding parts of a gene are known as introns as well. The nature and the role of Introns in the pre-mRNA is not clear and it is under ponderous researches by Biologists but, in our case, we will use the presence of Introns in the RNA-Crypto System output as a strong method to add chaotic non coding information and an unnecessary behaviour in the access to the secret key to code the messages. In the RNA-Crypto System algorithm the introns are sections of the ciphered message with non-coding information as well as in the precursor mRNA.

  7. a Simple Symmetric Algorithm Using a Likeness with Introns Behavior in RNA Sequences

    NASA Astrophysics Data System (ADS)

    Regoli, Massimo

    2009-02-01

    The RNA-Crypto System (shortly RCS) is a symmetric key algorithm to cipher data. The idea for this new algorithm starts from the observation of nature. In particular from the observation of RNA behavior and some of its properties. The RNA sequences has some sections called Introns. Introns, derived from the term "intragenic regions", are non-coding sections of precursor mRNA (pre-mRNA) or other RNAs, that are removed (spliced out of the RNA) before the mature RNA is formed. Once the introns have been spliced out of a pre-mRNA, the resulting mRNA sequence is ready to be translated into a protein. The corresponding parts of a gene are known as introns as well. The nature and the role of Introns in the pre-mRNA is not clear and it is under ponderous researches by Biologists but, in our case, we will use the presence of Introns in the RNA-Crypto System output as a strong method to add chaotic non coding information and an unnecessary behaviour in the access to the secret key to code the messages. In the RNA-Crypto System algoritnm the introns are sections of the ciphered message with non-coding information as well as in the precursor mRNA.

  8. Bio—Cryptography: A Possible Coding Role for RNA Redundancy

    NASA Astrophysics Data System (ADS)

    Regoli, M.

    2009-03-01

    The RNA-Crypto System (shortly RCS) is a symmetric key algorithm to cipher data. The idea for this new algorithm starts from the observation of nature. In particular from the observation of RNA behavior and some of its properties. The RNA sequences have some sections called Introns. Introns, derived from the term "intragenic regions," are non-coding sections of precursor mRNA (pre-mRNA) or other RNAs, that are removed (spliced out of the RNA) before the mature RNA is formed. Once the introns have been spliced out of a pre-mRNA, the resulting mRNA sequence is ready to be translated into a protein. The corresponding parts of a gene are known as introns as well. The nature and the role of Introns in the pre-mRNA is not clear and it is under ponderous researches by biologists but, in our case, we will use the presence of Introns in the RNA-Crypto System output as a strong method to add chaotic non coding information and an unnecessary behavior in the access to the secret key to code the messages. In the RNA-Crypto System algorithm the introns are sections of the ciphered message with non-coding information as well as in the precursor mRNA.

  9. A Secure Key Distribution System of Quantum Cryptography Based on the Coherent State

    NASA Technical Reports Server (NTRS)

    Guo, Guang-Can; Zhang, Xiao-Yu

    1996-01-01

    The cryptographic communication has a lot of important applications, particularly in the magnificent prospects of private communication. As one knows, the security of cryptographic channel depends crucially on the secrecy of the key. The Vernam cipher is the only cipher system which has guaranteed security. In that system the key must be as long as the message and most be used only once. Quantum cryptography is a method whereby key secrecy can be guaranteed by a physical law. So it is impossible, even in principle, to eavesdrop on such channels. Quantum cryptography has been developed in recent years. Up to now, many schemes of quantum cryptography have been proposed. Now one of the main problems in this field is how to increase transmission distance. In order to use quantum nature of light, up to now proposed schemes all use very dim light pulses. The average photon number is about 0.1. Because of the loss of the optical fiber, it is difficult for the quantum cryptography based on one photon level or on dim light to realize quantum key-distribution over long distance. A quantum key distribution based on coherent state is introduced in this paper. Here we discuss the feasibility and security of this scheme.

  10. Images Encryption Method using Steganographic LSB Method, AES and RSA algorithm

    NASA Astrophysics Data System (ADS)

    Moumen, Abdelkader; Sissaoui, Hocine

    2017-03-01

    Vulnerability of communication of digital images is an extremely important issue nowadays, particularly when the images are communicated through insecure channels. To improve communication security, many cryptosystems have been presented in the image encryption literature. This paper proposes a novel image encryption technique based on an algorithm that is faster than current methods. The proposed algorithm eliminates the step in which the secrete key is shared during the encryption process. It is formulated based on the symmetric encryption, asymmetric encryption and steganography theories. The image is encrypted using a symmetric algorithm, then, the secret key is encrypted by means of an asymmetrical algorithm and it is hidden in the ciphered image using a least significant bits steganographic scheme. The analysis results show that while enjoying the faster computation, our method performs close to optimal in terms of accuracy.

  11. Small Private Key PKS on an Embedded Microprocessor

    PubMed Central

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-01-01

    Multivariate quadratic ( ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012. PMID:24651722

  12. Small private key MQPKS on an embedded microprocessor.

    PubMed

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-03-19

    Multivariate quadratic (MQ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to MQ cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key MQ scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key MQ scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing MQ on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key MQ scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012.

  13. An image encryption algorithm based on 3D cellular automata and chaotic maps

    NASA Astrophysics Data System (ADS)

    Del Rey, A. Martín; Sánchez, G. Rodríguez

    2015-05-01

    A novel encryption algorithm to cipher digital images is presented in this work. The digital image is rendering into a three-dimensional (3D) lattice and the protocol consists of two phases: the confusion phase where 24 chaotic Cat maps are applied and the diffusion phase where a 3D cellular automata is evolved. The encryption method is shown to be secure against the most important cryptanalytic attacks.

  14. Environmental Requirements for Authentication Protocols

    DTIC Science & Technology

    2002-01-01

    Engineering for Informa- tion Security, March 2001. 10. D. Chaum . Blind signatures for untraceable payments. In Advances in Cryptology{ Proceedings of...the connection, the idea relies on a concept similar to blinding in the sense of Chaum [10], who used it e ectively in the design of anonymous payment...digital signature on the key and a nonce provided by the server, in which the client’s challenge response was independent of the type of cipher

  15. Color encryption scheme based on adapted quantum logistic map

    NASA Astrophysics Data System (ADS)

    Zaghloul, Alaa; Zhang, Tiejun; Amin, Mohamed; Abd El-Latif, Ahmed A.

    2014-04-01

    This paper presents a new color image encryption scheme based on quantum chaotic system. In this scheme, a new encryption scheme is accomplished by generating an intermediate chaotic key stream with the help of quantum chaotic logistic map. Then, each pixel is encrypted by the cipher value of the previous pixel and the adapted quantum logistic map. The results show that the proposed scheme has adequate security for the confidentiality of color images.

  16. Git as an Encrypted Distributed Version Control System

    DTIC Science & Technology

    2015-03-01

    options. The algorithm uses AES- 256 counter mode with an IV derived from SHA -1-HMAC hash (this is nearly identical to the GCM mode discussed earlier...built into the internal structure of Git. Every file in a Git repository is check summed with a SHA -1 hash, a one-way function with arbitrarily long...implementation. Git-encrypt calls OpenSSL cryptography library command line functions. The default cipher used is AES- 256 - Electronic Code Book (ECB), which is

  17. Random multispace quantization as an analytic mechanism for BioHashing of biometric and random identity inputs.

    PubMed

    Teoh, Andrew B J; Goh, Alwyn; Ngo, David C L

    2006-12-01

    Biometric analysis for identity verification is becoming a widespread reality. Such implementations necessitate large-scale capture and storage of biometric data, which raises serious issues in terms of data privacy and (if such data is compromised) identity theft. These problems stem from the essential permanence of biometric data, which (unlike secret passwords or physical tokens) cannot be refreshed or reissued if compromised. Our previously presented biometric-hash framework prescribes the integration of external (password or token-derived) randomness with user-specific biometrics, resulting in bitstring outputs with security characteristics (i.e., noninvertibility) comparable to cryptographic ciphers or hashes. The resultant BioHashes are hence cancellable, i.e., straightforwardly revoked and reissued (via refreshed password or reissued token) if compromised. BioHashing furthermore enhances recognition effectiveness, which is explained in this paper as arising from the Random Multispace Quantization (RMQ) of biometric and external random inputs.

  18. Public-key quantum digital signature scheme with one-time pad private-key

    NASA Astrophysics Data System (ADS)

    Chen, Feng-Lin; Liu, Wan-Fang; Chen, Su-Gen; Wang, Zhi-Hua

    2018-01-01

    A quantum digital signature scheme is firstly proposed based on public-key quantum cryptosystem. In the scheme, the verification public-key is derived from the signer's identity information (such as e-mail) on the foundation of identity-based encryption, and the signature private-key is generated by one-time pad (OTP) protocol. The public-key and private-key pair belongs to classical bits, but the signature cipher belongs to quantum qubits. After the signer announces the public-key and generates the final quantum signature, each verifier can verify publicly whether the signature is valid or not with the public-key and quantum digital digest. Analysis results show that the proposed scheme satisfies non-repudiation and unforgeability. Information-theoretic security of the scheme is ensured by quantum indistinguishability mechanics and OTP protocol. Based on the public-key cryptosystem, the proposed scheme is easier to be realized compared with other quantum signature schemes under current technical conditions.

  19. The cosmic microwave background radiation power spectrum as a random bit generator for symmetric- and asymmetric-key cryptography.

    PubMed

    Lee, Jeffrey S; Cleaver, Gerald B

    2017-10-01

    In this note, the Cosmic Microwave Background (CMB) Radiation is shown to be capable of functioning as a Random Bit Generator, and constitutes an effectively infinite supply of truly random one-time pad values of arbitrary length. It is further argued that the CMB power spectrum potentially conforms to the FIPS 140-2 standard. Additionally, its applicability to the generation of a (n × n) random key matrix for a Vernam cipher is established.

  20. Cryptographic Properties of Monotone Boolean Functions

    DTIC Science & Technology

    2016-01-01

    Algebraic attacks on stream ciphers with linear feedback, in: Advances in Cryptology (Eurocrypt 2003), Lecture Notes in Comput. Sci. 2656, Springer, Berlin...spectrum, algebraic immu- nity MSC 2010: 06E30, 94C10, 94A60, 11T71, 05E99 || Communicated by: Carlo Blundo 1 Introduction Let F 2 be the prime eld of...7]. For the reader’s convenience, we recall some basic notions below. Any f ∈ Bn can be expressed in algebraic normal form (ANF) as f(x 1 , x 2

  1. Pseudo-Random Number Generator Based on Coupled Map Lattices

    NASA Astrophysics Data System (ADS)

    Lü, Huaping; Wang, Shihong; Hu, Gang

    A one-way coupled chaotic map lattice is used for generating pseudo-random numbers. It is shown that with suitable cooperative applications of both chaotic and conventional approaches, the output of the spatiotemporally chaotic system can easily meet the practical requirements of random numbers, i.e., excellent random statistical properties, long periodicity of computer realizations, and fast speed of random number generations. This pseudo-random number generator system can be used as ideal synchronous and self-synchronizing stream cipher systems for secure communications.

  2. Medical data sheet in safe havens - A tri-layer cryptic solution.

    PubMed

    Praveenkumar, Padmapriya; Amirtharajan, Rengarajan; Thenmozhi, K; Balaguru Rayappan, John Bosco

    2015-07-01

    Secured sharing of the diagnostic reports and scan images of patients among doctors with complementary expertise for collaborative treatment will help to provide maximum care through faster and decisive decisions. In this context, a tri-layer cryptic solution has been proposed and implemented on Digital Imaging and Communications in Medicine (DICOM) images to establish a secured communication for effective referrals among peers without compromising the privacy of patients. In this approach, a blend of three cryptic schemes, namely Latin square image cipher (LSIC), discrete Gould transform (DGT) and Rubik׳s encryption, has been adopted. Among them, LSIC provides better substitution, confusion and shuffling of the image blocks; DGT incorporates tamper proofing with authentication; and Rubik renders a permutation of DICOM image pixels. The developed algorithm has been successfully implemented and tested in both the software (MATLAB 7) and hardware Universal Software Radio Peripheral (USRP) environments. Specifically, the encrypted data were tested by transmitting them through an additive white Gaussian noise (AWGN) channel model. Furthermore, the sternness of the implemented algorithm was validated by employing standard metrics such as the unified average changing intensity (UACI), number of pixels change rate (NPCR), correlation values and histograms. The estimated metrics have also been compared with the existing methods and dominate in terms of large key space to defy brute force attack, cropping attack, strong key sensitivity and uniform pixel value distribution on encryption. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Fundamental quantitative security in quantum key generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuen, Horace P.

    2010-12-15

    We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographicmore » context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.« less

  4. A joint asymmetric watermarking and image encryption scheme

    NASA Astrophysics Data System (ADS)

    Boato, G.; Conotter, V.; De Natale, F. G. B.; Fontanari, C.

    2008-02-01

    Here we introduce a novel watermarking paradigm designed to be both asymmetric, i.e., involving a private key for embedding and a public key for detection, and commutative with a suitable encryption scheme, allowing both to cipher watermarked data and to mark encrypted data without interphering with the detection process. In order to demonstrate the effectiveness of the above principles, we present an explicit example where the watermarking part, based on elementary linear algebra, and the encryption part, exploiting a secret random permutation, are integrated in a commutative scheme.

  5. A novel chaotic image encryption scheme using DNA sequence operations

    NASA Astrophysics Data System (ADS)

    Wang, Xing-Yuan; Zhang, Ying-Qian; Bao, Xue-Mei

    2015-10-01

    In this paper, we propose a novel image encryption scheme based on DNA (Deoxyribonucleic acid) sequence operations and chaotic system. Firstly, we perform bitwise exclusive OR operation on the pixels of the plain image using the pseudorandom sequences produced by the spatiotemporal chaos system, i.e., CML (coupled map lattice). Secondly, a DNA matrix is obtained by encoding the confused image using a kind of DNA encoding rule. Then we generate the new initial conditions of the CML according to this DNA matrix and the previous initial conditions, which can make the encryption result closely depend on every pixel of the plain image. Thirdly, the rows and columns of the DNA matrix are permuted. Then, the permuted DNA matrix is confused once again. At last, after decoding the confused DNA matrix using a kind of DNA decoding rule, we obtain the ciphered image. Experimental results and theoretical analysis show that the scheme is able to resist various attacks, so it has extraordinarily high security.

  6. A novel encryption scheme for high-contrast image data in the Fresnelet domain

    PubMed Central

    Bibi, Nargis; Farwa, Shabieh; Jahngir, Adnan; Usman, Muhammad

    2018-01-01

    In this paper, a unique and more distinctive encryption algorithm is proposed. This is based on the complexity of highly nonlinear S box in Flesnelet domain. The nonlinear pattern is transformed further to enhance the confusion in the dummy data using Fresnelet technique. The security level of the encrypted image boosts using the algebra of Galois field in Fresnelet domain. At first level, the Fresnelet transform is used to propagate the given information with desired wavelength at specified distance. It decomposes given secret data into four complex subbands. These complex sub-bands are separated into two components of real subband data and imaginary subband data. At second level, the net subband data, produced at the first level, is deteriorated to non-linear diffused pattern using the unique S-box defined on the Galois field F28. In the diffusion process, the permuted image is substituted via dynamic algebraic S-box substitution. We prove through various analysis techniques that the proposed scheme enhances the cipher security level, extensively. PMID:29608609

  7. Performance analysis of AES-Blowfish hybrid algorithm for security of patient medical record data

    NASA Astrophysics Data System (ADS)

    Mahmud H, Amir; Angga W, Bayu; Tommy; Marwan E, Andi; Siregar, Rosyidah

    2018-04-01

    A file security is one method to protect data confidentiality, integrity and information security. Cryptography is one of techniques used to secure and guarantee data confidentiality by doing conversion to the plaintext (original message) to cipher text (hidden message) with two important processes, they are encrypt and decrypt. Some researchers proposed a hybrid method to improve data security. In this research we proposed hybrid method of AES-blowfish (BF) to secure the patient’s medical report data into the form PDF file that sources from database. Generation method of private and public key uses two ways of approach, those are RSA method f RSA and ECC. We will analyze impact of these two ways of approach for hybrid method at AES-blowfish based on time and Throughput. Based on testing results, BF method is faster than AES and AES-BF hybrid, however AES-BF hybrid is better for throughput compared with AES and BF is higher.

  8. Vulnerabilities in GSM technology and feasibility of selected attacks

    NASA Astrophysics Data System (ADS)

    Voznak, M.; Prokes, M.; Sevcik, L.; Frnda, J.; Toral-Cruz, Homer; Jakovlev, Sergej; Fazio, Peppino; Mehic, M.; Mikulec, M.

    2015-05-01

    Global System for Mobile communication (GSM) is the most widespread technology for mobile communications in the world and serving over 7 billion users. Since first publication of system documentation there has been notified a potential safety problem's occurrence. Selected types of attacks, based on the analysis of the technical feasibility and the degree of risk of these weaknesses, were implemented and demonstrated in laboratory of the VSB-Technical University of Ostrava, Czech Republic. These vulnerabilities were analyzed and afterwards possible attacks were described. These attacks were implemented using open-source tools, software programmable radio USRP (Universal Software RadioPeripheral) and DVB-T (Digital Video Broadcasting - Terrestrial) receiver. GSM security architecture is being scrutinized since first public releases of its specification mainly pointing out weaknesses in authentication and ciphering mechanisms. This contribution also summarizes practically proofed and used scenarios that are performed using opensource software tools and variety of scripts mostly written in Python. Main goal of this paper is in analyzing security issues in GSM network and practical demonstration of selected attacks.

  9. [Resilience and Spirituality Considered from Viewpoint of Existential Philosophy of Karl Jaspers].

    PubMed

    Kato, Satoshi

    2015-01-01

    After publishing "General Psychopathology" in 1913, Jaspers turned his attention to serious philosophical contemplation. Using the term grenzsituation (limit situation) as a key concept, he first presented a framework to shed light on the pathology of both individuals and groups, and this led on to include the perspective of resilience. He then used three more key concepts, transzendenz (transcendence), chiffer (cipher), and unverstädliche (unintelligible) to offer a framework to focus on the possibilities of human existence. In the field of medicine, this is useful to support a spiritual approach which is discussed in palliative treatment. The philosophy developed by Jaspers can be considered as indicating a practical form of guidance for people to find self-support from a limit situation where they have lost their own support, and finally, come to a degree of mutual acceptance. Mutual acceptance is made possible at the level of ciphers, in which specific meaning remains undefined, by directing both the self and the other toward a state of "transcendence". Nowadays there is a trend for those chaplains involved in spiritual care from a specialist point of view to be trained to effectively transcend any difference in religious belief. As a basic premise, the author considers there is a need to once again return to a state before the start of individual religions, and stand on a cross-sectional ground level, an area which could be regarded as common to all religions. When conducting such a task, in the author's view, the restrained spirituality that Jaspers expounded is thought-provoking.

  10. AMS Prototyping Activities

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott

    2008-01-01

    This slide presentation reviews the activity around the Asynchronous Message Service (AMS) prototype. An AMS reference implementation has been available since late 2005. It is aimed at supporting message exchange both in on-board environments and over space links. The implementation incoroporates all mandatory elements of the draft recommendation from July 2007: (1) MAMS, AMS, and RAMS protocols. (2) Failover, heartbeats, resync. (3) "Hooks" for security, but no cipher suites included in the distribution. The performance is reviewed, and a Benchmark latency test over VxWorks Message Queues is shown as histograms of a count vs microseconds per 1000-byte message

  11. Deducing trapdoor primitives in public key encryption schemes

    NASA Astrophysics Data System (ADS)

    Pandey, Chandra

    2005-03-01

    Semantic security of public key encryption schemes is often interchangeable with the art of building trapdoors. In the frame of reference of Random Oracle methodology, the "Key Privacy" and "Anonymity" has often been discussed. However to a certain degree the security of most public key encryption schemes is required to be analyzed with formal proofs using one-way functions. This paper evaluates the design of El Gamal and RSA based schemes and attempts to parallelize the trapdoor primitives used in the computation of the cipher text, thereby magnifying the decryption error δp in the above schemes.

  12. Another Look at the "Dismal Science" and Jenner's Experiment.

    PubMed

    Elllis, John A

    2018-03-01

    The follow-up to Jenner's experiment, routine vaccination, has reduced more disease and saved more vertebrate lives than any other iatrogenic procedure by orders of magnitude. The unassailability of that potentially provocative cliché has been ciphered in human medicine, even if it is more difficult in our profession. Most public relations headaches concerning vaccines are a failure to communicate, often resulting in overly great expectations. Even in the throes of a tight appointment schedule remembering and synopsizing (for clients), some details of the dismal science can make practice great again. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Information Security Scheme Based on Computational Temporal Ghost Imaging.

    PubMed

    Jiang, Shan; Wang, Yurong; Long, Tao; Meng, Xiangfeng; Yang, Xiulun; Shu, Rong; Sun, Baoqing

    2017-08-09

    An information security scheme based on computational temporal ghost imaging is proposed. A sequence of independent 2D random binary patterns are used as encryption key to multiply with the 1D data stream. The cipher text is obtained by summing the weighted encryption key. The decryption process can be realized by correlation measurement between the encrypted information and the encryption key. Due to the instinct high-level randomness of the key, the security of this method is greatly guaranteed. The feasibility of this method and robustness against both occlusion and additional noise attacks are discussed with simulation, respectively.

  14. Pseudo-random bit generator based on lag time series

    NASA Astrophysics Data System (ADS)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  15. Mental ability and psychological work performance in Chinese workers.

    PubMed

    Zhong, Fei; Yano, Eiji; Lan, Yajia; Wang, Mianzhen; Wang, Zhiming; Wang, Xiaorong

    2006-10-01

    This study was to explore the relationship among mental ability, occupational stress, and psychological work performance in Chinese workers, and to identify relevant modifiers of mental ability and psychological work performance. Psychological Stress Intensity (PSI), psychological work performance, and mental ability (Mental Function Index, MFI) were determined among 485 Chinese workers (aged 33 to 62 yr, 65% of men) with varied work occupations. Occupational Stress Questionnaire (OSQ) and mental ability with 3 tests (including immediate memory, digit span, and cipher decoding) were used. The relationship between mental ability and psychological work performance was analyzed with multiple linear regression approach. PSI, MFI, or psychological work performance were significantly different among different work types and educational level groups (p<0.01). Multiple linear regression analysis showed that MFI was significantly related to gender, age, educational level, and work type. Higher MFI and lower PSI predicted a better psychological work performance, even after adjusted for gender, age, educational level, and work type. The study suggests that occupational stress and low mental ability are important predictors for poor psychological work performance, which is modified by both gender and educational level.

  16. Home and Clinical Cardiovascular Care Center (H4C): a Framework for Integrating Body Sensor Networks and QTRU Cryptography System.

    PubMed

    Zakerolhosseini, Ali; Sokouti, Massoud; Pezeshkian, Massoud

    2013-01-01

    Quick responds to heart attack patients before arriving to hospital is a very important factor. In this paper, a combined model of Body Sensor Network and Personal Digital Access using QTRU cipher algorithm in Wifi networks is presented to efficiently overcome these life threatening attacks. The algorithm for optimizing the routing paths between sensor nodes and an algorithm for reducing the power consumption are also applied for achieving the best performance by this model. This system is consumes low power and has encrypting and decrypting processes. It also has an efficient routing path in a fast manner.

  17. Trusted Storage: Putting Security and Data Together

    NASA Astrophysics Data System (ADS)

    Willett, Michael; Anderson, Dave

    State and Federal breach notification legislation mandates that the affected parties be notified in case of a breach of sensitive personal data, unless the data was provably encrypted. Self-encrypting hard drives provide the superior solution for encrypting data-at-rest when compared to software-based solutions. Self-encrypting hard drives, from the laptop to the data center, have been standardized across the hard drive industry by the Trusted Computing Group. Advantages include: simplified management (including keys), no performance impact, quick data erasure and drive re-purposing, no interference with end-to-end data integrity metrics, always encrypting, no cipher-text exposure, and scalability in large data centers.

  18. Design, Assembly, and Characterization of TALE-Based Transcriptional Activators and Repressors.

    PubMed

    Thakore, Pratiksha I; Gersbach, Charles A

    2016-01-01

    Transcription activator-like effectors (TALEs) are modular DNA-binding proteins that can be fused to a variety of effector domains to regulate the epigenome. Nucleotide recognition by TALE monomers follows a simple cipher, making this a powerful and versatile method to activate or repress gene expression. Described here are methods to design, assemble, and test TALE transcription factors (TALE-TFs) for control of endogenous gene expression. In this protocol, TALE arrays are constructed by Golden Gate cloning and tested for activity by transfection and quantitative RT-PCR. These methods for engineering TALE-TFs are useful for studies in reverse genetics and genomics, synthetic biology, and gene therapy.

  19. Home and Clinical Cardiovascular Care Center (H4C): a Framework for Integrating Body Sensor Networks and QTRU Cryptography System

    PubMed Central

    Zakerolhosseini, Ali; Sokouti, Massoud; Pezeshkian, Massoud

    2013-01-01

    Quick responds to heart attack patients before arriving to hospital is a very important factor. In this paper, a combined model of Body Sensor Network and Personal Digital Access using QTRU cipher algorithm in Wifi networks is presented to efficiently overcome these life threatening attacks. The algorithm for optimizing the routing paths between sensor nodes and an algorithm for reducing the power consumption are also applied for achieving the best performance by this model. This system is consumes low power and has encrypting and decrypting processes. It also has an efficient routing path in a fast manner. PMID:24252988

  20. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    PubMed

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.

  1. Hybrid cryptosystem for image file using elgamal and double playfair cipher algorithm

    NASA Astrophysics Data System (ADS)

    Hardi, S. M.; Tarigan, J. T.; Safrina, N.

    2018-03-01

    In this paper, we present an implementation of an image file encryption using hybrid cryptography. We chose ElGamal algorithm to perform asymmetric encryption and Double Playfair for the symmetric encryption. Our objective is to show that these algorithms are capable to encrypt an image file with an acceptable running time and encrypted file size while maintaining the level of security. The application was built using C# programming language and ran as a stand alone desktop application under Windows Operating System. Our test shows that the system is capable to encrypt an image with a resolution of 500×500 to a size of 976 kilobytes with an acceptable running time.

  2. Information hiding based on double random-phase encoding and public-key cryptography.

    PubMed

    Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li

    2009-03-02

    A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.

  3. Design, Assembly, and Characterization of TALE-Based Transcriptional Activators and Repressors

    PubMed Central

    Thakore, Pratiksha I.; Gersbach, Charles A.

    2016-01-01

    Transcription activator-like effectors (TALEs) are modular DNA-binding proteins that can be fused to a variety of effector domains to regulate the epigenome. Nucleotide recognition by TALE monomers follows a simple cipher, making this a powerful and versatile method to activate or repress gene expression. Described here are methods to design, assemble, and test TALE transcription factors (TALE-TFs) for control of endogenous gene expression. In this protocol, TALE arrays are constructed by Golden Gate cloning and tested for activity by transfection and quantitative RT-PCR. These methods for engineering TALE-TFs are useful for studies in reverse genetics and genomics, synthetic biology, and gene therapy. PMID:26443215

  4. [THE TECHNOLOGY "CELL BLOCK" IN CYTOLOGICAL PRACTICE].

    PubMed

    Volchenko, N N; Borisova, O V; Baranova, I B

    2015-08-01

    The article presents summary information concerning application of "cell block" technology in cytological practice. The possibilities of implementation of various modern techniques (immune cytochemnical analysis. FISH, CISH, polymerase chain reaction) with application of "cell block" method are demonstrated. The original results of study of "cell block" technology made with gelatin, AgarCyto and Shadon Cyoblock set are presented. The diagnostic effectiveness of "cell block" technology and common cytological smear and also immune cytochemical analysis on samples of "cell block" technology and fluid cytology were compared. Actually application of "cell block" technology is necessary for ensuring preservation of cell elements for subsequent immune cytochemical and molecular genetic analysis.

  5. Image Encryption Algorithm Based on Hyperchaotic Maps and Nucleotide Sequences Database

    PubMed Central

    2017-01-01

    Image encryption technology is one of the main means to ensure the safety of image information. Using the characteristics of chaos, such as randomness, regularity, ergodicity, and initial value sensitiveness, combined with the unique space conformation of DNA molecules and their unique information storage and processing ability, an efficient method for image encryption based on the chaos theory and a DNA sequence database is proposed. In this paper, digital image encryption employs a process of transforming the image pixel gray value by using chaotic sequence scrambling image pixel location and establishing superchaotic mapping, which maps quaternary sequences and DNA sequences, and by combining with the logic of the transformation between DNA sequences. The bases are replaced under the displaced rules by using DNA coding in a certain number of iterations that are based on the enhanced quaternary hyperchaotic sequence; the sequence is generated by Chen chaos. The cipher feedback mode and chaos iteration are employed in the encryption process to enhance the confusion and diffusion properties of the algorithm. Theoretical analysis and experimental results show that the proposed scheme not only demonstrates excellent encryption but also effectively resists chosen-plaintext attack, statistical attack, and differential attack. PMID:28392799

  6. A novel algorithm for thermal image encryption.

    PubMed

    Hussain, Iqtadar; Anees, Amir; Algarni, Abdulmohsen

    2018-04-16

    Thermal images play a vital character at nuclear plants, Power stations, Forensic labs biological research, and petroleum products extraction. Safety of thermal images is very important. Image data has some unique features such as intensity, contrast, homogeneity, entropy and correlation among pixels that is why somehow image encryption is trickier as compare to other encryptions. With conventional image encryption schemes it is normally hard to handle these features. Therefore, cryptographers have paid attention to some attractive properties of the chaotic maps such as randomness and sensitivity to build up novel cryptosystems. That is why, recently proposed image encryption techniques progressively more depends on the application of chaotic maps. This paper proposed an image encryption algorithm based on Chebyshev chaotic map and S8 Symmetric group of permutation based substitution boxes. Primarily, parameters of chaotic Chebyshev map are chosen as a secret key to mystify the primary image. Then, the plaintext image is encrypted by the method generated from the substitution boxes and Chebyshev map. By this process, we can get a cipher text image that is perfectly twisted and dispersed. The outcomes of renowned experiments, key sensitivity tests and statistical analysis confirm that the proposed algorithm offers a safe and efficient approach for real-time image encryption.

  7. Development of surface enhanced Raman scattering (SERS) spectroscopy monitoring of fuel markers to prevent fraud

    NASA Astrophysics Data System (ADS)

    Wilkinson, Timothy; Clarkson, John; White, Peter C.; Meakin, Nicholas; McDonald, Ken

    2013-05-01

    Governments often tax fuel products to generate revenues to support and stimulate their economies. They also subsidize the cost of essential fuel products. Fuel taxation and subsidization practices are both subject to fraud. Oil marketing companies also suffer from fuel fraud with loss of legitimate sales and additional quality and liability issues. The use of an advanced marking system to identify and control fraud has been shown to be effective in controlling illegal activity. DeCipher has developed surface enhanced Raman scattering (SERS) spectroscopy as its lead technology for measuring markers in fuel to identify and control malpractice. SERS has many advantages that make it highly suitable for this purpose. The SERS instruments are portable and can be used to monitor fuel at any point in the supply chain. SERS shows high specificity for the marker, with no false positives. Multiple markers can also be detected in a single SERS analysis allowing, for example, specific regional monitoring of fuel. The SERS analysis from fuel is also quick, clear and decisive, with a measurement time of less than 5 minutes. We will present results highlighting our development of the use of a highly stable silver colloid as a SERS substrate to measure the markers at ppb levels. Preliminary results from the use of a solid state SERS substrate to measure fuel markers will also be presented.

  8. Cryptosystem based on two-step phase-shifting interferometry and the RSA public-key encryption algorithm

    NASA Astrophysics Data System (ADS)

    Meng, X. F.; Peng, X.; Cai, L. Z.; Li, A. M.; Gao, Z.; Wang, Y. R.

    2009-08-01

    A hybrid cryptosystem is proposed, in which one image is encrypted to two interferograms with the aid of double random-phase encoding (DRPE) and two-step phase-shifting interferometry (2-PSI), then three pairs of public-private keys are utilized to encode and decode the session keys (geometrical parameters, the second random-phase mask) and interferograms. In the stage of decryption, the ciphered image can be decrypted by wavefront reconstruction, inverse Fresnel diffraction, and real amplitude normalization. This approach can successfully solve the problem of key management and dispatch, resulting in increased security strength. The feasibility of the proposed cryptosystem and its robustness against some types of attack are verified and analyzed by computer simulations.

  9. Quantum Color Image Encryption Algorithm Based on A Hyper-Chaotic System and Quantum Fourier Transform

    NASA Astrophysics Data System (ADS)

    Tan, Ru-Chao; Lei, Tong; Zhao, Qing-Min; Gong, Li-Hua; Zhou, Zhi-Hong

    2016-12-01

    To improve the slow processing speed of the classical image encryption algorithms and enhance the security of the private color images, a new quantum color image encryption algorithm based on a hyper-chaotic system is proposed, in which the sequences generated by the Chen's hyper-chaotic system are scrambled and diffused with three components of the original color image. Sequentially, the quantum Fourier transform is exploited to fulfill the encryption. Numerical simulations show that the presented quantum color image encryption algorithm possesses large key space to resist illegal attacks, sensitive dependence on initial keys, uniform distribution of gray values for the encrypted image and weak correlation between two adjacent pixels in the cipher-image.

  10. Design and Test of Pseudorandom Number Generator Using a Star Network of Lorenz Oscillators

    NASA Astrophysics Data System (ADS)

    Cho, Kenichiro; Miyano, Takaya

    We have recently developed a chaos-based stream cipher based on augmented Lorenz equations as a star network of Lorenz subsystems. In our method, the augmented Lorenz equations are used as a pseudorandom number generator. In this study, we propose a new method based on the augmented Lorenz equations for generating binary pseudorandom numbers and evaluate its security using the statistical tests of SP800-22 published by the National Institute for Standards and Technology in comparison with the performances of other chaotic dynamical models used as binary pseudorandom number generators. We further propose a faster version of the proposed method and evaluate its security using the statistical tests of TestU01 published by L’Ecuyer and Simard.

  11. Exponential H ∞ Synchronization of Chaotic Cryptosystems Using an Improved Genetic Algorithm

    PubMed Central

    Hsiao, Feng-Hsiag

    2015-01-01

    This paper presents a systematic design methodology for neural-network- (NN-) based secure communications in multiple time-delay chaotic (MTDC) systems with optimal H ∞ performance and cryptography. On the basis of the Improved Genetic Algorithm (IGA), which is demonstrated to have better performance than that of a traditional GA, a model-based fuzzy controller is then synthesized to stabilize the MTDC systems. A fuzzy controller is synthesized to not only realize the exponential synchronization, but also achieve optimal H ∞ performance by minimizing the disturbance attenuation level. Furthermore, the error of the recovered message is stated by using the n-shift cipher and key. Finally, a numerical example with simulations is given to demonstrate the effectiveness of our approach. PMID:26366432

  12. An Analysis of Research on Block Scheduling

    ERIC Educational Resources Information Center

    Zepeda, Sally J.; Mayers, R. Stewart

    2006-01-01

    In this analysis of 58 empirical studies of high school block scheduling, the authors report findings in and across five groupings. Within groups, data were inconsistent regarding whether teachers' practices changed, but teachers believed that staff development was necessary to teach in a block schedule. Block scheduling appeared to increase…

  13. [Transcription activator-like effectors(TALEs)based genome engineering].

    PubMed

    Zhao, Mei-Wei; Duan, Cheng-Li; Liu, Jiang

    2013-10-01

    Systematic reverse-engineering of functional genome architecture requires precise modifications of gene sequences and transcription levels. The development and application of transcription activator-like effectors(TALEs) has created a wealth of genome engineering possibilities. TALEs are a class of naturally occurring DNA-binding proteins found in the plant pathogen Xanthomonas species. The DNA-binding domain of each TALE typically consists of tandem 34-amino acid repeat modules rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Such "genome engineering" has now been established in human cells and a number of model organisms, thus opening the door to better understanding gene function in model organisms, improving traits in crop plants and treating human genetic disorders.

  14. Privacy authentication using key attribute-based encryption in mobile cloud computing

    NASA Astrophysics Data System (ADS)

    Mohan Kumar, M.; Vijayan, R.

    2017-11-01

    Mobile Cloud Computing is becoming more popular in nowadays were users of smartphones are getting increased. So, the security level of cloud computing as to be increased. Privacy Authentication using key-attribute based encryption helps the users for business development were the data sharing with the organization using the cloud in a secured manner. In Privacy Authentication the sender of data will have permission to add their receivers to whom the data access provided for others the access denied. In sender application, the user can choose the file which is to be sent to receivers and then that data will be encrypted using Key-attribute based encryption using AES algorithm. In which cipher created, and that stored in Amazon Cloud along with key value and the receiver list.

  15. An algorithm for encryption of secret images into meaningful images

    NASA Astrophysics Data System (ADS)

    Kanso, A.; Ghebleh, M.

    2017-03-01

    Image encryption algorithms typically transform a plain image into a noise-like cipher image, whose appearance is an indication of encrypted content. Bao and Zhou [Image encryption: Generating visually meaningful encrypted images, Information Sciences 324, 2015] propose encrypting the plain image into a visually meaningful cover image. This improves security by masking existence of encrypted content. Following their approach, we propose a lossless visually meaningful image encryption scheme which improves Bao and Zhou's algorithm by making the encrypted content, i.e. distortions to the cover image, more difficult to detect. Empirical results are presented to show high quality of the resulting images and high security of the proposed algorithm. Competence of the proposed scheme is further demonstrated by means of comparison with Bao and Zhou's scheme.

  16. Encryption and watermark-treated medical image against hacking disease-An immune convention in spatial and frequency domains.

    PubMed

    Lakshmi, C; Thenmozhi, K; Rayappan, John Bosco Balaguru; Amirtharajan, Rengarajan

    2018-06-01

    Digital Imaging and Communications in Medicine (DICOM) is one among the significant formats used worldwide for the representation of medical images. Undoubtedly, medical-image security plays a crucial role in telemedicine applications. Merging encryption and watermarking in medical-image protection paves the way for enhancing the authentication and safer transmission over open channels. In this context, the present work on DICOM image encryption has employed a fuzzy chaotic map for encryption and the Discrete Wavelet Transform (DWT) for watermarking. The proposed approach overcomes the limitation of the Arnold transform-one of the most utilised confusion mechanisms in image ciphering. Various metrics have substantiated the effectiveness of the proposed medical-image encryption algorithm. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Protect sensitive data with lightweight memory encryption

    NASA Astrophysics Data System (ADS)

    Zhou, Hongwei; Yuan, Jinhui; Xiao, Rui; Zhang, Kai; Sun, Jingyao

    2018-04-01

    Since current commercial processor is not able to deal with the data in the cipher text, the sensitive data have to be exposed in the memory. It leaves a window for the adversary. To protect the sensitive data, a direct idea is to encrypt the data when the processor does not access them. On the observation, we have developed a lightweight memory encryption, called LeMe, to protect the sensitive data in the application. LeMe marks the sensitive data in the memory with the page table entry, and encrypts the data in their free time. LeMe is built on the Linux with a 3.17.6 kernel, and provides four user interfaces as dynamic link library. Our evaluations show LeMe is effective to protect the sensitive data and incurs an acceptable performance overhead.

  18. Multi-focus image fusion and robust encryption algorithm based on compressive sensing

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Wang, Lan; Xiang, Tao; Wang, Yong

    2017-06-01

    Multi-focus image fusion schemes have been studied in recent years. However, little work has been done in multi-focus image transmission security. This paper proposes a scheme that can reduce data transmission volume and resist various attacks. First, multi-focus image fusion based on wavelet decomposition can generate complete scene images and optimize the perception of the human eye. The fused images are sparsely represented with DCT and sampled with structurally random matrix (SRM), which reduces the data volume and realizes the initial encryption. Then the obtained measurements are further encrypted to resist noise and crop attack through combining permutation and diffusion stages. At the receiver, the cipher images can be jointly decrypted and reconstructed. Simulation results demonstrate the security and robustness of the proposed scheme.

  19. Encoding plaintext by Fourier transform hologram in double random phase encoding using fingerprint keys

    NASA Astrophysics Data System (ADS)

    Takeda, Masafumi; Nakano, Kazuya; Suzuki, Hiroyuki; Yamaguchi, Masahiro

    2012-09-01

    It has been shown that biometric information can be used as a cipher key for binary data encryption by applying double random phase encoding. In such methods, binary data are encoded in a bit pattern image, and the decrypted image becomes a plain image when the key is genuine; otherwise, decrypted images become random images. In some cases, images decrypted by imposters may not be fully random, such that the blurred bit pattern can be partially observed. In this paper, we propose a novel bit coding method based on a Fourier transform hologram, which makes images decrypted by imposters more random. Computer experiments confirm that the method increases the randomness of images decrypted by imposters while keeping the false rejection rate as low as in the conventional method.

  20. DNA based random key generation and management for OTP encryption.

    PubMed

    Zhang, Yunpeng; Liu, Xin; Sun, Manhui

    2017-09-01

    One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Dinesh; Thapliyal, Himanshu; Mohammad, Azhar

    Differential Power Analysis (DPA) attack is considered to be a main threat while designing cryptographic processors. In cryptographic algorithms like DES and AES, S-Box is used to indeterminate the relationship between the keys and the cipher texts. However, S-box is prone to DPA attack due to its high power consumption. In this paper, we are implementing an energy-efficient 8-bit S-Box circuit using our proposed Symmetric Pass Gate Adiabatic Logic (SPGAL). SPGAL is energy-efficient as compared to the existing DPAresistant adiabatic and non-adiabatic logic families. SPGAL is energy-efficient due to reduction of non-adiabatic loss during the evaluate phase of the outputs.more » Further, the S-Box circuit implemented using SPGAL is resistant to DPA attacks. The results are verified through SPICE simulations in 180nm technology. SPICE simulations show that the SPGAL based S-Box circuit saves upto 92% and 67% of energy as compared to the conventional CMOS and Secured Quasi-Adiabatic Logic (SQAL) based S-Box circuit. From the simulation results, it is evident that the SPGAL based circuits are energy-efficient as compared to the existing DPAresistant adiabatic and non-adiabatic logic families. In nutshell, SPGAL based gates can be used to build secure hardware for lowpower portable electronic devices and Internet-of-Things (IoT) based electronic devices.« less

  2. Evaluation of transversus abdominis plane block for renal transplant recipients - A meta-analysis and trial sequential analysis of published studies.

    PubMed

    Singh, Preet Mohinder; Borle, Anuradha; Makkar, Jeetinder Kaur; Trisha, Aanjan; Sinha, Aashish

    2018-01-01

    Patients undergoing renal transplant (RT) have altered drug/opioid pharmacokinetics. Transversus abdominis plane (TAP) block in renal transplant recipients has been recently evaluated for analgesic and opioid-sparing potential by many trials. The studies comparing TAP-block to conventional analgesic regimens for RT were searched. Comparisons were made for total opioids consumed (as morphine-equivalents) during the first postoperative 24-h (primary objective), intraoperative, and immediate-postoperative period. Pain scores and postoperative nausea-vomiting (PONV) were also evaluated. Trial sequential analysis (TSA) was used to quantify the strength of analysis. Ten-trials with 258 and 237 patients in control and TAP-block group, respectively, were included. TAP-block decreased the 24-h (reported in 9-trials) opioid consumption by 14.61 ± 4.34 mg (reduction by 42.7%, random-effects, P < 0.001, I 2 = 97.82%). Sample size of the present analysis (472) was well past the required "information-size" variable (396) as per the TSA for a power of 85%. Intraoperative opioid consumption also decreased by 2.06 ± 0.63 mg (reduction of 27.8%) (random effects, P < 0.001, I 2 = 98.84%). Pain scores with TAP-block were significantly lower in both early and delayed postoperative phase. Odds ratio for PONV without TAP block was 1.99 ± 1.05 (Fixed-effects, P = 0.04, I 2 = 0%). Publication bias was likely (Egger's test, X-intercept=7.89, P < 0.05). TAP-block significantly lowers the intraoperative and cumulative postoperative 24-h opioid consumption in RT recipients. Persistent and better pain control is achieved when TAP-Block is used. Benefits of TAP block extend beyond the analgesic actions alone as it also decreases the 24-h incidence of postoperative nausea vomiting as well. The technique of the block needs standardization for RT recipients.

  3. A Design Selection Procedure.

    ERIC Educational Resources Information Center

    Kroeker, Leonard P.

    The problem of blocking on a status variable was investigated. The one-way fixed-effects analysis of variance, analysis of covariance, and generalized randomized block designs each treat the blocking problem in a different way. In order to compare these designs, it is necessary to restrict attention to experimental situations in which observations…

  4. A Study on the Saving Method of Plate Jigs in Hull Block Butt Welding

    NASA Astrophysics Data System (ADS)

    Ko, Dae-Eun

    2017-11-01

    A large amount of plate jigs is used for alignment of welding line and control of welding deformations in hull block assembly stage. Besides material cost, the huge working man-hours required for working process of plate jigs is one of the obstacles in productivity growth of shipyard. In this study, analysis method was proposed to simulate the welding deformations of block butt joint with plate jigs setting. Using the proposed analysis method, an example simulation was performed for actual panel block joint to investigate the saving method of plate jigs. Results show that it is possible to achieve two objectives of quality accuracy of the hull block and saving the plate jig usage at the same time by deploying the plate jigs at the right places. And the proposed analysis method can be used in establishing guidelines for the proper use of plate jigs in block assembly stage.

  5. Can You Party Your Way to Better Health? A Propensity Score Analysis of Block Parties and Health

    PubMed Central

    Dean, Lorraine T.; Hillier, Amy; Chau-Glendinning, Hang; Subramanian, SV; Williams, David R.; Kawachi, Ichiro

    2015-01-01

    While other indicators of social capital have been linked to health, the role of block parties on health in Black neighborhoods and on Black residents is understudied. Block parties exhibit several features of bonding social capital and are present in nearly 90% of Philadelphia’s predominantly Black neighborhoods. This analysis investigated: (1) whether or not block parties are an indicator of bonding social capital in Black neighborhoods; (2) the degree to which block parties might be related to self-rated health in the ways that other bonding social indicators are related to health; and (3) whether or not block parties are associated with average self-rated health for Black residents particularly. Using census tract-level indicators of bonding social capital and records of block parties from 2003 to 2008 for 381 Philadelphia neighborhoods (defined by census tracts), an ecological-level propensity score was generated to assess the propensity for a block party, adjusting for population demographics, neighborhood characteristics, neighborhood resources and violent crime. Results indicate that in multivariable regression, block parties were associated with increased bonding social capital in Black neighborhoods; however, the calculation of the average effect of the treatment on the treated (ATT) within each propensity score strata showed no effect of block parties on average self-rated health for Black residents. Block parties may be an indicator of bonding social capital in Philadelphia’s predominantly Black neighborhoods, but this analysis did not show a direct association between block parties and self-rated health for Black residents. Further research should consider what other health outcomes or behaviors block parties may be related to and how interventionists can leverage block parties for health promotion. PMID:26117555

  6. Metabolic control analysis of developing oilseed rape (Brassica napus cv Westar) embryos shows that lipid assembly exerts significant control over oil accumulation.

    PubMed

    Tang, Mingguo; Guschina, Irina A; O'Hara, Paul; Slabas, Antoni R; Quant, Patti A; Fawcett, Tony; Harwood, John L

    2012-10-01

    Metabolic control analysis allows the study of metabolic regulation. We applied both single- and double-manipulation top-down control analysis to examine the control of lipid accumulation in developing oilseed rape (Brassica napus) embryos. The biosynthetic pathway was conceptually divided into two blocks of reactions (fatty acid biosynthesis (Block A), lipid assembly (Block B)) connected by a single system intermediate, the acyl-coenzyme A (acyl-CoA) pool. Single manipulation used exogenous oleate. Triclosan was used to inhibit specifically Block A, whereas diazepam selectively manipulated flux through Block B. Exogenous oleate inhibited the radiolabelling of fatty acids from [1-(14)C]acetate, but stimulated that from [U-14C]glycerol into acyl lipids. The calculation of group flux control coefficients showed that c. 70% of the metabolic control was in the lipid assembly block of reactions. Monte Carlo simulations gave an estimation of the error of the resulting group flux control coefficients as 0.27±0.06 for Block A and 0.73±0.06 for Block B. The two methods of control analysis gave very similar results and showed that Block B reactions were more important under our conditions. This contrasts notably with data from oil palm or olive fruit cultures and is important for efforts to increase oilseed rape lipid yields. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.

  7. Using cluster analysis to organize and explore regional GPS velocities

    USGS Publications Warehouse

    Simpson, Robert W.; Thatcher, Wayne; Savage, James C.

    2012-01-01

    Cluster analysis offers a simple visual exploratory tool for the initial investigation of regional Global Positioning System (GPS) velocity observations, which are providing increasingly precise mappings of actively deforming continental lithosphere. The deformation fields from dense regional GPS networks can often be concisely described in terms of relatively coherent blocks bounded by active faults, although the choice of blocks, their number and size, can be subjective and is often guided by the distribution of known faults. To illustrate our method, we apply cluster analysis to GPS velocities from the San Francisco Bay Region, California, to search for spatially coherent patterns of deformation, including evidence of block-like behavior. The clustering process identifies four robust groupings of velocities that we identify with four crustal blocks. Although the analysis uses no prior geologic information other than the GPS velocities, the cluster/block boundaries track three major faults, both locked and creeping.

  8. Bone remodeling in onlay beta-tricalcium phosphate and coral grafts to rat calvaria: microcomputerized tomography analysis.

    PubMed

    Anavi, Yakir; Avishai, Gal; Calderon, Shlomo; Allon, Dror M

    2011-08-01

    This study was conducted to establish the efficiency of microcomputerized tomography (micro-CT) in detection of trabecular bone remodeling of onlay grafts in a rodent calvaria model, and to compare bone remodeling after onlay grafts with beta-tricalcium phosphate (TCP) or coral calcium carbonate. Ten rats received calvarial onlay blocks-5 with TCP and 5 with coral calcium carbonate. The grafts were fixed with a titanium miniplate screw and were covered with a collagen resorbable membrane. Three months after surgery, the calvaria were segmented, and a serial 3-dimensional micro-CT scan of the calvarium and grafted bone block at 16-micrometer resolution was performed. Image analysis software was used to calculate the percentage of newly formed bone from the total block size. Newly formed bone was present adjacent to the calvarium and screw in all specimens. The mean area of newly formed bone of the total block size ranged from 34.67%-38.34% in the TCP blocks, and from 32.41%-34.72% in the coral blocks. In the TCP blocks, bone remodeling was found to be slightly higher than in the coral blocks. Micro-CT appears to be a precise, reproducible, specimen-nondestructive method of analysis of bone formation in onlay block grafts to rat calvaria.

  9. Design and analysis of lifting tool assemblies to lift different engine block

    NASA Astrophysics Data System (ADS)

    Sawant, Arpana; Deshmukh, Nilaj N.; Chauhan, Santosh; Dabhadkar, Mandar; Deore, Rupali

    2017-07-01

    Engines block are required to be lifted from one place to another while they are being processed. The human effort required for this purpose is more and also the engine block may get damaged if it is not handled properly. There is a need for designing a proper lifting tool which will be able to conveniently lift the engine block and place it at the desired position without any accident and damage to the engine block. In the present study lifting tool assemblies are designed and analyzed in such way that it may lift different categories of engine blocks. The lifting tool assembly consists of lifting plate, lifting ring, cap screws and washers. A parametric model and assembly of Lifting tool is done in 3D modelling software CREO 2.0 and analysis is carried out in ANSYS Workbench 16.0. A test block of weight equivalent to that of an engine block is considered for the purpose of analysis. In the preliminary study, without washer the stresses obtained on the lifting tool were more than the safety margin. In the present design, washers were used with appropriate dimensions which helps to bring down the stresses on the lifting tool within the safety margin. Analysis is carried out to verify that tool design meets the ASME BTH-1 required safety margin.

  10. Regularized Generalized Canonical Correlation Analysis

    ERIC Educational Resources Information Center

    Tenenhaus, Arthur; Tenenhaus, Michel

    2011-01-01

    Regularized generalized canonical correlation analysis (RGCCA) is a generalization of regularized canonical correlation analysis to three or more sets of variables. It constitutes a general framework for many multi-block data analysis methods. It combines the power of multi-block data analysis methods (maximization of well identified criteria) and…

  11. Block-Parallel Data Analysis with DIY2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Peterka, Tom

    DIY2 is a programming model and runtime for block-parallel analytics on distributed-memory machines. Its main abstraction is block-structured data parallelism: data are decomposed into blocks; blocks are assigned to processing elements (processes or threads); computation is described as iterations over these blocks, and communication between blocks is defined by reusable patterns. By expressing computation in this general form, the DIY2 runtime is free to optimize the movement of blocks between slow and fast memories (disk and flash vs. DRAM) and to concurrently execute blocks residing in memory with multiple threads. This enables the same program to execute in-core, out-of-core, serial,more » parallel, single-threaded, multithreaded, or combinations thereof. This paper describes the implementation of the main features of the DIY2 programming model and optimizations to improve performance. DIY2 is evaluated on benchmark test cases to establish baseline performance for several common patterns and on larger complete analysis codes running on large-scale HPC machines.« less

  12. Failures to replicate blocking are surprising and informative-Reply to Soto (2018).

    PubMed

    Maes, Elisa; Krypotos, Angelos-Miltiadis; Boddez, Yannick; Alfei Palloni, Joaquín Matías; D'Hooge, Rudi; De Houwer, Jan; Beckers, Tom

    2018-04-01

    The blocking effect has inspired numerous associative learning theories and is widely cited in the literature. We recently reported a series of 15 experiments that failed to obtain a blocking effect in rodents. On the basis of those consistent failures, we claimed that there is a lack of insight into the boundary conditions for blocking. In his commentary, Soto (2018) argued that contemporary associative learning theory does provide a specific boundary condition for the occurrence of blocking, namely the use of same- versus different-modality stimuli. Given that in 10 of our 15 experiments same-modality stimuli were used, he claims that our failure to observe a blocking effect is unsurprising. We disagree with that claim, because of theoretical, empirical, and statistical problems with his analysis. We also address 2 other possible reasons for a lack of blocking that are referred to in Soto's (2018) analysis, related to generalization and salience, and dissect the potential importance of both. Although Soto's (2018) analyses raise a number of interesting points, we see more merit in an empirically guided analysis and call for empirical testing of boundary conditions on blocking. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. An Image Encryption Algorithm Utilizing Julia Sets and Hilbert Curves

    PubMed Central

    Sun, Yuanyuan; Chen, Lina; Xu, Rudan; Kong, Ruiqing

    2014-01-01

    Image encryption is an important and effective technique to protect image security. In this paper, a novel image encryption algorithm combining Julia sets and Hilbert curves is proposed. The algorithm utilizes Julia sets’ parameters to generate a random sequence as the initial keys and gets the final encryption keys by scrambling the initial keys through the Hilbert curve. The final cipher image is obtained by modulo arithmetic and diffuse operation. In this method, it needs only a few parameters for the key generation, which greatly reduces the storage space. Moreover, because of the Julia sets’ properties, such as infiniteness and chaotic characteristics, the keys have high sensitivity even to a tiny perturbation. The experimental results indicate that the algorithm has large key space, good statistical property, high sensitivity for the keys, and effective resistance to the chosen-plaintext attack. PMID:24404181

  14. Suprascapular and Interscalene Nerve Block for Shoulder Surgery: A Systematic Review and Meta-analysis.

    PubMed

    Hussain, Nasir; Goldar, Ghazaleh; Ragina, Neli; Banfield, Laura; Laffey, John G; Abdallah, Faraj W

    2017-12-01

    Interscalene block provides optimal shoulder surgery analgesia, but concerns over its associated risks have prompted the search for alternatives. Suprascapular block was recently proposed as an interscalene block alternative, but evidence of its comparative analgesic effect is conflicting. This meta-analysis compares the analgesic effect and safety of suprascapular block versus interscalene block for shoulder surgery. Databases were searched for randomized trials comparing interscalene block with suprascapular block for shoulder surgery. Postoperative 24-h cumulative oral morphine consumption and the difference in the area under curve for pooled rest pain scores were designated as primary outcomes. Analgesic and safety outcomes, particularly block-related and respiratory complications, were evaluated as secondary outcomes. Results were pooled using random-effects modeling. Data from 16 studies (1,152 patients) were analyzed. Interscalene block and suprascapular block were not different in 24-h morphine consumption. The difference in area under the curve of pain scores for the 24-h interval favored interscalene block by 1.1 cm/h, but this difference was not clinically important. Compared with suprascapular block, interscalene block reduced postoperative pain but not opioid consumption during recovery room stay by a weighted mean difference (95% CI) of 1.5 cm (0.6 to 2.5 cm; P < 0.0001). Pain scores were not different at any other time. In contrast, suprascapular block reduced the odds of block-related and respiratory complications. This review suggests that there are no clinically meaningful analgesic differences between suprascapular block and interscalene block except for interscalene block providing better pain control during recovery room stay; however, suprascapular block has fewer side effects. These findings suggest that suprascapular block may be considered an effective and safe interscalene block alternative for shoulder surgery.

  15. An outline of graphical Markov models in dentistry.

    PubMed

    Helfenstein, U; Steiner, M; Menghini, G

    1999-12-01

    In the usual multiple regression model there is one response variable and one block of several explanatory variables. In contrast, in reality there may be a block of several possibly interacting response variables one would like to explain. In addition, the explanatory variables may split into a sequence of several blocks, each block containing several interacting variables. The variables in the second block are explained by those in the first block; the variables in the third block by those in the first and the second block etc. During recent years methods have been developed allowing analysis of problems where the data set has the above complex structure. The models involved are called graphical models or graphical Markov models. The main result of an analysis is a picture, a conditional independence graph with precise statistical meaning, consisting of circles representing variables and lines or arrows representing significant conditional associations. The absence of a line between two circles signifies that the corresponding two variables are independent conditional on the presence of other variables in the model. An example from epidemiology is presented in order to demonstrate application and use of the models. The data set in the example has a complex structure consisting of successive blocks: the variable in the first block is year of investigation; the variables in the second block are age and gender; the variables in the third block are indices of calculus, gingivitis and mutans streptococci and the final response variables in the fourth block are different indices of caries. Since the statistical methods may not be easily accessible to dentists, this article presents them in an introductory form. Graphical models may be of great value to dentists in allowing analysis and visualisation of complex structured multivariate data sets consisting of a sequence of blocks of interacting variables and, in particular, several possibly interacting responses in the final block.

  16. Effect of tramadol as an adjuvant to local anesthetics for brachial plexus block: A systematic review and meta-analysis

    PubMed Central

    Ju, Bum Jun; Jang, Yoo Kyung; You, Hae Seun; Kang, Hyun; Park, Ji Yong

    2017-01-01

    Background Tramadol, a 4-phenyl-piperidine analog of codeine, has a unique action in that it has a central opioidergic, noradrenergic, serotonergic analgesic, and peripheral local anesthetic (LA) effect. Many studies have reported contradictory findings regarding the peripheral analgesic effect of tramadol as an adjuvant to LA in brachial plexus block (BPB). This meta-analysis aimed to evaluate the effects of tramadol as an adjunct to LA in BPB during shoulder or upper extremity surgery. Methods We searched the PubMed, EMBASE, Cochrane, KoreaMed databases, and Google Scholar for eligible randomized controlled trials (RCTs) that compared BPB with LA alone and BPB with LA and tramadol. Primary outcomes were the effects of tramadol as an adjuvant on duration of sensory block, motor block, and analgesia. Secondary outcomes were the effects of tramadol as an adjuvant on time to onset of sensory block and motor block and on adverse effects. We performed the meta-analysis using Review Manager 5.3 software. Results We identified 16 RCTs with 751 patients. BPB with tramadol prolonged the duration of sensory block (mean difference [MD], -61.5 min; 95% CI, -95.5 to -27.6; P = 0.0004), motor block (MD, -65.6 min; 95% CI, -101.5 to -29.7; P = 0.0003), and analgesia (MD, -125.5 min; 95% CI, -175.8 to -75.3; P < 0.0001) compared with BPB without tramadol. Tramadol also shortened the time to onset of sensory block (MD, 2.1 min; 95% CI, 1.1 to 3.1; P < 0.0001) and motor block (MD, 1.2 min; 95% CI, 0.2 to 2.1; P = 0.010). In subgroup analysis, the duration of sensory block, motor block, and analgesia was prolonged for BPB with tramadol 100 mg (P < 0.05) but not for BPB with tramadol 50 mg. The quality of evidence was high for duration of analgesia according to the GRADE system. Adverse effects were comparable between the studies. Conclusions In upper extremity surgery performed under BPB, use of tramadol 100 mg as an adjuvant to LA appears to prolong the duration of sensory block, motor block, and analgesia, and shorten the time to onset of sensory and motor blocks without altering adverse effects. PMID:28953949

  17. Effect of tramadol as an adjuvant to local anesthetics for brachial plexus block: A systematic review and meta-analysis.

    PubMed

    Shin, Hye Won; Ju, Bum Jun; Jang, Yoo Kyung; You, Hae Seun; Kang, Hyun; Park, Ji Yong

    2017-01-01

    Tramadol, a 4-phenyl-piperidine analog of codeine, has a unique action in that it has a central opioidergic, noradrenergic, serotonergic analgesic, and peripheral local anesthetic (LA) effect. Many studies have reported contradictory findings regarding the peripheral analgesic effect of tramadol as an adjuvant to LA in brachial plexus block (BPB). This meta-analysis aimed to evaluate the effects of tramadol as an adjunct to LA in BPB during shoulder or upper extremity surgery. We searched the PubMed, EMBASE, Cochrane, KoreaMed databases, and Google Scholar for eligible randomized controlled trials (RCTs) that compared BPB with LA alone and BPB with LA and tramadol. Primary outcomes were the effects of tramadol as an adjuvant on duration of sensory block, motor block, and analgesia. Secondary outcomes were the effects of tramadol as an adjuvant on time to onset of sensory block and motor block and on adverse effects. We performed the meta-analysis using Review Manager 5.3 software. We identified 16 RCTs with 751 patients. BPB with tramadol prolonged the duration of sensory block (mean difference [MD], -61.5 min; 95% CI, -95.5 to -27.6; P = 0.0004), motor block (MD, -65.6 min; 95% CI, -101.5 to -29.7; P = 0.0003), and analgesia (MD, -125.5 min; 95% CI, -175.8 to -75.3; P < 0.0001) compared with BPB without tramadol. Tramadol also shortened the time to onset of sensory block (MD, 2.1 min; 95% CI, 1.1 to 3.1; P < 0.0001) and motor block (MD, 1.2 min; 95% CI, 0.2 to 2.1; P = 0.010). In subgroup analysis, the duration of sensory block, motor block, and analgesia was prolonged for BPB with tramadol 100 mg (P < 0.05) but not for BPB with tramadol 50 mg. The quality of evidence was high for duration of analgesia according to the GRADE system. Adverse effects were comparable between the studies. In upper extremity surgery performed under BPB, use of tramadol 100 mg as an adjuvant to LA appears to prolong the duration of sensory block, motor block, and analgesia, and shorten the time to onset of sensory and motor blocks without altering adverse effects.

  18. A Novel Method for Block Size Forensics Based on Morphological Operations

    NASA Astrophysics Data System (ADS)

    Luo, Weiqi; Huang, Jiwu; Qiu, Guoping

    Passive forensics analysis aims to find out how multimedia data is acquired and processed without relying on pre-embedded or pre-registered information. Since most existing compression schemes for digital images are based on block processing, one of the fundamental steps for subsequent forensics analysis is to detect the presence of block artifacts and estimate the block size for a given image. In this paper, we propose a novel method for blind block size estimation. A 2×2 cross-differential filter is first applied to detect all possible block artifact boundaries, morphological operations are then used to remove the boundary effects caused by the edges of the actual image contents, and finally maximum-likelihood estimation (MLE) is employed to estimate the block size. The experimental results evaluated on over 1300 nature images show the effectiveness of our proposed method. Compared with existing gradient-based detection method, our method achieves over 39% accuracy improvement on average.

  19. Picture of All Solutions of Successive 2-Block Maxbet Problems

    ERIC Educational Resources Information Center

    Choulakian, Vartan

    2011-01-01

    The Maxbet method is a generalized principal components analysis of a data set, where the group structure of the variables is taken into account. Similarly, 3-block[12,13] partial Maxdiff method is a generalization of covariance analysis, where only the covariances between blocks (1, 2) and (1, 3) are taken into account. The aim of this paper is…

  20. Transversus abdominis plane (TAP) block in laparoscopic colorectal surgery improves postoperative pain management: a meta-analysis.

    PubMed

    Hain, E; Maggiori, L; Prost À la Denise, J; Panis, Y

    2018-04-01

    Transversus abdominis plane (TAP) block is a locoregional anaesthesia technique of growing interest in abdominal surgery. However, its efficacy following laparoscopic colorectal surgery is still debated. This meta-analysis aimed to assess the efficacy of TAP block after laparoscopic colorectal surgery. All comparative studies focusing on TAP block after laparoscopic colorectal surgery have been systematically identified through the MEDLINE database, reviewed and included. Meta-analysis was performed according to the Mantel-Haenszel method for random effects. End-points included postoperative opioid consumption, morbidity, time to first bowel movement and length of hospital stay. A total of 13 studies, including 7 randomized controlled trials, were included, comprising a total of 600 patients who underwent laparoscopic colorectal surgery with TAP block, compared with 762 patients without TAP block. Meta-analysis of these studies showed that TAP block was associated with a significantly reduced postoperative opioid consumption on the first day after surgery [weighted mean difference (WMD) -14.54 (-25.14; -3.94); P = 0.007] and a significantly shorter time to first bowel movement [WMD -0.53 (-0.61; -0.44); P < 0.001] but failed to show any impact on length of hospital stay [WMD -0.32 (-0.83; 0.20); P = 0.23] although no study considered length of stay as its primary outcome. Finally, TAP block was not associated with a significant increase in the postoperative overall complication rate [OR = 0.84 (0.62-1.14); P = 0.27]. Transversus abdominis plane (TAP) block in laparoscopic colorectal surgery improves postoperative opioid consumption and recovery of postoperative digestive function without any significant drawback. Colorectal Disease © 2018 The Association of Coloproctology of Great Britain and Ireland.

  1. Performance analysis of a generalized upset detection procedure

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Masson, Gerald M.

    1987-01-01

    A general procedure for upset detection in complex systems, called the data block capture and analysis upset monitoring process is described and analyzed. The process consists of repeatedly recording a fixed amount of data from a set of predetermined observation lines of the system being monitored (i.e., capturing a block of data), and then analyzing the captured block in an attempt to determine whether the system is functioning correctly. The algorithm which analyzes the data blocks can be characterized in terms of the amount of time it requires to examine a given length data block to ascertain the existence of features/conditions that have been predetermined to characterize the upset-free behavior of the system. The performance of linear, quadratic, and logarithmic data analysis algorithms is rigorously characterized in terms of three performance measures: (1) the probability of correctly detecting an upset; (2) the expected number of false alarms; and (3) the expected latency in detecting upsets.

  2. An Analysis of Instruction-Cached SIMD Computer Architecture

    DTIC Science & Technology

    1993-12-01

    ASSEBLE SIMULATE SCHEDULE VERIFY :t og ... . .. ... V~JSRUCTONSFOR PECIIEDCOMPARE ASSEMBLEI SIMULATE Ift*U1II ~ ~ SCHEDULEIinw ;. & VERIFY...Cache to Place Blocks ................. 70 4.5.4 Step 4: Schedule Cache Blocks ............................. 70 4.5.5 Step 5: Store Cache Blocks...167 B.4 Scheduler .............................................. 167 B.4.1 Basic Block Definition

  3. Statistical Analysis of the Links between Blocking and Nor'easters

    NASA Astrophysics Data System (ADS)

    Booth, J. F.; Pfahl, S.

    2015-12-01

    Nor'easters can be loosely defined as extratropical cyclones that develop as they progress northward along the eastern coast of North America. The path makes it possible for these storms to generate storm surge along the coastline and/or heavy precipitation or snow inland. In the present analysis, the path of the storms is investigated relative to the behavior of upstream blocking events over the North Atlantic Ocean. For this analysis, two separate Lagrangian tracking methods are used to identify the extratropical cyclone paths and the blocking events. Using the cyclone paths, Nor'easters are identified and blocking statistics are calculated for the days prior to, during and following the occurrence of the Nor'easters. The path, strength and intensification rates of the cyclones are compared with the strength and location of the blocks. In the event that a Nor'easter occurs, the likelihood of the presence of block at the southeast tip of Greenland is statistically significantly increased, i.e., the presence of a block concurrent with a Nor'easter happens more often than by random coincidence. However no significant link between the strength of the storms and the strength of the block is identified. These results suggest that the presence of the block mainly affects the path of the Nor'easters. On the other hand, in the event of blocking at the southeast tip of Greenland, the likelihood of a Nor'easter, as opposed to a different type of storm is no greater than what one might expect from randomly sampling cyclone tracks. The results confirm a long held understanding in forecast meteorology that upstream blocking is a necessary but not sufficient condition for generating a Nor'easter.

  4. Tracing regulatory routes in metabolism using generalised supply-demand analysis.

    PubMed

    Christensen, Carl D; Hofmeyr, Jan-Hendrik S; Rohwer, Johann M

    2015-12-03

    Generalised supply-demand analysis is a conceptual framework that views metabolism as a molecular economy. Metabolic pathways are partitioned into so-called supply and demand blocks that produce and consume a particular intermediate metabolite. By studying the response of these reaction blocks to perturbations in the concentration of the linking metabolite, different regulatory routes of interaction between the metabolite and its supply and demand blocks can be identified and their contribution quantified. These responses are mediated not only through direct substrate/product interactions, but also through allosteric effects. Here we subject previously published kinetic models of pyruvate metabolism in Lactococcus lactis and aspartate-derived amino acid synthesis in Arabidopsis thaliana to generalised supply-demand analysis. Multiple routes of regulation are brought about by different mechanisms in each model, leading to behavioural and regulatory patterns that are generally difficult to predict from simple inspection of the reaction networks depicting the models. In the pyruvate model the moiety-conserved cycles of ATP/ADP and NADH/NAD(+) allow otherwise independent metabolic branches to communicate. This causes the flux of one ATP-producing reaction block to increase in response to an increasing ATP/ADP ratio, while an NADH-consuming block flux decreases in response to an increasing NADH/NAD(+) ratio for certain ratio value ranges. In the aspartate model, aspartate semialdehyde can inhibit its supply block directly or by increasing the concentration of two amino acids (Lys and Thr) that occur as intermediates in demand blocks and act as allosteric inhibitors of isoenzymes in the supply block. These different routes of interaction from aspartate semialdehyde are each seen to contribute differently to the regulation of the aspartate semialdehyde supply block. Indirect routes of regulation between a metabolic intermediate and a reaction block that either produces or consumes this intermediate can play a much larger regulatory role than routes mediated through direct interactions. These indirect routes of regulation can also result in counter-intuitive metabolic behaviour. Performing generalised supply-demand analysis on two previously published models demonstrated the utility of this method as an entry point in the analysis of metabolic behaviour and the potential for obtaining novel results from previously analysed models by using new approaches.

  5. Liposomal bupivacaine versus interscalene nerve block for pain control after shoulder arthroplasty: A meta-analysis.

    PubMed

    Yan, Zeng; Chen, Zong; Ma, Chuangen

    2017-07-01

    Postoperative pain control after total shoulder arthroplasty (TSA) can be challenging. Liposomal bupivacaine and interscalene nerve block are 2 common pain control protocol for TSA patients. However, whether liposomal bupivacaine was superior than interscalene nerve block was unknown. This meta-analysis aimed to illustrate the efficacy liposomal bupivacaine versus interscalene nerve block for pain control in patients undergoing TSA. In May 2017, a systematic computer-based search was conducted in PubMed, EMBASE, Web of Science, Cochrane Database of Systematic Reviews, and Google database. Data on patients prepared for TSA in studies that compared liposomal bupivacaine versus interscalene nerve block were retrieved. The endpoints were the visual analogue scale (VAS) at 4 hours, 8 hours, 12 hours, 24 hours, and 2 weeks, total morphine consumption at 24 hours, and the length of hospital stay. Software of Stata 12.0 was used for pooling the final outcomes. Five clinical studies with 573 patients (liposomal bupivacaine group = 239, interscalene nerve block group = 334) were ultimately included in the meta-analysis. There was no significant difference between the VAS at 4 hours, 8 hours, and 2 weeks between liposomal bupivacaine group and interscalene nerve block group (P > .05). Compared with interscalene nerve block group, liposomal bupivacaine was associated with a reduction of VAS score at 12 hours, 24 hours by appropriately 3.31 points and 6.42 points respectively on a 100-point VAS. Furthermore, liposomal bupivacaine was associated with a significantly reduction of the length of hospital stay by appropriately by 0.16 days compared with interscalene nerve block group. Current meta-analysis indicates that compared with interscalene nerve block, liposomal bupivacaine had comparative effectiveness on reducing both pain scores and the length of hospital stay. However, studies with more patients and better-designed methods are needed to establish the optimal regimen and the safety of liposomal bupivacaine in TSA patients.

  6. Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains

    NASA Astrophysics Data System (ADS)

    Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao

    2017-11-01

    A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.

  7. Coupling Functions Enable Secure Communications

    NASA Astrophysics Data System (ADS)

    Stankovski, Tomislav; McClintock, Peter V. E.; Stefanovska, Aneta

    2014-01-01

    Secure encryption is an essential feature of modern communications, but rapid progress in illicit decryption brings a continuing need for new schemes that are harder and harder to break. Inspired by the time-varying nature of the cardiorespiratory interaction, here we introduce a new class of secure communications that is highly resistant to conventional attacks. Unlike all earlier encryption procedures, this cipher makes use of the coupling functions between interacting dynamical systems. It results in an unbounded number of encryption key possibilities, allows the transmission or reception of more than one signal simultaneously, and is robust against external noise. Thus, the information signals are encrypted as the time variations of linearly independent coupling functions. Using predetermined forms of coupling function, we apply Bayesian inference on the receiver side to detect and separate the information signals while simultaneously eliminating the effect of external noise. The scheme is highly modular and is readily extendable to support different communications applications within the same general framework.

  8. Simultaneous transmission for an encrypted image and a double random-phase encryption key

    NASA Astrophysics Data System (ADS)

    Yuan, Sheng; Zhou, Xin; Li, Da-Hai; Zhou, Ding-Fu

    2007-06-01

    We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.

  9. Simultaneous transmission for an encrypted image and a double random-phase encryption key.

    PubMed

    Yuan, Sheng; Zhou, Xin; Li, Da-hai; Zhou, Ding-fu

    2007-06-20

    We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.

  10. The New Physics

    NASA Astrophysics Data System (ADS)

    Fraser, Gordon

    2006-04-01

    Introduction Gordon Fraser; Part I. Matter and the Universe: 1. Cosmology Wendy Freedman and Rocky Kolb; 2. Gravity Ronald Adler; 3. Astrophysics Arnon Dar; 4. Particles and the standard model Chris Quigg; 5. Superstrings Michael Green; Part II. Quantum Matter: 6. Atoms and photons Claude Cohen-Tannoudji and Jean Dalibard; 7. The quantum world of ultra-cold atoms Christopher Foot and William Phillips; 8. Superfluidity Henry Hall; 9. Quantum phase transitions Subir Sachdev; Part III. Quanta in Action: 10. Quantum entanglement Anton Zeilinger; 11. Quanta, ciphers and computers Artur Ekert; 12. Small-scale structure and nanoscience Yoseph Imry; Part IV. Calculation and Computation: 13. Nonlinearity Henry Abarbanel; 14. Complexity Antonio Politi; 15. Collaborative physics, e-science and the grid Tony Hey and Anne Trefethen; Part V. Science in Action: 16. Biophysics Cyrus Safinya; 17. Medical physics Nicolaj Pavel; 18. Physics and materials Robert Cahn; 19. Physics and society Ugo Amaldi.

  11. The New Physics

    NASA Astrophysics Data System (ADS)

    Fraser, Gordon

    2009-08-01

    Introduction Gordon Fraser; Part I. Matter and the Universe: 1. Cosmology Wendy Freedman and Rocky Kolb; 2. Gravity Ronald Adler; 3. Astrophysics Arnon Dar; 4. Particles and the standard model Chris Quigg; 5. Superstrings Michael Green; Part II. Quantum Matter: 6. Atoms and photons Claude Cohen-Tannoudji and Jean Dalibard; 7. The quantum world of ultra-cold atoms Christopher Foot and William Phillips; 8. Superfluidity Henry Hall; 9. Quantum phase transitions Subir Sachdev; Part III. Quanta in Action: 10. Quantum entanglement Anton Zeilinger; 11. Quanta, ciphers and computers Artur Ekert; 12. Small-scale structure and nanoscience Yoseph Imry; Part IV. Calculation and Computation: 13. Nonlinearity Henry Abarbanel; 14. Complexity Antonio Politi; 15. Collaborative physics, e-science and the grid Tony Hey and Anne Trefethen; Part V. Science in Action: 16. Biophysics Cyrus Safinya; 17. Medical physics Nicolaj Pavel; 18. Physics and materials Robert Cahn; 19. Physics and society Ugo Amaldi.

  12. Single Channel Quantum Color Image Encryption Algorithm Based on HSI Model and Quantum Fourier Transform

    NASA Astrophysics Data System (ADS)

    Gong, Li-Hua; He, Xiang-Tao; Tan, Ru-Chao; Zhou, Zhi-Hong

    2018-01-01

    In order to obtain high-quality color images, it is important to keep the hue component unchanged while emphasize the intensity or saturation component. As a public color model, Hue-Saturation Intensity (HSI) model is commonly used in image processing. A new single channel quantum color image encryption algorithm based on HSI model and quantum Fourier transform (QFT) is investigated, where the color components of the original color image are converted to HSI and the logistic map is employed to diffuse the relationship of pixels in color components. Subsequently, quantum Fourier transform is exploited to fulfill the encryption. The cipher-text is a combination of a gray image and a phase matrix. Simulations and theoretical analyses demonstrate that the proposed single channel quantum color image encryption scheme based on the HSI model and quantum Fourier transform is secure and effective.

  13. How Properties of Kenaf Fibers from Burkina Faso Contribute to the Reinforcement of Earth Blocks

    PubMed Central

    Millogo, Younoussa; Aubert, Jean-Emmanuel; Hamard, Erwan; Morel, Jean-Claude

    2015-01-01

    Physicochemical characteristics of Hibiscus cannabinus (kenaf) fibers from Burkina Faso were studied using X-ray diffraction (XRD), infrared spectroscopy, thermal gravimetric analysis (TGA), chemical analysis and video microscopy. Kenaf fibers (3 cm long) were used to reinforce earth blocks, and the mechanical properties of reinforced blocks, with fiber contents ranging from 0.2 to 0.8 wt%, were investigated. The fibers were mainly composed of cellulose type I (70.4 wt%), hemicelluloses (18.9 wt%) and lignin (3 wt%) and were characterized by high tensile strength (1 ± 0.25 GPa) and Young’s modulus (136 ± 25 GPa), linked to their high cellulose content. The incorporation of short fibers of kenaf reduced the propagation of cracks in the blocks, through the good adherence of fibers to the clay matrix, and therefore improved their mechanical properties. Fiber incorporation was particularly beneficial for the bending strength of earth blocks because it reinforces these blocks after the failure of soil matrix observed for unreinforced blocks. Blocks reinforced with such fibers had a ductile tensile behavior that made them better building materials for masonry structures than unreinforced blocks.

  14. BlockLogo: visualization of peptide and sequence motif conservation

    PubMed Central

    Olsen, Lars Rønn; Kudahl, Ulrich Johan; Simon, Christian; Sun, Jing; Schönbach, Christian; Reinherz, Ellis L.; Zhang, Guang Lan; Brusic, Vladimir

    2013-01-01

    BlockLogo is a web-server application for visualization of protein and nucleotide fragments, continuous protein sequence motifs, and discontinuous sequence motifs using calculation of block entropy from multiple sequence alignments. The user input consists of a multiple sequence alignment, selection of motif positions, type of sequence, and output format definition. The output has BlockLogo along with the sequence logo, and a table of motif frequencies. We deployed BlockLogo as an online application and have demonstrated its utility through examples that show visualization of T-cell epitopes and B-cell epitopes (both continuous and discontinuous). Our additional example shows a visualization and analysis of structural motifs that determine specificity of peptide binding to HLA-DR molecules. The BlockLogo server also employs selected experimentally validated prediction algorithms to enable on-the-fly prediction of MHC binding affinity to 15 common HLA class I and class II alleles as well as visual analysis of discontinuous epitopes from multiple sequence alignments. It enables the visualization and analysis of structural and functional motifs that are usually described as regular expressions. It provides a compact view of discontinuous motifs composed of distant positions within biological sequences. BlockLogo is available at: http://research4.dfci.harvard.edu/cvc/blocklogo/ and http://methilab.bu.edu/blocklogo/ PMID:24001880

  15. Ion blocking dip shape analysis around a LaAlO3/SrTiO3 interface

    NASA Astrophysics Data System (ADS)

    Jalabert, D.; Zaid, H.; Berger, M. H.; Fongkaew, I.; Lambrecht, W. R. L.; Sehirlioglu, A.

    2018-05-01

    We present an analysis of the widths of the blocking dips obtained in MEIS ion blocking experiments of two LaAlO3/SrTiO3 heterostructures differing in their LaAlO3 layer thicknesses. In the LaAlO3 layers, the observed blocking dips are larger than expected. This enlargement is the result of the superposition of individual dips at slightly different angular positions revealing a local disorder in the atomic alignment, i.e., layer buckling. By contrast, in the SrTiO3 substrate, just below the interface, the obtained blocking dips are thinner than expected. This thinning indicates that the blocking atoms stand at a larger distance from the scattering center than expected. This is attributed to an accumulation of Sr vacancies at the layer/substrate interface which induces lattice distortions shifting the atoms off the scattering plane.

  16. Effects of applying nerve blocks to prevent postherpetic neuralgia in patients with acute herpes zoster: a systematic review and meta-analysis

    PubMed Central

    Kim, Hyun Jung; Ahn, Hyeong Sik; Lee, Jae Young; Choi, Seong Soo; Cheong, Yu Seon; Kwon, Koo; Yoon, Syn Hae

    2017-01-01

    Background Postherpetic neuralgia (PHN) is a common and painful complication of acute herpes zoster. In some cases, it is refractory to medical treatment. Preventing its occurrence is an important issue. We hypothesized that applying nerve blocks during the acute phase of herpes zoster could reduce PHN incidence by attenuating central sensitization and minimizing nerve damage and the anti-inflammatory effects of local anesthetics and steroids. Methods This systematic review and meta-analysis evaluates the efficacy of using nerve blocks to prevent PHN. We searched the MEDLINE, EMBASE, Cochrane Library, ClinicalTrials.gov and KoreaMed databases without language restrictions on April, 30 2014. We included all randomized controlled trials performed within 3 weeks after the onset of herpes zoster in order to compare nerve blocks vs active placebo and standard therapy. Results Nine trials were included in this systematic review and meta-analysis. Nerve blocks reduced the duration of herpes zoster-related pain and PHN incidence of at 3, 6, and 12 months after final intervention. Stellate ganglion block and single epidural injection did not achieve positive outcomes, but administering paravertebral blockage and continuous/repeated epidural blocks reduced PHN incidence at 3 months. None of the included trials reported clinically meaningful serious adverse events. Conclusions Applying nerve blocks during the acute phase of the herpes zoster shortens the duration of zoster-related pain, and somatic blocks (including paravertebral and repeated/continuous epidural blocks) are recommended to prevent PHN. In future studies, consensus-based PHN definitions, clinical cutoff points that define successful treatment outcomes and standardized outcome-assessment tools will be needed. PMID:28119767

  17. Mechanics of distributed fault and block rotation

    NASA Technical Reports Server (NTRS)

    Nur, A.; Scotti, O.; Ron, H.

    1989-01-01

    Paleomagnetic data, structural geology, and rock mechanics are used to explore the validity and significance of the block rotation concept. The analysis is based on data from Northern Israel, where fault slip and spacing are used to predict block rotation; the Mojave Desert, with well documented strike-slip sets; the Lake Mead, Nevada fault system with well-defined sets of strike-slip faults; and the San Gabriel Mountains domain with a multiple set of strike-slip faults. The results of the analysis indicate that block rotations can have a profound influence on the interpretation of geodetic measurments and the inversion of geodetic data. Furthermore, the block rotations and domain boundaries may be involved in creating the heterogeneities along active fault systems which may be responsible for the initiation and termination of earthquake rupture.

  18. Textual blocks rectification method based on fast Hough transform analysis in identity documents recognition

    NASA Astrophysics Data System (ADS)

    Bezmaternykh, P. V.; Nikolaev, D. P.; Arlazarov, V. L.

    2018-04-01

    Textual blocks rectification or slant correction is an important stage of document image processing in OCR systems. This paper considers existing methods and introduces an approach for the construction of such algorithms based on Fast Hough Transform analysis. A quality measurement technique is proposed and obtained results are shown for both printed and handwritten textual blocks processing as a part of an industrial system of identity documents recognition on mobile devices.

  19. Newer regional analgesia interventions (fascial plane blocks) for breast surgeries: Review of literature.

    PubMed

    Garg, Rakesh; Bhan, Swati; Vig, Saurabh

    2018-04-01

    Surgical resection of the primary tumour with axillary dissection is one of the main modalities of breast cancer treatment. Regional blocks have been considered as one of the modalities for effective perioperative pain control. With the advent of ultrasound, newer interventions such as fascial plane blocks have been reported for perioperative analgesia in breast surgeries. Our aim is to review the literature for fascial plane blocks for analgesia in breast surgeries. The research question for initiating the review was 'What are the reported newer regional anaesthesia techniques (fascial plane blocks) for female patients undergoing breast surgery and their analgesic efficacy?.' The participants, intervention, comparisons, outcomes and study design were followed. Due to the paucity of similar studies and heterogeneity, the assessment of bias, systematic review or pooled analysis/meta-analysis was not feasible. Of the 989 manuscripts, the present review included 28 manuscripts inclusive of all types of published manuscripts. 15 manuscripts directly related to the administration of fascial plane blocks for breast surgery across all type of study designs and cases were reviewed for the utility of fascial plane blocks in breast surgeries. Interfascial blocks score over regional anaesthetic techniques such as paravertebral block as they have no risk of sympathetic blockade, intrathecal or epidural spread which may lead to haemodynamic instability and prolonged hospital stay. This review observed that no block effectively covers the whole of breast and axilla, thus a combination of blocks should be used depending on the site of incision and extent of surgical resection.

  20. Mosquito larvicidal effectiveness of EcoBio-Block S: a novel integrated water-purifying concrete block formulation containing insect growth regulator pyriproxyfen.

    PubMed

    Kawada, Hitoshi; Saita, Susumu; Shimabukuro, Kozue; Hirano, Masachika; Koga, Masayuki; Iwashita, Toshiaki; Takagi, Masahiro

    2006-09-01

    EcoBio-Block S, a novel controlled release system (CRS) for the insect growth regulator pyriproxyfen, uses a water-purifying concrete block system (EcoBio-Block) composed of a porous volcanic rock and cement, and it incorporates the aerobic bacterial groups of Bacillus subtilis natto. EcoBio-Block S showed high inhibitory activity against mosquito emergence as well as a water-purifying effect. Chemical analysis and bioassay showed that EcoBio-Block S provides a high-performance CRS that controls the release of pyriproxyfen at low levels according to "zero order kinetics".

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The final design, performance analysis, and economic analysis of a solar hot water system for curing concrete blocks at the new Rotoclave block fabricating plant being built by the York Building Products Co. Inc. at Harrisburg, Pa. are presented. The system will use AAI Corporation's 24/1 concentrating collectors. (WHK)

  2. The epidemiology of adolescents living with perinatally acquired HIV: A cross-region global cohort analysis

    PubMed Central

    Davies, Mary-Ann; Williams, Paige; Balkan, Suna; Ben-Farhat, Jihane; Calles, Nancy; Chokephaibulkit, Kulkanya; Duff, Charlotte; Eboua, Tanoh François; Kekitiinwa-Rukyalekere, Adeodata; Maxwell, Nicola; Pinto, Jorge; Seage, George; Wanless, Sebastian; Warszawski, Josiane; Wools-Kaloustian, Kara; Collins, Intira J.; Smith, Colette; Patel, Kunjal; Paul, Mary; Abrams, Elaine J.; Hazra, Rohan; Van Dyke, Russell; Bekker, Linda-Gail; Vicari, Marissa; Essajee, Shaffiq; Penazzato, Martina; Anabwani, Gabriel; Q. Mohapi, Edith; N. Kazembe, Peter; Hlatshwayo, Makhosazana; Lumumba, Mwita; Thorne, Claire; Galli, Luisa; Giaquinto, Carlo; Marczynska, Magdalena; Marques, Laura; Prata, Filipa; Ene, Luminita; Rojo, Pablo; Fortuny, Claudia; Rudin, Christoph; Le Coeur, Sophie; Volokha, Alla; Succi, Regina; Sohn, Annette; Kariminia, Azar; Edmonds, Andrew; Lelo, Patricia; Ayaya, Samuel; Ongwen, Patricia; Jefferys, Laura F.; Phiri, Sam; Mubiana-Mbewe, Mwangelwa; Renner, Lorna; Sylla, Mariam; Abzug, Mark J.; Levin, Myron; Oleske, James; Chernoff, Miriam; Traite, Shirley; Chadwick, Ellen G.; Leroy, Valériane

    2018-01-01

    Background Globally, the population of adolescents living with perinatally acquired HIV (APHs) continues to expand. In this study, we pooled data from observational pediatric HIV cohorts and cohort networks, allowing comparisons of adolescents with perinatally acquired HIV in “real-life” settings across multiple regions. We describe the geographic and temporal characteristics and mortality outcomes of APHs across multiple regions, including South America and the Caribbean, North America, Europe, sub-Saharan Africa, and South and Southeast Asia. Methods and findings Through the Collaborative Initiative for Paediatric HIV Education and Research (CIPHER), individual retrospective longitudinal data from 12 cohort networks were pooled. All children infected with HIV who entered care before age 10 years, were not known to have horizontally acquired HIV, and were followed up beyond age 10 years were included in this analysis conducted from May 2016 to January 2017. Our primary analysis describes patient and treatment characteristics of APHs at key time points, including first HIV-associated clinic visit, antiretroviral therapy (ART) start, age 10 years, and last visit, and compares these characteristics by geographic region, country income group (CIG), and birth period. Our secondary analysis describes mortality, transfer out, and lost to follow-up (LTFU) as outcomes at age 15 years, using competing risk analysis. Among the 38,187 APHs included, 51% were female, 79% were from sub-Saharan Africa and 65% lived in low-income countries. APHs from 51 countries were included (Europe: 14 countries and 3,054 APHs; North America: 1 country and 1,032 APHs; South America and the Caribbean: 4 countries and 903 APHs; South and Southeast Asia: 7 countries and 2,902 APHs; sub-Saharan Africa, 25 countries and 30,296 APHs). Observation started as early as 1982 in Europe and 1996 in sub-Saharan Africa, and continued until at least 2014 in all regions. The median (interquartile range [IQR]) duration of adolescent follow-up was 3.1 (1.5–5.2) years for the total cohort and 6.4 (3.6–8.0) years in Europe, 3.7 (2.0–5.4) years in North America, 2.5 (1.2–4.4) years in South and Southeast Asia, 5.0 (2.7–7.5) years in South America and the Caribbean, and 2.1 (0.9–3.8) years in sub-Saharan Africa. Median (IQR) age at first visit differed substantially by region, ranging from 0.7 (0.3–2.1) years in North America to 7.1 (5.3–8.6) years in sub-Saharan Africa. The median age at ART start varied from 0.9 (0.4–2.6) years in North America to 7.9 (6.0–9.3) years in sub-Saharan Africa. The cumulative incidence estimates (95% confidence interval [CI]) at age 15 years for mortality, transfers out, and LTFU for all APHs were 2.6% (2.4%–2.8%), 15.6% (15.1%–16.0%), and 11.3% (10.9%–11.8%), respectively. Mortality was lowest in Europe (0.8% [0.5%–1.1%]) and highest in South America and the Caribbean (4.4% [3.1%–6.1%]). However, LTFU was lowest in South America and the Caribbean (4.8% [3.4%–6.7%]) and highest in sub-Saharan Africa (13.2% [12.6%–13.7%]). Study limitations include the high LTFU rate in sub-Saharan Africa, which could have affected the comparison of mortality across regions; inclusion of data only for APHs receiving ART from some countries; and unavailability of data from high-burden countries such as Nigeria. Conclusion To our knowledge, our study represents the largest multiregional epidemiological analysis of APHs. Despite probable under-ascertained mortality, mortality in APHs remains substantially higher in sub-Saharan Africa, South and Southeast Asia, and South America and the Caribbean than in Europe. Collaborations such as CIPHER enable us to monitor current global temporal trends in outcomes over time to inform appropriate policy responses. PMID:29494593

  3. Feasibility of blocking detection in observations from radio occultation

    NASA Astrophysics Data System (ADS)

    Brunner, Lukas; Steiner, Andrea Karin; Scherllin-Pirscher, Barbara; Jury, Martin

    2015-04-01

    Blocking describes an atmospheric situation in which the climatological westerly flow at mid latitudes is weakened or reversed. This is caused by a persistent high pressure system which can be stationary for several days to weeks. In the Northern Hemisphere blocking preferably occurs over the Atlantic/European and the Pacific regions. In recent years blocking has been under close scientific investigation due to its effect on weather extremes, triggering heat waves in summer and cold spells in winter. So far, scientific literature mainly focused on the investigation of blocking in reanalysis and global climate model data sets. However, blocking is underestimated in most climate models due to small-scale processes involved in its evolution. For a detection of blocking, most commonly applied methods are based on the computation of meridional geopotential height gradients at the 500 hPa level. Therefore measurements with adequate vertical, horizontal, and temporal resolution and coverage are required. We use an observational data set based on Global Positioning System (GPS) Radio Occultation (RO) measurements fulfilling these requirements. RO is a relatively new, satellite based remote sensing technique, delivering profiles of atmospheric parameters such as geopotential height, pressure, and temperature. It is characterized by favorable properties like long-term stability, global coverage, and high vertical resolution. Our data set is based on the most recent WEGC RO retrieval. Here we report on a feasibility study for blocking detection and analysis in RO data for two exemplary blocking events: the blocking over Russia in summer 2010 and the blocking over Greenland in late winter 2013. For these two events about 700 RO measurements per day are available in the Northern Hemisphere. We will show that the measurement density and quality of RO observations are favorable for blocking analysis and can therefore contribute to blocking research.

  4. Sequential and simultaneous SLAR block adjustment. [spline function analysis for mapping

    NASA Technical Reports Server (NTRS)

    Leberl, F.

    1975-01-01

    Two sequential methods of planimetric SLAR (Side Looking Airborne Radar) block adjustment, with and without splines, and three simultaneous methods based on the principles of least squares are evaluated. A limited experiment with simulated SLAR images indicates that sequential block formation with splines followed by external interpolative adjustment is superior to the simultaneous methods such as planimetric block adjustment with similarity transformations. The use of the sequential block formation is recommended, since it represents an inexpensive tool for satisfactory point determination from SLAR images.

  5. A simulation for gravity fine structure recovery from low-low GRAVSAT SST data

    NASA Technical Reports Server (NTRS)

    Estes, R. H.; Lancaster, E. R.

    1976-01-01

    Covariance error analysis techniques were applied to investigate estimation strategies for the low-low SST mission for accurate local recovery of gravitational fine structure, considering the aliasing effects of unsolved for parameters. A 5 degree by 5 degree surface density block representation of the high order geopotential was utilized with the drag-free low-low GRAVSAT configuration in a circular polar orbit at 250 km altitude. Recovery of local sets of density blocks from long data arcs was found not to be feasible due to strong aliasing effects. The error analysis for the recovery of local sets of density blocks using independent short data arcs demonstrated that the estimation strategy of simultaneously estimating a local set of blocks covered by data and two "buffer layers" of blocks not covered by data greatly reduced aliasing errors.

  6. Two-Dimensional Liquid Chromatography Analysis of Polystyrene/Polybutadiene Block Copolymers.

    PubMed

    Lee, Sanghoon; Choi, Heejae; Chang, Taihyun; Staal, Bastiaan

    2018-05-15

    A detailed characterization of a commercial polystyrene/polybutadiene block copolymer material (Styrolux) was carried out using two-dimensional liquid chromatography (2D-LC). The Styrolux is prepared by statistical linking reaction of two different polystyrene- block-polybutadienyl anion precursors with a multivalent linking agent. Therefore, it is a mixture of a number of branched block copolymers different in molecular weight, composition, and chain architecture. While individual LC analysis, including size exclusion chromatography, interaction chromatography, or liquid chromatography at critical condition, is not good enough to resolve all the polymer species, 2D-LC separations coupling two chromatography methods were able to resolve all polymer species present in the sample; at least 13 block copolymer species and a homopolystyrene blended. Four different 2D-LC analyses combining a different pair of two LC methods provide their characteristic separation results. The separation characteristics of the 2D-LC separations are compared to elucidate the elution characteristics of the block copolymer species.

  7. Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP

    NASA Astrophysics Data System (ADS)

    Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio

    1988-09-01

    This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.

  8. The Relationship Between Extratropical Cyclone Steering and Blocking Along the North American East Coast

    NASA Astrophysics Data System (ADS)

    Booth, James F.; Dunn-Sigouin, Etienne; Pfahl, Stephan

    2017-12-01

    The path and speed of extratropical cyclones along the east coast of North America influence their societal impact. This work characterizes the climatological relationship between cyclone track path and speed, and blocking and the North Atlantic Oscillation (NAO). An analysis of Lagrangian cyclone track propagation speed and angle shows that the percentage of cyclones with blocks is larger for cyclones that propagate northward or southeastward, as is the size of the blocked region near the cyclone. Cyclone-centered composites show that propagation of cyclones relative to blocks is consistent with steering by the block: northward tracks more often have a block east/northeast of the cyclone; slow tracks tend to have blocks due north of the cyclone. Comparison with the NAO shows that to first-order blocking and the NAO steer cyclones in a similar manner. However, blocked cyclones are more likely to propagate northward, increasing the likelihood of cyclone related impacts.

  9. Liposomal bupivacaine versus interscalene nerve block for pain control after shoulder arthroplasty

    PubMed Central

    Yan, Zeng; Chen, Zong; Ma, Chuangen

    2017-01-01

    Abstract Background: Postoperative pain control after total shoulder arthroplasty (TSA) can be challenging. Liposomal bupivacaine and interscalene nerve block are 2 common pain control protocol for TSA patients. However, whether liposomal bupivacaine was superior than interscalene nerve block was unknown. This meta-analysis aimed to illustrate the efficacy liposomal bupivacaine versus interscalene nerve block for pain control in patients undergoing TSA. Methods: In May 2017, a systematic computer-based search was conducted in PubMed, EMBASE, Web of Science, Cochrane Database of Systematic Reviews, and Google database. Data on patients prepared for TSA in studies that compared liposomal bupivacaine versus interscalene nerve block were retrieved. The endpoints were the visual analogue scale (VAS) at 4 hours, 8 hours, 12 hours, 24 hours, and 2 weeks, total morphine consumption at 24 hours, and the length of hospital stay. Software of Stata 12.0 was used for pooling the final outcomes. Results: Five clinical studies with 573 patients (liposomal bupivacaine group = 239, interscalene nerve block group = 334) were ultimately included in the meta-analysis. There was no significant difference between the VAS at 4 hours, 8 hours, and 2 weeks between liposomal bupivacaine group and interscalene nerve block group (P > .05). Compared with interscalene nerve block group, liposomal bupivacaine was associated with a reduction of VAS score at 12 hours, 24 hours by appropriately 3.31 points and 6.42 points respectively on a 100-point VAS. Furthermore, liposomal bupivacaine was associated with a significantly reduction of the length of hospital stay by appropriately by 0.16 days compared with interscalene nerve block group. Conclusion: Current meta-analysis indicates that compared with interscalene nerve block, liposomal bupivacaine had comparative effectiveness on reducing both pain scores and the length of hospital stay. However, studies with more patients and better-designed methods are needed to establish the optimal regimen and the safety of liposomal bupivacaine in TSA patients. PMID:28682872

  10. Rectus sheath and transversus abdominis plane blocks in children: a systematic review and meta-analysis of randomized trials.

    PubMed

    Hamill, James K; Rahiri, Jamie-Lee; Liley, Andrew; Hill, Andrew G

    2016-04-01

    The role of rectus sheath blocks (RSB) and transversus abdominis plane (TAP) blocks in pediatric surgery has not been well established. We aimed to determine if RSB and TAP blocks decrease postoperative pain and improve recovery in children. Duplicate searching of MEDLINE, EMBASE, Cochrane, Web of Science, and trial registries databases by two reviewers. Included were randomized trials in children on RSB or TAP block in abdominal operations, excluding inguinal procedures. Independent duplicate data extraction and quality assessment using a standardized form. Ten trials met inclusion criteria (n = 599), RSB in five and TAP block in five. A linear mixed effects model on patient level data from three trials showed nerve blocks lowered morphine requirements 6-8 h after surgery, -0.03 mg · kg(-1) (95% CI -0.05, -0.002). Pooled analysis of summary data showed nerve blocks lowered 0-10 scale pain scores immediately after the operation, -0.7 (95% CI -1.3, -0.1); lowered 4-16 scale pain scores, -2.0 (95% CI -2.3, -1.7); and delayed the time to first rescue analgesia, 17 min (95% CI 1.3, 33). Quality assessment showed some studies at moderate to high risk of bias. Abdominal wall blocks reduce pain and opiate use in children. We advise cautious interpretation of the results given the heterogeneity of studies. © 2016 John Wiley & Sons Ltd.

  11. Enhancing Post-Traumatic Pain Relief with Alternative Perineural Drugs

    DTIC Science & Technology

    2013-11-01

    producing long- duration, sensory-specific nerve block with minimal toxicity . We assessed the efficacy of the adjuvants clonidine (C), buprenorphine...influence on either LA- or M- induced nerve block. Further analysis of M effects indicated that peripheral nerve block and toxicity were due to a...hoping to identify a means to produce a nerve block in the absence of toxicity . We also hypothesized that potassium channel openers might directly

  12. An interactive multi-block grid generation system

    NASA Technical Reports Server (NTRS)

    Kao, T. J.; Su, T. Y.; Appleby, Ruth

    1992-01-01

    A grid generation procedure combining interactive and batch grid generation programs was put together to generate multi-block grids for complex aircraft configurations. The interactive section provides the tools for 3D geometry manipulation, surface grid extraction, boundary domain construction for 3D volume grid generation, and block-block relationships and boundary conditions for flow solvers. The procedure improves the flexibility and quality of grid generation to meet the design/analysis requirements.

  13. The impact of temperature loading on massive concrete block resistance

    NASA Astrophysics Data System (ADS)

    Beran, Pavel; Kočí, Jan

    2017-07-01

    Very large and massive concrete blocks with thickness in interval 3.5 - 6 meters are often designed in cement industry. These massive blocks have high heat inertial and thus the thermal stress due to nonlinear temperature gradient in concrete block may occur. The coupled thermo-mechanical analysis of concrete block in Prague Czech Republic and Sterlitamak Russia was made. By the numerical model of concrete block was analyzed the typical year (called reference year) in particular localities. The results show that in concrete block the thermal stresses which are higher than the tensile strength of concrete originate. Therefore, the concrete block should be reinforced by steel rods. The values of stresses are markedly affected by climate. The significantly higher values of thermal stresses were detected in Sterlitamak than in Prague.

  14. A general multiblock Euler code for propulsion integration. Volume 1: Theory document

    NASA Technical Reports Server (NTRS)

    Chen, H. C.; Su, T. Y.; Kao, T. J.

    1991-01-01

    A general multiblock Euler solver was developed for the analysis of flow fields over geometrically complex configurations either in free air or in a wind tunnel. In this approach, the external space around a complex configuration was divided into a number of topologically simple blocks, so that surface-fitted grids and an efficient flow solution algorithm could be easily applied in each block. The computational grid in each block is generated using a combination of algebraic and elliptic methods. A grid generation/flow solver interface program was developed to facilitate the establishment of block-to-block relations and the boundary conditions for each block. The flow solver utilizes a finite volume formulation and an explicit time stepping scheme to solve the Euler equations. A multiblock version of the multigrid method was developed to accelerate the convergence of the calculations. The generality of the method was demonstrated through the analysis of two complex configurations at various flow conditions. Results were compared to available test data. Two accompanying volumes, user manuals for the preparation of multi-block grids (vol. 2) and for the Euler flow solver (vol. 3), provide information on input data format and program execution.

  15. Comparative Study on Different Slot Forms of Prestressed Anchor Blocks

    NASA Astrophysics Data System (ADS)

    Fan, Rong; Si, Jianhui; Jian, Zheng

    2018-03-01

    In this paper, two models of prestressed pier, rectangular cavity anchor block and arch hollow anchor block are established. The ABAQUS software was used to calculate the stress of the surface of the neck of the pier and the cavity of the anchor block, through comparative analysis. The results show that compared with the rectangular cavity anchor block, the stress of the pier and the cavity can be effectively reduced when the arch hole is used, and the amount of prestressed anchor can be reduced, so as to obtain obvious economic benefits.

  16. Transversus abdominal plane block for postoperative analgesia: a systematic review and meta-analysis of randomized-controlled trials.

    PubMed

    Brogi, Etrusca; Kazan, Roy; Cyr, Shantale; Giunta, Francesco; Hemmerling, Thomas M

    2016-10-01

    The transversus abdominal plane (TAP) block has been described as an effective pain control technique after abdominal surgery. We performed a systematic review and meta-analysis of randomized-controlled trials (RCTs) to account for the increasing number of TAP block studies appearing in the literature. The primary outcome we examined was the effect of TAP block on the postoperative pain score at six, 12, and 24 hr. The secondary outcome was 24-hr morphine consumption. We searched the United States National Library of Medicine database, the Excerpta Medica database, and the Cochrane Central Register of Controlled Clinical Studies and identified RCTs focusing on the analgesic efficacy of TAP block compared with a control group [i.e., placebo, epidural analgesia, intrathecal morphine (ITM), and ilioinguinal nerve block after abdominal surgery]. Meta-analyses were performed on postoperative pain scores at rest at six, 12, and 24 hr (visual analogue scale, 0-10) and on 24-hr opioid consumption. In the 51 trials identified, compared with placebo, TAP block reduced the VAS for pain at six hours by 1.4 (95% confidence interval [CI], -1.9 to -0.8; P < 0.001), at 12 hr by 2.0 (95% CI, -2.7 to -1.4; P < 0.001), and at 24 hr by 1.2 (95% CI, -1.6 to -0.8; P < 0.001). Similarly, compared with placebo, TAP block reduced morphine consumption at 24 hr after surgery (mean difference, -14.7 mg; 95% CI, -18.4 to -11.0; P < 0.001). We observed this reduction in pain scores and morphine consumption in the TAP block group after gynecological surgery, appendectomy, inguinal surgery, bariatric surgery, and urological surgery. Nevertheless, separate analysis of the studies comparing ITM with TAP block revealed that ITM seemed to have a greater analgesic efficacy. The TAP block can play an important role in the management of pain after abdominal surgery by reducing both pain scores and 24-hr morphine consumption. It may have particular utility when neuraxial techniques or opioids are contraindicated.

  17. On the security of compressed encryption with partial unitary sensing matrices embedding a secret keystream

    NASA Astrophysics Data System (ADS)

    Yu, Nam Yul

    2017-12-01

    The principle of compressed sensing (CS) can be applied in a cryptosystem by providing the notion of security. In this paper, we study the computational security of a CS-based cryptosystem that encrypts a plaintext with a partial unitary sensing matrix embedding a secret keystream. The keystream is obtained by a keystream generator of stream ciphers, where the initial seed becomes the secret key of the CS-based cryptosystem. For security analysis, the total variation distance, bounded by the relative entropy and the Hellinger distance, is examined as a security measure for the indistinguishability. By developing upper bounds on the distance measures, we show that the CS-based cryptosystem can be computationally secure in terms of the indistinguishability, as long as the keystream length for each encryption is sufficiently large with low compression and sparsity ratios. In addition, we consider a potential chosen plaintext attack (CPA) from an adversary, which attempts to recover the key of the CS-based cryptosystem. Associated with the key recovery attack, we show that the computational security of our CS-based cryptosystem is brought by the mathematical intractability of a constrained integer least-squares (ILS) problem. For a sub-optimal, but feasible key recovery attack, we consider a successive approximate maximum-likelihood detection (SAMD) and investigate the performance by developing an upper bound on the success probability. Through theoretical and numerical analyses, we demonstrate that our CS-based cryptosystem can be secure against the key recovery attack through the SAMD.

  18. Microbial Communities in Long-Term, Water-Flooded Petroleum Reservoirs with Different in situ Temperatures in the Huabei Oilfield, China

    PubMed Central

    Tang, Yue-Qin; Li, Yan; Zhao, Jie-Yu; Chi, Chang-Qiao; Huang, Li-Xin; Dong, Han-Ping; Wu, Xiao-Lei

    2012-01-01

    The distribution of microbial communities in the Menggulin (MGL) and Ba19 blocks in the Huabei Oilfield, China, were studied based on 16S rRNA gene analysis. The dominant microbes showed obvious block-specific characteristics, and the two blocks had substantially different bacterial and archaeal communities. In the moderate-temperature MGL block, the bacteria were mainly Epsilonproteobacteria and Alphaproteobacteria, and the archaea were methanogens belonging to Methanolinea, Methanothermobacter, Methanosaeta, and Methanocella. However, in the high-temperature Ba19 block, the predominant bacteria were Gammaproteobacteria, and the predominant archaea were Methanothermobacter and Methanosaeta. In spite of shared taxa in the blocks, differences among wells in the same block were obvious, especially for bacterial communities in the MGL block. Compared to the bacterial communities, the archaeal communities were much more conserved within blocks and were not affected by the variation in the bacterial communities. PMID:22432032

  19. Efficacy of ultrasound and nerve stimulation guidance in peripheral nerve block: A systematic review and meta-analysis.

    PubMed

    Wang, Zhi-Xue; Zhang, De-Li; Liu, Xin-Wei; Li, Yan; Zhang, Xiao-Xia; Li, Ru-Hong

    2017-09-01

    Evidence was controversial about whether nerve stimulation (NS) can optimize ultrasound guidance (US)-guided nerve blockade for peripheral nerve block. This review aims to explore the effects of the two combined techniques. We searched EMBASE (from 1974 to March 2015), PubMed (from 1966 to Mar 2015), Medline (from 1966 to Mar 2015), the Cochrane Central Register of Controlled Trials and clinicaltrials.gov. Finally, 15 randomized trials were included into analysis involving 1,019 lower limb and 696 upper limb surgery cases. Meta-analysis indicated that, compared with US alone, USNS combination had favorable effects on overall block success rate (risk ratio [RR] 1.17; confidence interval [CI] 1.05 to 1.30, P = 0.004), sensory block success rate (RR 1.56; CI 1.29 to 1.89, P < 0.00001), and block onset time (mean difference [MD] -3.84; CI -5.59 to -2.08, P < 0.0001). USNS guidance had a longer procedure time in both upper and lower limb nerve block (MD 1.67; CI 1.32 to 2.02, P < 0.00001; MD 1.17; CI 0.95 to 1.39, P < 0.00001) and more patients with anesthesia supplementation (RR 2.5; CI 1.02 to 6.13, P = 0.05). USNS guidance trends to result in a shorter block onset time than US alone as well as higher block success rate, but no statistical difference was demonstrated, as more data are required. © 2017 IUBMB Life, 69(9):720-734, 2017. © 2017 International Union of Biochemistry and Molecular Biology.

  20. Morphological and physical characterization of poly(styrene-isobutylene-styrene) block copolymers and ionomers thereof

    NASA Astrophysics Data System (ADS)

    Baugh, Daniel Webster, III

    Poly(styrene-isobutylene-styrene) block copolymers made by living cationic polymerization using a difunctional initiator and the sequential monomer addition technique were analyzed using curve-resolution software in conjunction with high-resolution GPC. Fractional precipitation and selective solvent extraction were applied to a representative sample in order to confirm the identity of contaminating species. The latter were found to be low molecular weight polystyrene homopolymer, diblock copolymer, and higher molecular weight segmented block copolymers formed by intermolecular electrophilic aromatic substitution linking reactions occurring late in the polymerization of the styrene outer blocks. Solvent-cast films of poly(styrene-isobutylene-styrene) (PS-PIB-PS) block copolymers and block ionomers were analyzed using small-angle X-ray scattering (SAXS) and transmission electron microscopy (TEM). Four block copolymer samples with center block molecular weights of 52,000 g/mol and PS volume fractions (o sbPS) ranging from 0.17 to 0.31 were studied. All samples exhibited hexagonally packed cylinders of PS within the PIB matrix. Cylinder spacing was in the range 32 to 36 nm for most samples, while cylinder diameters varied from 14 to 21 nm. Porod analysis of the scattering data indicated the presence of isolated phase mixing and sharp phase boundaries. PS-PIB-PS block copolymers and ionomers therefrom were analyzed using dynamic mechanical analysis (DMA) and tensile testing. The study encompassed five block copolymer samples with similar PIB center blocks with molecular weights of approx52,000 g/mol and PS weight fractions ranging from 0.127 to 0.337. Ionomers were prepared from two of these materials by lightly sulfonating the PS outer blocks. Sulfonation levels varied from 1.7 to 4.7 mol % and the sodium and potassium neutralized forms were compared to the parent block copolymers. Dynamic mechanical analysis (DMA) of the block copolymer films indicated the existence of a third phase attributed to PIB chains near the PS domain interface which experience reduced mobility due to their firm attachment to the hard PS domain. The relative amount of this phase decreased in samples with larger PS blocks, while the temperature of the associated transition increased. Tensile testing showed increased tensile strength but decreased elongation at break with larger PS blocks. DMA of the ionomers indicated improved dynamic modulus at temperatures above 100spcirc$C. Tensile testing of the ionomers indicated slight improvements in tensile strength with little loss in elongation at break. PS-PIB-PS block copolymer ionomer (BCP01, center block molecular weight = 53,000 g/mole; 25.5 wt % polystyrene, 4.7% sulfonation of phenyl units, 100% neutralized with KOH) was compounded with various organic and inorganic acid salts of 2-ethylhexyl-p-dimethyl aminobenzoate (ODAB) to explore the efficacy of these compounds as ionic plasticizers. (Abstract shortened by UMI.)

  1. SLAMMER: Seismic LAndslide Movement Modeled using Earthquake Records

    USGS Publications Warehouse

    Jibson, Randall W.; Rathje, Ellen M.; Jibson, Matthew W.; Lee, Yong W.

    2013-01-01

    This program is designed to facilitate conducting sliding-block analysis (also called permanent-deformation analysis) of slopes in order to estimate slope behavior during earthquakes. The program allows selection from among more than 2,100 strong-motion records from 28 earthquakes and allows users to add their own records to the collection. Any number of earthquake records can be selected using a search interface that selects records based on desired properties. Sliding-block analyses, using any combination of rigid-block (Newmark), decoupled, and fully coupled methods, are then conducted on the selected group of records, and results are compiled in both graphical and tabular form. Simplified methods for conducting each type of analysis are also included.

  2. Girlhood, Sexual Violence, and Agency in Francesca Lia Block's "Wolf"

    ERIC Educational Resources Information Center

    Marshall, Elizabeth

    2009-01-01

    This essay examines the representation of adolescent girlhood, sexual violence and agency in Francesca Lia Block's contemporary fairy tale collection "The Rose and The Beast." Focusing specifically on the tale "Wolf," the author provides a literary analysis of how Block draws on and reworks traditional Western fairy tale variants to reintroduce…

  3. The Impact of the Built Environment on Children's School Conduct Grades: The Role of Diversity of Use in a Hispanic Neighborhood

    PubMed Central

    Szapocznik, José; Lombard, Joanna; Martinez, Frank; Mason, Craig A.; Gorman-Smith, Deborah; Plater-Zyberk, Elizabeth; Brown, Scott C.; Spokane, Arnold

    2013-01-01

    A population-based study examined the relationship between diversity of use of the built environment and teacher reports of children's grades. Diversity of use of the built environment (i.e., proportion of a block that is residential, institutional, commercial and vacant) was assessed for all 403 city blocks in East Little Havana, Miami—a Hispanic neighborhood. Cluster analysis identified three block-types, based on diversity of use: Residential, Mixed-Use, and Commercial. Cross-classified hierarchical linear modeling was used to examine the impact of diversity of use, school, gender, and year-in-school on academic and conduct grades for 2857 public school children who lived in these blocks. Contrary to popular belief, mixed-use blocks were associated with optimal outcomes. Specifically, follow-up analyses found that a youth living on a residential block had a 74% greater odds of being in the lowest 10% of conduct grades (conduct GPA <2.17) than a youth living on a mixed-use block. In fact, an analysis of the population attributable fraction suggests that if the risk associated with residential blocks could be reduced to the level of risk associated with mixed-use blocks, a 38% reduction in Conduct GPAs <2.17 could be achieved in the total population. These findings suggest that public policy targeting the built environment may be a mechanism for community-based interventions to enhance children's classroom conduct, and potentially related sequelae. PMID:16967342

  4. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... within the populated area (A) Carbon Dioxide Information Analysis Center (CDIAC) Oak Ridge National... including 100 nm from the launch point are required at the U.S. census block group level. Population data... populated area (N) Within 100 nm of the launch point: U.S. census data at the census block-group level...

  5. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... within the populated area (A) Carbon Dioxide Information Analysis Center (CDIAC) Oak Ridge National... including 100 nm from the launch point are required at the U.S. census block group level. Population data... populated area (N) Within 100 nm of the launch point: U.S. census data at the census block-group level...

  6. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... within the populated area (A) Carbon Dioxide Information Analysis Center (CDIAC) Oak Ridge National... including 100 nm from the launch point are required at the U.S. census block group level. Population data... populated area (N) Within 100 nm of the launch point: U.S. census data at the census block-group level...

  7. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... within the populated area (A) Carbon Dioxide Information Analysis Center (CDIAC) Oak Ridge National... including 100 nm from the launch point are required at the U.S. census block group level. Population data... populated area (N) Within 100 nm of the launch point: U.S. census data at the census block-group level...

  8. Is Local Infiltration Analgesia Superior to Peripheral Nerve Blockade for Pain Management After THA: A Network Meta-analysis.

    PubMed

    Jiménez-Almonte, José H; Wyles, Cody C; Wyles, Saranya P; Norambuena-Morales, German A; Báez, Pedro J; Murad, Mohammad H; Sierra, Rafael J

    2016-02-01

    Local infiltration analgesia and peripheral nerve blocks are common methods for pain management in patients after THA but direct head-to-head, randomized controlled trials (RCTs) have not been performed. A network meta-analysis allows indirect comparison of individual treatments relative to a common comparator; in this case placebo (or no intervention), epidural analgesia, and intrathecal morphine, yielding an estimate of comparative efficacy. We asked, when compared with a placebo, (1) does use of local infiltration analgesia reduce patient pain scores and opioid consumption, (2) does use of peripheral nerve blocks reduce patient pain scores and opioid consumption, and (3) is local infiltration analgesia favored over peripheral nerve blocks for postoperative pain management after THA? We searched six databases, from inception through June 30, 2014, to identify RCTs comparing local infiltration analgesia or peripheral nerve block use in patients after THA. A total of 35 RCTs at low risk of bias based on the recommended Cochrane Collaboration risk assessment tool were included in the network meta-analysis (2296 patients). Primary outcomes for this review were patient pain scores at rest and cumulative opioid consumption, both assessed at 24 hours after THA. Because of substantial heterogeneity (variation of outcomes between studies) across included trials, a random effect model for meta-analysis was used to estimate the weighted mean difference (WMD) and 95% CI. The gray literature was searched with the same inclusion criteria as published trials. Only one unpublished trial (published abstract) fulfilled our criteria and was included in this review. All other studies included in this systematic review were full published articles. Bayesian network meta-analysis included all RCTs that compared local infiltration analgesia or peripheral nerve blocks with placebo (or no intervention), epidural analgesia, and intrathecal morphine. Compared with placebo, local infiltration analgesia reduced patient pain scores (WMD, -0.61; 95% CI, -0.97 to -0.24; p = 0.001) and opioid consumption (WMD, -7.16 mg; 95% CI, -11.98 to -2.35; p = 0.004). Peripheral nerve blocks did not result in lower pain scores or reduced opioid consumption compared with placebo (WMD, -0.43; 95% CI, -0.99 to 0.12; p = 0.12 and WMD, -3.14 mg, 95% CI, -11.30 to 5.02; p = 0.45). However, network meta-analysis comparing local infiltration analgesia with peripheral nerve blocks through common comparators showed no differences between postoperative pain scores (WMD, -0.36; 95% CI, -1.06 to 0.31) and opioid consumption (WMD, -4.59 mg; 95% CI, -9.35 to 0.17), although rank-order analysis found local infiltration analgesia to be ranked first in more simulations than peripheral nerve blocks, suggesting that it may be more effective. Using the novel statistical network meta-analysis approach, we found no differences between local infiltration analgesia and peripheral nerve blocks in terms of analgesia or opioid consumption 24 hours after THA; there was a suggestion of a slight advantage to peripheral nerve blocks based on rank-order analysis, but the effect size in question is likely not large. Given the slight difference between interventions, clinicians may choose to focus on other factors such as cost and intervention-related complications when debating which analgesic treatment to use after THA. Level I, therapeutic study.

  9. Extracranial glioblastoma diagnosed by examination of pleural effusion using the cell block technique: case report.

    PubMed

    Hori, Yusuke S; Fukuhara, Toru; Aoi, Mizuho; Oda, Kazunori; Shinno, Yoko

    2018-06-01

    Metastatic glioblastoma is a rare condition, and several studies have reported the involvement of multiple organs including the lymph nodes, liver, and lung. The lung and pleura are reportedly the most frequent sites of metastasis, and diagnosis using less invasive tools such as cytological analysis with fine needle aspiration biopsy is challenging. Cytological analysis of fluid specimens tends to be negative because of the small number of cells obtained, whereas the cell block technique reportedly has higher sensitivity because of a decrease in cellular dispersion. Herein, the authors describe a patient with a history of diffuse astrocytoma who developed intractable, progressive accumulation of pleural fluid. Initial cytological analysis of the pleural effusion obtained by thoracocentesis was negative, but reanalysis using the cell block technique revealed the presence of glioblastoma cells. This is the first report to suggest the effectiveness of the cell block technique in the diagnosis of extracranial glioblastoma using pleural effusion. In patients with a history of glioma, the presence of extremely intractable pleural effusion warrants cytological analysis of the fluid using this technique in order to initiate appropriate chemotherapy.

  10. INVITED PAPER: Low power cryptography

    NASA Astrophysics Data System (ADS)

    Kitsos, P.; Koufopavlou, O.; Selimis, G.; Sklavos, N.

    2005-01-01

    Today more and more sensitive data is stored digitally. Bank accounts, medical records and personal emails are some categories that data must keep secure. The science of cryptography tries to encounter the lack of security. Data confidentiality, authentication, non-reputation and data integrity are some of the main parts of cryptography. The evolution of cryptography drove in very complex cryptographic models which they could not be implemented before some years. The use of systems with increasing complexity, which usually are more secure, has as result low throughput rate and more energy consumption. However the evolution of cipher has no practical impact, if it has only theoretical background. Every encryption algorithm should exploit as much as possible the conditions of the specific system without omitting the physical, area and timing limitations. This fact requires new ways in design architectures for secure and reliable crypto systems. A main issue in the design of crypto systems is the reduction of power consumption, especially for portable systems as smart cards.

  11. Deciphering the language of nature: cryptography, secrecy, and alterity in Francis Bacon.

    PubMed

    Clody, Michael C

    2011-01-01

    The essay argues that Francis Bacon's considerations of parables and cryptography reflect larger interpretative concerns of his natural philosophic project. Bacon describes nature as having a language distinct from those of God and man, and, in so doing, establishes a central problem of his natural philosophy—namely, how can the language of nature be accessed through scientific representation? Ultimately, Bacon's solution relies on a theory of differential and duplicitous signs that conceal within them the hidden voice of nature, which is best recognized in the natural forms of efficient causality. The "alphabet of nature"—those tables of natural occurrences—consequently plays a central role in his program, as it renders nature's language susceptible to a process and decryption that mirrors the model of the bilateral cipher. It is argued that while the writing of Bacon's natural philosophy strives for literality, its investigative process preserves a space for alterity within scientific representation, that is made accessible to those with the interpretative key.

  12. Securing your Site in Development and Beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akopov, Mikhail S.

    Why wait until production deployment, or even staging and testing deployment to identify security vulnerabilities? Using tools like Burp Suite, you can find security vulnerabilities before they creep up on you. Prevent cross-site scripting attacks, and establish a firmer trust between your website and your client. Verify that Apache/Nginx have the correct SSL Ciphers set. We explore using these tools and more to validate proper Apache/Nginx configurations, and to be compliant with modern configuration standards as part of the development cycle. Your clients can use tools like https://securityheaders.io and https://ssllabs.com to get a graded report on your level of compliancemore » with OWASP Secure Headers Project and SSLLabs recommendations. Likewise, you should always use the same sites to validate your configurations. Burp Suite will find common misconfigurations and will also perform more thorough security testing of your applications. In this session you will see examples of vulnerabilities that were detected early on, as well has how to integrate these practices into your daily workflow.« less

  13. Simultaneous compression and encryption for secure real-time secure transmission of sensitive video transmission

    NASA Astrophysics Data System (ADS)

    Al-Hayani, Nazar; Al-Jawad, Naseer; Jassim, Sabah A.

    2014-05-01

    Video compression and encryption became very essential in a secured real time video transmission. Applying both techniques simultaneously is one of the challenges where the size and the quality are important in multimedia transmission. In this paper we proposed a new technique for video compression and encryption. Both encryption and compression are based on edges extracted from the high frequency sub-bands of wavelet decomposition. The compression algorithm based on hybrid of: discrete wavelet transforms, discrete cosine transform, vector quantization, wavelet based edge detection, and phase sensing. The compression encoding algorithm treats the video reference and non-reference frames in two different ways. The encryption algorithm utilized A5 cipher combined with chaotic logistic map to encrypt the significant parameters and wavelet coefficients. Both algorithms can be applied simultaneously after applying the discrete wavelet transform on each individual frame. Experimental results show that the proposed algorithms have the following features: high compression, acceptable quality, and resistance to the statistical and bruteforce attack with low computational processing.

  14. One-Time Pad as a nonlinear dynamical system

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin

    2012-11-01

    The One-Time Pad (OTP) is the only known unbreakable cipher, proved mathematically by Shannon in 1949. In spite of several practical drawbacks of using the OTP, it continues to be used in quantum cryptography, DNA cryptography and even in classical cryptography when the highest form of security is desired (other popular algorithms like RSA, ECC, AES are not even proven to be computationally secure). In this work, we prove that the OTP encryption and decryption is equivalent to finding the initial condition on a pair of binary maps (Bernoulli shift). The binary map belongs to a family of 1D nonlinear chaotic and ergodic dynamical systems known as Generalized Luröth Series (GLS). Having established these interesting connections, we construct other perfect secrecy systems on the GLS that are equivalent to the One-Time Pad, generalizing for larger alphabets. We further show that OTP encryption is related to Randomized Arithmetic Coding - a scheme for joint compression and encryption.

  15. A Transcription Activator-Like Effector (TALE) Toolbox for Genome Engineering

    PubMed Central

    Sanjana, Neville E.; Cong, Le; Zhou, Yang; Cunniff, Margaret M.; Feng, Guoping; Zhang, Feng

    2013-01-01

    Transcription activator-like effectors (TALEs) are a class of naturally occurring DNA binding proteins found in the plant pathogen Xanthomonas sp. The DNA binding domain of each TALE consists of tandem 34-amino acid repeat modules that can be rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Here we describe a toolbox for rapid construction of custom TALE transcription factors (TALE-TFs) and nucleases (TALENs) using a hierarchical ligation procedure. This toolbox facilitates affordable and rapid construction of custom TALE-TFs and TALENs within one week and can be easily scaled up to construct TALEs for multiple targets in parallel. We also provide details for testing the activity in mammalian cells of custom TALE-TFs and TALENs using, respectively, qRT-PCR and Surveyor nuclease. The TALE toolbox described here will enable a broad range of biological applications. PMID:22222791

  16. SSL/TLS Vulnerability Detection Using Black Box Approach

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Sitorus, E. H.; Rahmat, R. F.; Hizriadi, A.

    2018-03-01

    Socket Secure Layer (SSL) and Transport Layer Security (TLS) are cryptographic protocols that provide data encryption to secure the communication over a network. However, in some cases, there are vulnerability found in the implementation of SSL/TLS because of weak cipher key, certificate validation error or session handling error. One of the most vulnerable SSL/TLS bugs is heartbleed. As the security is essential in data communication, this research aims to build a scanner that detect the SSL/TLS vulnerability by using black box approach. This research will focus on heartbleed case. In addition, this research also gathers information about existing SSL in the server. The black box approach is used to test the output of a system without knowing the process inside the system itself. For testing purpose, this research scanned websites and found that some of the websites still have SSL/TLS vulnerability. Thus, the black box approach can be used to detect the vulnerability without considering the source code and the process inside the application.

  17. An Efficient Crankshaft Dynamic Analysis Using Substructuring with Ritz Vectors

    NASA Astrophysics Data System (ADS)

    MOURELATOS, Z. P.

    2000-11-01

    A structural analysis using dynamic substructuring with Ritz vectors is presented for predicting the dynamic response of an engine crankshaft, based on the finite-element method. A two-level dynamic substructuring is performed using a set of load-dependent Ritz vectors. The rotating crankshaft is properly coupled with the non-rotating, compliant engine block. The block compliance is represented by a distributed linear elastic foundation at each main bearing location. The stiffness of the elastic foundation can be different in the vertical and horizontal planes, thereby considering the anisotropy of the engine block compliance with respect to the crankshaft rotation. The analysis accounts for the kinematic non-linearity resulting from the crankangle-dependent circumferential contact location between each journal and the corresponding bore of the engine block. Crankshaft “bent” and block “misboring” effects due to manufacturing imperfections are considered in the analysis. The superior accuracy and reduced computational effort of the present method as compared with the equivalent superelement analysis in MSC/NASTRAN, are demonstrated using the free and forced vibrations of a slender cylindrical beam and free vibrations of a four-cylinder engine crankshaft. Subsequently, the accuracy of the present method in calculating the dynamic response of engine crankshafts is shown through comparisons between the analytical predictions and experimental results for the torsional vibrations of an in-line five cylinder engine and the bending vibrations of the crankshaft-flywheel assembly of a V6 engine.

  18. How to push a block along a wall

    NASA Technical Reports Server (NTRS)

    Mason, Matthew T.

    1989-01-01

    Some robot tasks require manipulation of objects that may be touching other fixed objects. The effects of friction and kinematic constraint must be anticipated, and may even be exploited to accomplish the task. An example task, a dynamic analysis, and appropriate effector motions are presented. The goal is to move a rectangular block along a wall, so that one side of the block maintains contact with the wall. Two solutions that push the block along the wall are discussed.

  19. Multivariate analysis of subsurface radiometric data in Rongsohkham area, East Khasi Hills district, Meghalaya (India): implication on uranium exploration.

    PubMed

    Kukreti, B M; Pandey, Pradeep; Singh, R V

    2012-08-01

    Non-coring based exploratory drilling was under taken in the sedimentary environment of Rangsohkham block, East Khasi Hills district to examine the eastern extension of existing uranium resources located at Domiasiat and Wakhyn in the Mahadek basin of Meghalaya (India). Although radiometric survey and radiometric analysis of surface grab/channel samples in the block indicate high uranium content but the gamma ray logging results of exploratory boreholes in the block, did not obtain the expected results. To understand this abrupt discontinuity between the two sets of data (surface and subsurface) multivariate statistical analysis of primordial radioactive elements (K(40), U(238) and Th(232)) was performed using the concept of representative subsurface samples, drawn from the randomly selected 11 boreholes of this block. The study was performed to a high confidence level (99%), and results are discussed for assessing the U and Th behavior in the block. Results not only confirm the continuation of three distinct geological formations in the area but also the uranium bearing potential in the Mahadek sandstone of the eastern part of Mahadek Basin. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Frequency stability of on-orbit GPS Block-I and Block-II Navstar clocks

    NASA Astrophysics Data System (ADS)

    McCaskill, Thomas B.; Reid, Wilson G.; Buisson, James A.

    On-orbit analysis of the Global Positioning System (GPS) Block-I and Block-II Navstar clocks has been performed by the Naval Research Laboratory using a multi-year database. The Navstar clock phase-offset measurements were computed from pseudorange measurements made by the five GPS monitor sites and from the U.S. Naval Observatory precise-time site using single or dual frequency GPS receivers. Orbital data was obtained from the Navstar broadcast ephemeris and from the best-fit, postprocessed orbital ephemerides supplied by the Naval Surface Weapons Center or by the Defense Mapping Agency. Clock performance in the time domain is characterized using frequency-stability profiles with sample times that vary from 1 to 100 days. Composite plots of Navstar frequency stability and time-prediction uncertainty are included as a summary of clock analysis results. The analysis includes plots of the clock phase offset and frequency offset histories with the eclipse seasons superimposed on selected plots to demonstrate the temperature sensitivity of one of the Block-I Navstar rubidium clocks. The potential impact on navigation and on transferring precise time of the degradation in the long-term frequency stability of the rubidium clocks is discussed.

  1. The Changeable Block Distance System Analysis

    NASA Astrophysics Data System (ADS)

    Lewiński, Andrzej; Toruń, Andrzej

    The paper treats about efficiency analysis in Changeable Block Distance (CBD) System connected with wireless positioning and control of train. The analysis is based on modeling of typical ERTMS line and comparison with actual and future traffic. The calculations are related to assumed parameters of railway traffic corresponding to real time - table of distance Psary - Góra Włodowska from CMK line equipped in classic, ETCS Level 1 and ETCS with CBD systems.

  2. MR Image Reconstruction Using Block Matching and Adaptive Kernel Methods.

    PubMed

    Schmidt, Johannes F M; Santelli, Claudio; Kozerke, Sebastian

    2016-01-01

    An approach to Magnetic Resonance (MR) image reconstruction from undersampled data is proposed. Undersampling artifacts are removed using an iterative thresholding algorithm applied to nonlinearly transformed image block arrays. Each block array is transformed using kernel principal component analysis where the contribution of each image block to the transform depends in a nonlinear fashion on the distance to other image blocks. Elimination of undersampling artifacts is achieved by conventional principal component analysis in the nonlinear transform domain, projection onto the main components and back-mapping into the image domain. Iterative image reconstruction is performed by interleaving the proposed undersampling artifact removal step and gradient updates enforcing consistency with acquired k-space data. The algorithm is evaluated using retrospectively undersampled MR cardiac cine data and compared to k-t SPARSE-SENSE, block matching with spatial Fourier filtering and k-t ℓ1-SPIRiT reconstruction. Evaluation of image quality and root-mean-squared-error (RMSE) reveal improved image reconstruction for up to 8-fold undersampled data with the proposed approach relative to k-t SPARSE-SENSE, block matching with spatial Fourier filtering and k-t ℓ1-SPIRiT. In conclusion, block matching and kernel methods can be used for effective removal of undersampling artifacts in MR image reconstruction and outperform methods using standard compressed sensing and ℓ1-regularized parallel imaging methods.

  3. Blocking performance approximation in flexi-grid networks

    NASA Astrophysics Data System (ADS)

    Ge, Fei; Tan, Liansheng

    2016-12-01

    The blocking probability to the path requests is an important issue in flexible bandwidth optical communications. In this paper, we propose a blocking probability approximation method of path requests in flexi-grid networks. It models the bundled neighboring carrier allocation with a group of birth-death processes and provides a theoretical analysis to the blocking probability under variable bandwidth traffic. The numerical results show the effect of traffic parameters to the blocking probability of path requests. We use the first fit algorithm in network nodes to allocate neighboring carriers to path requests in simulations, and verify approximation results.

  4. Discrimination of high-Z materials in concrete-filled containers using muon scattering tomography

    NASA Astrophysics Data System (ADS)

    Frazão, L.; Velthuis, J.; Thomay, C.; Steer, C.

    2016-07-01

    An analysis method of identifying materials using muon scattering tomography is presented, which uses previous knowledge of the position of high-Z objects inside a container and distinguishes them from similar materials. In particular, simulations were performed in order to distinguish a block of Uranium from blocks of Lead and Tungsten of the same size, inside a concrete-filled drum. The results show that, knowing the shape and position from previous analysis, it is possible to distinguish 5 × 5 × 5 cm3 blocks of these materials with about 4h of muon exposure, down to 2 × 2 × 2 cm3 blocks with 70h of data using multivariate analysis (MVA). MVA uses several variables, but it does not benefit the discrimination over a simpler method using only the scatter angles. This indicates that the majority of discrimination is provided by the angular information. Momentum information is shown to provide no benefits in material discrimination.

  5. Seismic motion in urban sites consisting of blocks in welded contact with a soft layer overlying a hard half-space

    NASA Astrophysics Data System (ADS)

    Groby, Jean-Philippe; Wirgin, Armand

    2008-02-01

    We address the problem of the response to a seismic wave of an urban site consisting of Nb blocks overlying a soft layer underlain by a hard substratum. The results of a theoretical analysis, appealing to a space-frequency mode-matching (MM) technique, are compared to those obtained by a space-time finite-element (FE) technique. The two methods are shown to give rise to the same prediction of the seismic response for Nb = 1, 2 and 40 blocks. The mechanism of the interaction between blocks and the ground, as well as that of the mutual interaction between blocks, are studied. It is shown, in the first part of this paper, that the presence of a small number of blocks modifies the seismic disturbance in a manner which evokes qualitatively, but not quantitatively, what was observed during the 1985 Michoacan earthquake in Mexico City. Anomalous earthquake response at a much greater level, in terms of duration, peak and cumulative amplitude of motion, is shown, by a theoretical and numerical analysis in the second part of this paper, to be induced by the presence of a large (>=10) number of identical equi-spaced blocks that are present in certain districts of many cities.

  6. Analysis of genome rearrangement by block-interchanges.

    PubMed

    Lu, Chin Lung; Lin, Ying Chih; Huang, Yen Lin; Tang, Chuan Yi

    2007-01-01

    Block-interchanges are a new kind of genome rearrangements that affect the gene order in a chromosome by swapping two nonintersecting blocks of genes of any length. More recently, the study of such rearrangements is becoming increasingly important because of its applications in molecular evolution. Usually, this kind of study requires to solve a combinatorial problem, called the block-interchange distance problem, which is to find a minimum number of block-interchanges between two given gene orders of linear/circular chromosomes to transform one gene order into another. In this chapter, we shall introduce the basics of block-interchange rearrangements and permutation groups in algebra that are useful in analyses of genome rearrangements. In addition, we shall present a simple algorithm on the basis of permutation groups to efficiently solve the block-interchange distance problem, as well as ROBIN, a web server for the online analyses of block-interchange rearrangements.

  7. Expansion and improvements of the FORMA system for response and load analysis. Volume 1: Programming manual

    NASA Technical Reports Server (NTRS)

    Wohlen, R. L.

    1976-01-01

    Techniques are presented for the solution of structural dynamic systems on an electronic digital computer using FORMA (FORTRAN Matrix Analysis). FORMA is a library of subroutines coded in FORTRAN 4 for the efficient solution of structural dynamics problems. These subroutines are in the form of building blocks that can be put together to solve a large variety of structural dynamics problems. The obvious advantage of the building block approach is that programming and checkout time are limited to that required for putting the blocks together in the proper order.

  8. Interactive-predictive detection of handwritten text blocks

    NASA Astrophysics Data System (ADS)

    Ramos Terrades, O.; Serrano, N.; Gordó, A.; Valveny, E.; Juan, A.

    2010-01-01

    A method for text block detection is introduced for old handwritten documents. The proposed method takes advantage of sequential book structure, taking into account layout information from pages previously transcribed. This glance at the past is used to predict the position of text blocks in the current page with the help of conventional layout analysis methods. The method is integrated into the GIDOC prototype: a first attempt to provide integrated support for interactive-predictive page layout analysis, text line detection and handwritten text transcription. Results are given in a transcription task on a 764-page Spanish manuscript from 1891.

  9. Visuo-Spatial Performance in Autism: A Meta-Analysis

    ERIC Educational Resources Information Center

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  10. Enhancements of Bayesian Blocks; Application to Large Light Curve Databases

    NASA Technical Reports Server (NTRS)

    Scargle, Jeff

    2015-01-01

    Bayesian Blocks are optimal piecewise linear representations (step function fits) of light-curves. The simple algorithm implementing this idea, using dynamic programming, has been extended to include more data modes and fitness metrics, multivariate analysis, and data on the circle (Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations, Scargle, Norris, Jackson and Chiang 2013, ApJ, 764, 167), as well as new results on background subtraction and refinement of the procedure for precise timing of transient events in sparse data. Example demonstrations will include exploratory analysis of the Kepler light curve archive in a search for "star-tickling" signals from extraterrestrial civilizations. (The Cepheid Galactic Internet, Learned, Kudritzki, Pakvasa1, and Zee, 2008, arXiv: 0809.0339; Walkowicz et al., in progress).

  11. Communications Effects Server (CES) Model for Systems Engineering Research

    DTIC Science & Technology

    2012-01-31

    Visualization Tool Interface «logical» HLA Tool Interface «logical» DIS Tool Interface «logical» STK Tool Interface «module» Execution Kernels «logical...interoperate with STK when running simulations. GUI Components  Architect – The Architect represents the main network design and visualization ...interest» CES «block» Third Party Visualization Tool «block» Third Party Analysis Tool «block» Third Party Text Editor «block» HLA Tools Analyst User Army

  12. Combined Sciatic and Lumbar Plexus Nerve Blocks for the Analgesic Management of Hip Arthroscopy Procedures: A Retrospective Review.

    PubMed

    Jaffe, J Douglas; Morgan, Theodore Ross; Russell, Gregory B

    2017-06-01

    Hip arthroscopy is a minimally invasive alternative to open hip surgery. Despite its minimally invasive nature, there can still be significant reported pain following these procedures. The impact of combined sciatic and lumbar plexus nerve blocks on postoperative pain scores and opioid consumption in patients undergoing hip arthroscopy was investigated. A retrospective analysis of 176 patients revealed that compared with patients with no preoperative peripheral nerve block, significant reductions in pain scores to 24 hours were reported and decreased opioid consumption during the post anesthesia care unit (PACU) stay was recorded; no significant differences in opioid consumption out to 24 hours were discovered. A subgroup analysis comparing two approaches to the sciatic nerve block in patients receiving the additional lumbar plexus nerve block failed to reveal a significant difference for this patient population. We conclude that peripheral nerve blockade can be a useful analgesic modality for patients undergoing hip arthroscopy.

  13. Identification of methylation haplotype blocks aids in deconvolution of heterogeneous tissue samples and tumor tissue-of-origin mapping from plasma DNA.

    PubMed

    Guo, Shicheng; Diep, Dinh; Plongthongkum, Nongluk; Fung, Ho-Lim; Zhang, Kang; Zhang, Kun

    2017-04-01

    Adjacent CpG sites in mammalian genomes can be co-methylated owing to the processivity of methyltransferases or demethylases, yet discordant methylation patterns have also been observed, which are related to stochastic or uncoordinated molecular processes. We focused on a systematic search and investigation of regions in the full human genome that show highly coordinated methylation. We defined 147,888 blocks of tightly coupled CpG sites, called methylation haplotype blocks, after analysis of 61 whole-genome bisulfite sequencing data sets and validation with 101 reduced-representation bisulfite sequencing data sets and 637 methylation array data sets. Using a metric called methylation haplotype load, we performed tissue-specific methylation analysis at the block level. Subsets of informative blocks were further identified for deconvolution of heterogeneous samples. Finally, using methylation haplotypes we demonstrated quantitative estimation of tumor load and tissue-of-origin mapping in the circulating cell-free DNA of 59 patients with lung or colorectal cancer.

  14. Transversus Abdominis Plane Block for Post Hysterectomy Pain: A Systematic Review and Meta-Analysis.

    PubMed

    Bacal, Vanessa; Rana, Urvi; McIsaac, Daniel I; Chen, Innie

    2018-04-30

    The objective of this study was to address the efficacy of transversus abdominis plane (TAP) blocks in pain management among women who undergo elective hysterectomy for benign pathology in both open and minimally invasive surgeries. We performed a systematic review by searching for bibliographic citations from Medline, Embase and Cochrane Library. MeSH headings for TAP blocks and hysterectomy were combined and restricted to the English language. We included RCTs comparing TAP blocks to placebo or no block in patients who underwent elective hysterectomy. Pain was measured using a visual analog score (VAS) on a scale of 0-100. We calculated pooled mean differences in VAS and total morphine consumption at 2 and 24 hours by performing a random effects meta-analysis. Fourteen studies met the inclusion criteria, comprising 855 participants. At 2 hours, mean VAS scores for patients who underwent TAP blocks were significantly lower after both total abdominal hysterectomy (TAH) (mean difference -14.97 [CI: -20.35- -9.59]) and total laparoscopic hysterectomy (TLH) (-18.16 [CI: -34.78- -1.53]) compared to placebo or no block. Pain scores at 24 hours for patients who underwent TAPB were significantly lower after both TAH (-10.09 [CI: -17.35- -2.83]) and TLH (-9.12 [CI: -18.12- -0.13]) compared to placebo or no block. Mean difference in morphine consumption was -9.53 mg (CI -15.43- -3.63) for TAH and -3.15 mg (CI: -8.41- 2.12) for TLH. In conclusion, TAP blocks provide significant postoperative early and delayed pain control compared to placebo or no block among women who undergo hysterectomy. There was reduced morphine consumption among patients who underwent TAH, but not after TLH. Copyright © 2018. Published by Elsevier Inc.

  15. Improving the Curie depth estimation through optimizing the spectral block dimensions of the aeromagnetic data in the Sabalan geothermal field

    NASA Astrophysics Data System (ADS)

    Akbar, Somaieh; Fathianpour, Nader

    2016-12-01

    The Curie point depth is of great importance in characterizing geothermal resources. In this study, the Curie iso-depth map was provided using the well-known method of dividing the aeromagnetic dataset into overlapping blocks and analyzing the power spectral density of each block separately. Determining the optimum block dimension is vital in improving the resolution and accuracy of estimating Curie point depth. To investigate the relation between the optimal block size and power spectral density, a forward magnetic modeling was implemented on an artificial prismatic body with specified characteristics. The top, centroid, and bottom depths of the body were estimated by the spectral analysis method for different block dimensions. The result showed that the optimal block size could be considered as the smallest possible block size whose corresponding power spectrum represents an absolute maximum in small wavenumbers. The Curie depth map of the Sabalan geothermal field and its surrounding areas, in the northwestern Iran, was produced using a grid of 37 blocks with different dimensions from 10 × 10 to 50 × 50 km2, which showed at least 50% overlapping with adjacent blocks. The Curie point depth was estimated in the range of 5 to 21 km. The promising areas with the Curie point depths less than 8.5 km are located around Mountain Sabalan encompassing more than 90% of known geothermal resources in the study area. Moreover, the Curie point depth estimated by the improved spectral analysis is in good agreement with the depth calculated from the thermal gradient data measured in one of the exploratory wells in the region.

  16. Convolutional neural network for road extraction

    NASA Astrophysics Data System (ADS)

    Li, Junping; Ding, Yazhou; Feng, Fajie; Xiong, Baoyu; Cui, Weihong

    2017-11-01

    In this paper, the convolution neural network with large block input and small block output was used to extract road. To reflect the complex road characteristics in the study area, a deep convolution neural network VGG19 was conducted for road extraction. Based on the analysis of the characteristics of different sizes of input block, output block and the extraction effect, the votes of deep convolutional neural networks was used as the final road prediction. The study image was from GF-2 panchromatic and multi-spectral fusion in Yinchuan. The precision of road extraction was 91%. The experiments showed that model averaging can improve the accuracy to some extent. At the same time, this paper gave some advice about the choice of input block size and output block size.

  17. Association of cardiac implantable electronic devices with survival in bifascicular block and prolonged PR interval on electrocardiogram.

    PubMed

    Moulki, Naeem; Kealhofer, Jessica V; Benditt, David G; Gravely, Amy; Vakil, Kairav; Garcia, Santiago; Adabag, Selcuk

    2018-06-16

    Bifascicular block and prolonged PR interval on the electrocardiogram (ECG) have been associated with complete heart block and sudden cardiac death. We sought to determine if cardiac implantable electronic devices (CIED) improve survival in these patients. We assessed survival in relation to CIED status among 636 consecutive patients with bifascicular block and prolonged PR interval on the ECG. In survival analyses, CIED was considered as a time-varying covariate. Average age was 76 ± 9 years, and 99% of the patients were men. A total of 167 (26%) underwent CIED (127 pacemaker only) implantation at baseline (n = 23) or during follow-up (n = 144). During 5.4 ± 3.8 years of follow-up, 83 (13%) patients developed complete or high-degree atrioventricular block and 375 (59%) died. Patients with a CIED had a longer survival compared to those without a CIED in the traditional, static analysis (log-rank p < 0.0001) but not when CIED was considered as a time-varying covariate (log-rank p = 0.76). In the multivariable model, patients with a CIED had a 34% lower risk of death (hazard ratio 0.66, 95% confidence interval 0.52-0.83; p = 0.001) than those without CIED in the traditional analysis but not in the time-varying covariate analysis (hazard ratio 1.05, 95% confidence interval 0.79-1.38; p = 0.76). Results did not change in the subgroup with a pacemaker only. Bifascicular block and prolonged PR interval on ECG are associated with a high incidence of complete atrioventricular block and mortality. However, CIED implantation does not have a significant influence on survival when time-varying nature of CIED implantation is considered.

  18. Detecting most influencing courses on students grades using block PCA

    NASA Astrophysics Data System (ADS)

    Othman, Osama H.; Gebril, Rami Salah

    2014-12-01

    One of the modern solutions adopted in dealing with the problem of large number of variables in statistical analyses is the Block Principal Component Analysis (Block PCA). This modified technique can be used to reduce the vertical dimension (variables) of the data matrix Xn×p by selecting a smaller number of variables, (say m) containing most of the statistical information. These selected variables can then be employed in further investigations and analyses. Block PCA is an adapted multistage technique of the original PCA. It involves the application of Cluster Analysis (CA) and variable selection throughout sub principal components scores (PC's). The application of Block PCA in this paper is a modified version of the original work of Liu et al (2002). The main objective was to apply PCA on each group of variables, (established using cluster analysis), instead of involving the whole large pack of variables which was proved to be unreliable. In this work, the Block PCA is used to reduce the size of a huge data matrix ((n = 41) × (p = 251)) consisting of Grade Point Average (GPA) of the students in 251 courses (variables) in the faculty of science in Benghazi University. In other words, we are constructing a smaller analytical data matrix of the GPA's of the students with less variables containing most variation (statistical information) in the original database. By applying the Block PCA, (12) courses were found to `absorb' most of the variation or influence from the original data matrix, and hence worth to be keep for future statistical exploring and analytical studies. In addition, the course Independent Study (Math.) was found to be the most influencing course on students GPA among the 12 selected courses.

  19. Effect of education on listening comprehension of sentences on healthy elderly: analysis of number of correct responses and task execution time.

    PubMed

    Silagi, Marcela Lima; Rabelo, Camila Maia; Schochat, Eliane; Mansur, Letícia Lessa

    2017-11-13

    To analyze the effect of education on sentence listening comprehension on cognitively healthy elderly. A total of 111 healthy elderly, aged 60-80 years of both genders were divided into two groups according to educational level: low education (0-8 years of formal education) and high education (≥9 years of formal education). The participants were assessed using the Revised Token Test, an instrument that supports the evaluation of auditory comprehension of orders with different working memory and syntactic complexity demands. The indicators used for performance analysis were the number of correct responses (accuracy analysis) and task execution time (temporal analysis) in the different blocks. The low educated group had a lower number of correct responses than the high educated group on all blocks of the test. In the temporal analysis, participants with low education had longer execution time for commands on the first four blocks related to working memory. However, the two groups had similar execution time for blocks more related to syntactic comprehension. Education influenced sentence listening comprehension on elderly. Temporal analysis allowed to infer over the relationship between comprehension and other cognitive abilities, and to observe that the low educated elderly did not use effective compensation strategies to improve their performances on the task. Therefore, low educational level, associated with aging, may potentialize the risks for language decline.

  20. Methods for assessing the stability of slopes during earthquakes-A retrospective

    USGS Publications Warehouse

    Jibson, R.W.

    2011-01-01

    During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.

  1. Development of the Functional Flow Block Diagram for the J-2X Rocket Engine System

    NASA Technical Reports Server (NTRS)

    White, Thomas; Stoller, Sandra L.; Greene, WIlliam D.; Christenson, Rick L.; Bowen, Barry C.

    2007-01-01

    The J-2X program calls for the upgrade of the Apollo-era Rocketdyne J-2 engine to higher power levels, using new materials and manufacturing techniques, and with more restrictive safety and reliability requirements than prior human-rated engines in NASA history. Such requirements demand a comprehensive systems engineering effort to ensure success. Pratt & Whitney Rocketdyne system engineers performed a functional analysis of the engine to establish the functional architecture. J-2X functions were captured in six major operational blocks. Each block was divided into sub-blocks or states. In each sub-block, functions necessary to perform each state were determined. A functional engine schematic consistent with the fidelity of the system model was defined for this analysis. The blocks, sub-blocks, and functions were sequentially numbered to differentiate the states in which the function were performed and to indicate the sequence of events. The Engine System was functionally partitioned, to provide separate and unique functional operators. Establishing unique functional operators as work output of the System Architecture process is novel in Liquid Propulsion Engine design. Each functional operator was described such that its unique functionality was identified. The decomposed functions were then allocated to the functional operators both of which were the inputs to the subsystem or component performance specifications. PWR also used a novel approach to identify and map the engine functional requirements to customer-specified functions. The final result was a comprehensive Functional Flow Block Diagram (FFBD) for the J-2X Engine System, decomposed to the component level and mapped to all functional requirements. This FFBD greatly facilitates component specification development, providing a well-defined trade space for functional trades at the subsystem and component level. It also provides a framework for function-based failure modes and effects analysis (FMEA), and a rigorous baseline for the functional architecture.

  2. Polybenzimidazole block copolymers for fuel cell: synthesis and studies of block length effects on nanophase separation, mechanical properties, and proton conductivity of PEM.

    PubMed

    Maity, Sudhangshu; Jana, Tushar

    2014-05-14

    A series of meta-polybenzimidazole-block-para-polybenzimidazole (m-PBI-b-p-PBI), segmented block copolymers of PBI, were synthesized with various structural motifs and block lengths by condensing the diamine terminated meta-PBI (m-PBI-Am) and acid terminated para-PBI (p-PBI-Ac) oligomers. NMR studies and existence of two distinct glass transition temperatures (Tg), obtained from dynamical mechanical analysis (DMA) results, unequivocally confirmed the formation of block copolymer structure through the current polymerization methodology. Appropriate and careful selection of oligomers chain length enabled us to tailor the block length of block copolymers and also to make varieties of structural motifs. Increasingly distinct Tg peaks with higher block length of segmented block structure attributed the decrease in phase mixing between the meta-PBI and para-PBI blocks, which in turn resulted into nanophase segregated domains. The proton conductivities of proton exchange membrane (PEM) developed from phosphoric acid (PA) doped block copolymer membranes were found to be increasing substantially with increasing block length of copolymers even though PA loading of these membranes did not alter appreciably with varying block length. For example when molecular weight (Mn) of blocks were increased from 1000 to 5500 then the proton conductivities at 160 °C of resulting copolymers increased from 0.05 to 0.11 S/cm. Higher block length induced nanophase separation between the blocks by creating less morphological barrier within the block which facilitated the movement of the proton in the block and hence resulting higher proton conductivity of the PEM. The structural varieties also influenced the phase separation and proton conductivity. In comparison to meta-para random copolymers reported earlier, the current meta-para segmented block copolymers were found to be more suitable for PBI-based PEM.

  3. Variation block-based genomics method for crop plants.

    PubMed

    Kim, Yul Ho; Park, Hyang Mi; Hwang, Tae-Young; Lee, Seuk Ki; Choi, Man Soo; Jho, Sungwoong; Hwang, Seungwoo; Kim, Hak-Min; Lee, Dongwoo; Kim, Byoung-Chul; Hong, Chang Pyo; Cho, Yun Sung; Kim, Hyunmin; Jeong, Kwang Ho; Seo, Min Jung; Yun, Hong Tai; Kim, Sun Lim; Kwon, Young-Up; Kim, Wook Han; Chun, Hye Kyung; Lim, Sang Jong; Shin, Young-Ah; Choi, Ik-Young; Kim, Young Sun; Yoon, Ho-Sung; Lee, Suk-Ha; Lee, Sunghoon

    2014-06-15

    In contrast with wild species, cultivated crop genomes consist of reshuffled recombination blocks, which occurred by crossing and selection processes. Accordingly, recombination block-based genomics analysis can be an effective approach for the screening of target loci for agricultural traits. We propose the variation block method, which is a three-step process for recombination block detection and comparison. The first step is to detect variations by comparing the short-read DNA sequences of the cultivar to the reference genome of the target crop. Next, sequence blocks with variation patterns are examined and defined. The boundaries between the variation-containing sequence blocks are regarded as recombination sites. All the assumed recombination sites in the cultivar set are used to split the genomes, and the resulting sequence regions are termed variation blocks. Finally, the genomes are compared using the variation blocks. The variation block method identified recurring recombination blocks accurately and successfully represented block-level diversities in the publicly available genomes of 31 soybean and 23 rice accessions. The practicality of this approach was demonstrated by the identification of a putative locus determining soybean hilum color. We suggest that the variation block method is an efficient genomics method for the recombination block-level comparison of crop genomes. We expect that this method will facilitate the development of crop genomics by bringing genomics technologies to the field of crop breeding.

  4. Increased prevalence of third-degree atrioventricular block in patients with type II diabetes mellitus.

    PubMed

    Movahed, Mohammad-Reza; Hashemzadeh, Mehrtash; Jamal, M Mazen

    2005-10-01

    Diabetes mellitus (DM) is a major risk for cardiovascular disease and mortality. There is some evidence that third-degree atrioventricular (AV) block occurs more commonly in patients with DM. In this study, we evaluated any possible association between DM and third-degree AV block using International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes in a very large inpatient database. We used patient treatment files containing discharge diagnoses using ICD-9 codes of inpatient treatment from all Veterans Health Administration hospitals. The cohort was stratified using the ICD-9-CM code for DM (n = 293,124), a control group with hypertension but no DM (n = 552,623), and the ICD-9 code for third-degree AV block (426.0) and smoking (305.1, V15.82). We performed multivariate analysis adjusting for coronary artery disease, congestive heart failure, smoking, and hyperlipidemia. Continuous and binary variables were analyzed using chi2 and Fisher exact tests. Third-degree AV block diagnosis was present in 3,240 of DM patients (1.1%) vs 3,367 patients (0.6%) in the control group. Using multivariate analysis, DM remained strongly associated with third-degree AV block (odds ratio, 3.1; 95% confidential interval, 3.0 to 3.3; p < 0.0001). Third-degree AV block occurs significantly more in patients with DM. This finding may, in part, explain the high cardiovascular mortality in DM patients.

  5. Application and research of block caving in Pulang copper mine

    NASA Astrophysics Data System (ADS)

    Ge, Qifa; Fan, Wenlu; Zhu, Weigen; Chen, Xiaowei

    2018-01-01

    The application of block caving in mines shows significant advantages in large scale, low cost and high efficiency, thus block caving is worth promoting in the mines that meets the requirement of natural caving. Due to large scale of production and low ore grade in Pulang copper mine in China, comprehensive analysis and research were conducted on rock mechanics, mining sequence, undercutting and stability of bottom structure in terms of raising mine benefit and maximizing the recovery mineral resources. Finally this study summarizes that block caving is completely suitable for Pulang copper mine.

  6. Harmony of spinning conformal blocks

    NASA Astrophysics Data System (ADS)

    Schomerus, Volker; Sobko, Evgeny; Isachenkov, Mikhail

    2017-03-01

    Conformal blocks for correlation functions of tensor operators play an increasingly important role for the conformal bootstrap programme. We develop a universal approach to such spinning blocks through the harmonic analysis of certain bundles over a coset of the conformal group. The resulting Casimir equations are given by a matrix version of the Calogero-Sutherland Hamiltonian that describes the scattering of interacting spinning particles in a 1-dimensional external potential. The approach is illustrated in several examples including fermionic seed blocks in 3D CFT where they take a very simple form.

  7. The "SIMCLAS" Model: Simultaneous Analysis of Coupled Binary Data Matrices with Noise Heterogeneity between and within Data Blocks

    ERIC Educational Resources Information Center

    Wilderjans, Tom F.; Ceulemans, E.; Van Mechelen, I.

    2012-01-01

    In many research domains different pieces of information are collected regarding the same set of objects. Each piece of information constitutes a data block, and all these (coupled) blocks have the object mode in common. When analyzing such data, an important aim is to obtain an overall picture of the structure underlying the whole set of coupled…

  8. Role of Polyalanine Domains in -Sheet Formation in Spider Silk Block Copolymers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabotyagova, O.; Cebe, P; Kaplan, D

    2010-01-01

    Genetically engineered spider silk-like block copolymers were studied to determine the influence of polyalanine domain size on secondary structure. The role of polyalanine block distribution on {beta}-sheet formation was explored using FT-IR and WAXS. The number of polyalanine blocks had a direct effect on the formation of crystalline {beta}-sheets, reflected in the change in crystallinity index as the blocks of polyalanines increased. WAXS analysis confirmed the crystalline nature of the sample with the largest number of polyalanine blocks. This approach provides a platform for further exploration of the role of specific amino acid chemistries in regulating the assembly of {beta}-sheetmore » secondary structures, leading to options to regulate material properties through manipulation of this key component in spider silks.« less

  9. Study of mathematical modeling of communication systems transponders and receivers

    NASA Technical Reports Server (NTRS)

    Walsh, J. R.

    1972-01-01

    The modeling of communication receivers is described at both the circuit detail level and at the block level. The largest effort was devoted to developing new models at the block modeling level. The available effort did not permit full development of all of the block modeling concepts envisioned, but idealized blocks were developed for signal sources, a variety of filters, limiters, amplifiers, mixers, and demodulators. These blocks were organized into an operational computer simulation of communications receiver circuits identified as the frequency and time circuit analysis technique (FATCAT). The simulation operates in both the time and frequency domains, and permits output plots or listings of either frequency spectra or time waveforms from any model block. Transfer between domains is handled with a fast Fourier transform algorithm.

  10. The blocking reagent optimization for the magnetoelastic biosensor

    NASA Astrophysics Data System (ADS)

    Hu, Jiajia; Chai, Yating; Horikawa, Shin; Wikle, Howard C.; Wang, Feng'en; Du, Songtao; Chin, Bryan A.; Hu, Jing

    2015-06-01

    The wireless phage-based magnetoelastic (ME) biosensor has proven to be promising for real-time detection of pathogenic bacteria on fresh produces. The ME biosensor consists of a freestanding ME resonator as the signal transducer and filamentous phage as the biomolecular-recognition element, which can specifically bind to a pathogen of interest. Due to the Joule magnetostriction effect, the biosensors can be placed into mechanical resonance when subjected to a time-varying magnetic field alternating at the sensor's resonant frequency. Upon the attachment of the target pathogen, the mass of the biosensor increases, thereby decreasing its resonant frequency. This paper presents an investigation of blocking reagents immobilization for detecting Salmonella Typhimurium on fresh food surfaces. Three different blocking reagents (BSA, SuperBlock blocking buffer, and blocker BLOTTO) were used and compared. The optical microscope was used for bacterial cells binding observation. Student t-test was used to statistically analysis the experiment results. The results shows that SuperBlock blocking buffer and blocker BLOTTO have much better blocking performance than usually used BSA.

  11. Phase response curves for models of earthquake fault dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franović, Igor, E-mail: franovic@ipb.ac.rs; Kostić, Srdjan; Perc, Matjaž

    We systematically study effects of external perturbations on models describing earthquake fault dynamics. The latter are based on the framework of the Burridge-Knopoff spring-block system, including the cases of a simple mono-block fault, as well as the paradigmatic complex faults made up of two identical or distinct blocks. The blocks exhibit relaxation oscillations, which are representative for the stick-slip behavior typical for earthquake dynamics. Our analysis is carried out by determining the phase response curves of first and second order. For a mono-block fault, we consider the impact of a single and two successive pulse perturbations, further demonstrating how themore » profile of phase response curves depends on the fault parameters. For a homogeneous two-block fault, our focus is on the scenario where each of the blocks is influenced by a single pulse, whereas for heterogeneous faults, we analyze how the response of the system depends on whether the stimulus is applied to the block having a shorter or a longer oscillation period.« less

  12. Duration of motor block with intrathecal ropivacaine versus bupivacaine for caesarean section: a meta-analysis.

    PubMed

    Malhotra, R; Johnstone, C; Halpern, S; Hunter, J; Banerjee, A

    2016-08-01

    Bupivacaine is a commonly used local anaesthetic for spinal anaesthesia for caesarean section, but may produce prolonged motor block, delaying discharge from the post-anaesthesia care unit. Ropivacaine may have a shorter time to recovery of motor function compared with bupivacaine. We performed a meta-analysis to assess the time difference in duration of motor block with intrathecal ropivacaine compared with bupivacaine for caesarean section. We searched MEDLINE, EMBASE and Cochrane Central Register of Controlled Trials databases for randomised controlled trials comparing ropivacaine with bupivacaine in parturients undergoing elective caesarean section under spinal anaesthesia. The primary outcome was the duration of motor block. Secondary outcomes included the time to onset of sensory block, need for conversion to general anaesthesia and the incidence of hypotension. Thirteen trials comprising 743 spinal anaesthetics were included. Intrathecal ropivacaine resulted in a reduced duration of motor block, regressing 35.7min earlier compared with intrathecal bupivacaine (P<0.00001). There was no difference in the time to onset of sensory block (P=0.25) or the incidence of hypotension (P=0.10). Limited data suggested no difference in the rate of conversion to general anaesthesia, but an earlier request for postoperative analgesia with ropivacaine. Compared with bupivacaine, intrathecal ropivacaine is associated with more rapid recovery of motor block despite similar sensory properties and no increased rate of conversion to general anaesthesia. This may be useful in centres in which recovery of motor block is a criterion for discharge from the post-anaesthesia care unit. However, small numbers of trials and significant heterogeneity limit the interpretation of our results. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Opioid-sparing effects of the thoracic interfascial plane blocks: A meta-analysis of randomized controlled trials.

    PubMed

    Singh, Preet Mohinder; Borle, Anuradha; Kaur, Manpreet; Trikha, Anjan; Sinha, Ashish

    2018-01-01

    Thoracic interfascial plane blocks and modification (PECS) have recently gained popularity for analgesic potential during breast surgery. We evaluate/consolidate the evidence on opioid-sparing effect of PECS blocks in comparison with conventional intravenous analgesia (IVA) and paravertebral block (PVB). Prospective, randomized controlled trials comparing PECS block to conventional IVA or PVB in patients undergoing breast surgery published till June 2017 were searched in the medical database. Comparisons were made for 24-h postoperative morphine consumption and intraoperative fentanyl-equivalent consumption. Final analysis included nine trials (PECS vs. IVA 4 trials and PECS vs. PVB 5 trials). PECS block showed a decreased intraoperative fentanyl consumption over IVA by 49.20 mcg (95% confidence interval [CI] =42.67-55.74) ( I 2 = 98.47%, P < 0.001) and PVB by 15.88 mcg (95% CI = 12.95-18.81) ( I 2 = 95.51%, P < 0.001). Postoperative, 24-h morphine consumption with PECS block was lower than IVA by 7.66 mg (95% CI being 6.23-9.10) ( I 2 = 63.15, P < 0.001) but was higher than PVB group by 1.26 mg (95% CI being 0.91-1.62) ( I 2 = 99.53%, P < 0.001). Two cases of pneumothorax were reported with PVB, and no complication was reported in any other group. Use of PECS block and its modifications with general anesthesia for breast surgery has significant opioid-sparing effect intraoperatively and during the first 24 h after surgery. It also has higher intraoperative opioid-sparing effect when compared to PVB. During the 1 st postoperative day, PVB has slightly more morphine sparing potential that may however be associated with higher complication rates. The present PECS block techniques show marked interstudy variations and need standardization.

  14. A new method for detection and discrimination of Pepino mosaic virus isolates using high resolution melting analysis of the triple gene block 3.

    PubMed

    Hasiów-Jaroszewska, Beata; Komorowska, Beata

    2013-10-01

    Diagnostic methods distinguished different Pepino mosaic virus (PepMV) genotypes but the methods do not detect sequence variation in particular gene segments. The necrotic and non-necrotic isolates (pathotypes) of PepMV share a 99% sequence similarity. These isolates differ from each other at one nucleotide site in the triple gene block 3. In this study, a combination of real-time reverse transcription polymerase chain reaction and high resolution melting curve analysis of triple gene block 3 was developed for simultaneous detection and differentiation of PepMV pathotypes. The triple gene block 3 region carrying a transition A → G was amplified using two primer pairs from twelve virus isolates, and was subjected to high resolution melting curve analysis. The results showed two distinct melting curve profiles related to each pathotype. The results also indicated that the high resolution melting method could readily differentiate between necrotic and non-necrotic PepMV pathotypes. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Factor levels for density comparisons in the split-block spacing design

    Treesearch

    Kurt H. Riitters; Brian J. Stanton; Robbert H. Walkup

    1989-01-01

    The split-block spacing design is a compact test of the effects of within-row and between-row spacings. But the sometimes awkward analysis of density (i.e., trees/ha) effects may deter use of the design. The analysis is simpler if the row spacings are chosen to obtain a balanced set of equally spaced density and rectangularity treatments. A spacing study in poplar (...

  16. Analysis of blocking probability for OFDM-based variable bandwidth optical network

    NASA Astrophysics Data System (ADS)

    Gong, Lei; Zhang, Jie; Zhao, Yongli; Lin, Xuefeng; Wu, Yuyao; Gu, Wanyi

    2011-12-01

    Orthogonal Frequency Division Multiplexing (OFDM) has recently been proposed as a modulation technique. For optical networks, because of its good spectral efficiency, flexibility, and tolerance to impairments, optical OFDM is much more flexible compared to traditional WDM systems, enabling elastic bandwidth transmissions, and optical networking is the future trend of development. In OFDM-based optical network the research of blocking rate has very important significance for network assessment. Current research for WDM network is basically based on a fixed bandwidth, in order to accommodate the future business and the fast-changing development of optical network, our study is based on variable bandwidth OFDM-based optical networks. We apply the mathematical analysis and theoretical derivation, based on the existing theory and algorithms, research blocking probability of the variable bandwidth of optical network, and then we will build a model for blocking probability.

  17. Detecting dominant motion patterns in crowds of pedestrians

    NASA Astrophysics Data System (ADS)

    Saqib, Muhammad; Khan, Sultan Daud; Blumenstein, Michael

    2017-02-01

    As the population of the world increases, urbanization generates crowding situations which poses challenges to public safety and security. Manual analysis of crowded situations is a tedious job and usually prone to errors. In this paper, we propose a novel technique of crowd analysis, the aim of which is to detect different dominant motion patterns in real-time videos. A motion field is generated by computing the dense optical flow. The motion field is then divided into blocks. For each block, we adopt an Intra-clustering algorithm for detecting different flows within the block. Later on, we employ Inter-clustering for clustering the flow vectors among different blocks. We evaluate the performance of our approach on different real-time videos. The experimental results show that our proposed method is capable of detecting distinct motion patterns in crowded videos. Moreover, our algorithm outperforms state-of-the-art methods.

  18. [Peribulbar block combined with general anesthesia in babies undergoing laser treatment for retinopathy of prematurity: a retrospective analysis].

    PubMed

    Pinho, Daniela Filipa Rodrigues; Real, Cátia; Ferreira, Leónia; Pina, Pedro

    2018-03-12

    Currently there is no agreement regarding which one is the most adequate anesthetic technique for the treatment of retinopathy of prematurity. Peribulbar block may reduce the incidence of oculocardiac reflex and postoperative apnea. The goal of this study was to report the outcomes of peribulbar block, when combined with general anesthesia, for the laser treatment for retinopathy of prematurity, in premature babies. A retrospective analysis of anesthetic records of all babies who underwent laser treatment for retinopathy of prematurity from January 2008 through December 2015 in a tertiary hospital was performed. During that period a total of six babies was submitted to laser treatment for retinopathy of prematurity, all under peribulbar block combined with general anesthesia. A single infratemporal injection of 0.15mL.kg -1 per eye ropivacaine 1% or 0.75% was performed. At the end of the procedure, all babies resumed spontaneous ventilation. No perioperative complications were reported. Peribulbar block was a safe anesthetic technique in our sample considered. Copyright © 2018 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.

  19. Electrical-power-system data base for consumables analysis. Volume 1: Electrical equipment list, activity blocks, and time lines

    NASA Technical Reports Server (NTRS)

    Pipher, M. D.; Green, P. A.; Wolfgram, D. F.

    1975-01-01

    A standardized data base is described which consists of a space shuttle electrical equipment list, activity blocks defining electrical equipment utilization, and activity-block time lines for specific mission analyses. Information is presented to facilitate utilization of the data base, to provide the basis for the electrical equipment utilization to enable interpretation of analyses based on the data contained herein.

  20. On Alternative Formulations for Linearised Miss Distance Analysis

    DTIC Science & Technology

    2013-05-01

    is traditionally employed by analysts as part of the solution process . To gain further insight into the nature of the missile-target engagement...a constant. Thus, following this process , the revised block diagram model for the linearised equations is presented in Figure 13. This model is... process is known as reducing the block to its fundamental closed loop form and has been achieved here using standard block diagram algebra. This

  1. Control analysis of lipid biosynthesis in tissue cultures from oil crops shows that flux control is shared between fatty acid synthesis and lipid assembly.

    PubMed Central

    Ramli, Umi S; Baker, Darren S; Quant, Patti A; Harwood, John L

    2002-01-01

    Top-Down (Metabolic) Control Analysis (TDCA) was used to examine, quantitatively, lipid biosynthesis in tissue cultures from two commercially important oil crops, olive (Olea europaea L.) and oil palm (Elaeis guineensis Jacq.). A conceptually simplified system was defined comprising two blocks of reactions: fatty acid synthesis (Block A) and lipid assembly (Block B), which produced and consumed, respectively, a common and unique system intermediate, cytosolic acyl-CoA. We manipulated the steady-state levels of the system intermediate by adding exogenous oleic acid and, using two independent assays, measured the effect of the addition on the system fluxes (J(A) and J(B)). These were the rate of incorporation of radioactivity: (i) through Block A from [1-(14)C]acetate into fatty acids and (ii) via Block B from [U-(14)C]glycerol into complex lipids respectively. The data showed that fatty acid formation (Block A) exerted higher control than lipid assembly (Block B) in both tissues with the following group flux control coefficients (C):(i) Oil palm: *C(J(TL))(BlkA)=0.64+/-0.05 and *C(J(TL))(BlkB)=0.36+/-0.05(ii) Olive: *C(J(TL))(BlkA)=0.57+/-0.10 and *C(J(TL))(BlkB)=0.43+/-0.10where *C indicates the group flux control coefficient over the lipid biosynthesis flux (J(TL)) and the subscripts BlkA and BlkB refer to defined blocks of the system, Block A and Block B. Nevertheless, because both parts of the lipid biosynthetic pathway exert significant flux control, we suggest strongly that manipulation of single enzyme steps will not affect product yield appreciably. The present study represents the first use of TDCA to examine the overall lipid biosynthetic pathway in any tissue, and its findings are of immediate academic and economic relevance to the yield and nutritional quality of oil crops. PMID:12023882

  2. Trial-dependent psychometric functions accounting for perceptual learning in 2-AFC discrimination tasks.

    PubMed

    Kattner, Florian; Cochrane, Aaron; Green, C Shawn

    2017-09-01

    The majority of theoretical models of learning consider learning to be a continuous function of experience. However, most perceptual learning studies use thresholds estimated by fitting psychometric functions to independent blocks, sometimes then fitting a parametric function to these block-wise estimated thresholds. Critically, such approaches tend to violate the basic principle that learning is continuous through time (e.g., by aggregating trials into large "blocks" for analysis that each assume stationarity, then fitting learning functions to these aggregated blocks). To address this discrepancy between base theory and analysis practice, here we instead propose fitting a parametric function to thresholds from each individual trial. In particular, we implemented a dynamic psychometric function whose parameters were allowed to change continuously with each trial, thus parameterizing nonstationarity. We fit the resulting continuous time parametric model to data from two different perceptual learning tasks. In nearly every case, the quality of the fits derived from the continuous time parametric model outperformed the fits derived from a nonparametric approach wherein separate psychometric functions were fit to blocks of trials. Because such a continuous trial-dependent model of perceptual learning also offers a number of additional advantages (e.g., the ability to extrapolate beyond the observed data; the ability to estimate performance on individual critical trials), we suggest that this technique would be a useful addition to each psychophysicist's analysis toolkit.

  3. Retrospective analysis of risk factors and predictors of intraoperative complications in neuraxial blocks at Faculdade de Medicina de Botucatu-UNESP.

    PubMed

    Pereira, Ivan Dias Fernandes; Grando, Marcela Miguel; Vianna, Pedro Thadeu Galvão; Braz, José Reinaldo Cerqueira; Castiglia, Yara Marcondes Machado; Vane, Luís Antônio; Módolo, Norma Sueli Pinheiro; do Nascimento, Paulo; Amorim, Rosa Beatriz; Rodrigues, Geraldo Rolim; Braz, Leandro Gobbo; Ganem, Eliana Marisa

    2011-01-01

    Cardiovascular changes associated with neuraxial blocks are a cause of concern due to their frequency and because some of them can be considered physiological effects triggered by the sympathetic nervous system blockade. The objective of this study was to evaluate intraoperative cardiovascular complications and predictive factors associated with neuraxial blocks in patients ≥ 18 years of age undergoing non-obstetric procedures over an 18-year period in a tertiary university hospital--HCFMB-UNESP. A retrospective analysis of the following complications was undertaken: hypertension, hypotension, sinus bradycardia, and sinus tachycardia. These complications were correlated with anesthetic technique, physical status (ASA), age, gender, and preoperative co-morbidities. The Tukey test for comparisons among proportions and logistic regression was used for statistical analysis. 32,554 patients underwent neuraxial blocks. Intraoperative complications mentioned included hypotension (n=4,109), sinus bradycardia (n=1,107), sinus tachycardia (n=601), and hypertension (n=466). Hypotension was seen more often in patients undergoing continuous subarachnoid anesthesia (29.4%, OR=2.39), ≥ 61 years of age, and female (OR=1.27). Intraoperative hypotension and bradycardia were the complications observed more often. Hypotension was related to anesthetic technique (CSA), increased age, and female. Tachycardia and hypertension may not have been directly related to neuraxial blocks. Copyright © 2011 Elsevier Editora Ltda. All rights reserved.

  4. Multi-scale analysis of the effect of nano-filler particle diameter on the physical properties of CAD/CAM composite resin blocks.

    PubMed

    Yamaguchi, Satoshi; Inoue, Sayuri; Sakai, Takahiko; Abe, Tomohiro; Kitagawa, Haruaki; Imazato, Satoshi

    2017-05-01

    The objective of this study was to assess the effect of silica nano-filler particle diameters in a computer-aided design/manufacturing (CAD/CAM) composite resin (CR) block on physical properties at the multi-scale in silico. CAD/CAM CR blocks were modeled, consisting of silica nano-filler particles (20, 40, 60, 80, and 100 nm) and matrix (Bis-GMA/TEGDMA), with filler volume contents of 55.161%. Calculation of Young's moduli and Poisson's ratios for the block at macro-scale were analyzed by homogenization. Macro-scale CAD/CAM CR blocks (3 × 3 × 3 mm) were modeled and compressive strengths were defined when the fracture loads exceeded 6075 N. MPS values of the nano-scale models were compared by localization analysis. As the filler size decreased, Young's moduli and compressive strength increased, while Poisson's ratios and MPS decreased. All parameters were significantly correlated with the diameters of the filler particles (Pearson's correlation test, r = -0.949, 0.943, -0.951, 0.976, p < 0.05). The in silico multi-scale model established in this study demonstrates that the Young's moduli, Poisson's ratios, and compressive strengths of CAD/CAM CR blocks can be enhanced by loading silica nanofiller particles of smaller diameter. CAD/CAM CR blocks by using smaller silica nano-filler particles have a potential to increase fracture resistance.

  5. Predictive and postdictive analysis of forage yield trials

    USDA-ARS?s Scientific Manuscript database

    Classical experimental design theory, the predominant treatment in most textbooks, promotes the use of blocking designs for control of spatial variability in field studies and other situations in which there is significant variation among heterogeneity among experimental units. Many blocking design...

  6. Block Distribution Analysis of Impact Craters in the Tharsis and Elysium Planitia Regions on Mars

    NASA Astrophysics Data System (ADS)

    Button, N.; Karunatillake, S.; Diaz, C.; Zadei, S.; Rajora, V.; Barbato, A.; Piorkowski, M.

    2017-12-01

    The block distribution pattern of ejecta surrounding impact craters reveals clues about their formation. Using images from High Resolution Imaging Science Experiment (HiRISE) image onboard the Mars Reconnaissance Orbiter (MRO), we indentified two rayed impact craters on Mars with measurable ejecta fields to quantitatively investigate in this study. Impact Crater 1 (HiRISE image PSP_008011_1975) is located in the Tharsis region at 17.41°N, 248.75°E and is 175 m in diameter. Impact Crater 2 (HiRISE image ESP_018352_1805) is located in Elysium Planitia at 0.51°N, 163.14°E and is 320 m in diameter. Our block measurements, used to determine the area, were conducted using HiView. Employing methods similar to Krishna and Kumar (2016), we compared block size and axis ratio to block distance from the center of the crater, impact angle, and direction. Preliminary analysis of sixteen radial sectors around Impact Crater 1 revealed that in sectors containing mostly small blocks (less than 10 m2), the small blocks were ejected up to three times the diameter of the crater from the center of the crater. These small block-dominated sectors lacked blocks larger than 10 m2. Contrastingly, in large block-dominated sectors (larger than 30 m2) blocks rarely traveled farther than 200 m from the center of the crater. We also seek to determine the impact angle and direction. Krishna and Kumar (2016) calculate the b-value (N(a) = Ca-b; "N(a) equals the number of fragments or craters with a size greater than a, C is a constant, and -b is a power index") as a method to determine the impact direction. Our preliminary results for Impact Crater 1 did not clearly indicate the impact angle. With improved measurements and the assessment of Impact Crater 2, we will compare Impact Crater 1 to Impact Crater 2 as well as assess the impact angle and direction in order to determine if the craters are secondary craters. Hood, D. and Karunatillake, S. (2017), LPSC, Abstract #2640 Krishna, N., and P. S. Kumar (2016), Icarus, 264, 274-299

  7. Peculiarities of the atmospheric blocking events over the Siberia and Russian Far East region during summertime

    NASA Astrophysics Data System (ADS)

    Antokhina, Olga Yu.; Devjatova, Elena V.; Mordvinov, Vladimir I.

    2017-11-01

    We study the atmospheric blocking event evolution peculiarities over the Siberia and Far Eastern region (Russia) during summertime. Compared are two methods to identify blockings: from the 500 hPa (Z500) isobaric surface height distribution, and from the potential temperature at the dynamic tropopause (PV-θ) for every July 1979 through 2016. We revealed the situations, where blockings are identified only in one of the characteristics. Blocking identification by the PV-θ characteristics is complicated in the cases, when its cyclonic part appears to be filled with air masses of the southern origin, due to which there is no meridional gradient reversal in the PV-θ region. In the Z500 region, the difficulties to identify blocking events may arise in those cases, when the baric field fails to adapt to rapid changes in the temperature field associated with the air mass advection. For example, such events often occur over the ocean surface. We performed a synoptic analysis for several blocking events from the data on the velocity field dynamics at 850 hPa and PV-θ supplemented by the analysis of the observational rainfall data at the stations during those events. Distinguished were several stages of the blocking evolution over the Siberia and Far Eastern region that involved air masses from the East Asian summer monsoon region: 1. The formation of a blocking over Western Siberia; 2. Cold inflow on the blocking eastern periphery, the East Asian summer monsoon front activation, and a cyclone formation (east of Lake Baikal), in whose system the monsoon air was actively involved. Such monsoon cyclones, as a rule, are deep long-living formations, and they bring abnormal precipitations; 3. The formation of a ridge or anticyclone east of the monsoon cyclone, caused by the advection of the same monsoon flow, whose part is involved in a cyclone system. In general, the East Asian summer monsoon influence comes to the effects of regeneration and intensification of the blocking circulating systems. Those effects are often accompanied by strong droughts in some regions and floods in others.

  8. Revisiting Parallel Cyclic Reduction and Parallel Prefix-Based Algorithms for Block Tridiagonal System of Equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seal, Sudip K; Perumalla, Kalyan S; Hirshman, Steven Paul

    2013-01-01

    Simulations that require solutions of block tridiagonal systems of equations rely on fast parallel solvers for runtime efficiency. Leading parallel solvers that are highly effective for general systems of equations, dense or sparse, are limited in scalability when applied to block tridiagonal systems. This paper presents scalability results as well as detailed analyses of two parallel solvers that exploit the special structure of block tridiagonal matrices to deliver superior performance, often by orders of magnitude. A rigorous analysis of their relative parallel runtimes is shown to reveal the existence of a critical block size that separates the parameter space spannedmore » by the number of block rows, the block size and the processor count, into distinct regions that favor one or the other of the two solvers. Dependence of this critical block size on the above parameters as well as on machine-specific constants is established. These formal insights are supported by empirical results on up to 2,048 cores of a Cray XT4 system. To the best of our knowledge, this is the highest reported scalability for parallel block tridiagonal solvers to date.« less

  9. Automation of block assignment planning using a diagram-based scenario modeling method

    NASA Astrophysics Data System (ADS)

    Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye

    2014-03-01

    Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  10. Cross-Site Reliability of Human Induced Pluripotent Stem-Cell Derived Cardiomyocyte Based Safety Assays using Microelectrode Arrays: Results from a Blinded CiPA Pilot Study.

    PubMed

    Millard, Daniel; Dang, Qianyu; Shi, Hong; Zhang, Xiaou; Strock, Chris; Kraushaar, Udo; Zeng, Haoyu; Levesque, Paul; Lu, Hua-Rong; Guillon, Jean-Michel; Wu, Joseph C; Li, Yingxin; Luerman, Greg; Anson, Blake; Guo, Liang; Clements, Mike; Abassi, Yama A; Ross, James; Pierson, Jennifer; Gintant, Gary

    2018-04-27

    Recent in vitro cardiac safety studies demonstrate the ability of human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CMs) to detect electrophysiologic effects of drugs. However, variability contributed by unique approaches, procedures, cell lines and reagents across laboratories makes comparisons of results difficult, leading to uncertainty about the role of hiPSC-CMs in defining proarrhythmic risk in drug discovery and regulatory submissions. A blinded pilot study was conducted to evaluate the electrophysiologic effects of eight well-characterized drugs on four cardiomyocyte lines using a standardized protocol across three microelectrode array (MEA) platforms (18 individual studies). Drugs were selected to define assay sensitivity of prominent repolarizing currents (E-4031 for IKr, JNJ303 for IKs) and depolarizing currents (nifedipine for ICaL, mexiletine for INa) as well as drugs affecting multi-channel block (flecainide, moxifloxacin, quinidine, and ranolazine). Inclusion criteria for final analysis was based on demonstrated sensitivity to IKr block (20% prolongation with E-4031) and L-type calcium current block (20% shortening with nifedipine). Despite differences in baseline characteristics across cardiomyocyte lines, multiple sites and instrument platforms, 10 of 18 studies demonstrated adequate sensitivity to IKr block with E-4031 and ICaL block with nifedipine for inclusion in the final analysis. Concentration-dependent effects on repolarization were observed with this qualified dataset consistent with known ionic mechanisms of single and multi-channel blocking drugs. hiPSC-CMs can detect repolarization effects elicited by single and multi-channel blocking drugs after defining pharmacologic sensitivity to IKr and ICaL block, supporting further validation efforts using hiPSC-CMs for cardiac safety studies.

  11. Diagnostic Utility of Pleural Fluid Cell Block versus Pleural Biopsy Collected by Flex-Rigid Pleuroscopy for Malignant Pleural Disease: A Single Center Retrospective Analysis

    PubMed Central

    Sasada, Shinji; Izumo, Takehiro; Matsumoto, Yuji; Tsuchida, Takaaki

    2016-01-01

    Background Some trials recently demonstrated the benefit of targeted treatment for malignant disease; therefore, adequate tissues are needed to detect the targeted gene. Pleural biopsy using flex-rigid pleuroscopy and pleural effusion cell block analysis are both useful for diagnosis of malignancy and obtaining adequate samples. The purpose of our study was to compare the diagnostic utility between the two methods among patients with malignant pleural disease with effusion. Methods Data from patients who underwent flex-rigid pleuroscopy for diagnosis of pleural effusion suspicious for malignancy at the National Cancer Center Hospital, Japan between April 2011 and June 2014 were retrospectively reviewed. All procedures were performed under local anesthesia. At least 150 mL of pleural fluid was collected by pleuroscopy, followed by pleural biopsies from the abnormal site. Results Thirty-five patients who were finally diagnosed as malignant pleural disease were included in this study. Final diagnoses of malignancy were 24 adenocarcinoma, 1 combined adeno-small cell carcinoma, and 7 malignant pleural mesothelioma (MPM), and 3 metastatic breast cancer. The diagnostic yield was significantly higher by pleural biopsy than by cell block [94.2% (33/35) vs. 71.4% (25/35); p = 0.008]. All patients with positive results on cell block also had positive results on pleural biopsy. Eight patients with negative results on cell block had positive results on pleural biopsy (lung adenocarcinoma in 4, sarcomatoid MPM in 3, and metastatic breast cancer in 1). Two patients with negative results on both cell block and pleural biopsy were diagnosed was sarcomatoid MPM by computed tomography-guided needle biopsy and epithelioid MPM by autopsy. Conclusion Pleural biopsy using flex-rigid pleuroscopy was efficient in the diagnosis of malignant pleural diseases. Flex-rigid pleuroscopy with pleural biopsy and pleural effusion cell block analysis should be considered as the initial diagnostic approach for malignant pleural diseases presenting with effusion. PMID:27880851

  12. Association of μ-Calpain and Calpastatin Polymorphisms with Meat Tenderness in a Brahman–Angus Population

    PubMed Central

    Leal-Gutiérrez, Joel D.; Elzo, Mauricio A.; Johnson, Dwain D.; Scheffler, Tracy L.; Scheffler, Jason M.; Mateescu, Raluca G.

    2018-01-01

    Autogenous proteolytic enzymes of the calpain family are implicated in myofibrillar protein degradation. As a result, the μ-calpain gene and its specific inhibitor, calpastatin, have been repeatedly investigated for their association with meat quality traits in cattle; however, no functional mutation has been identified for these two genes. The objectives of this study were: (1) to assess breed composition effect on tenderness; (2) to perform a linkage disequilibrium (LD) analysis in μ-calpain and calpastatin genes as well as an association analyses with tenderness; and (3) to analyze putative functional SNPs inside the significant LD block for an effect on tenderness. Tenderness measurements and genotypes for 16 SNPs in μ-calpain gene and 28 SNPs in calpastatin gene from 673 steers were analyzed. A bioinformatic analysis identified “putative functional SNPs” inside the associated LD block – polymorphisms able to produce a physical and/or chemical change in the DNA, mRNA, or translated protein in silico. Breed composition had a significant (P < 0.0001) effect on tenderness where animals with more than 80% Angus composition had the most tender meat. One 11-kb LD-block and three LD-blocks of 37, 17, and 14 kb in length were identified in the μ-calpain and calpastatin genes, respectively. Out of these, the LD-block 3 in calpastatin, tagged by SNPs located at 7-98566391 and 7-98581038, had a significant effect on tenderness with the TG-CG diplotype being approximately 1 kg more tender than the toughest diplotype, TG-CG. A total of 768 SNPs in the LD-block 3 of calpastatin were included in the bioinformatic analysis, and 28 markers were selected as putative functional SNPs inside the LD-block 3 of calpastatin; however, none of them were polymorphic in this population. Out of 15 initial polymorphisms segregating inside the LD-block 3 of calpastatin in this population, markers ARSUSMARC116, Cast5, rs730723459, and rs210861835 were found to be significantly associated with tenderness. PMID:29520298

  13. Transversus abdominis plane block using a short-acting local anesthetic for postoperative pain after laparoscopic colorectal surgery: a systematic review and meta-analysis.

    PubMed

    Oh, Tak Kyu; Lee, Se-Jun; Do, Sang-Hwan; Song, In-Ae

    2018-02-01

    Transversus abdominis plane (TAP) block using a short-acting local anesthetic as part of multimodal analgesia is efficient in various abdominal surgeries, including laparoscopic surgery. However, information regarding its use in laparoscopic colorectal surgery is still limited and sometimes controversial. Therefore, we conducted a systematic review and meta-analysis to determine whether TAP block using a short-acting anesthetic has a positive postoperative analgesic outcome in patients who have undergone laparoscopic colorectal surgery. We searched for studies comparing the postoperative pain outcome after laparoscopic colorectal surgery between patients who received TAP block and a control group (placebo or no treatment). Outcome measures were early pain at rest (numeric rating scale [NRS] score at 0-2 h postoperatively), late pain at movement (NRS score at 24 h postoperatively), late pain at rest (NRS score at 24 h postoperatively), and postoperative opioid consumption (up to 24 h postoperatively). We used a random-effects model for the meta-analysis and Egger's regression test to detect publication bias. We included six studies involving 452 patients (224 in the TAP block group, 228 in the control group). Early and late pain scores at movement were significantly different between the TAP block and control groups (standardized mean difference: - 0.695, P < 0.0001 for early pain and - 0.242, P = 0.029 for late pain). There was no significant difference between the TAP block and control groups in early pain at rest (P = 0.475), late pain at rest (P = 0.826), and postoperative opioid consumption (P = 0.257). The TAP block using a short-acting anesthetic had a significant effect on the postoperative pain outcome in the early (0-2 h) and late (24 h) period at movement. However, it did not have a significant effect on the postoperative pain outcome in the early (0-2 h) and late (24 h) periods at rest after laparoscopic surgery.

  14. Neural Coding Mechanisms in Gustation.

    DTIC Science & Technology

    1980-09-15

    world is composed of four primary tastes ( sweet , sour, salty , and bitter), and that each of these is carried by a separate and private neural line, thus...ted sweet -sour- salty -bitter types. The mathematical method of analysis was hierarchical cluster analysis based on the responses of many neurons (20 to...block number) Taste Neural coding Neural organization Stimulus organization Olfaction AB TRACT M~ea -i .rvm~ .1* N necffas and idmatity by block mmnbwc

  15. Vision-Based UAV Flight Control and Obstacle Avoidance

    DTIC Science & Technology

    2006-01-01

    denoted it by Vb = (Vb1, Vb2 , Vb3). Fig. 2 shows the block diagram of the proposed vision-based motion analysis and obstacle avoidance system. We denote...structure analysis often involve computation- intensive computer vision tasks, such as feature extraction and geometric modeling. Computation-intensive...First, we extract a set of features from each block. 2) Second, we compute the distance between these two sets of features. In conventional motion

  16. Geometric Theory of Moving Grid Wavefront Sensor

    DTIC Science & Technology

    1977-06-30

    Identify by block numbot) Adaptive Optics WaVefront Sensor Geometric Optics Analysis Moving Ronchi Grid "ABSTRACT (Continue an revere sdde If nooessaY...ad Identify by block nucber)A geometric optics analysis is made for a wavefront sensor that uses a moving Ronchi grid. It is shown that by simple data... optical systems being considered or being developed -3 for imaging an object through a turbulent atmosphere. Some of these use a wavefront sensor to

  17. From spinning conformal blocks to matrix Calogero-Sutherland models

    NASA Astrophysics Data System (ADS)

    Schomerus, Volker; Sobko, Evgeny

    2018-04-01

    In this paper we develop further the relation between conformal four-point blocks involving external spinning fields and Calogero-Sutherland quantum mechanics with matrix-valued potentials. To this end, the analysis of [1] is extended to arbitrary dimensions and to the case of boundary two-point functions. In particular, we construct the potential for any set of external tensor fields. Some of the resulting Schrödinger equations are mapped explicitly to the known Casimir equations for 4-dimensional seed conformal blocks. Our approach furnishes solutions of Casimir equations for external fields of arbitrary spin and dimension in terms of functions on the conformal group. This allows us to reinterpret standard operations on conformal blocks in terms of group-theoretic objects. In particular, we shall discuss the relation between the construction of spinning blocks in any dimension through differential operators acting on seed blocks and the action of left/right invariant vector fields on the conformal group.

  18. Poly(cyclohexylethylene)- block-poly(ethylene oxide) block polymers for metal oxide templating

    DOE PAGES

    Schulze, Morgan W.; Sinturel, Christophe; Hillmyer, Marc A.

    2015-09-01

    A series of poly(cyclohexylethylene)- block-poly(ethylene oxide) (CEO) diblock copolymers were synthesized through tandem anionic polymerizations and heterogeneous catalytic hydrogenation. Solvent-annealed CEO diblock films were used to template dense arrays of inorganic oxide nanodots via simple spin coating of an inorganic precursor solution atop the ordered film. The substantial chemical dissimilarity of the two blocks enables (i) selective inclusion of the inorganic precursor within the PEO domain and (ii) the formation of exceptionally small feature sizes due to a relatively large interaction parameter estimated from mean-field analysis of the order–disorder transition temperatures of compositionally symmetric samples. UV/ozone treatment following incorporation producesmore » an ordered arrangement of oxide nanodots and simultaneously removes the block polymer template. However, we report the smallest particles (6 ± 1 nm) templated from a selective precursor insertion method to date using a block polymer scaffold.« less

  19. Context adaptive binary arithmetic coding-based data hiding in partially encrypted H.264/AVC videos

    NASA Astrophysics Data System (ADS)

    Xu, Dawen; Wang, Rangding

    2015-05-01

    A scheme of data hiding directly in a partially encrypted version of H.264/AVC videos is proposed which includes three parts, i.e., selective encryption, data embedding and data extraction. Selective encryption is performed on context adaptive binary arithmetic coding (CABAC) bin-strings via stream ciphers. By careful selection of CABAC entropy coder syntax elements for selective encryption, the encrypted bitstream is format-compliant and has exactly the same bit rate. Then a data-hider embeds the additional data into partially encrypted H.264/AVC videos using a CABAC bin-string substitution technique without accessing the plaintext of the video content. Since bin-string substitution is carried out on those residual coefficients with approximately the same magnitude, the quality of the decrypted video is satisfactory. Video file size is strictly preserved even after data embedding. In order to adapt to different application scenarios, data extraction can be done either in the encrypted domain or in the decrypted domain. Experimental results have demonstrated the feasibility and efficiency of the proposed scheme.

  20. Super-Encryption Implementation Using Monoalphabetic Algorithm and XOR Algorithm for Data Security

    NASA Astrophysics Data System (ADS)

    Rachmawati, Dian; Andri Budiman, Mohammad; Aulia, Indra

    2018-03-01

    The exchange of data that occurs offline and online is very vulnerable to the threat of data theft. In general, cryptography is a science and art to maintain data secrecy. An encryption is a cryptography algorithm in which data is transformed into cipher text, which is something that is unreadable and meaningless so it cannot be read or understood by other parties. In super-encryption, two or more encryption algorithms are combined to make it more secure. In this work, Monoalphabetic algorithm and XOR algorithm are combined to form a super- encryption. Monoalphabetic algorithm works by changing a particular letter into a new letter based on existing keywords while the XOR algorithm works by using logic operation XOR Since Monoalphabetic algorithm is a classical cryptographic algorithm and XOR algorithm is a modern cryptographic algorithm, this scheme is expected to be both easy-to-implement and more secure. The combination of the two algorithms is capable of securing the data and restoring it back to its original form (plaintext), so the data integrity is still ensured.

  1. Enhanced rearrangement technique for secure data transmission: case study credit card process

    NASA Astrophysics Data System (ADS)

    Vyavahare, Tushar; Tekade, Darshana; Nayak, Saurabh; kumar, N. Suresh; Blessy Trencia Lincy, S. S.

    2017-11-01

    Encryption of data is very important in order to keep the data secure and make secure transactions and transmission of data. Such as online shopping. whenever we give our card details there is possibility of data being hacked or intruded. So to secure that we need to encrypt the data and decryption strategy should be known only to that particular bank. Therefore to achieve this objective RSA algorithm can be used. Where only intended sender and receiver can know about the encryption and decryption of data. To make the RSA technique more secure in this paper we propose the technique we call it Modified RSA. for which a transposition module is designed which uses Row Transposition method to encrypt the data. Before giving the card details to RSA the input will be given to this transposition module which will scrambles the data and rearranges it. Output of transposition will be then provided to the modified RSA which produces the cipher text to send over the network. Use of RSA and the transposition module will provide the dual security to whole system.

  2. Quantum key based burst confidentiality in optical burst switched networks.

    PubMed

    Balamurugan, A M; Sivasubramanian, A

    2014-01-01

    The optical burst switching (OBS) is an emergent result to the technology concern that could achieve a feasible network in future. They are endowed with the ability to meet the bandwidth requirement of those applications that require intensive bandwidth. There are more domains opening up in the OBS that evidently shows their advantages and their capability to face the future network traffic. However, the concept of OBS is still far from perfection facing issues in case of security threat. The transfer of optical switching paradigm to optical burst switching faces serious downfall in the fields of burst aggregation, routing, authentication, dispute resolution, and quality of service (QoS). This paper deals with employing RC4 (stream cipher) to encrypt and decrypt bursts thereby ensuring the confidentiality of the burst. Although the use of AES algorithm has already been proposed for the same issue, by contrasting the two algorithms under the parameters of burst encryption and decryption time, end-to-end delay, it was found that RC4 provided better results. This paper looks to provide a better solution for the confidentiality of the burst in OBS networks.

  3. Fast, Parallel and Secure Cryptography Algorithm Using Lorenz's Attractor

    NASA Astrophysics Data System (ADS)

    Marco, Anderson Gonçalves; Martinez, Alexandre Souto; Bruno, Odemir Martinez

    A novel cryptography method based on the Lorenz's attractor chaotic system is presented. The proposed algorithm is secure and fast, making it practical for general use. We introduce the chaotic operation mode, which provides an interaction among the password, message and a chaotic system. It ensures that the algorithm yields a secure codification, even if the nature of the chaotic system is known. The algorithm has been implemented in two versions: one sequential and slow and the other, parallel and fast. Our algorithm assures the integrity of the ciphertext (we know if it has been altered, which is not assured by traditional algorithms) and consequently its authenticity. Numerical experiments are presented, discussed and show the behavior of the method in terms of security and performance. The fast version of the algorithm has a performance comparable to AES, a popular cryptography program used commercially nowadays, but it is more secure, which makes it immediately suitable for general purpose cryptography applications. An internet page has been set up, which enables the readers to test the algorithm and also to try to break into the cipher.

  4. Quantum Key Based Burst Confidentiality in Optical Burst Switched Networks

    PubMed Central

    Balamurugan, A. M.; Sivasubramanian, A.

    2014-01-01

    The optical burst switching (OBS) is an emergent result to the technology concern that could achieve a feasible network in future. They are endowed with the ability to meet the bandwidth requirement of those applications that require intensive bandwidth. There are more domains opening up in the OBS that evidently shows their advantages and their capability to face the future network traffic. However, the concept of OBS is still far from perfection facing issues in case of security threat. The transfer of optical switching paradigm to optical burst switching faces serious downfall in the fields of burst aggregation, routing, authentication, dispute resolution, and quality of service (QoS). This paper deals with employing RC4 (stream cipher) to encrypt and decrypt bursts thereby ensuring the confidentiality of the burst. Although the use of AES algorithm has already been proposed for the same issue, by contrasting the two algorithms under the parameters of burst encryption and decryption time, end-to-end delay, it was found that RC4 provided better results. This paper looks to provide a better solution for the confidentiality of the burst in OBS networks. PMID:24578663

  5. Pseudo-random generator based on Chinese Remainder Theorem

    NASA Astrophysics Data System (ADS)

    Bajard, Jean Claude; Hördegen, Heinrich

    2009-08-01

    Pseudo-Random Generators (PRG) are fundamental in cryptography. Their use occurs at different level in cipher protocols. They need to verify some properties for being qualified as robust. The NIST proposes some criteria and a tests suite which gives informations on the behavior of the PRG. In this work, we present a PRG constructed from the conversion between further residue systems of representation of the elements of GF(2)[X]. In this approach, we use some pairs of co-prime polynomials of degree k and a state vector of 2k bits. The algebraic properties are broken by using different independent pairs during the process. Since this method is reversible, we also can use it as a symmetric crypto-system. We evaluate the cost of a such system, taking into account that some operations are commonly implemented on crypto-processors. We give the results of the different NIST Tests and we explain this choice compare to others found in the literature. We describe the behavior of this PRG and explain how the different rounds are chained for ensuring a fine secure randomness.

  6. Compression of Encrypted Images Using Set Partitioning In Hierarchical Trees Algorithm

    NASA Astrophysics Data System (ADS)

    Sarika, G.; Unnithan, Harikuttan; Peter, Smitha

    2011-10-01

    When it is desired to transmit redundant data over an insecure channel, it is customary to encrypt the data. For encrypted real world sources such as images, the use of Markova properties in the slepian-wolf decoder does not work well for gray scale images. Here in this paper we propose a method of compression of an encrypted image. In the encoder section, the image is first encrypted and then it undergoes compression in resolution. The cipher function scrambles only the pixel values, but does not shuffle the pixel locations. After down sampling, each sub-image is encoded independently and the resulting syndrome bits are transmitted. The received image undergoes a joint decryption and decompression in the decoder section. By using the local statistics based on the image, it is recovered back. Here the decoder gets only lower resolution version of the image. In addition, this method provides only partial access to the current source at the decoder side, which improves the decoder's learning of the source statistics. The source dependency is exploited to improve the compression efficiency. This scheme provides better coding efficiency and less computational complexity.

  7. Efficiently Multi-User Searchable Encryption Scheme with Attribute Revocation and Grant for Cloud Storage

    PubMed Central

    Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling

    2016-01-01

    Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model. PMID:27898703

  8. Information security: where computer science, economics and psychology meet.

    PubMed

    Anderson, Ross; Moore, Tyler

    2009-07-13

    Until ca. 2000, information security was seen as a technological discipline, based on computer science but with mathematics helping in the design of ciphers and protocols. That perspective started to change as researchers and practitioners realized the importance of economics. As distributed systems are increasingly composed of machines that belong to principals with divergent interests, incentives are becoming as important to dependability as technical design. A thriving new field of information security economics provides valuable insights not just into 'security' topics such as privacy, bugs, spam and phishing, but into more general areas of system dependability and policy. This research programme has recently started to interact with psychology. One thread is in response to phishing, the most rapidly growing form of online crime, in which fraudsters trick people into giving their credentials to bogus websites; a second is through the increasing importance of security usability; and a third comes through the psychology-and-economics tradition. The promise of this multidisciplinary research programme is a novel framework for analysing information security problems-one that is both principled and effective.

  9. Efficiently Multi-User Searchable Encryption Scheme with Attribute Revocation and Grant for Cloud Storage.

    PubMed

    Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling

    2016-01-01

    Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model.

  10. EBSD Analysis of Relationship Between Microstructural Features and Toughness of a Medium-Carbon Quenching and Partitioning Bainitic Steel

    NASA Astrophysics Data System (ADS)

    Li, Qiangguo; Huang, Xuefei; Huang, Weigang

    2017-12-01

    A multiphase microstructure of bainite, martensite and retained austenite in a 0.3C bainitic steel was obtained by a novel bainite isothermal transformation plus quenching and partitioning (B-QP) process. The correlations between microstructural features and toughness were investigated by electron backscatter diffraction (EBSD), and the results showed that the multiphase microstructure containing approximately 50% bainite exhibits higher strength (1617 MPa), greater elongation (18.6%) and greater impact toughness (103 J) than the full martensite. The EBSD analysis indicated that the multiphase microstructure with a smaller average local misorientation (1.22°) has a lower inner stress concentration possibility and that the first formed bainitic ferrite plates in the multiphase microstructure can refine subsequently generated packets and blocks. The corresponding packet and block average size decrease from 11.9 and 2.3 to 8.4 and 1.6 μm, respectively. A boundary misorientation analysis indicated that the multiphase microstructure has a higher percentage of high-angle boundaries (67.1%) than the full martensite (57.9%) because of the larger numbers and smaller sizes of packets and blocks. The packet boundary obstructs crack propagation more effectively than the block boundary.

  11. Damage sensitivity investigations of EMI technique on different materials through coupled field analysis

    NASA Astrophysics Data System (ADS)

    Joshi, Bhrigu; Adhikari, Sailesh; Bhalla, Suresh

    2016-04-01

    This paper presents a comparative study through the piezoelectric coupled field analysis mode of finite element method (FEM) on detection of damages of varying magnitude, encompassing three different types of structural materials, using piezo impedance transducers. An aluminum block, a concrete block and a steel block of dimensions 48×48×10 mm were modelled in finite element software ANSYS. A PZT patch of 10×10×0.3 mm was also included in the model as surface bonded on the block. Coupled field analysis (CFA) was performed to obtain the admittance signatures of the piezo sensor in the frequency range of 0-250 kHz. The root mean square deviation (RMSD) index was employed to quantify the degree of variation of the signatures. It was found that concrete exhibited deviation in the signatures only with the change of damping values. However, the other two materials showed variation in the signatures even with changes in density and elasticity values in a small portion of the specimen. The comparative study shows that the PZT patches are more sensitive to damage detection in materials with low damping and the sensitivity typically decreases with increase in the damping.

  12. Tremor analysis separates Parkinson's disease and dopamine receptor blockers induced parkinsonism.

    PubMed

    Shaikh, Aasef G

    2017-05-01

    Parkinson's disease, the most common cause of parkinsonism is often difficult to distinguish from its second most common etiology due to exposure to dopamine receptor blocking agents such as antiemetics and neuroleptics. Dual axis accelerometry was used to quantify tremor in 158 patients with parkinsonism; 62 had Parkinson's disease and 96 were clinically diagnosed with dopamine receptor blocking agent-induced parkinsonism. Tremor was measured while subjects rested arms (resting tremor), outstretched arms in front (postural tremor), and reached a target (kinetic tremor). Cycle-by-cycle analysis was performed to measure cycle duration, oscillation amplitude, and inter-cycle variations in the frequency. Patients with dopamine receptor blocker induced parkinsonism had lower resting and postural tremor amplitude. There was a substantial increase of kinetic tremor amplitude in both disorders. Postural and resting tremor in subjects with dopamine receptor blocking agent-induced parkinsonism was prominent in the abduction-adduction plane. In contrast, the Parkinson's disease tremor had equal amplitude in all three planes of motion. Tremor frequency was comparable in both groups. Remarkable variability in the width of the oscillatory cycles suggested irregularity in the oscillatory waveforms in both subtypes of parkinsonism. Quantitative tremor analysis can distinguish Parkinson's disease from dopamine receptor blocking agent-induced parkinsonism.

  13. Aspects of the evolution of the West Antarctic margin of Gondwanaland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grunow, A.M.

    1989-01-01

    A combination of paleomagnetism, structural field mapping, microprobe analysis, microfabric analysis and {sup 40}Ar/{sup 39}Ar geochronology was used to elucidate the history of the West Antarctic crustal block and the evolution of subduction complexes along the Scotia Ridge. West Antarctica is composed of four crustal blocks whose relationship to East Antarctica and to each other throughout the Phanerozoic is not well known. These blocks are: the Ellsworth-Whitmore Mountains (EWM); the Antarctic Peninsula (AP); Thurston Island (TI); Marie Byrd Land (MBL). Paleomagnetic sampling and analysis were conducted on rocks from the EWM and TI blocks in the hope of constraining themore » motion of these blocks and the opening history of the Weddell Sea. The paleomagnetic results suggest that the AP, EWM, and TI blocks have moved relative to East Antarctica prior to the mid-Cretaceous and that the main opening of the Weddell Sea was between the Early and mid-Cretaceous. Detailed field mapping was conducted on the subduction complexes of the Scotia Metamorphic Complex (SMC) on Smith Island and Elephant Island (Antarctica). Polyphase ductile deformation characterizes the Smith Island and Elephant Island tectonites. Microprobe analyses indicate that the blue amphiboles from both areas are primary crossite. Pressure-temperature estimates for Smith Island blueschist metamorphism are {approximately}350 C at 6-7 kbars. The {sup 40}Ar/{sup 39}Ar geochronology indicates a complex thermal evolution for the SMC. The north to south increase in intensity of deformation and metamorphism on Elephant Island corresponds to decrease in {sup 40}Ar/{sup 39}Ar age. Uplift of the Smith Island blueschists occurred since 47 Ma while most of the uplift on Elephant Island occurred since {approximately}102 Ma.« less

  14. A New Cell Block Method for Multiple Immunohistochemical Analysis of Circulating Tumor Cells in Patients with Liver Cancer.

    PubMed

    Nam, Soo Jeong; Yeo, Hyun Yang; Chang, Hee Jin; Kim, Bo Hyun; Hong, Eun Kyung; Park, Joong-Won

    2016-10-01

    We developed a new method of detecting circulating tumor cells (CTCs) in liver cancer patients by constructing cell blocks from peripheral blood cells, including CTCs, followed by multiple immunohistochemical analysis. Cell blockswere constructed from the nucleated cell pellets of peripheral blood afterremoval of red blood cells. The blood cell blocks were obtained from 29 patients with liver cancer, and from healthy donor blood spikedwith seven cell lines. The cell blocks and corresponding tumor tissues were immunostained with antibodies to seven markers: cytokeratin (CK), epithelial cell adhesion molecule (EpCAM), epithelial membrane antigen (EMA), CK18, α-fetoprotein (AFP), Glypican 3, and HepPar1. The average recovery rate of spiked SW620 cells from blood cell blocks was 91%. CTCs were detected in 14 out of 29 patients (48.3%); 11/23 hepatocellular carcinomas (HCC), 1/2 cholangiocarcinomas (CC), 1/1 combined HCC-CC, and 1/3 metastatic cancers. CTCs from 14 patients were positive for EpCAM (57.1%), EMA (42.9%), AFP (21.4%), CK18 (14.3%), Gypican3 and CK (7.1%, each), and HepPar1 (0%). Patients with HCC expressed EpCAM, EMA, CK18, and AFP in tissue and/or CTCs, whereas CK, HepPar1, and Glypican3 were expressed only in tissue. Only EMA was significantly associated with the expressions in CTC and tissue. CTC detection was associated with higher T stage and portal vein invasion in HCC patients. This cell block method allows cytologic detection and multiple immunohistochemical analysis of CTCs. Our results show that tissue biomarkers of HCC may not be useful for the detection of CTC. EpCAM could be a candidate marker for CTCs in patients with HCC.

  15. Analgesia for total knee arthroplasty: a meta-analysis comparing local infiltration and femoral nerve block.

    PubMed

    Mei, ShuYa; Jin, ShuQing; Chen, ZhiXia; Ding, XiBing; Zhao, Xiang; Li, Quan

    2015-09-01

    Patients frequently experience postoperative pain after a total knee arthroplasty; such pain is always challenging to treat and may delay the patient's recovery. It is unclear whether local infiltration or a femoral nerve block offers a better analgesic effect after total knee arthroplasty.We performed a systematic review and meta-analysis of randomized controlled trials to compare local infiltration with a femoral nerve block in patients who underwent a primary unilateral total knee arthroplasty. We searched Pubmed, EMBASE, and the Cochrane Library through December 2014. Two reviewers scanned abstracts and extracted data. The data collected included numeric rating scale values for pain at rest and pain upon movement and opioid consumption in the first 24 hours. Mean differences with 95% confidence intervals were calculated for each end point. A sensitivity analysis was conducted to evaluate potential sources of heterogeneity.While the numeric rating scale values for pain upon movement (MD-0.62; 95%CI: -1.13 to -0.12; p=0.02) in the first 24 hours differed significantly between the patients who received local infiltration and those who received a femoral nerve block, there were no differences in the numeric rating scale results for pain at rest (MD-0.42; 95%CI:-1.32 to 0.47; p=0.35) or opioid consumption (MD 2.92; 95%CI:-1.32 to 7.16; p=0.18) in the first 24 hours.Local infiltration and femoral nerve block showed no significant differences in pain intensity at rest or opioid consumption after total knee arthroplasty, but the femoral nerve block was associated with reduced pain upon movement.

  16. Analgesia for total knee arthroplasty: a meta-analysis comparing local infiltration and femoral nerve block

    PubMed Central

    Mei, ShuYa; Jin, ShuQing; Chen, ZhiXia; Ding, XiBing; Zhao, Xiang; Li, Quan

    2015-01-01

    Patients frequently experience postoperative pain after a total knee arthroplasty; such pain is always challenging to treat and may delay the patient's recovery. It is unclear whether local infiltration or a femoral nerve block offers a better analgesic effect after total knee arthroplasty. We performed a systematic review and meta-analysis of randomized controlled trials to compare local infiltration with a femoral nerve block in patients who underwent a primary unilateral total knee arthroplasty. We searched Pubmed, EMBASE, and the Cochrane Library through December 2014. Two reviewers scanned abstracts and extracted data. The data collected included numeric rating scale values for pain at rest and pain upon movement and opioid consumption in the first 24 hours. Mean differences with 95% confidence intervals were calculated for each end point. A sensitivity analysis was conducted to evaluate potential sources of heterogeneity. While the numeric rating scale values for pain upon movement (MD-0.62; 95%CI: -1.13 to -0.12; p=0.02) in the first 24 hours differed significantly between the patients who received local infiltration and those who received a femoral nerve block, there were no differences in the numeric rating scale results for pain at rest (MD-0.42; 95%CI:-1.32 to 0.47; p=0.35) or opioid consumption (MD 2.92; 95%CI:-1.32 to 7.16; p=0.18) in the first 24 hours. Local infiltration and femoral nerve block showed no significant differences in pain intensity at rest or opioid consumption after total knee arthroplasty, but the femoral nerve block was associated with reduced pain upon movement. PMID:26375568

  17. Analysis of the impact of crude oil price fluctuations on China's stock market in different periods-Based on time series network model

    NASA Astrophysics Data System (ADS)

    An, Yang; Sun, Mei; Gao, Cuixia; Han, Dun; Li, Xiuming

    2018-02-01

    This paper studies the influence of Brent oil price fluctuations on the stock prices of China's two distinct blocks, namely, the petrochemical block and the electric equipment and new energy block, applying the Shannon entropy of information theory. The co-movement trend of crude oil price and stock prices is divided into different fluctuation patterns with the coarse-graining method. Then, the bivariate time series network model is established for the two blocks stock in five different periods. By joint analysis of the network-oriented metrics, the key modes and underlying evolutionary mechanisms were identified. The results show that the both networks have different fluctuation characteristics in different periods. Their co-movement patterns are clustered in some key modes and conversion intermediaries. The study not only reveals the lag effect of crude oil price fluctuations on the stock in Chinese industry blocks but also verifies the necessity of research on special periods, and suggests that the government should use different energy policies to stabilize market volatility in different periods. A new way is provided to study the unidirectional influence between multiple variables or complex time series.

  18. PERTS: A Prototyping Environment for Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.

    1991-01-01

    We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.

  19. Reversible Nerve Conduction Block Using Kilohertz Frequency Alternating Current

    PubMed Central

    Kilgore, Kevin L.; Bhadra, Niloy

    2013-01-01

    Objectives The features and clinical applications of balanced-charge kilohertz frequency alternating currents (KHFAC) are reviewed. Preclinical studies of KHFAC block have demonstrated that it can produce an extremely rapid and reversible block of nerve conduction. Recent systematic analysis and experimentation utilizing KHFAC block has resulted in a significant increase in interest in KHFAC block, both scientifically and clinically. Materials and Methods We review the history and characteristics of KHFAC block, the methods used to investigate this type of block, the experimental evaluation of block, and the electrical parameters and electrode designs needed to achieve successful block. We then analyze the existing clinical applications of high frequency currents, comparing the early results with the known features of KHFAC block. Results Although many features of KHFAC block have been characterized, there is still much that is unknown regarding the response of neural structures to rapidly fluctuating electrical fields. The clinical reports to date do not provide sufficient information to properly evaluate the mechanisms that result in successful or unsuccessful treatment. Conclusions KHFAC nerve block has significant potential as a means of controlling nerve activity for the purpose of treating disease. However, early clinical studies in the use of high frequency currents for the treatment of pain have not been designed to elucidate mechanisms or allow direct comparisons to preclinical data. We strongly encourage the careful reporting of the parameters utilized in these clinical studies, as well as the development of outcome measures that could illuminate the mechanisms of this modality. PMID:23924075

  20. Viking Orbiter 1975 articulation control subsystem design analysis

    NASA Technical Reports Server (NTRS)

    Horiuchi, H. H.; Vallas, L. J.

    1973-01-01

    The articulation control subsystem, developed for the Viking Orbiter 1975 spacecraft, is a digital, multiplexed, closed-loop servo system used to control the pointing and positioning of the science scan platform and the high-gain communication antenna, and to position the solar-energy controller louver blades for the thermal control of the propellant tanks. The development, design, and anlaysis of the subsystem is preliminary. The subsystem consists of a block-redundant control electronics multiplexed among eight control actuators. Each electronics block is capable of operating either individually or simultaneously with the second block. This provides the subsystem the capability of simultaneous two-actuator control or a single actuator control with the second block in a stand-by redundant mode. The result of the preliminary design and analysis indicates that the subsystem will perform satisfactorily in the Viking Orbiter 1975 mission. Some of the parameter values used, particularly those in the subsystem dynamics and the error estimates, are preliminary and the results will be updated as more accurate parameter values become available.

  1. Comprehensive two-dimensional liquid chromatographic analysis of poloxamers.

    PubMed

    Malik, Muhammad Imran; Lee, Sanghoon; Chang, Taihyun

    2016-04-15

    Poloxamers are low molar mass triblock copolymers of poly(ethylene oxide) (PEO) and poly(propylene oxide) (PPO), having number of applications as non-ionic surfactants. Comprehensive one and two-dimensional liquid chromatographic (LC) analysis of these materials is proposed in this study. The separation of oligomers of both types (PEO and PPO) is demonstrated for several commercial poloxamers. This is accomplished at the critical conditions for one of the block while interaction for the other block. Reversed phase LC at CAP of PEO allowed for oligomeric separation of triblock copolymers with regard to PPO block whereas normal phase LC at CAP of PPO renders oligomeric separation with respect to PEO block. The oligomeric separation with regard to PEO and PPO are coupled online (comprehensive 2D-LC) to reveal two-dimensional contour plots by unconventional 2D IC×IC (interaction chromatography) coupling. The study provides chemical composition mapping of both PEO and PPO, equivalent to combined molar mass and chemical composition mapping for several commercial poloxamers. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Blind image quality assessment via probabilistic latent semantic analysis.

    PubMed

    Yang, Xichen; Sun, Quansen; Wang, Tianshu

    2016-01-01

    We propose a blind image quality assessment that is highly unsupervised and training free. The new method is based on the hypothesis that the effect caused by distortion can be expressed by certain latent characteristics. Combined with probabilistic latent semantic analysis, the latent characteristics can be discovered by applying a topic model over a visual word dictionary. Four distortion-affected features are extracted to form the visual words in the dictionary: (1) the block-based local histogram; (2) the block-based local mean value; (3) the mean value of contrast within a block; (4) the variance of contrast within a block. Based on the dictionary, the latent topics in the images can be discovered. The discrepancy between the frequency of the topics in an unfamiliar image and a large number of pristine images is applied to measure the image quality. Experimental results for four open databases show that the newly proposed method correlates well with human subjective judgments of diversely distorted images.

  3. Java Programs for Using Newmark's Method and Simplified Decoupled Analysis to Model Slope Performance During Earthquakes

    USGS Publications Warehouse

    Jibson, Randall W.; Jibson, Matthew W.

    2003-01-01

    Landslides typically cause a large proportion of earthquake damage, and the ability to predict slope performance during earthquakes is important for many types of seismic-hazard analysis and for the design of engineered slopes. Newmark's method for modeling a landslide as a rigid-plastic block sliding on an inclined plane provides a useful method for predicting approximate landslide displacements. Newmark's method estimates the displacement of a potential landslide block as it is subjected to earthquake shaking from a specific strong-motion record (earthquake acceleration-time history). A modification of Newmark's method, decoupled analysis, allows modeling landslides that are not assumed to be rigid blocks. This open-file report is available on CD-ROM and contains Java programs intended to facilitate performing both rigorous and simplified Newmark sliding-block analysis and a simplified model of decoupled analysis. For rigorous analysis, 2160 strong-motion records from 29 earthquakes are included along with a search interface for selecting records based on a wide variety of record properties. Utilities are available that allow users to add their own records to the program and use them for conducting Newmark analyses. Also included is a document containing detailed information about how to use Newmark's method to model dynamic slope performance. This program will run on any platform that supports the Java Runtime Environment (JRE) version 1.3, including Windows, Mac OSX, Linux, Solaris, etc. A minimum of 64 MB of available RAM is needed, and the fully installed program requires 400 MB of disk space.

  4. Opioid-sparing effects of the thoracic interfascial plane blocks: A meta-analysis of randomized controlled trials

    PubMed Central

    Singh, Preet Mohinder; Borle, Anuradha; Kaur, Manpreet; Trikha, Anjan; Sinha, Ashish

    2018-01-01

    Background: Thoracic interfascial plane blocks and modification (PECS) have recently gained popularity for analgesic potential during breast surgery. We evaluate/consolidate the evidence on opioid-sparing effect of PECS blocks in comparison with conventional intravenous analgesia (IVA) and paravertebral block (PVB). Materials and Methods: Prospective, randomized controlled trials comparing PECS block to conventional IVA or PVB in patients undergoing breast surgery published till June 2017 were searched in the medical database. Comparisons were made for 24-h postoperative morphine consumption and intraoperative fentanyl-equivalent consumption. Results: Final analysis included nine trials (PECS vs. IVA 4 trials and PECS vs. PVB 5 trials). PECS block showed a decreased intraoperative fentanyl consumption over IVA by 49.20 mcg (95% confidence interval [CI] =42.67–55.74) (I2 = 98.47%, P < 0.001) and PVB by 15.88 mcg (95% CI = 12.95–18.81) (I2 = 95.51%, P < 0.001). Postoperative, 24-h morphine consumption with PECS block was lower than IVA by 7.66 mg (95% CI being 6.23–9.10) (I2 = 63.15, P < 0.001) but was higher than PVB group by 1.26 mg (95% CI being 0.91–1.62) (I2 = 99.53%, P < 0.001). Two cases of pneumothorax were reported with PVB, and no complication was reported in any other group. Conclusions: Use of PECS block and its modifications with general anesthesia for breast surgery has significant opioid-sparing effect intraoperatively and during the first 24 h after surgery. It also has higher intraoperative opioid-sparing effect when compared to PVB. During the 1st postoperative day, PVB has slightly more morphine sparing potential that may however be associated with higher complication rates. The present PECS block techniques show marked interstudy variations and need standardization. PMID:29416465

  5. Soft-decision decoding techniques for linear block codes and their error performance analysis

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1996-01-01

    The first paper presents a new minimum-weight trellis-based soft-decision iterative decoding algorithm for binary linear block codes. The second paper derives an upper bound on the probability of block error for multilevel concatenated codes (MLCC). The bound evaluates difference in performance for different decompositions of some codes. The third paper investigates the bit error probability code for maximum likelihood decoding of binary linear codes. The fourth and final paper included in this report is concerns itself with the construction of multilevel concatenated block modulation codes using a multilevel concatenation scheme for the frequency non-selective Rayleigh fading channel.

  6. Influence of additional heat exchanger block on directional solidification system for growing multi-crystalline silicon ingot - A simulation investigation

    NASA Astrophysics Data System (ADS)

    Nagarajan, S. G.; Srinivasan, M.; Aravinth, K.; Ramasamy, P.

    2018-04-01

    Transient simulation has been carried out for analyzing the heat transfer properties of Directional Solidification (DS) furnace. The simulation results revealed that the additional heat exchanger block under the bottom insulation on the DS furnace has enhanced the control of solidification of the silicon melt. Controlled Heat extraction rate during the solidification of silicon melt is requisite for growing good quality ingots which has been achieved by the additional heat exchanger block. As an additional heat exchanger block, the water circulating plate has been placed under the bottom insulation. The heat flux analysis of DS system and the temperature distribution studies of grown ingot confirm that the established additional heat exchanger block on the DS system gives additional benefit to the mc-Si ingot.

  7. Implicit Block ACK Scheme for IEEE 802.11 WLANs

    PubMed Central

    Sthapit, Pranesh; Pyun, Jae-Young

    2016-01-01

    The throughput of IEEE 802.11 standard is significantly bounded by the associated Medium Access Control (MAC) overhead. Because of the overhead, an upper limit exists for throughput, which is bounded, including situations where data rates are extremely high. Therefore, an overhead reduction is necessary to achieve higher throughput. The IEEE 802.11e amendment introduced the block ACK mechanism, to reduce the number of control messages in MAC. Although the block ACK scheme greatly reduces overhead, further improvements are possible. In this letter, we propose an implicit block ACK method that further reduces the overhead associated with IEEE 802.11e’s block ACK scheme. The mathematical analysis results are presented for both the original protocol and the proposed scheme. A performance improvement of greater than 10% was achieved with the proposed implementation.

  8. Improved 3-D turbomachinery CFD algorithm

    NASA Technical Reports Server (NTRS)

    Janus, J. Mark; Whitfield, David L.

    1988-01-01

    The building blocks of a computer algorithm developed for the time-accurate flow analysis of rotating machines are described. The flow model is a finite volume method utilizing a high resolution approximate Riemann solver for interface flux definitions. This block LU implicit numerical scheme possesses apparent unconditional stability. Multi-block composite gridding is used to orderly partition the field into a specified arrangement. Block interfaces, including dynamic interfaces, are treated such as to mimic interior block communication. Special attention is given to the reduction of in-core memory requirements by placing the burden on secondary storage media. Broad applicability is implied, although the results presented are restricted to that of an even blade count configuration. Several other configurations are presently under investigation, the results of which will appear in subsequent publications.

  9. The Tsaoling 1941 Landslide, New Insight of Numerical Simulation of Discrete Element Model

    NASA Astrophysics Data System (ADS)

    Tang, C.-L.; Hu, J.-C.; Lin, M.-L.

    2009-04-01

    Large earthquakes in the southeastern Taiwan are not rare in the historical catalogue. Tsaoling, located southeast of Taiwan, last five large landslides occurred in the 19th and 20th centuries. According to the literature about the Tsaoling landslide, we concluded four characteristics of the Tsaoling landslide, (1) repeated (2) multi-landslide surface, (3) huge landslide block, and (4) some people survived after sliding a long distance (>2 km). This is the reason why we want to understand the causes of the repeated landslides in Tsaoling and its mechanisms. However, there is not any record about the landslide in 1862 and the most of the landslide evidence disappeared. Hence, we aim at the landslide dynamics of the 1941 landslide in this study. Tsaoling area is located in a large dipping towards the south-southwest monocline. The dip of strata toward the SSW is similar to the both sides of the Chinshui River valley. The bedrock of the Tsaoling area is Pliocene in age and belongs to the upper Chinshui Shale and the lower Cholan Formation. The plane failure analysis and Newmark displacement method are common for slope stability in recent years. However, the plane failure analysis can only provide a safety factor. When the safe factor (FS) is less than 1, it can only indicate that the slope is unstable. The result of Newmark displacement method is a value of displacement length. Both assumptions of the analysis are based on a rigid body. For the large landslide, like the Tsaoling landslide, the volume of landslide masses are over 108 m3, and the landslide block cannot be considered a rigid body. We considered the block as a quasi-rigid body, because the blocks are deformable and jointed. The original version of Distinct Element Method (DEM) was devoted to the modeling of rock-block systems and it was lately applied to the modeling of granular material. The calculation cycle in PFC2D is a time-stepping algorithm that consists of the repeated application of the law of motion to each particle, a force-displacement law to each contact, and a constant updating of wall positions. The physical properties of the particles in the model can be traced in time dominant (i.e. velocity, displacement, force, and stress). During the simulating, we can get the variation of physical properties, so the inter-block change of displacement, force, and stress could be monitored. After the seismic shaking, the result of the PFC model can be divided into three portions, upper (thick), middle (transitional) and lower (thin). The shear displacements of the three parts on the sliding plane are not agreement. The displacement of the lower part block is large than the upper and middle part of the blocks. The shear displacement of middle part is between upper and lower part. During the shaking of the earthquake, the different parts in the block collide with each other, and the upper part of the block was hit back and stayed in origin position or slid a short distance, but the lower part of the block was hit down by the upper block. The collision pushed down a certain length to the lower part of the block. The shear length just lost the strength of the sliding plane and induced the landslide during the 1941 earthquake. The upper part of the block stayed on the slope but revealed unstable. Eight months later, the upper part of the block slid down was induced by a 700 mm downpour in three days.

  10. Tectonic geomorphology of the New Madrid seismic zone based on imaging of digital topographic data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, L.

    1993-03-01

    Topographic analysis using digital elevation data of the New Madrid region focuses on topographic features that occur at several spatial scales and can be used to delineate distinct anomalies. In this region, topographic anomalies occur as domal or elongate uplifts and bowl-shaped depressions approximately 1--10 km in size, topographic lineaments, and differences in topographic blocking across 50 km long boundaries. In order to fully explain these topographic anomalies, tectonic processes may be required. Imaging is based on digital topographic data from USGS 30 arc-second, 3 arc-second, and 30 m resolutions. Imaging of these data uses standard imaging processing techniques tomore » examine topography within the contexts of geomorphological hypothesis testing. A good example is the use of thresholding to highlight areas of unusually high elevation given the hypothesis of fluvial landscape architecture. Thresholding delineates topographic features such as the Tiptonville dome which is strongly believed to be tectonic in origin. To determine the pattern of topographic blocking, defined as a pattern that topography assumes when constrained by active forces other than erosion alone, low frequency passing spatial convolutions are used as filters and the resulting data are sliced into blocks according to pseudoelevations that produce a stable block pattern. The resultant blocks are analyzed according to its structural pattern of block size and block orientation. This analysis suggests that a topographic boundary cuts across the Mississippi embayment from near the Newport pluton on the west, to the area south of Memphis on east.« less

  11. Pattern-based IP block detection, verification, and variability analysis

    NASA Astrophysics Data System (ADS)

    Ahmad Ibrahim, Muhamad Asraf Bin; Muhsain, Mohamad Fahmi Bin; Kamal Baharin, Ezni Aznida Binti; Sweis, Jason; Lai, Ya-Chieh; Hurat, Philippe

    2018-03-01

    The goal of a foundry partner is to deliver high quality silicon product to its customers on time. There is an assumed trust that the silicon will yield, function and perform as expected when the design fits all the sign-off criteria. The use of Intellectual Property (IP) blocks is very common today and provides the customer with pre-qualified and optimized functions for their design thus shortening the design cycle. There are many methods by which an IP Block can be generated and placed within layout. Even with the most careful methods and following of guidelines comes the responsibility of sign-off checking. A foundry needs to detect where these IP Blocks have been placed and look for any violations. This includes DRC clean modifications to the IP Block which may or may not be intentional. Using a pattern-based approach to detect all IP Blocks used provides the foundry advanced capabilities to analyze them further for any kind of changes which could void the OPC and process window optimizations. Having any changes in an IP Block could cause functionality changes or even failures. This also opens the foundry to legal and cost issues while at the same time forcing re-spins of the design. In this publication, we discuss the methodology we have employed to avoid process issues and tape-out errors while at the same time reduce our manual work and improve the turnaround time. We are also able to use our pattern analysis to improve our OPC optimizations when modifications are encountered which have not been seen before.

  12. Experimental investigation and CFD analysis on cross flow in the core of PMR200

    DOE PAGES

    Lee, Jeong -Hun; Yoon, Su -Jong; Cho, Hyoung -Kyu; ...

    2015-04-16

    The Prismatic Modular Reactor (PMR) is one of the major Very High Temperature Reactor (VHTR) concepts, which consists of hexagonal prismatic fuel blocks and reflector blocks made of nuclear gradegraphite. However, the shape of the graphite blocks could be easily changed by neutron damage duringthe reactor operation and the shape change can create gaps between the blocks inducing the bypass flow.In the VHTR core, two types of gaps, a vertical gap and a horizontal gap which are called bypass gap and cross gap, respectively, can be formed. The cross gap complicates the flow field in the reactor core by connectingmore » the coolant channel to the bypass gap and it could lead to a loss of effective coolant flow in the fuel blocks. Thus, a cross flow experimental facility was constructed to investigate the cross flow phenomena in the core of the VHTR and a series of experiments were carried out under varying flow rates and gap sizes. The results of the experiments were compared with CFD (Computational Fluid Dynamics) analysis results in order to verify its prediction capability for the cross flow phenomena. Fairly good agreement was seen between experimental results and CFD predictions and the local characteristics of the cross flow was discussed in detail. Based on the calculation results, pressure loss coefficient across the cross gap was evaluated, which is necessary for the thermo-fluid analysis of the VHTR core using a lumped parameter code.« less

  13. PERTS: A Prototyping Environment for Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.

    1993-01-01

    PERTS is a prototyping environment for real-time systems. It is being built incrementally and will contain basic building blocks of operating systems for time-critical applications, tools, and performance models for the analysis, evaluation and measurement of real-time systems and a simulation/emulation environment. It is designed to support the use and evaluation of new design approaches, experimentations with alternative system building blocks, and the analysis and performance profiling of prototype real-time systems.

  14. M-X Environmental Technical Report. Alternative Potential Operating Base Locations, Coyote Spring Valley.

    DTIC Science & Technology

    1980-12-22

    necessary and identify by block number) MX Coyote Spring, Nevada Siting Analysis Nevada Environnental Report 20. ABSTRACT (Continue on reverse side If...necessary and Identify by block number) The area of analysis (AO) for the Coyote Spring Valley operating base option includes both Clark and Lincoln...counties, and is located in the southern portion of the designated region of influence. Las Vegas and the surrounding suburbs are the major settlements and

  15. Book Analysis: Challenger: A Major Malfunction.

    DTIC Science & Technology

    1988-04-01

    REPORT NUMBER 88-113S TITLE BOOK ANALYSIS: CHALLENGER : A MAJOR MALFUNCTION AUTHOR(S) MAJOR THOMAS M. HALL, USAF FACULTY ADVISOR LT COL JOhN R. GRELLMAN... CHALLENGER : A MAJOR MALFUNCTION 12. PERSONAL AUTHOR(S) Hall, Thomas M., Major, USAF 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (Year, Month...identify by block number) FIELD GROUP SUB-GROUP 19. ABSTRACT Continue on reverse if necessary and identify by block number) This report analyzes Challenger

  16. Analysis of High Precision GPS Time Series and Strain Rates for the Geothermal Play Fairway Analysis of Washington State Prospects Project

    DOE Data Explorer

    Michael Swyer

    2015-02-22

    Global Positioning System (GPS) time series from the National Science Foundation (NSF) Earthscope’s Plate Boundary Observatory (PBO) and Central Washington University’s Pacific Northwest Geodetic Array (PANGA). GPS station velocities were used to infer strain rates using the ‘splines in tension’ method. Strain rates were derived separately for subduction zone locking at depth and block rotation near the surface within crustal block boundaries.

  17. The Effects of Presession Manipulations on Automatically Maintained Challenging Behavior and Task Responding

    ERIC Educational Resources Information Center

    Chung, Yi-Chieh; Cannella-Malone, Helen I.

    2010-01-01

    This study examined the effects of presession exposure to attention, response blocking, attention with response blocking, and noninteraction conditions on subsequent engagement in automatically maintained challenging behavior and correct responding in four individuals with significant intellectual disabilities. Following a functional analysis, the…

  18. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research.

    PubMed

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D; Cui, Licong

    2015-11-10

    A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f1, f2, ..., fk. The input for each function fi has 3 components: a random number r, an integer n, and input data m. The result, fi(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f1(r1, n1, m1), f2(r2, n2, m2), ..., fk(rk, nk, mk). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash.

  19. Fabrication and study of properties of magnetite nanoparticles in hybrid micelles of polystyrene-block-polyethylene oxide and sodium dodecyl sulfate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loginova, T. P., E-mail: tlg@ineos.ac.ru; Timofeeva, G. I.; Lependina, O. L.

    2016-01-15

    Magnetite nanoparticles have been formed for the first time in hybrid micelles of polystyrene-block-polyethylene oxide and sodium dodecyl sulfate in water by ultrasonic treatment at room temperature. An analysis by small-angle X-ray scattering and transmission electron microscopy (TEM) showed that magnetite nanoparticles in hybrid micelles of block copolymer and sodium dodecyl sulfate are polydesperse (have sizes from 0.5 to 20 nm). The specific magnetization of solid samples has been measured.

  20. Identification and interpretation of tectonic features from Skylab imagery. [Mojave Desert block of Texas, Arizona, and Chihuahua, Mexico

    NASA Technical Reports Server (NTRS)

    Abdel-Gawad, M. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Two alternate models for the extension of the Texas zone through the Mojave Desert block have been developed: (1) along the Pisgah Line, and (2) along the eastern Transverse Ranges; this model suggests a counterclockwise rotation of the Mojave block. Analysis of S190B photographs of the western Mojave Desert provides strong evidence for the feasibility of identifying recent fault breaks.

  1. A deblocking algorithm based on color psychology for display quality enhancement

    NASA Astrophysics Data System (ADS)

    Yeh, Chia-Hung; Tseng, Wen-Yu; Huang, Kai-Lin

    2012-12-01

    This article proposes a post-processing deblocking filter to reduce blocking effects. The proposed algorithm detects blocking effects by fusing the results of Sobel edge detector and wavelet-based edge detector. The filtering stage provides four filter modes to eliminate blocking effects at different color regions according to human color vision and color psychology analysis. Experimental results show that the proposed algorithm has better subjective and objective qualities for H.264/AVC reconstructed videos when compared to several existing methods.

  2. Clinical safety and effectiveness of transversus abdominis plane (TAP) block in post-operative analgesia: a systematic review and meta-analysis.

    PubMed

    Ma, Ning; Duncan, Joanna K; Scarfe, Anje J; Schuhmann, Susanne; Cameron, Alun L

    2017-06-01

    Transversus abdominis plane (TAP) blocks can provide analgesia postoperatively for a range of surgeries. Abundant clinical trials have assessed TAP block showing positive analgesic effects. This systematic review assesses safety and effectiveness outcomes of TAP block in all clinical settings, comparing with both active (standard care) and inactive (placebo) comparators. PubMed, EMBASE, The Cochrane Library and the University of York CRD databases were searched. RCTs were screened for their eligibility and assessed for risk of bias. Meta-analyses were performed on available data. TAP block showed an equivalent safety profile to all comparators in the incidence of nausea (OR = 1.07) and vomiting (OR = 0.81). TAP block was more effective in reducing morphine consumption [MD = 13.05, 95% CI (8.33, 51.23)] and in delaying time to first analgesic request [MD = 123.49, 95% CI (48.59, 198.39)]. Postoperative pain within 24 h was reduced or at least equivalent in TAP block compared to its comparators. Therefore, TAP block is a safe and effective procedure compared to standard care, placebo and other analgesic techniques. Further research is warranted to investigate whether the TAP block technique can be improved by optimizing dose and technique-related factors.

  3. New self-assembly strategies for next generation lithography

    NASA Astrophysics Data System (ADS)

    Schwartz, Evan L.; Bosworth, Joan K.; Paik, Marvin Y.; Ober, Christopher K.

    2010-04-01

    Future demands of the semiconductor industry call for robust patterning strategies for critical dimensions below twenty nanometers. The self assembly of block copolymers stands out as a promising, potentially lower cost alternative to other technologies such as e-beam or nanoimprint lithography. One approach is to use block copolymers that can be lithographically patterned by incorporating a negative-tone photoresist as the majority (matrix) phase of the block copolymer, paired with photoacid generator and a crosslinker moiety. In this system, poly(α-methylstyrene-block-hydroxystyrene)(PαMS-b-PHOST), the block copolymer is spin-coated as a thin film, processed to a desired microdomain orientation with long-range order, and then photopatterned. Therefore, selfassembly of the block copolymer only occurs in select areas due to the crosslinking of the matrix phase, and the minority phase polymer can be removed to produce a nanoporous template. Using bulk TEM analysis, we demonstrate how the critical dimension of this block copolymer is shown to scale with polymer molecular weight using a simple power law relation. Enthalpic interactions such as hydrogen bonding are used to blend inorganic additives in order to enhance the etch resistance of the PHOST block. We demonstrate how lithographically patternable block copolymers might fit in to future processing strategies to produce etch-resistant self-assembled features at length scales impossible with conventional lithography.

  4. Polyacrylonitrile block copolymers for the preparation of a thin carbon coating around TiO2 nanorods for advanced lithium-ion batteries.

    PubMed

    Oschmann, Bernd; Bresser, Dominic; Tahir, Muhammad Nawaz; Fischer, Karl; Tremel, Wolfgang; Passerini, Stefano; Zentel, Rudolf

    2013-11-01

    Herein, a new method for the realization of a thin and homogenous carbonaceous particle coating, made by carbonizing RAFT polymerization derived block copolymers anchored on anatase TiO2 nanorods, is presented. These block copolymers consist of a short anchor block (based on dopamine) and a long, easily graphitizable block of polyacrylonitrile. The grafting of such block copolymers to TiO2 nanorods creates a polymer shell, which can be visualized by atomic force microscopy (AFM). Thermal treatment at 700 °C converts the polyacrylonitrile block to partially graphitic structures (as determined by Raman spectroscopy), establishing a thin carbon coating (as determined by transmission electron microscopy, TEM, analysis). The carbon-coated TiO2 nanorods show improved electrochemical performance in terms of achievable specific capacity and, particularly, long-term cycling stability by reducing the average capacity fading per cycle from 0.252 mAh g(-1) to only 0.075 mAh g(-1) . © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. New thiol-responsive mono-cleavable block copolymer micelles labeled with single disulfides.

    PubMed

    Sourkohi, Behnoush Khorsand; Schmidt, Rolf; Oh, Jung Kwon

    2011-10-18

    Thiol-responsive symmetric triblock copolymers having single disulfide linkages in the middle blocks (called mono-cleavable block copolymers, ss-ABP(2)) were synthesized by atom transfer radical polymerization in the presence of a disulfide-labeled difunctional Br-initiator. These brush-like triblock copolymers consist of a hydrophobic polyacrylate block having pendent oligo(propylene oxide) and a hydrophilic polymethacrylate block having pendent oligo(ethylene oxide). Gel permeation chromatography and (1)H NMR results confirmed the synthesis of well-defined mono-cleavable block copolymers and revealed that polymerizations were well controlled. Because of amphiphilic nature, these copolymers self-assembled to form colloidally stable micelles above critical micellar concentration of 0.032 mg · mL(-1). In response to reductive reactions, disulfides in thiol-responsive micelles were cleaved. Atomic force microscopy and dynamic light scattering analysis suggested that the cleavage of disulfides caused dissociation of micelles to smaller-sized assembled structures in water. Moreover, in a biomedical perspective, the mono-cleavable block copolymer micelles are not cytotoxic and thus biocompatible. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Rockfall trajectory modelling by the integration of Digital Terrestrial Photogrammetry, Laser Scanning and GIS

    NASA Astrophysics Data System (ADS)

    Francioni, Mirko; Salvini, Riccardo; Riccucci, Silvia; Guastaldi, Enrico; Ortolano, Fabrizio; Bonciani, Filippo; Callegari, Ivan; Fantozzi, Pierlorenzo

    2010-05-01

    The present paper describes the runout analysis of rocky unstable blocks on the slope, 500 m wide and 600 m high, overhanging the railroad line Domodossola - Iselle, Italy. In addition to the traditional geological, geomorphological and engineering-geological surveys, DTP (Digital Terrestrial Photogrammetry) by means of an helicopter was used to perform a detailed analysis of rocky blocks sited in inaccessible areas. In order to accomplish the analysis, DTP was combined with LS (Laser Scanning) to build the DDSM (Digital Dense Surface Model) of the slope. Aim of the work is the assessment of the rockfalls potentially dangerous for the railroad line, the assessment of the efficiency of existing protection measures and the prompt of mitigation strategies and monitoring. In order to collect the exact position and size of blocks and wedges, a digital interpretation of stereopairs coming from DTP has been carried out. The photointerpretation has been used to realize the land cover map (ex. outcropping rock, soil covered by vegetation) and to recognize the mitigation and protection measures already installed. Starting from blocks position the DDSM has allowed to determine the probable trajectories of rockfall along the slope. These have been calculated by means of a GIS procedure by the use of the ArcHydro module of EsriTM ArcMap assuming a correspondence between probable trajectories and flowdirection. The morphologic profile of rock falling paths has been obtained by the interpolation of 3D points coming from a properly procedure developed inside EsriTM Arcinfo Workstation environment integrated with the Easy Profiler tool of EsriTM ArcMap. The physical-mechanical characteristics of blocks, the morphologic profile, the land cover and the location of the protection barriers (classified according to the height - from 2 to 4 m - and to the preservation status), have been used as input data in RocFall2D (RoscienceTM) software to calculate the runout analysis. Local slope land cover has been managed by a statistical approach utilizing the coefficient of normal and tangential restitution; in this way probabilistic results about rockfall end point and kinetic energy along the falling path and on the barriers have been obtained. Considering the railroad line proximity, the analysis has shown the high probability to reach the train track for some unstable block. Some other ends their fall mainly in correspondence of vegetated and less steep areas; the remaining blocks are stopped by the existing protection measures. Results from this work have allowed the hazard zoning in respect to the railway; moreover, comparing them with results coming from the rock slope stability analysis, it has been possible to suggest the proper protection methods in different areas.

  7. Cost Utility Analysis of Cervical Therapeutic Medial Branch Blocks in Managing Chronic Neck Pain

    PubMed Central

    Manchikanti, Laxmaiah; Pampati, Vidyasagar; Kaye, Alan D.; Hirsch, Joshua A.

    2017-01-01

    Background:Controlled diagnostic studies have established the prevalence of cervical facet joint pain to range from 36% to 67% based on the criterion standard of ≥ 80% pain relief. Treatment of cervical facet joint pain has been described with Level II evidence of effectiveness for therapeutic facet joint nerve blocks and radiofrequency neurotomy and with no significant evidence for intraarticular injections. However, there have not been any cost effectiveness or cost utility analysis studies performed in managing chronic neck pain with or without headaches with cervical facet joint interventions. Study Design:Cost utility analysis based on the results of a double-blind, randomized, controlled trial of cervical therapeutic medial branch blocks in managing chronic neck pain. Objectives:To assess cost utility of therapeutic cervical medial branch blocks in managing chronic neck pain. Methods: A randomized trial was conducted in a specialty referral private practice interventional pain management center in the United States. This trial assessed the clinical effectiveness of therapeutic cervical medial branch blocks with or without steroids for an established diagnosis of cervical facet joint pain by means of controlled diagnostic blocks. Cost utility analysis was performed with direct payment data for the procedures for a total of 120 patients over a period of 2 years from this trial based on reimbursement rates of 2016. The payment data provided direct procedural costs without inclusion of drug treatments. An additional 40% was added to procedural costs with multiplication of a factor of 1.67 to provide estimated total costs including direct and indirect costs, based on highly regarded surgical literature. Outcome measures included significant improvement defined as at least a 50% improvement with reduction in pain and disability status with a combined 50% or more reduction in pain in Neck Disability Index (NDI) scores. Results:The results showed direct procedural costs per one-year improvement in quality adjusted life year (QALY) of United States Dollar (USD) of $2,552, and overall costs of USD $4,261. Overall, each patient on average received 5.7 ± 2.2 procedures over a period of 2 years. Average significant improvement per procedure was 15.6 ± 12.3 weeks and average significant improvement in 2 years per patient was 86.0 ± 24.6 weeks. Limitations:The limitations of this cost utility analysis are that data are based on a single center evaluation. Only costs of therapeutic interventional procedures and physician visits were included, with extrapolation of indirect costs. Conclusion:The cost utility analysis of therapeutic cervical medial branch blocks in the treatment of chronic neck pain non-responsive to conservative management demonstrated clinical effectiveness and cost utility at USD $4,261 per one year of QALY. PMID:29200944

  8. Cost Utility Analysis of Cervical Therapeutic Medial Branch Blocks in Managing Chronic Neck Pain.

    PubMed

    Manchikanti, Laxmaiah; Pampati, Vidyasagar; Kaye, Alan D; Hirsch, Joshua A

    2017-01-01

    Background: Controlled diagnostic studies have established the prevalence of cervical facet joint pain to range from 36% to 67% based on the criterion standard of ≥ 80% pain relief. Treatment of cervical facet joint pain has been described with Level II evidence of effectiveness for therapeutic facet joint nerve blocks and radiofrequency neurotomy and with no significant evidence for intraarticular injections. However, there have not been any cost effectiveness or cost utility analysis studies performed in managing chronic neck pain with or without headaches with cervical facet joint interventions. Study Design: Cost utility analysis based on the results of a double-blind, randomized, controlled trial of cervical therapeutic medial branch blocks in managing chronic neck pain. Objectives: To assess cost utility of therapeutic cervical medial branch blocks in managing chronic neck pain. Methods: A randomized trial was conducted in a specialty referral private practice interventional pain management center in the United States. This trial assessed the clinical effectiveness of therapeutic cervical medial branch blocks with or without steroids for an established diagnosis of cervical facet joint pain by means of controlled diagnostic blocks. Cost utility analysis was performed with direct payment data for the procedures for a total of 120 patients over a period of 2 years from this trial based on reimbursement rates of 2016. The payment data provided direct procedural costs without inclusion of drug treatments. An additional 40% was added to procedural costs with multiplication of a factor of 1.67 to provide estimated total costs including direct and indirect costs, based on highly regarded surgical literature. Outcome measures included significant improvement defined as at least a 50% improvement with reduction in pain and disability status with a combined 50% or more reduction in pain in Neck Disability Index (NDI) scores. Results: The results showed direct procedural costs per one-year improvement in quality adjusted life year (QALY) of United States Dollar (USD) of $2,552, and overall costs of USD $4,261. Overall, each patient on average received 5.7 ± 2.2 procedures over a period of 2 years. Average significant improvement per procedure was 15.6 ± 12.3 weeks and average significant improvement in 2 years per patient was 86.0 ± 24.6 weeks. Limitations: The limitations of this cost utility analysis are that data are based on a single center evaluation. Only costs of therapeutic interventional procedures and physician visits were included, with extrapolation of indirect costs. Conclusion: The cost utility analysis of therapeutic cervical medial branch blocks in the treatment of chronic neck pain non-responsive to conservative management demonstrated clinical effectiveness and cost utility at USD $4,261 per one year of QALY.

  9. Safety and operational analysis of lane widths in mid-block segments and intersection approaches in the urban environment in Nebraska.

    DOT National Transportation Integrated Search

    2015-04-01

    This research examined the safety and operational effects of roadway lane width on mid-block segments between : signalized intersections as well as on signalized intersection approaches in the urban environments of Lincoln and Omaha, : Nebraska. In t...

  10. Module performance and failure analysis area: Flat-plate solar array project

    NASA Technical Reports Server (NTRS)

    Tornstrom, E.

    1984-01-01

    A redesign of the initial (Group I) Mobile Solar Block V module was done and documented. Manufacturing experience and accelerated test data from Group I formed the basis for the redesign. Ten Block V Group II modules were submitted for evaluation and the results are presented.

  11. Observing atmospheric blocking with GPS radio occultation - one decade of measurements

    NASA Astrophysics Data System (ADS)

    Brunner, Lukas; Steiner, Andrea

    2017-04-01

    Atmospheric blocking has received a lot of attention in recent years due to its impact on mid-latitude circulation and subsequently on weather extremes such as cold and warm spells. So far blocking studies have been based mainly on re-analysis data or model output. However, it has been shown that blocking frequency exhibits considerable inter-model spread in current climate models. Here we use one decade (2006 to 2016) of satellite-based observations from GPS radio occultation (RO) to analyze blocking in RO data building on work by Brunner et al. (2016). Daily fields on a 2.5°×2.5° longitude-latitude grid are calculated by applying an adequate gridding strategy to the RO measurements. For blocking detection we use a standard blocking detection algorithm based on 500 hPa geopotential height (GPH) gradients. We investigate vertically resolved atmospheric variables such as GPH, temperature, and water vapor before, during, and after blocking events to increase process understanding. Moreover, utilizing the coverage of the RO data set, we investigate global blocking frequencies. The main blocking regions in the northern and southern hemisphere are identified and the (vertical) atmospheric structure linked to blocking events is compared. Finally, an inter-comparison of results from RO data to different re-analyses, such as ERA-Interim, MERRA 2, and JRA-55, is presented. Brunner, L., A. K. Steiner, B. Scherllin-Pirscher, and M. W. Jury (2016): Exploring atmospheric blocking with GPS radio occultation observations. Atmos. Chem. Phys., 16, 4593-4604, doi:10.5194/acp-16-4593-2016.

  12. Analysis of the Feasibility of Using Soil from the Municipality of Goytacazes/RJ for Production of Soil-Cement Brick

    NASA Astrophysics Data System (ADS)

    Alexandre, J.; Azevedo, A. R. G.; Theophilo, M. M. D.; Xavier, C. G.; Paes, A. L. C.; Monteiro, S. N.; Margem, F. M.; Azeredo, N. G.

    The use of bricks of soil-cement is proving to be an important constructive methodology due to low environmental impact in the production process of these blocks comparing with conventional bricks are burnt, besides being easy to produce. However during the process of production of bricks, which are compressed, knowledge of the properties of the soil used is critical to the quality and durability of the blocks. The objective of this work is to evaluate the feasibility of using soil from the municipality of Goytacazes for the production of soil-cement bricks. Assays were performed the compaction, liquid limit, plastic limit, particle size analysis, EDX and X-Ray diffraction for later pressed blocks and analyze their compressive strength and water absorption.

  13. Inspection of baked carbon anodes using a combination of multi-spectral acousto-ultrasonic techniques and principal component analysis.

    PubMed

    Boubaker, Moez Ben; Picard, Donald; Duchesne, Carl; Tessier, Jayson; Alamdari, Houshang; Fafard, Mario

    2018-05-17

    This paper reports on the application of an acousto-ultrasonic (AU) scheme for the inspection of industrial-size carbon anode blocks used in the production of primary aluminium by the Hall-Héroult process. A frequency-modulated wave is used to excite the anode blocks at multiple points. The collected attenuated AU signals are decomposed using the Discrete Wavelet Transform (DTW) after which vectors of features are calculated. Principal Component Analysis (PCA) is utilized to cluster the AU responses of the anodes. The approach allows locating cracks in the blocks and the AU features were found sensitive to crack severity. The results are validated using images collected after cutting some anodes. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Top Value Added Chemicals From Biomass: I. Results of Screening for Potential Candidates from Sugars and Synthesis Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werpy, Todd A.; Holladay, John E.; White, James F.

    2004-11-01

    This report identifies twelve building block chemicals that can be produced from sugars via biological or chemical conversions. The twelve building blocks can be subsequently converted to a number of high-value bio-based chemicals or materials. Building block chemicals, as considered for this analysis, are molecules with multiple functional groups that possess the potential to be transformed into new families of useful molecules. The twelve sugar-based building blocks are 1,4-diacids (succinic, fumaric and malic), 2,5-furan dicarboxylic acid, 3-hydroxy propionic acid, aspartic acid, glucaric acid, glutamic acid, itaconic acid, levulinic acid, 3-hydroxybutyrolactone, glycerol, sorbitol, and xylitol/arabinitol. In addition to building blocks, themore » report outlines the central technical barriers that are preventing the widespread use of biomass for products and chemicals.« less

  15. What is the evidence behind the evidence-base? The premature death of block-replace antithyroid drug regimens for Graves' disease.

    PubMed

    Razvi, Salman; Vaidya, Bijay; Perros, Petros; Pearce, Simon H S

    2006-06-01

    Block-replace and titration antithyroid drug regimens both give similar rates of medium- to long-term remission of hyperthyroid Graves' disease. Recent meta-analysis, however, has suggested that titration regimens may be preferable owing to a higher rate of adverse events seen in the block-replace arms of published comparative studies. This article critically re-evaluates the evidence upon which these meta-analyses were based. We suggest that there is little objective evidence that is pertinent to current clinical practice to separate block-replace from titration antithyroid drug regimens and that both remain satisfactory approaches to the medical management of hyperthyroid Graves' disease.

  16. Top Value Added Chemicals from Biomass - Volume I, Results of Screening for Potential Candidates from Sugars and Synthesis Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2004-08-01

    This report identifies twelve building block chemicals that can be produced from sugars via biological or chemical conversions. The twelve building blocks can be subsequently converted to a number of high-value bio-based chemicals or materials. Building block chemicals, as considered for this analysis, are molecules with multiple functional groups that possess the potential to be transformed into new families of useful molecules. The twelve sugar-based building blocks are 1,4-diacids (succinic, fumaric and malic), 2,5-furan dicarboxylic acid, 3-hydroxy propionic acid, aspartic acid, glucaric acid, glutamic acid, itaconic acid, levulinic acid, 3-hydroxybutyrolactone, glycerol, sorbitol, and xylitol/arabinitol.

  17. Top Value Added Chemicals from Biomass: Volume I -- Results of Screening for Potential Candidates from Sugars and Synthesis Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werpy, T.; Petersen, G.

    2004-08-01

    This report identifies twelve building block chemicals that can be produced from sugars via biological or chemical conversions. The twelve building blocks can be subsequently converted to a number of high-value bio-based chemicals or materials. Building block chemicals, as considered for this analysis, are molecules with multiple functional groups that possess the potential to be transformed into new families of useful molecules. The twelve sugar-based building blocks are 1,4-diacids (succinic, fumaric and malic), 2,5-furan dicarboxylic acid, 3-hydroxy propionic acid, aspartic acid, glucaric acid, glutamic acid, itaconic acid, levulinic acid, 3-hydroxybutyrolactone, glycerol, sorbitol, and xylitol/arabinitol.

  18. Evaluation of Isoprene Chain Extension from PEO Macromolecular Chain Transfer Agents for the Preparation of Dual, Invertible Block Copolymer Nanoassemblies.

    PubMed

    Bartels, Jeremy W; Cauët, Solène I; Billings, Peter L; Lin, Lily Yun; Zhu, Jiahua; Fidge, Christopher; Pochan, Darrin J; Wooley, Karen L

    2010-09-14

    Two RAFT-capable PEO macro-CTAs, 2 and 5 kDa, were prepared and used for the polymerization of isoprene which yielded well-defined block copolymers of varied lengths and compositions. GPC analysis of the PEO macro-CTAs and block copolymers showed remaining unreacted PEO macro-CTA. Mathematical deconvolution of the GPC chromatograms allowed for the estimation of the blocking efficiency, about 50% for the 5 kDa PEO macro-CTA and 64% for the 2 kDa CTA. Self assembly of the block copolymers in both water and decane was investigated and the resulting regular and inverse assemblies, respectively, were analyzed with DLS, AFM, and TEM to ascertain their dimensions and properties. Assembly of PEO-b-PIp block copolymers in aqueous solution resulted in well-defined micelles of varying sizes while the assembly in hydrophobic, organic solvent resulted in the formation of different morphologies including large aggregates and well-defined cylindrical and spherical structures.

  19. Improved characterization, monitoring and instability assessment of high rock faces by integrating TLS and GB-InSAR

    NASA Astrophysics Data System (ADS)

    Bianchetti, Matteo; Agliardi, Federico; Villa, Alberto; Battista Crosta, Giovanni; Rivolta, Carlo

    2015-04-01

    Rockfall risk analysis require quantifying rockfall onset susceptibility and magnitude scenarios at source areas, and the expected rockfall trajectories and related dynamic quantities. Analysis efforts usually focus on the rockfall runout component, whereas rock mass characterization and block size distribution quantification, monitoring and analysis of unstable rock volumes are usually performed using simplified approaches, due to technological and site-specific issues. Nevertheless, proper quantification of rock slope stability and rockfall magnitude scenarios is key when dealing with high rock walls, where widespread rockfall sources and high variability of release mechanisms and block volumes can result in excessive modelling uncertainties and poorly constrained mitigation measures. We explored the potential of integrating field, remote sensing, structural analysis and stability modelling techniques to improve hazard assessment at the Gallivaggio sanctuary site, a XVI century heritage located along the State Road 36 in the Spluga Valley (Italian Central Alps). The site is impended by a subvertical cliff up to 600 m high, made of granitic orthogneiss of the Truzzo granitic complex (Tambo Nappe, upper Pennidic domain). The rock mass is cut by NNW and NW-trending slope-scale structural lineaments and by 5-6 fracture sets with variable spatial distribution, spacing and persistence, which bound blocks up to tens of cubic meters and control the 3D slope morphology. The area is characterised by widespread rock slope instability from rockfalls to massive failures. Although a 180 m long embankment was built to protect the site from rockfalls, concerns remain about potential large unstable rock volumes or flyrocks projected by the widely observed impact fragmentation of stiff rock blocks. Thus, the authority in charge started a series of periodical GB-InSAR monitoring surveys using LiSALabTM technology (12 surveys in 2011-2014), which outlined the occurrence of unstable spots spread over the cliff, with cm-scale cumulative displacements in the observation period. To support the interpretation and analysis of these data, we carried out multitemporal TLS surveys (5 sessions between September 2012 and October 2014) using a Riegl VZ-1000 long-range laser scanner. We performed rock mass structural analyses on dense TLS point clouds using two different approaches: 1) manual discontinuity orientation and intensity measurement from digital outcrops; 2) automatic feature extraction and intensity evaluation through the development of an original Matlab tool, suited for multi-scale applications and optimized for parallel computing. Results were validated using field discontinuity measurements and compared to evaluate advantages and limitations of different approaches, and allowed: 1) outlining the precise location, geometry and kinematics of unstable blocks and block clusters corresponding to radar moving spots; 2) performing stability analyses; 3) quantifying rockwall changes over the observation period. Our analysis provided a robust spatial characterization of rockfall sources, block size distribution and onset susceptibility as input for 3D runout modelling and quantitative risk analysis.

  20. Rework and workarounds in nurse medication administration process: implications for work processes and patient safety.

    PubMed

    Halbesleben, Jonathon R B; Savage, Grant T; Wakefield, Douglas S; Wakefield, Bonnie J

    2010-01-01

    Health care organizations have redesigned existing and implemented new work processes intended to improve patient safety. As a consequence of these process changes, there are now intentionally designed "blocks" or barriers that limit how specific work actions, such as ordering and administering medication, are to be carried out. Health care professionals encountering these designed barriers can choose to either follow the new process, engage in workarounds to get past the block, or potentially repeat work (rework). Unfortunately, these workarounds and rework may lead to other safety concerns. The aim of this study was to examine rework and workarounds in hospital medication administration processes. Observations and semistructured interviews were conducted with 58 nurses from four hospital intensive care units focusing on the medication administration process. Using the constant comparative method, we analyzed the observation and interview data to develop themes regarding rework and workarounds. From this analysis, we developed an integrated process map of the medication administration process depicting blocks. A total of 12 blocks were reported by the participants. Based on the analysis, we categorized them as related to information exchange, information entry, and internal supply chain issues. Whereas information exchange and entry blocks tended to lead to rework, internal supply chain issues were more likely to lead to workarounds. A decentralized pharmacist on the unit may reduce work flow blocks (and, thus, workarounds and rework). Work process redesign may further address the problems of workarounds and rework.

  1. Reliability and validity of the faculty evaluation instrument used at King Saud bin Abdulaziz University for Health Sciences: Results from the Haematology Course.

    PubMed

    Al-Eidan, Fahad; Baig, Lubna Ansari; Magzoub, Mohi-Eldin; Omair, Aamir

    2016-04-01

    To assess reliability and validity of evaluation tool using Haematology course as an example. The cross-sectional study was conducted at King Saud Bin Abdul Aziz University of Health Sciences, Riyadh, Saudi Arabia, in 2012, while data analysis was completed in 2013. The 27-item block evaluation instrument was developed by a multidisciplinary faculty after a comprehensive literature review. Validity of the questionnaire was confirmed using principal component analysis with varimax rotation and Kaiser normalisation. Identified factors were combined to get the internal consistency reliability of each factor. Student's t-test was used to compare mean ratings between male and female students for the faculty and block evaluation. Of the 116 subjects in the study, 80(69%) were males and 36(31%) were females. Reliability of the questionnaire was Cronbach's alpha 0.91. Factor analysis yielded a logically coherent 7 factor solution that explained 75% of the variation in the data. The factors were group dynamics in problem-based learning (alpha0.92), block administration (alpha 0.89), quality of objective structured clinical examination (alpha 0.86), block coordination (alpha 0.81), structure of problem-based learning (alpha 0.84), quality of written exam (alpha 0.91), and difficulty of exams (alpha0.41). Female students' opinion on depth of analysis and critical thinking was significantly higher than that of the males (p=0.03). The faculty evaluation tool used was found to be reliable, but its validity, as assessed through factor analysis, has to be interpreted with caution as the responders were less than the minimum required for factor analysis.

  2. The Ålesund slide, March 2008: Lidar analysis of an urban human-induced rockslide

    NASA Astrophysics Data System (ADS)

    Derron, M.-H.; Melchiorre, C.; Blikra, L.; Loye, A.; Pedrazzini, A.

    2009-04-01

    On March 26 2008, a 2400 m3 rock block slid from an excavated slope on a six floors house in the city of Aalesund (Western coast of Norway). The whole house was displaced horizontally on its basement on about 6 meters and the two lower floors of the building collapsed. Five persons died during the event. Almost 500 inhabitants living in the surroundings had to be evacuated for 6 to 8 days because of a gas leakage. The site has been scanned three times with a terrestrial lidar. In the first dataset, made a couple of days after the event, the block and the house are present. In the second dataset, acquired about 3 months later, the house had been removed but most of the sliding block is still in place. The last scan, made six months after the events, shows very well the sliding surface, as the block had been removed too. These three scans completed by field measurement and observations have been used: 1) to characterize the geometry and volume of the block, 2) to determine the displacement vector, 3) to perform a analysis of discontinuities and some kinematics tests, 4) to estimate the roughness and the waviness of the sliding surface.

  3. TAD-free analysis of architectural proteins and insulators.

    PubMed

    Mourad, Raphaël; Cuvier, Olivier

    2018-03-16

    The three-dimensional (3D) organization of the genome is intimately related to numerous key biological functions including gene expression and DNA replication regulations. The mechanisms by which molecular drivers functionally organize the 3D genome, such as topologically associating domains (TADs), remain to be explored. Current approaches consist in assessing the enrichments or influences of proteins at TAD borders. Here, we propose a TAD-free model to directly estimate the blocking effects of architectural proteins, insulators and DNA motifs on long-range contacts, making the model intuitive and biologically meaningful. In addition, the model allows analyzing the whole Hi-C information content (2D information) instead of only focusing on TAD borders (1D information). The model outperforms multiple logistic regression at TAD borders in terms of parameter estimation accuracy and is validated by enhancer-blocking assays. In Drosophila, the results support the insulating role of simple sequence repeats and suggest that the blocking effects depend on the number of repeats. Motif analysis uncovered the roles of the transcriptional factors pannier and tramtrack in blocking long-range contacts. In human, the results suggest that the blocking effects of the well-known architectural proteins CTCF, cohesin and ZNF143 depend on the distance between loci, where each protein may participate at different scales of the 3D chromatin organization.

  4. Change detection of medical images using dictionary learning techniques and principal component analysis.

    PubMed

    Nika, Varvara; Babyn, Paul; Zhu, Hongmei

    2014-07-01

    Automatic change detection methods for identifying the changes of serial MR images taken at different times are of great interest to radiologists. The majority of existing change detection methods in medical imaging, and those of brain images in particular, include many preprocessing steps and rely mostly on statistical analysis of magnetic resonance imaging (MRI) scans. Although most methods utilize registration software, tissue classification remains a difficult and overwhelming task. Recently, dictionary learning techniques are being used in many areas of image processing, such as image surveillance, face recognition, remote sensing, and medical imaging. We present an improved version of the EigenBlockCD algorithm, named the EigenBlockCD-2. The EigenBlockCD-2 algorithm performs an initial global registration and identifies the changes between serial MR images of the brain. Blocks of pixels from a baseline scan are used to train local dictionaries to detect changes in the follow-up scan. We use PCA to reduce the dimensionality of the local dictionaries and the redundancy of data. Choosing the appropriate distance measure significantly affects the performance of our algorithm. We examine the differences between [Formula: see text] and [Formula: see text] norms as two possible similarity measures in the improved EigenBlockCD-2 algorithm. We show the advantages of the [Formula: see text] norm over the [Formula: see text] norm both theoretically and numerically. We also demonstrate the performance of the new EigenBlockCD-2 algorithm for detecting changes of MR images and compare our results with those provided in the recent literature. Experimental results with both simulated and real MRI scans show that our improved EigenBlockCD-2 algorithm outperforms the previous methods. It detects clinical changes while ignoring the changes due to the patient's position and other acquisition artifacts.

  5. A Retrospective Study Evaluating the Effect of Low Doses of Perineural Dexamethasone on Ropivacaine Brachial Plexus Peripheral Nerve Block Analgesic Duration.

    PubMed

    Schnepper, Gregory D; Kightlinger, Benjamin I; Jiang, Yunyun; Wolf, Bethany J; Bolin, Eric D; Wilson, Sylvia H

    2017-09-23

    Examination of the effectiveness of perineural dexamethasone administered in very low and low doses on ropivacaine brachial plexus block duration. Retrospective evaluation of brachial plexus block duration in a large cohort of patients receiving peripheral nerve blocks with and without perineural dexamethasone in a prospectively collected quality assurance database. A single academic medical center. A total of 1,942 brachial plexus blocks placed over a 16-month period were reviewed. Demographics, nerve block location, and perineural dexamethasone utilization and dose were examined in relation to block duration. Perineural dexamethasone was examined as none (0 mg), very low dose (2 mg or less), and low dose (greater than 2 mg to 4 mg). Continuous catheter techniques, local anesthetics other than ropivacaine, and block locations with fewer than 15 subjects were excluded. Associations between block duration and predictors of interest were examined using multivariable regression models. A subgroup analysis of the impact of receiving dexamethasone on block duration within each block type was also conducted using a univariate linear regression approach. A total of 1,027 subjects were evaluated. More than 90% of brachial plexus blocks contained perineural dexamethasone (≤4 mg), with a median dose of 2 mg. Increased block duration was associated with receiving any dose of perineural dexamethasone (P < 0.0001), female gender (P = 0.022), increased age (P = 0.048), and increased local anesthetic dose (P = 0.01). In a multivariable model, block duration did not differ with very low- or low-dose perineural dexamethasone after controlling for other factors (P = 0.420). Perineural dexamethasone prolonged block duration compared with ropivacaine alone; however, duration was not greater with low-dose compared with very low-dose perineural dexamethasone. © 2017 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  6. Near-exact distributions for the block equicorrelation and equivariance likelihood ratio test statistic

    NASA Astrophysics Data System (ADS)

    Coelho, Carlos A.; Marques, Filipe J.

    2013-09-01

    In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.

  7. A simple approach to characterizing block copolymer assemblies: graphene oxide supports for high contrast multi-technique imaging†

    PubMed Central

    Patterson, Joseph P.; Sanchez, Ana M.; Petzetakis, Nikos; Smart, Thomas P.; Epps, Thomas H.; Portman, Ian

    2013-01-01

    Block copolymers are well-known to self-assemble into a range of 3-dimensional morphologies. However, due to their nanoscale dimensions, resolving their exact structure can be a challenge. Transmission electron microscopy (TEM) is a powerful technique for achieving this, but for polymeric assemblies chemical fixing/staining techniques are usually required to increase image contrast and protect specimens from electron beam damage. Graphene oxide (GO) is a robust, water-dispersable, and nearly electron transparent membrane: an ideal support for TEM. We show that when using GO supports no stains are required to acquire high contrast TEM images and that the specimens remain stable under the electron beam for long periods, allowing sample analysis by a range of electron microscopy techniques. GO supports are also used for further characterization of assemblies by atomic force microscopy. The simplicity of sample preparation and analysis, as well as the potential for significantly increased contrast background, make GO supports an attractive alternative for the analysis of block copolymer assemblies. PMID:24049544

  8. HRV analysis in local anesthesia using Continuous Wavelet Transform (CWT).

    PubMed

    Shafqat, K; Pal, S K; Kumari, S; Kyriacou, P A

    2011-01-01

    Spectral analysis of Heart Rate Variability (HRV) is used for the assessment of cardiovascular autonomic control. In this study Continuous Wavelet Transform (CWT) has been used to evaluate the effect of local anesthesia on HRV parameters in a group of fourteen patients undergoing axillary brachial plexus block. A new method which takes signal characteristics into account has been presented for the estimation of the variable boundaries associated with the low and the high frequency band of the HRV signal. The variable boundary method might be useful in cases when the power related to respiration component extends beyond the traditionally excepted range of the high frequency band (0.15-0.4 Hz). The statistical analysis (non-parametric Wilcoxon signed rank test) showed that the LF/HF ratio decreased within an hour of the application of the brachial plexus block compared to the values fifteen minutes prior to the application of the block. These changes were observed in thirteen of the fourteen patients included in this study.

  9. Relationships between chemical structure and rat repellency. II. Compounds screened between 1950 and 1960

    USGS Publications Warehouse

    Bowles, W.A.; Adomaitis, V.A.; DeWitt, J.B.; Pratt, J.J.

    1974-01-01

    Over 4,600 compounds, chiefly organic types, were evaluated using both a food acceptance test (Part A) and a barrier penetration bioassay (Part B), to correlate relationships between chemical structure and rodent repellency.These chemicals are indexed and classified according to the functional groups present and to the degree of substitution within their molecular structures. The results of reduction in foot consumption for each compound appraised are calculated and their K values listed in Table I.The repellent activities of the functional groups represented, alone or in combinations, are expressed in Table II by a Functional Group Repellency Index. A ranking of these indices suggests that acyclic and heteroyclic compounds containing tri- or pentavalent nitrogen would be a parent compound of choice for synthesizing novel repellents. Other molecular arrangements, spatial configurations and combinations of functional groups are compared.There were 123 active, interesting or promising compounds included in the 699 having K values of 85 or greater, which were selected for the barrier appraisal study. These chemicals were formulated in selective solvents at several concentrations and applied to burlap. Small foot bags were fashioned using the fabric impregnated with the candidate formulation, and exposed to rodent attack following storage periods of varying intervals. The results of these tests are listed in Table III. Again, those compounds containing nitrogen in the functional groupings indicated a high order of effectiveness. Several commercial patents covering rodent repellents were issued using the data from the food acceptance and barrier studies.Organizations and cooperators which supplied samples for the program are listed in Appendix I. The Wiswesser cipher for compounds in Table I is used in Appendix II to facilitate location of chemicals by sample code number as they appear under the index headings, and for computer storage and analysis.

  10. Relationships between chemical structure and rat repellency: II. compounds screened between 1950 and 1960

    USGS Publications Warehouse

    Bowles, Walter A.; Adomaitis, V.A.; DeWitt, J.B.; Pratt, J.J.

    1974-01-01

    Over 4,600 compounds, chiefly organic types, were evaluated using both a food acceptance test (Part A) and a barrier penetration bioassay (Part B), to correlate relationships between chemical structure and rodent repellency. These chemicals are indexed and classified according to the functional groups present and to the degree of substitution within their molecular structures. The results of reduction in food consumption for each compound appraised are calculated and their K values listed in Table 1. The repellent activities of the functional groups represented, alone or in combinations, are expressed in Table II by a Functional Group Repellency Index.. A ranking of these indices suggests that acyclic and heteroyclic compounds containing tri- or pentavalent nitrogen would be a parent compound of choice for synthesizing novel repellents. Other molecular arrangements, spatial configurations and combinations of functional groups are compared. There were 123 active, interesting or promising compounds included in the 699 having K values of 85 or greater, which were selected for the barrier appraisal study. These chemicals were formulated in selective solvents at several concentrations and applied to burlap. Small food bags were fashioned using the fabric impregnated with the candidate formulation, and exposed to rodent attack following storage periods of varying intervals. The results of these tests are listed in Table III. Again, those compounds containing nitrogen in the functional groupings indicated a high order of effectiveness. Several commercial patents covering rodent repellents were issued using the data from the food acceptance and barrier studies. Organizations and cooperators which supplied samples for the program are listed in Appendix I. The Wiswesser cipher for compounds in Table I is used in Appendix II to facilitate location of chemicals by sample code number as they appear under the index headings, and for computer storage and analysis.

  11. [Retrospective computation of the ISS in multiple trauma patients: Potential pitfalls and limitations of findings in full body CT scans].

    PubMed

    Bogner, V; Brumann, M; Kusmenkov, T; Kanz, K G; Wierer, M; Berger, F; Mutschler, W

    2016-03-01

    The Injury Severity Score (ISS) is a well-established anatomical scoring system for polytraumatized patients. However, any inaccuracy in the Abbreviated Injury Score (AIS) directly increases the ISS impreciseness. Using the full body computed tomography (CT) scan report, ISS computation can be associated with certain pitfalls. This study evaluates interpretation variations depending on radiological reports and indicates requirements to reliably determine the ISS. The ISS of 81 polytraumatized patients was calculated based on the full body CT scan report. If an injury could not be attributed to a precise AIS cipher, the minimal and maximal ISS was computed. Real ISS included all conducted investigations, intraoperative findings, and final medical reports. The differences in ISS min, ISS max, and ISS real were evaluated using the Kruskal-Wallis test (p<0.05) and plotted in a linear regression analysis. Mean ISS min was 24.0 (± 0.7 SEM) points, mean ISS real 38.6 (±1.3 SEM) and mean ISS max was 48.3 (±1.4 SEM) points. All means were significantly different compared to one another (p<0.001). The difference between possible and real ISS showed a distinctive variation. Mean deviation was 9.7 (±0.9 SEM) points downward and 14.5 (±1.1 SEM) points upward. The difference between deviation to ISS min and ISS max was highly significant (p<0.001). Objectification of injury severity in polytraumatized patients using the ISS is an internationally well-established method in clinical and scientific settings. The full body CT scan report must meet distinct criteria and has to be written in acquaintance to the AIS scale if intended to be used for correct ISS computation.

  12. Identifying Effective Design Approaches to Allocate Genotypes in Two-Phase Designs: A Case Study in Pelargonium zonale.

    PubMed

    Molenaar, Heike; Boehm, Robert; Piepho, Hans-Peter

    2017-01-01

    Robust phenotypic data allow adequate statistical analysis and are crucial for any breeding purpose. Such data is obtained from experiments laid out to best control local variation. Additionally, experiments frequently involve two phases, each contributing environmental sources of variation. For example, in a former experiment we conducted to evaluate production related traits in Pelargonium zonale , there were two consecutive phases, each performed in a different greenhouse. Phase one involved the propagation of the breeding strains to obtain the stem cutting count, and phase two involved the assessment of root formation. The evaluation of the former study raised questions regarding options for improving the experimental layout: (i) Is there a disadvantage to using exactly the same design in both phases? (ii) Instead of generating a separate layout for each phase, can the design be optimized across both phases, such that the mean variance of a pair-wise treatment difference (MVD) can be decreased? To answer these questions, alternative approaches were explored to generate two-phase designs either in phase-wise order (Option 1) or across phases (Option 2). In Option 1 we considered the scenarios (i) using in both phases the same experimental design and (ii) randomizing each phase separately. In Option 2, we considered the scenarios (iii) generating a single design with eight replicates and splitting these among the two phases, (iv) separating the block structure across phases by dummy coding, and (v) design generation with optimal alignment of block units in the two phases. In both options, we considered the same or different block structures in each phase. The designs were evaluated by the MVD obtained by the intra-block analysis and the joint inter-block-intra-block analysis. The smallest MVD was most frequently obtained for designs generated across phases rather than for each phase separately, in particular when both phases of the design were separated with a single pseudo-level. The joint optimization ensured that treatment concurrences were equally balanced across pairs, one of the prerequisites for an efficient design. The proposed alternative approaches can be implemented with any model-based design packages with facilities to formulate linear models for treatment and block structures.

  13. A 3D Analysis of Rock Block Deformation and Failure Mechanics Using Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Rowe, Emily; Hutchinson, D. Jean; Kromer, Ryan A.; Edwards, Tom

    2017-04-01

    Many natural geological hazards are present along the Thompson River corridor in British Columbia, Canada, including one particularly hazardous rocky slope known as the White Canyon. Railway tracks used by Canadian National (CN) and Canadian Pacific (CP) Railway companies pass through this area at the base of the Canyon slope. The geologically complex and weathered rock face exposed at White Canyon is prone to rockfalls. With a limited ditch capacity, these falling rocks have the potential to land on the tracks and therefore increase the risk of train derailment. Since 2012, terrestrial laser scanning (TLS) data has been collected at this site on a regular basis to enable researchers at Queen's University to study these rockfalls in greater detail. In this paper, the authors present a summary of an analysis of these TLS datasets including an examination of the pre-failure deformation patterns exhibited by failed rock blocks as well as an investigation into the influence of structural constraints on the pre-failure behavior of these blocks. Aligning rockfall source zones in an early point cloud dataset to a later dataset generates a transformation matrix describing the movement of the block from one scan to the next. This process was repeated such that the motion of the block over the entire TLS data coverage period was measured. A 3D roto-translation algorithm was then used to resolve the motion into translation and rotation components (Oppikofer et al. 2009; Kromer et al. 2015). Structural information was plotted on a stereonet for further analysis. A total of 111 rockfall events exceeding a volume of 1 m3 were analyzed using this approach. The study reveals that although some rockfall source blocks blocks do not exhibit detectable levels of deformation prior to failure, others do experience cm-level translation and rotation on the order of 1 to 6 degrees before detaching from the slope. Moreover, these movements may, in some cases, be related to the discontinuity planes on the slope that were confining the block. It is concluded that rock blocks in White Canyon may be classified as one of five main failure mechanisms based on their pre-failure deformation and structure: planar slide, topple, rotation, wedge, and overhang, with overhang failures representing a large portion of rockfalls in this area. Overhang rockfalls in the White Canyon are characterized by blocks that (a) are not supported by an underlying discontinuity plane, and (b) generally do not exhibit pre-failure deformation. Though overhanging rock blocks are a structural subset of toppling failure, their behavior suggests a different mechanism of detachment. Future work will further populate the present database of rockfalls in White Canyon and will expand the study to include other sites along this corridor. The ultimate goal of this research is to establish warning thresholds based on deformation magnitudes for rockfalls in White Canyon to assist Canadian railways in better understanding and managing these slopes.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockwood, Jr., Neil; McLellan, Jason G; Crossley, Brian

    The Resident Fish Stock Status above Chief Joseph and Grand Coulee Dams Project, commonly known as the Joint Stock Assessment Project (JSAP) is a management tool using ecosystem principles to manage artificial fish assemblages and native fish in altered environments existing in the Columbia River System above Chief Joseph and Grand Coulee Dams (blocked area). The three-phase approach of this project will enhance the fisheries resources of the blocked area by identifying data gaps, filling data gaps with research, and implementing management recommendations based on research results. The Blocked Area fisheries information housed in a central location will allow managersmore » to view the entire system while making decisions, rather than basing management decisions on isolated portions of the system. The JSAP (NWPPC program measure 10.8B.26) is designed and guided jointly by fisheries managers in the blocked area and the Columbia Basin blocked area management plan (1998). The initial year of the project (1997) identified the need for a central data storage and analysis facility, coordination with the StreamNet project, compilation of blocked area fisheries information, and a report on the ecological condition of the Spokane River System. These needs were addressed in 1998 by acquiring a central location with a data storage and analysis system, coordinating a pilot project with StreamNet, compiling fisheries distribution data throughout the blocked area, identifying data gaps based on compiled information, and researching the ecological condition of the Spokane River. In order to ensure that any additional information collected throughout the life of this project will be easily stored and manipulated by the central storage facility, it was necessary to develop standardized methodologies between the JSAP fisheries managers. The use of common collection and analytical tools is essential to the process of streamlining joint management decisions. In 1999 and 2000 the project began to address some of the identified data gaps, throughout the blocked area, with a variety of newly developed sampling projects, as well as, continuing with ongoing data collection of established projects.« less

  15. Supraclavicular block versus interscalene brachial plexus block for shoulder surgery: A meta-analysis of clinical control trials.

    PubMed

    Guo, C W; Ma, J X; Ma, X L; Lu, B; Wang, Y; Tian, A X; Sun, L; Wang, Y; Dong, B C; Teng, Y B

    2017-09-01

    The ultrasound-guided interscalene block (ISB) has been considered a standard technique in managing pain after shoulder surgery. However, this method was associated with the incidence of hemi-diaphragmatic paresis. In contrast to ISB, supraclavicular block (SCB) was suggested to provide effective anaesthesia for shoulder surgery with a low rate of side-effects. Thus, we performed a meta-analysis of randomised controlled trials (RCTs) to compare SCB with ISB for evaluating the efficacy and safety. The literature was searched from PubMed, Wiley Online Library, EMBASE, and the Cochrane Library by two reviewers up to April 2017. All available RCTs written in English that met the criteria were included. Two authors pulled data from relevant articles and assessed the quality with the Cochrane Handbook. Review Manager 5.3 software was used to analyse the data. Five RCTs and one prospective clinical study met the eligibility criteria and were included in the meta-analysis. We considered that there were no statistically significant differences between supraclavicular and interscalene groups in procedural time (P = 0.81), rescue analgesia (P = 0.53), and dyspnoea (P = 0.6). The incidence of hoarseness and Horner syndrome was statistically lower in the SCB group than in the ISB group (P = 0.0002 and P < 0.00001, respectively). The meta-analysis showed that ultrasound-guided SCB could become a feasible alternative technique to the ISB in shoulder surgery. Copyright © 2017. Published by Elsevier Ltd.

  16. Hydrophilic interaction chromatography-multiple reaction monitoring mass spectrometry method for basic building block analysis of low molecular weight heparins prepared through nitrous acid depolymerization.

    PubMed

    Sun, Xiaojun; Guo, Zhimou; Yu, Mengqi; Lin, Chao; Sheng, Anran; Wang, Zhiyu; Linhardt, Robert J; Chi, Lianli

    2017-01-06

    Low molecular weight heparins (LMWHs) are important anticoagulant drugs that are prepared through depolymerization of unfractionated heparin. Based on the types of processing reactions and the structures of the products, LMWHs can be divided into different classifications. Enoxaparin is prepared by benzyl esterification and alkaline depolymerization, while dalteparin and nadroparin are prepared through nitrous acid depolymerization followed by borohydride reduction. Compositional analysis of their basic building blocks is an effective way to provide structural information on heparin and LMWHs. However, most current compositional analysis methods have been limited to heparin and enoxaparin. A sensitive and comprehensive approach is needed for detailed investigation of the structure of LMWHs prepared through nitrous acid depolymerization, especially their characteristic saturated non-reducing end (NRE) and 2,5-anhydro-d-mannitol reducing end (RE). A maltose modified hydrophilic interaction column offers improved separation of complicated mixtures of acidic disaccharides and oligosaccharides. A total of 36 basic building blocks were unambiguously identified by high-resolution tandem mass spectrometry (MS). Multiple reaction monitoring (MRM) MS/MS quantification was developed and validated in the analysis of dalteparin and nadroparin samples. Each group of building blocks revealed different aspects of the properties of LMWHs, such as functional motifs required for anticoagulant activity, the structure of heparin starting materials, cleavage sites in the depolymerization reaction, and undesired structural modifications resulting from side reactions. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Analysis of KC-46 Live-Fire Risk Mitigation Program Testing

    DTIC Science & Technology

    2012-03-01

    the use of real hardware such as electrohydraulic actuators , electrical units, and converter regulators (Andrus, 2010). The only feasible method for...worked with the MQ-9 as a test engineer and analyst for the programs IOT &E, RQ-4 as lead engineer and program lead for the block 3 and the block 4

  18. Refrigeration and Air Conditioning Equipment, 11-9. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This military-developed text consists of three blocks of instructional materials for use by those studying to become refrigeration and air conditioning specialists. Covered in the individual course blocks are the following topics: refrigeration and trouble analysis, thermodynamics, and principles of refrigeration; major components and domestic and…

  19. Laparoscopic-guided abdominal wall nerve blocks in the pediatric population: a novel technique with comparison to ultrasound-guided blocks and local wound infiltration alone.

    PubMed

    Landmann, Alessandra; Visoiu, Mihaela; Malek, Marcus M

    2018-03-01

    Abdominal wall nerve blocks have been gaining popularity for the treatment of perioperative pain in children. Our aim was to compare a technique of surgeon-performed, laparoscopic abdominal wall nerve blocks to anesthesia-placed, ultrasound-guided abdominal wall nerve blocks and the current standard of local wound infiltration. After institutional review board approval was obtained, a retrospective chart review was performed of pediatric patients treated at a single institution during a 2-year period. Statistics were calculated using analysis of variance with post-hoc Bonferonni t tests for pair-wise comparisons. Included in this study were 380 patients who received ultrasound-guided abdominal wall nerve blocks (n = 125), laparoscopic-guided abdominal wall nerve blocks (n = 88), and local wound infiltration (n = 117). Groups were well matched for age, sex, and weight. There was no significant difference in pain scores within the first 8 hours or narcotic usage between groups. Local wound infiltration demonstrated the shortest overall time required to perform (P < .0001). Patients who received a surgeon-performed abdominal wall nerve block demonstrated a shorter duration of hospital stay when compared to the other groups (P = .02). Our study has demonstrated that laparoscopic-guided abdominal wall nerve blocks show similar efficacy to ultrasound-guided nerve blocks performed by pain management physicians without increasing time in the operating room. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Performance Analysis of a Hybrid Overset Multi-Block Application on Multiple Architectures

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    This paper presents a detailed performance analysis of a multi-block overset grid compu- tational fluid dynamics app!ication on multiple state-of-the-art computer architectures. The application is implemented using a hybrid MPI+OpenMP programming paradigm that exploits both coarse and fine-grain parallelism; the former via MPI message passing and the latter via OpenMP directives. The hybrid model also extends the applicability of multi-block programs to large clusters of SNIP nodes by overcoming the restriction that the number of processors be less than the number of grid blocks. A key kernel of the application, namely the LU-SGS linear solver, had to be modified to enhance the performance of the hybrid approach on the target machines. Investigations were conducted on cacheless Cray SX6 vector processors, cache-based IBM Power3 and Power4 architectures, and single system image SGI Origin3000 platforms. Overall results for complex vortex dynamics simulations demonstrate that the SX6 achieves the highest performance and outperforms the RISC-based architectures; however, the best scaling performance was achieved on the Power3.

  1. Evaluation of the antifungal effects of bio-oil prepared with lignocellulosic biomass using fast pyrolysis technology.

    PubMed

    Kim, Kwang Ho; Jeong, Han Seob; Kim, Jae-Young; Han, Gyu Seong; Choi, In-Gyu; Choi, Joon Weon

    2012-10-01

    This study was performed to investigate the utility of bio-oil, produced via a fast pyrolysis process, as an antifungal agent against wood-rot fungi. Bio-oil solutions (25-100 wt.%) were prepared by diluting the bio-oil with EtOH. Wood block samples (yellow poplar and pitch pine) were treated with diluted bio-oil solutions and then subjected to a leaching process under hot water (70°C) for 72 h. After the wood block samples were thoroughly dried, they were subjected to a soil block test using Tyromyces palustris and Trametes versicolor. The antifungal effect of the 75% and 100% bio-oil solutions was the highest for both wood blocks. Scanning electron microscopy analysis indicated that some chemical components in the bio-oil solution could agglomerate together to form clusters in the inner part of the wood during the drying process, which could act as a wood preservative against fungal growth. According to GC/MS analysis, the components of the agglomerate were mainly phenolic compounds derived from lignin polymers. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. How do blockings relate to heavy precipitation events in Europe?

    NASA Astrophysics Data System (ADS)

    Lenggenhager, Sina; Romppainen, Olivia; Brönnimann, Stefan; Croci-Maspoli, Mischa

    2017-04-01

    Atmospheric blockings are quasi-stationary high pressure systems that persist for several days. Due to their longevity, blockings can be key features for extreme weather events. While several studies have shown their relevant role for temperatures extremes, the link between blockings and extreme precipitation and floods is still poorly understood. A case study of a Swiss lake flood event in the year 2000 reveals how different processes connected to blockings can favour the development of a flood. First upstream blocks helped to form strongly elongated troughs that are known to be associated with heavy precipitation events south of the Alps. Second recurrent precipitation events upstream of a block led to a moistening of the catchment and an increase of the lake level. Third the progression of the upstream weather systems was slowed and thereby the precipitation period over a catchment prolonged. Additionally, cloud diabatic processes in the flood region contributed to the establishment and maintenance of blocking anticyclones. Based on this case study we extend our analysis to all of Europe. Focusing on flood relevant precipitation events, i.e. extreme precipitation events that last for several days and affect larger areas, we show that different regions in Europe have very distinct seasonal precipitation patterns. Hence there is a strong seasonality in the occurrence of extreme events, depending on the geographical region. We further suggest that for different precipitation regimes, the preferred location of blockings varies strongly. Heavy precipitation events in southern France, for example, are often observed during Scandinavian blockings, while heavy precipitation events in south-eastern Europe coincide more often with eastern North-Atlantic blockings.

  3. Origami-inspired building block and parametric design for mechanical metamaterials

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Ma, Hua; Feng, Mingde; Yan, Leilei; Wang, Jiafu; Wang, Jun; Qu, Shaobo

    2016-08-01

    An origami-based building block of mechanical metamaterials is proposed and explained by introducing a mechanism model based on its geometry. According to our model, this origami mechanism supports response to uniaxial tension that depends on structure parameters. Hence, its mechanical properties can be tunable by adjusting the structure parameters. Experiments for poly lactic acid (PLA) samples were carried out, and the results are in good agreement with those of finite element analysis (FEA). This work may be useful for designing building blocks of mechanical metamaterials or other complex mechanical structures.

  4. Response Surface Analysis of Experiments with Random Blocks

    DTIC Science & Technology

    1988-09-01

    partitioned into a lack of fit sum of squares, SSLOF, and a pure error sum of squares, SSPE . The latter is obtained by pooling the pure error sums of squares...from the blocks. Tests concerning the polynomial effects can then proceed using SSPE as the error term in the denominators of the F test statistics. 3.2...the center point in each of the three blocks is equal to SSPE = 2.0127 with 5 degrees of freedom. Hence, the lack of fit sum of squares is SSLoF

  5. Continuous recording and interobserver agreement algorithms reported in the Journal of Applied Behavior Analysis (1995-2005).

    PubMed

    Mudford, Oliver C; Taylor, Sarah Ann; Martin, Neil T

    2009-01-01

    We reviewed all research articles in 10 recent volumes of the Journal of Applied Behavior Analysis (JABA): Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement, block-by-block agreement, and time-window analysis) were employed in more than 10 of the articles that reported continuous recording. Having identified these currently popular agreement computation algorithms, we explain them to assist researchers, software writers, and other consumers of JABA articles.

  6. Resolving writer's block.

    PubMed Central

    Huston, P.

    1998-01-01

    PROBLEM BEING ADDRESSED: Writer's block, or a distinctly uncomfortable inability to write, can interfere with professional productivity. OBJECTIVE OF PROGRAM: To identify writer's block and to outline suggestions for its early diagnosis, treatment, and prevention. MAIN COMPONENTS OF PROGRAM: Once the diagnosis has been established, a stepwise approach to care is recommended. Mild blockage can be resolved by evaluating and revising expectations, conducting a task analysis, and giving oneself positive feedback. Moderate blockage can be addressed by creative exercises, such as brainstorming and role-playing. Recalcitrant blockage can be resolved with therapy. Writer's block can be prevented by taking opportunities to write at the beginning of projects, working with a supportive group of people, and cultivating an ongoing interest in writing. CONCLUSIONS: Writer's block is a highly treatable condition. A systematic approach can help to alleviate anxiety, build confidence, and give people the information they need to work productively. PMID:9481467

  7. Comparison of saddle, lumbar epidural and caudal blocks on anal sphincter tone: A prospective, randomized study.

    PubMed

    Shon, Yoon-Jung; Huh, Jin; Kang, Sung-Sik; Bae, Seung-Kil; Kang, Ryeong-Ah; Kim, Duk-Kyung

    2016-10-01

    Objective To compare the effects of saddle, lumbar epidural and caudal blocks on anal sphincter tone using anorectal manometry. Methods Patients undergoing elective anorectal surgery with regional anaesthesia were divided randomly into three groups and received a saddle (SD), lumbar epidural (LE), or caudal (CD) block. Anorectal manometry was performed before and 30 min after each regional block. The degree of motor blockade of the anal sphincter was compared using the maximal resting pressure (MRP) and the maximal squeezing pressure (MSP). Results The study analysis population consisted of 49 patients (SD group, n = 18; LE group, n = 16; CD group, n = 15). No significant differences were observed in the percentage inhibition of the MRP among the three regional anaesthetic groups. However, percentage inhibition of the MSP was significantly greater in the SD group (83.6 ± 13.7%) compared with the LE group (58.4 ± 19.8%) and the CD group (47.8 ± 16.9%). In all groups, MSP was reduced significantly more than MRP after each regional block. Conclusions Saddle block was more effective than lumbar epidural or caudal block for depressing anal sphincter tone. No differences were detected between lumbar epidural and caudal blocks.

  8. In-hospital outcome in patients with ST elevation myocardial infarction and right bundle branch block. A sub-study from RENASICA II, a national multicenter registry.

    PubMed

    Juárez-Herrera, Ursulo; Jerjes Sánchez, Carlos; González-Pacheco, Héctor; Martínez-Sánchez, Carlos

    2010-01-01

    Compare in-hospital outcome in patients with ST-elevation myocardial infarction with right versus left bundle branch block. RENASICA II, a national Mexican registry enrolled 8098 patients with final diagnosis of acute coronary syndrome secondary to ischemic heart disease. In 4555 STEMI patients, 545 had bundle branch block, 318 (58.3%) with right and 225 patients with left (41.6%). Both groups were compared in terms of in-hospital outcome through major cardiovascular adverse events; (cardiovascular death, recurrent ischemia and reinfarction). Multivariable analysis was performed to identify in-hospital mortality risk among right and left bundle branch block patients. There were not statistical differences in both groups regarding baseline characteristics, time of ischemia, myocardial infarction location, ventricular dysfunction and reperfusion strategies. In-hospital outcome in bundle branch block group was characterized by a high incidence of major cardiovascular adverse events with a trend to higher mortality in patients with right bundle branch block (OR 1.70, CI 1.19 - 2.42, p < 0.003), compared to left bundle branch block patients. In this sub-study right bundle branch block accompanying ST-elevation myocardial infarction of any location at emergency room presentation was an independent predictor of high in-hospital mortality.

  9. Design and analysis of multihypothesis motion-compensated prediction (MHMCP) codec for error-resilient visual communications

    NASA Astrophysics Data System (ADS)

    Kung, Wei-Ying; Kim, Chang-Su; Kuo, C.-C. Jay

    2004-10-01

    A multi-hypothesis motion compensated prediction (MHMCP) scheme, which predicts a block from a weighted superposition of more than one reference blocks in the frame buffer, is proposed and analyzed for error resilient visual communication in this research. By combining these reference blocks effectively, MHMCP can enhance the error resilient capability of compressed video as well as achieve a coding gain. In particular, we investigate the error propagation effect in the MHMCP coder and analyze the rate-distortion performance in terms of the hypothesis number and hypothesis coefficients. It is shown that MHMCP suppresses the short-term effect of error propagation more effectively than the intra refreshing scheme. Simulation results are given to confirm the analysis. Finally, several design principles for the MHMCP coder are derived based on the analytical and experimental results.

  10. TIGGERC: Turbomachinery Interactive Grid Generator for 2-D Grid Applications and Users Guide

    NASA Technical Reports Server (NTRS)

    Miller, David P.

    1994-01-01

    A two-dimensional multi-block grid generator has been developed for a new design and analysis system for studying multiple blade-row turbomachinery problems. TIGGERC is a mouse driven, interactive grid generation program which can be used to modify boundary coordinates and grid packing and generates surface grids using a hyperbolic tangent or algebraic distribution of grid points on the block boundaries. The interior points of each block grid are distributed using a transfinite interpolation approach. TIGGERC can generate a blocked axisymmetric H-grid, C-grid, I-grid or O-grid for studying turbomachinery flow problems. TIGGERC was developed for operation on Silicon Graphics workstations. Detailed discussion of the grid generation methodology, menu options, operational features and sample grid geometries are presented.

  11. Rapid ordering of block copolymer thin films

    NASA Astrophysics Data System (ADS)

    Majewski, Pawel W.; Yager, Kevin G.

    2016-10-01

    Block-copolymers self-assemble into diverse morphologies, where nanoscale order can be finely tuned via block architecture and processing conditions. However, the ultimate usage of these materials in real-world applications may be hampered by the extremely long thermal annealing times—hours or days—required to achieve good order. Here, we provide an overview of the fundamentals of block-copolymer self-assembly kinetics, and review the techniques that have been demonstrated to influence, and enhance, these ordering kinetics. We discuss the inherent tradeoffs between oven annealing, solvent annealing, microwave annealing, zone annealing, and other directed self-assembly methods; including an assessment of spatial and temporal characteristics. We also review both real-space and reciprocal-space analysis techniques for quantifying order in these systems.

  12. Radio-tracer techniques for the study of flow in saturated porous materials

    USGS Publications Warehouse

    Skibitzke, H.E.; Chapman, H.T.; Robinson, G.M.; McCullough, Richard A.

    1961-01-01

    An experiment was conducted by the U.S. Geological Survey to determine the feasibility of using a radioactive substance as a tracer in the study of microscopic flow in a saturated porous solid. A radioactive tracer was chosen in preference to dye or other chemical in order to eliminate effects of the tracer itself on the flow system such as those relating to density, viscosity and surface tension. The porous solid was artificial "sandstone" composed of uniform fine grains of sand bonded together with an epoxy adhesive. The sides of the block thus made were sealed with an epoxy coating compound to insure water-tightness. Because of the chemical inertness of the block it was possible to use radioactive phosphorus (P32). Ion-exchange equilibrium was created between the block and nonradioactive phosphoric acid. Then a tracer tagged with P32 was injected into the block in the desired geometric configuration, in this case, a line source. After equilibrium in isotopic exchange was reached between the block and the line source, the block was rinsed, drained and sawn into slices. It was found that a quantitative analysis of the flow system may be made by assaying the dissected block. ?? 1961.

  13. Estimating the intensity of ward admission and its effect on emergency department access block.

    PubMed

    Luo, Wei; Cao, Jiguo; Gallagher, Marcus; Wiles, Janet

    2013-07-10

    Emergency department access block is an urgent problem faced by many public hospitals today. When access block occurs, patients in need of acute care cannot access inpatient wards within an optimal time frame. A widely held belief is that access block is the end product of a long causal chain, which involves poor discharge planning, insufficient bed capacity, and inadequate admission intensity to the wards. This paper studies the last link of the causal chain-the effect of admission intensity on access block, using data from a metropolitan hospital in Australia. We applied several modern statistical methods to analyze the data. First, we modeled the admission events as a nonhomogeneous Poisson process and estimated time-varying admission intensity with penalized regression splines. Next, we established a functional linear model to investigate the effect of the time-varying admission intensity on emergency department access block. Finally, we used functional principal component analysis to explore the variation in the daily time-varying admission intensities. The analyses suggest that improving admission practice during off-peak hours may have most impact on reducing the number of ED access blocks. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Three-dimensional Reconstruction of Block Shape Irregularity and its Effects on Block Impacts Using an Energy-Based Approach

    NASA Astrophysics Data System (ADS)

    Zhang, Yulong; Liu, Zaobao; Shi, Chong; Shao, Jianfu

    2018-04-01

    This study is devoted to three-dimensional modeling of small falling rocks in block impact analysis in energy view using the particle flow method. The restitution coefficient of rockfall collision is introduced from the energy consumption mechanism to describe rockfall-impacting properties. Three-dimensional reconstruction of falling block is conducted with the help of spherical harmonic functions that have satisfactory mathematical properties such as orthogonality and rotation invariance. Numerical modeling of the block impact to the bedrock is analyzed with both the sphere-simplified model and the 3D reconstructed model. Comparisons of the obtained results suggest that the 3D reconstructed model is advantageous in considering the combination effects of rockfall velocity and rotations during colliding process. Verification of the modeling is carried out with the results obtained from other experiments. In addition, the effects of rockfall morphology, surface characteristics, velocity, and volume, colliding damping and relative angle are investigated. A three-dimensional reconstruction modulus of falling blocks is to be developed and incorporated into the rockfall simulation tools in order to extend the modeling results at block scale to slope scale.

  15. A comparison of plasma levobupivacaine concentrations following transversus abdominis plane block and rectus sheath block.

    PubMed

    Yasumura, R; Kobayashi, Y; Ochiai, R

    2016-05-01

    Levobupivacaine is commonly used as the local anaesthetic of choice in peripheral nerve blocks, but its pharmacokinetics have not been fully investigated. We compared the changes in plasma concentrations of levobupivacaine following transversus abdominis plane block and rectus sheath block. Fifty woman undergoing laparoscopy were randomly allocated to receive either a transversus abdominis plane block or an rectus sheath block. In both groups, 2.5 mg.kg(-1) levobupivacaine was administered, and blood samples were obtained 15 min, 30 min, 60 min and 120 min after injection. The mean maximum plasma concentration (Cmax) and mean time to reach Cmax (Tmax) as determined by non-linear regression analysis were 1.05 μg.ml(-1) and 32.4 min in the transversus abdominis plane group and 0.95 μg.ml(-1) and 60.9 min in the rectus sheath group, respectively. The plasma concentration of levobupivacaine peaked earlier in the transversus abdominis plane group than in the rectus sheath group and the maximum plasma concentration depended on the dose administered but not the procedure. © 2016 The Association of Anaesthetists of Great Britain and Ireland.

  16. Analysis of backward error recovery for concurrent processes with recovery blocks

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Lee, Y. H.

    1982-01-01

    Three different methods of implementing recovery blocks (RB's). These are the asynchronous, synchronous, and the pseudo recovery point implementations. Pseudo recovery points so that unbounded rollback may be avoided while maintaining process autonomy are proposed. Probabilistic models for analyzing these three methods under standard assumptions in computer performance analysis, i.e., exponential distributions for related random variables were developed. The interval between two successive recovery lines for asynchronous RB's mean loss in computation power for the synchronized method, and additional overhead and rollback distance in case PRP's are used were estimated.

  17. Navy Tactical Applications Guide. Volume 8. Weather Analysis and Forecast Applications. Part 1. Arctic: Greenland/Norwegian/Barents Seas. Meteorological Satellite Systems

    DTIC Science & Technology

    1989-09-01

    Blocks develop over the North Atlantic with the maximum frequency be- tween December and May, and with the typical block lasting 5 to 15 days (Fett et al ...combination of these processes and others. In the "others" category is a third potential, described by Shapiro et al . (1987), in which essentially unmodified...distinguish the polar low in the mean." On the other hand Shapiro et al . (1987), in their analysis "of the first research aircraft observations of the

  18. Pulse power applications of silicon diodes in EML capacitive pulsers

    NASA Astrophysics Data System (ADS)

    Dethlefsen, Rolf; McNab, Ian; Dobbie, Clyde; Bernhardt, Tom; Puterbaugh, Robert; Levine, Frank; Coradeschi, Tom; Rinaldi, Vito

    1993-01-01

    Crowbar diodes are used for increasing the energy transfer from capacitive pulse forming networks. They also prevent voltage reversal on the energy storage capacitors. 52 mm diameter diodes with a 5 kV reverse blocking voltage, rated 40 kA were successfully used for the 32 MJ SSG rail gun. An uprated diode with increased current capability and a 15 kV reverse blocking voltage has been developed. Transient thermal analysis has predicted the current ratings for different pulse length. Analysis verification is obtained from destructive testing.

  19. The Building Blocks of Digital Media Literacy: Socio-Material Participation and the Production of Media Knowledge

    ERIC Educational Resources Information Center

    Dezuanni, Michael

    2015-01-01

    This article outlines the knowledge and skills students develop when they engage in digital media production and analysis in school settings. The metaphor of "digital building blocks" is used to describe the material practices, conceptual understandings and production of knowledge that lead to the development of digital media literacy.…

  20. Toward an Aspirational Learning Model Gleaned from Large-Scale Assessment

    ERIC Educational Resources Information Center

    Diket, Read M.; Xu, Lihua; Brewer, Thomas M.

    2014-01-01

    The aspirational model resulted from the authors' secondary analysis of the Mother/Child (M/C) test block from the 2008 National Assessment of Educational Progress restricted data that examined the responses of the national sample of 8th-grade students (n = 1648). This test block presented no artmaking task and consisted of the same 13 questions…

  1. Building Blocks of Contemporary HRD Research: A Citation Analysis on Human Resource Development Quarterly between 2007 and 2013

    ERIC Educational Resources Information Center

    Mehdiabadi, Amir Hedayati; Seo, Gaeun; Huang, Wenhao David; Han, Seung-hyun Caleb

    2017-01-01

    Human resource development is known to encapsulate a collection of social science disciplines including communications, psychology, and economics. Since these and other similar areas are the cornerstones of HRD, the changing nature of HRD demands constant reflections on the value and building blocks of contemporary HRD inquiries. This article…

  2. Multi-blocking strategies for the INS3D incompressible Navier-Stokes code

    NASA Technical Reports Server (NTRS)

    Gatlin, Boyd

    1990-01-01

    With the continuing development of bigger and faster supercomputers, computational fluid dynamics (CFD) has become a useful tool for real-world engineering design and analysis. However, the number of grid points necessary to resolve realistic flow fields numerically can easily exceed the memory capacity of available computers. In addition, geometric shapes of flow fields, such as those in the Space Shuttle Main Engine (SSME) power head, may be impossible to fill with continuous grids upon which to obtain numerical solutions to the equations of fluid motion. The solution to this dilemma is simply to decompose the computational domain into subblocks of manageable size. Computer codes that are single-block by construction can be modified to handle multiple blocks, but ad-hoc changes in the FORTRAN have to be made for each geometry treated. For engineering design and analysis, what is needed is generalization so that the blocking arrangement can be specified by the user. INS3D is a computer program for the solution of steady, incompressible flow problems. It is used frequently to solve engineering problems in the CFD Branch at Marshall Space Flight Center. INS3D uses an implicit solution algorithm and the concept of artificial compressibility to provide the necessary coupling between the pressure field and the velocity field. The development of generalized multi-block capability in INS3D is described.

  3. Tailor-made polyfluoroacrylate and its block copolymer by RAFT polymerization in miniemulsion; improved hydrophobicity in the core-shell block copolymer.

    PubMed

    Chakrabarty, Arindam; Singha, Nikhil K

    2013-10-15

    Controlled/living radical polymerization (CRP) of a fluoroacrylate was successfully carried out in miniemulsion by Reversible Addition Fragmentation chain Transfer (RAFT) process. In this case, 2,2,3,3,4,4,4-heptafluorobutyl acrylate (HFBA) was polymerized using 2-cyanopropyl dodecyl trithiocarbonate (CPDTC) as RAFT agent, Triton X-405 and sodium dodecyl sulfonate (SDS) as surfactant, and potassium persulphate (KPS) or 2,2'-azobis isobutyronitrile (AIBN) as initiator. Being compatible with hydrophobic fluoroacrylate, this RAFT agent offered very high conversion and good control over the molecular weight of the polymer. The miniemulsion was stable without any costabilizer. The long chain dodecyl group (-C12H25) (Z-group in the RAFT agent) had beneficial effect in stabilizing the miniemulsion. When 2-cyano 2-propyl benzodithioate (CPBD) (Z=-C6H5) was used as RAFT agent, the conversion was less and particle size distribution was very broad. Block copolymerization with butyl acrylate (BA) using PHFBA as macro-RAFT agent showed core-shell morphology with the aggregation of PHFBA segment in the shell. GPC as well as DSC analysis confirmed the formation of block copolymer. The core-shell morphology was confirmed by TEM analysis. The block copolymers (PHFBA-b-PBA) showed significantly higher water contact angle (WCA) showing much better hydrophobicity compared to PHFBA alone. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Proteomic analysis of formalin-fixed paraffin-embedded renal tissue samples by label-free MS: assessment of overall technical variability and the impact of block age.

    PubMed

    Craven, Rachel A; Cairns, David A; Zougman, Alexandre; Harnden, Patricia; Selby, Peter J; Banks, Rosamonde E

    2013-04-01

    Protein profiling of formalin-fixed paraffin-embedded (FFPE) tissues has enormous potential for the discovery and validation of disease biomarkers. The aim of this study was to systematically characterize the effect of length of time of storage of such tissue blocks in pathology archives on the quality of data produced using label-free MS. Normal kidney and clear cell renal cell carcinoma tissues routinely collected up to 10 years prior to analysis were profiled using LC-MS/MS and the data analyzed using MaxQuant. Protein identities and quantification data were analyzed to examine differences between tissue blocks of different ages and assess the impact of technical and biological variability. An average of over 2000 proteins was seen in each sample with good reproducibility in terms of proteins identified and quantification for normal kidney tissue, with no significant effect of block age. Greater biological variability was apparent in the renal cell carcinoma tissue, possibly reflecting disease heterogeneity, but again there was good correlation between technical replicates and no significant effect of block age. These results indicate that archival storage time does not have a detrimental effect on protein profiling of FFPE tissues, supporting the use of such tissues in biomarker discovery studies. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Building blocks for the development of an interface for high-throughput thin layer chromatography/ambient mass spectrometric analysis: a green methodology.

    PubMed

    Cheng, Sy-Chyi; Huang, Min-Zong; Wu, Li-Chieh; Chou, Chih-Chiang; Cheng, Chu-Nian; Jhang, Siou-Sian; Shiea, Jentaie

    2012-07-17

    Interfacing thin layer chromatography (TLC) with ambient mass spectrometry (AMS) has been an important area of analytical chemistry because of its capability to rapidly separate and characterize the chemical compounds. In this study, we have developed a high-throughput TLC-AMS system using building blocks to deal, deliver, and collect the TLC plate through an electrospray-assisted laser desorption ionization (ELDI) source. This is the first demonstration of the use of building blocks to construct and test the TLC-MS interfacing system. With the advantages of being readily available, cheap, reusable, and extremely easy to modify without consuming any material or reagent, the use of building blocks to develop the TLC-AMS interface is undoubtedly a green methodology. The TLC plate delivery system consists of a storage box, plate dealing component, conveyer, light sensor, and plate collecting box. During a TLC-AMS analysis, the TLC plate was sent to the conveyer from a stack of TLC plates placed in the storage box. As the TLC plate passed through the ELDI source, the chemical compounds separated on the plate would be desorbed by laser desorption and subsequently postionized by electrospray ionization. The samples, including a mixture of synthetic dyes and extracts of pharmaceutical drugs, were analyzed to demonstrate the capability of this TLC-ELDI/MS system for high-throughput analysis.

  6. The Iterative Reweighted Mixed-Norm Estimate for Spatio-Temporal MEG/EEG Source Reconstruction.

    PubMed

    Strohmeier, Daniel; Bekhti, Yousra; Haueisen, Jens; Gramfort, Alexandre

    2016-10-01

    Source imaging based on magnetoencephalography (MEG) and electroencephalography (EEG) allows for the non-invasive analysis of brain activity with high temporal and good spatial resolution. As the bioelectromagnetic inverse problem is ill-posed, constraints are required. For the analysis of evoked brain activity, spatial sparsity of the neuronal activation is a common assumption. It is often taken into account using convex constraints based on the l 1 -norm. The resulting source estimates are however biased in amplitude and often suboptimal in terms of source selection due to high correlations in the forward model. In this work, we demonstrate that an inverse solver based on a block-separable penalty with a Frobenius norm per block and a l 0.5 -quasinorm over blocks addresses both of these issues. For solving the resulting non-convex optimization problem, we propose the iterative reweighted Mixed Norm Estimate (irMxNE), an optimization scheme based on iterative reweighted convex surrogate optimization problems, which are solved efficiently using a block coordinate descent scheme and an active set strategy. We compare the proposed sparse imaging method to the dSPM and the RAP-MUSIC approach based on two MEG data sets. We provide empirical evidence based on simulations and analysis of MEG data that the proposed method improves on the standard Mixed Norm Estimate (MxNE) in terms of amplitude bias, support recovery, and stability.

  7. Effects of block copolymer properties on nanocarrier protection from in vivo clearance

    PubMed Central

    D’Addio, Suzanne M.; Saad, Walid; Ansell, Steven M.; Squiers, John J.; Adamson, Douglas; Herrera-Alonso, Margarita; Wohl, Adam R.; Hoye, Thomas R.; Macosko, Christopher W.; Mayer, Lawrence D.; Vauthier, Christine; Prud’homme, Robert K.

    2012-01-01

    Drug nanocarrier clearance by the immune system must be minimized to achieve targeted delivery to pathological tissues. There is considerable interest in finding in vitro tests that can predict in vivo clearance outcomes. In this work, we produce nanocarriers with dense PEG layers resulting from block copolymer-directed assembly during rapid precipitation. Nanocarriers are formed using block copolymers with hydrophobic blocks of polystyrene (PS), poly-ε-caprolactone (PCL), poly-D,L-lactide (PLA), or poly-lactide-co-glycolide (PLGA), and hydrophilic blocks of polyethylene glycol (PEG) with molecular weights from 1.5 kg/mol to 9 kg/mol. Nanocarriers with paclitaxel prodrugs are evaluated in vivo in Foxn1nu mice to determine relative rates of clearance. The amount of nanocarrier in circulation after 4 h varies from 10% to 85% of initial dose, depending on the block copolymer. In vitro complement activation assays are conducted in an effort to correlate the protection of the nanocarrier surface from complement binding and activation and in vivo circulation. Guidelines for optimizing block copolymer structure to maximize circulation of nanocarriers formed by rapid precipitation and directed assembly are proposed, relating to the relative size of the hydrophilic and hydrophobic block, the hydrophobicity of the anchoring block, the absolute size of the PEG block, and polymer crystallinity. The in vitro results distinguish between the poorly circulating PEG5k-PCL9k and the better circulating nanocarriers, but could not rank the better circulating nanocarriers in order of circulation time. Analysis of PEG surface packing on monodisperse 200 nm latex spheres indicates that the sizes of the hydrophobic PCL, PS, and PLA blocks are correlated with the PEG blob size, and possibly the clearance from circulation. Suggestions for next step in vitro measurements are made. PMID:22732478

  8. Wrong-site nerve blocks: A systematic literature review to guide principles for prevention.

    PubMed

    Deutsch, Ellen S; Yonash, Robert A; Martin, Donald E; Atkins, Joshua H; Arnold, Theresa V; Hunt, Christina M

    2018-05-01

    Wrong-site nerve blocks (WSBs) are a significant, though rare, source of perioperative morbidity. WSBs constitute the most common type of perioperative wrong-site procedure reported to the Pennsylvania Patient Safety Authority. This systematic literature review aggregates information about the incidence, patient consequences, and conditions that contribute to WSBs, as well as evidence-based methods to prevent them. A systematic search of English-language publications was performed, using the PRISMA process. Seventy English-language publications were identified. Analysis of four publications reporting on at least 10,000 blocks provides a rate of 0.52 to 5.07 WSB per 10,000 blocks, unilateral blocks, or "at risk" procedures. The most commonly mentioned potential consequence was local anesthetic toxicity. The most commonly mentioned contributory factors were time pressure, personnel factors, and lack of site-mark visibility (including no site mark placed). Components of the block process that were addressed include preoperative nerve-block verification, nerve-block site marking, time-outs, and the healthcare facility's structure and culture of safety. A lack of uniform reporting criteria and divergence in the data and theories presented may reflect the variety of circumstances affecting when and how nerve blocks are performed, as well as the infrequency of a WSB. However, multiple authors suggest three procedural steps that may help to prevent WSBs: (1) verify the nerve-block procedure using multiple sources of information, including the patient; (2) identify the nerve-block site with a visible mark; and (3) perform time-outs immediately prior to injection or instillation of the anesthetic. Hospitals, ambulatory surgical centers, and anesthesiology practices should consider creating site-verification processes with clinician input and support to develop sustainable WSB-prevention practices. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. "Transversus Abdominis Plane Blocks in Microsurgical Breast Reconstruction: Analysis of Pain, Narcotic Consumption, Length of Stay and Cost."

    PubMed

    Salibian, Ara A; Frey, Jordan D; Thanik, Vishal D; Karp, Nolan S; Choi, Mihye

    2018-06-02

    Transversus abdominis plane (TAP) blocks are increasingly being utilized in microvascular breast reconstruction. The implications of TAP blocks on specific reconstructive, patient and institutional outcomes remain to be fully elucidated. Patients undergoing abdominally-based microvascular breast reconstruction from 2015-2017 were reviewed. Length of stay, complications, narcotic consumption, donor-site pain and hospital expenses were compared between patients that did and those that did not receive TAP blocks with liposomal bupivacaine. Outcomes were subsequently compared in patients with elevated body mass index (BMI). Fifty patients (43.9%) received TAP blocks (27 [54.0%] under ultrasound guidance) and 64 patients (56.1%) did not. Patients with TAP blocks had significantly decreased oral and total narcotic consumption (p=0.0001 and p<0.0001, respectively) as well as significantly less donor-site pain (3.3 versus 4.3, p<0.0001). There was no significant difference in hospital expenses between the two cohorts ($21,531.53 versus $22,050.15 per patient, p=0.5659). Patients with BMI≥25 who received TAP blocks had a significantly decreased length of stay (3.8 versus 4.4 days, p=0.0345) as well as decreased narcotic consumption and postoperative pain compared to patients without TAP blocks. Patients with BMI<25 did not have a significant difference in postoperative pain, narcotic consumption or length of stay between the TAP versus no TAP block groups. TAP blocks with liposomal bupivacaine significantly reduce oral and total postoperative narcotic consumption as well as donor-site pain in all patients after abdominally-based microvascular breast reconstruction without increasing hospital expenses. TAP blocks additionally significantly decrease length of stay in patients with BMI≥25.

  10. Cell block samples from malignant pleural effusion might be valid alternative samples for anaplastic lymphoma kinase detection in patients with advanced non-small-cell lung cancer.

    PubMed

    Zhou, Jianya; Yao, Hongtian; Zhao, Jing; Zhang, Shumeng; You, Qihan; Sun, Ke; Zou, Yinying; Zhou, Caicun; Zhou, Jianying

    2015-06-01

    To evaluate the clinical value of cell block samples from malignant pleural effusion (MPE) as alternative samples to tumour tissue for anaplastic lymphoma kinase (ALK) detection in patients with advanced non-small-cell lung cancer (NSCLC). Fifty-two matched samples were eligible for analysis. ALK status was detected by Ventana immunohistochemistry (IHC) (with the D5F3 clone), reverse transcription polymerase chain reaction (RT-PCR) and fluorescence in-situ hybridization (FISH) in MPE cell block samples, and by FISH in tumour tissue block samples. In total, ALK FISH results were obtained for 52 tumour tissue samples and 41 MPE cell block samples. Eight cases (15.4%) were ALK-positive in tumour tissue samples by FISH, and among matched MPE cell block samples, five were ALK-positive by FISH, seven were ALK-positive by RT-PCR, and eight were ALK-positive by Ventana IHC. The ALK status concordance rates between tumour tissue and MPE cell block samples were 78.9% by FISH, 98.1% by RT-PCR, and 100% by Ventana IHC. In MPE cell block samples, the sensitivity and specificity of Ventana IHC (100% and 100%) and RT-PCR (87.5% and 100%) were higher than those of FISH (62.5% and 100%). Malignant pleural effusion cell block samples had a diagnostic performance for ALK detection in advanced NSCLC that was comparable to that of tumour tissue samples. MPE cell block samples might be valid alternative samples for ALK detection when tissue is not available. Ventana IHC could be the most suitable method for ALK detection in MPE cell block samples. © 2014 John Wiley & Sons Ltd.

  11. Two-sided block of a dual-topology F- channel.

    PubMed

    Turman, Daniel L; Nathanson, Jacob T; Stockbridge, Randy B; Street, Timothy O; Miller, Christopher

    2015-05-05

    The Fluc family is a set of small membrane proteins forming F(-)-specific electrodiffusive ion channels that rescue microorganisms from F(-) toxicity during exposure to weakly acidic environments. The functional channel is built as a dual-topology homodimer with twofold symmetry parallel to the membrane plane. Fluc channels are blocked by nanomolar-affinity fibronectin-domain monobodies originally selected from phage-display libraries. The unusual symmetrical antiparallel dimeric architecture of Flucs demands that the two chemically equivalent monobody-binding epitopes reside on opposite ends of the channel, a double-sided blocking situation that has never before presented itself in ion channel biophysics. However, it is not known if both sites can be simultaneously occupied, and if so, whether monobodies bind independently or cooperatively to their transmembrane epitopes. Here, we use direct monobody-binding assays and single-channel recordings of a Fluc channel homolog to reveal a novel trimolecular blocking behavior that reveals a doubly occupied blocked state. Kinetic analysis of single-channel recordings made with monobody on both sides of the membrane shows substantial negative cooperativity between the two blocking sites.

  12. A Cytological Analysis of the Antimetabolite Activity of 5-Hydroxyuracil in Vicia faba Roots

    PubMed Central

    Schreiber, Richard W.; Duncan, Robert E.

    1958-01-01

    The effects of 5-hydroxyuracil (5-HU) (isobarbituric acid) upon cell elongation, mitosis, and DNA synthesis were studied in Vicia faba roots. 5-HU had no consistent effect upon root elongation. It blocked DNA synthesis (analyzed by photometric measurements of Feulgen dye in nuclei) during the first 6 hours of treatment; the block spontaneously disappeared by the 12th hour of treatment. Uracil and thymine had no effect upon this block of synthesis. Both thymidine and uridine reversed the block in 6 and 9 hours respectively. In all cases blockage of DNA synthesis was followed by inhibition of mitosis (determined by changes in the percentage of cells in mitosis) and resumption of DNA synthesis was followed by resumption of mitosis. Inhibition indices calculated from the mitotic data indicated a competitive relationship between 5-HU and thymidine and 5-HU and uridine. 5-HU is considered to block DNA synthesis by competing with thymidine for sites on enzymes involved in the synthesis. It is suggested that uridine reverses the block in synthesis by undergoing a conversion to thymidine. PMID:13610946

  13. Strategic retrieval processing and the impact of knowing when a memory was first created.

    PubMed

    Roberts, J S; Tsivilis, D; Mayes, A R

    2014-04-01

    Retrieval orientation refers to a process where participants strategically alter how a memory cue is processed in response to different task demands. In the present study we explored whether retrieval orientation is influenced by knowing when an old stimulus was first encoded. Participants completed separate remember/know test blocks for old items from a recent delay (40min) and old items from a remote delay (48h). Manipulations at encoding ensured that performance levels were matched between these two blocks, thus avoiding confounds with differences in retrieval effort. Importantly, a third test block comprised old items from both delays randomly intermixed. As the nature of the old items varies unpredictably in the mixed block, it should not be possibly to adopt a specific retrieval orientation and the mixed block therefore acts as a control condition. Participants saw the words "mixed," "recent" or "remote" prior to each test block. Comparing ERPs from the recent and remote blocks permitted an investigation of whether participants alter their retrieval orientation in response to the specific length of the retention interval. Comparing ERPs from the pure (recent and remote) test blocks to ERPs from the mixed block permitted an investigation of whether delay information per se led to differences in retrieval strategy. Analysis of the ERP data found no differences between the recent and remote blocks. However, ERPs from these pure blocks were significantly less positive than ERPs from the mixed block from 200ms towards the end of the epoch. The findings suggest that the delay information was useful in a general sense and encouraged retrieval strategies distinct from those engaged in the mixed block. We speculate that such strategies might relate to whether or not the retrieval search is specific and constrained and/or whether processes that serve to reinstate the original encoding context are engaged. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Development and Validation of a New Air Carrier Block Time Prediction Model and Methodology

    NASA Astrophysics Data System (ADS)

    Litvay, Robyn Olson

    Commercial airline operations rely on predicted block times as the foundation for critical, successive decisions that include fuel purchasing, crew scheduling, and airport facility usage planning. Small inaccuracies in the predicted block times have the potential to result in huge financial losses, and, with profit margins for airline operations currently almost nonexistent, potentially negate any possible profit. Although optimization techniques have resulted in many models targeting airline operations, the challenge of accurately predicting and quantifying variables months in advance remains elusive. The objective of this work is the development of an airline block time prediction model and methodology that is practical, easily implemented, and easily updated. Research was accomplished, and actual U.S., domestic, flight data from a major airline was utilized, to develop a model to predict airline block times with increased accuracy and smaller variance in the actual times from the predicted times. This reduction in variance represents tens of millions of dollars (U.S.) per year in operational cost savings for an individual airline. A new methodology for block time prediction is constructed using a regression model as the base, as it has both deterministic and probabilistic components, and historic block time distributions. The estimation of the block times for commercial, domestic, airline operations requires a probabilistic, general model that can be easily customized for a specific airline’s network. As individual block times vary by season, by day, and by time of day, the challenge is to make general, long-term estimations representing the average, actual block times while minimizing the variation. Predictions of block times for the third quarter months of July and August of 2011 were calculated using this new model. The resulting, actual block times were obtained from the Research and Innovative Technology Administration, Bureau of Transportation Statistics (Airline On-time Performance Data, 2008-2011) for comparison and analysis. Future block times are shown to be predicted with greater accuracy, without exception and network-wide, for a major, U.S., domestic airline.

  15. Young Investigator Challenge: A novel, simple method for cell block preparation, implementation, and use over 2 years.

    PubMed

    Lindsey, Kathryn G; Houser, Patricia M; Shotsberger-Gray, Wanda; Chajewski, Olga S; Yang, Jack

    2016-12-01

    The cell block is an essential adjunct to conventional cytopreparatory techniques. The need for molecular analysis and immunostains will increase the need for successful cell block preparation. Even with this need, to the authors' knowledge very little has changed regarding the way in which cell blocks are produced. The authors developed A Formalin-Fixed, paraffin Embedded Cytology cell block Technique (AFFECT) that uses a cytospin centrifuge and funnel to deposit a cell pellet into a well on a piece of open-cell, absorbent foam. The foam and the pellet are then sent through normal processing. Herein, the authors present the implementation of this method and some of their experience with its performance over the course of 2 years. Although a comparison of the methods indicated good correlation for the production of a cell block between AFFECT and the agarose method, the AFFECT blocks demonstrated markedly improved cellular morphology. Over the first 6 months of use, AFFECT produced a successful cell block in 74% of cases overall, and in 65% of cases with a cell pellet measuring ≤0.1 mL. The year preceding the implementation of AFFECT and its first year of use were compared for endoscopic and bronchoscopic ultrasound-guided fine-needle aspiration specimens, and demonstrated an improved success rate. The authors developed a novel method of cell block preparation that demonstrates improved histology and has increased the success rate of cell block production compared with the agarose method. Cancer Cytopathol 2016;124:885-892. © 2016 American Cancer Society. © 2016 American Cancer Society.

  16. Comparison of the effect of land-sea thermal contrast on interdecadal variations in winter and summer blockings

    NASA Astrophysics Data System (ADS)

    He, Yongli; Huang, Jianping; Li, Dongdong; Xie, Yongkun; Zhang, Guolong; Qi, Yulei; Wang, Shanshan; Totz, Sonja

    2017-11-01

    The influence of winter and summer land-sea surface thermal contrast on blocking for 1948-2013 is investigated using observations and the coupled model intercomparison project outputs. The land-sea index (LSI) is defined to measure the changes of zonal asymmetric thermal forcing under global warming. The summer LSI shows a slower increasing trend than winter during this period. For the positive of summer LSI, the EP flux convergence induced by the land-sea thermal forcing in the high latitude becomes weaker than normal, which induces positive anomaly of zonal-mean westerly and double-jet structure. Based on the quasiresonance amplification mechanism, the narrow and reduced westerly tunnel between two jet centers provides a favor environment for more frequent blocking. Composite analysis demonstrates that summer blocking shows an increasing trend of event numbers and a decreasing trend of durations. The numbers of the short-lived blocking persisting for 5-9 days significantly increases and the numbers of the long-lived blocking persisting for longer than 10 days has a weak increase than that in negative phase of summer LSI. The increasing transient wave activities induced by summer LSI is responsible for the decreasing duration of blockings. The increasing blocking due to summer LSI can further strengthen the continent warming and increase the summer LSI, which forms a positive feedback. The opposite dynamical effect of LSI on summer and winter blocking are discussed and found that the LSI-blocking negative feedback partially reduces the influence of the above positive feedback and induce the weak summer warming rate.

  17. Unpacking the Blockers: Understanding Perceptions and Social Constraints of Health Communication in Hereditary Breast Ovarian Cancer (HBOC) Susceptibility Families

    PubMed Central

    Kenen, Regina; Hoskins, Lindsey M.; Koehly, Laura M.; Graubard, Barry; Loud, Jennifer T.; Greene, Mark H.

    2012-01-01

    Family communication is essential for accurate cancer risk assessment and counseling; family blockers play a role in this communication process. This qualitative analysis of social exchanges is an extension of earlier work characterizing those who are perceived by study participants as health information gatherers, disseminators, and blockers within families with Hereditary Breast and Ovarian Cancer (HBOC) susceptibility. Eighty-nine women, ages 23–56 years, enrolled in a Breast Imaging Study (BIS) and participated in a sub-study utilizing a social assessment tool known as the Colored Ecological Genetic Relational Map (CEGRM). Purposive sampling ensured that participants varied according to numbers of participating family members e.g., ranging from 1 to 6. Eighty-nine women from 42 families (1–8 relatives/family) participated. They collectively designated 65 blockers, both male and female. Situational factors, beliefs, attitudes and cultural traditions, privacy and protectiveness comprised perceived reasons for blocking intra-family health communications. Longitudinal data collected over 4 years showed families where blocking behavior was universally recognized and stable over time, as well as other families where blocking was less consistent. Self-blocking was observed among a significant minority of participating women. Blocking of health communications among family members with HBOC was variable, complex, and multifaceted. The reasons for blocking were heterogeneous; duration of the blocking appeared to depend on the reasons for blocking. Blocking often seemed to involve bi-directional feedback loops, in keeping with Lepore’s Social Constraints and Modulation Theory. Privacy and protectiveness predominated as explanations for long-term blocking. PMID:21547418

  18. Synthesis of Diblock copolymer poly-3-hydroxybutyrate -block-poly-3-hydroxyhexanoate [PHB-b-PHHx] by a β-oxidation weakened Pseudomonas putida KT2442.

    PubMed

    Tripathi, Lakshmi; Wu, Lin-Ping; Chen, Jinchun; Chen, Guo-Qiang

    2012-04-05

    Block polyhydroxyalkanoates (PHA) were reported to be resistant against polymer aging that negatively affects polymer properties. Recently, more and more attempts have been directed to make PHA block copolymers. Diblock copolymers PHB-b-PHHx consisting of poly-3-hydroxybutyrate (PHB) block covalently bonded with poly-3-hydroxyhexanoate (PHHx) block were for the first time produced successfully by a recombinant Pseudomonas putida KT2442 with its β-oxidation cycle deleted to its maximum. The chloroform extracted polymers were characterized by nuclear magnetic resonance (NMR), thermo- and mechanical analysis. NMR confirmed the existence of diblock copolymers consisting of 58 mol% PHB as the short chain length block with 42 mol% PHHx as the medium chain length block. The block copolymers had two glass transition temperatures (Tg) at 2.7°C and -16.4°C, one melting temperature (Tm) at 172.1°C and one cool crystallization temperature (Tc) at 69.1°C as revealed by differential scanning calorimetry (DSC), respectively. This is the first microbial short-chain-length (scl) and medium-chain-length (mcl) PHA block copolymer reported. It is possible to produce PHA block copolymers of various kinds using the recombinant Pseudomonas putida KT2442 with its β-oxidation cycle deleted to its maximum. In comparison to a random copolymer poly-3-hydroxybutyrate-co-3-hydroxyhexanoate (P(HB-co-HHx)) and a blend sample of PHB and PHHx, the PHB-b-PHHx showed improved structural related mechanical properties.

  19. Functional Activation during the Rapid Visual Information Processing Task in a Middle Aged Cohort: An fMRI Study.

    PubMed

    Neale, Chris; Johnston, Patrick; Hughes, Matthew; Scholey, Andrew

    2015-01-01

    The Rapid Visual Information Processing (RVIP) task, a serial discrimination task where task performance believed to reflect sustained attention capabilities, is widely used in behavioural research and increasingly in neuroimaging studies. To date, functional neuroimaging research into the RVIP has been undertaken using block analyses, reflecting the sustained processing involved in the task, but not necessarily the transient processes associated with individual trial performance. Furthermore, this research has been limited to young cohorts. This study assessed the behavioural and functional magnetic resonance imaging (fMRI) outcomes of the RVIP task using both block and event-related analyses in a healthy middle aged cohort (mean age = 53.56 years, n = 16). The results show that the version of the RVIP used here is sensitive to changes in attentional demand processes with participants achieving a 43% accuracy hit rate in the experimental task compared with 96% accuracy in the control task. As shown by previous research, the block analysis revealed an increase in activation in a network of frontal, parietal, occipital and cerebellar regions. The event related analysis showed a similar network of activation, seemingly omitting regions involved in the processing of the task (as shown in the block analysis), such as occipital areas and the thalamus, providing an indication of a network of regions involved in correct trial performance. Frontal (superior and inferior frontal gryi), parietal (precuenus, inferior parietal lobe) and cerebellar regions were shown to be active in both the block and event-related analyses, suggesting their importance in sustained attention/vigilance. These networks and the differences between them are discussed in detail, as well as implications for future research in middle aged cohorts.

  20. [Fetal bradycardia: a retrospective study in 9 Spanish centers].

    PubMed

    Perin, F; Rodríguez Vázquez del Rey, M M; Deiros Bronte, L; Ferrer Menduiña, Q; Rueda Nuñez, F; Zabala Arguelles, J I; García de la Calzada, D; Teodoro Marin, S; Centeno Malfaz, F; Galindo Izquierdo, A

    2014-11-01

    The aim of this study is to review the current management and outcomes of fetal bradycardia in 9 Spanish centers. Retrospective multicenter study: analysis of all fetuses with bradycardia diagnosed between January 2008 and September 2010. Underlying mechanisms of fetal bradyarrhythmias were studied with echocardiography. A total of 37 cases were registered: 3 sinus bradycardia, 15 blocked atrial bigeminy, and 19 high grade atrioventricular blocks. Sinus bradycardia: 3 cases (100%) were associated with serious diseases. Blocked atrial bigeminy had an excellent outcome, except for one case with post-natal tachyarrhythmia. Of the atrioventricular blocks, 16% were related to congenital heart defects with isomerism, 63% related to the presence of maternal SSA/Ro antibodies, and 21% had unclear etiology. Overall mortality was 20% (37%, if terminations of pregnancy are taken into account). Risk factors for mortality were congenital heart disease, hydrops and/or ventricular dysfunction. Management strategies differed among centers. Steroids were administrated in 73% of immune-mediated atrioventricular blocks, including the only immune-mediated IInd grade block. More than half (58%) of atrioventricular blocks had a pacemaker implanted in a follow-up of 18 months. Sustained fetal bradycardia requires a comprehensive study in all cases, including those with sinus bradycardia. Blocked atrial bigeminy has a good prognosis, but tachyarrhythmias may develop. Heart block has significant mortality and morbidity rates, and its management is still highly controversial. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  1. Modeling study of mecamylamine block of muscle type acetylcholine receptors.

    PubMed

    Ostroumov, Konstantin; Shaikhutdinova, Asya; Skorinkin, Andrey

    2008-04-01

    The blocking action of mecamylamine on different types of nicotinic acetylcholine receptors (nAChRs) has been extensively studied and used as a tool to characterize the nAChRs from different synapses. However, mechanism of mecamylamine action was not fully explored for all types of nAChRs. In the present study, we provide brief description of the mecamylamine action on muscle nAChRs expressed at the frog neuromuscular junction. In this preparation mecamylamine block of nAChRs was accompanied by a use-dependent block relief induced by membrane depolarization combined with the activation of nAChRs by endogenous agonist acetylcholine (ACh). Further, three kinetic models of possible mecamylamine interaction with nAChRs were analyzed including simple open channel block, symmetrical trapping block and asymmetrical trapping block. This analysis suggested that mecamylamine action could be described on the basis of trapping mechanism, when the antagonist remained inside the channel even in the absence of bound agonist. Such receptors with trapped mecamylamine inside were predicted to have a closing rate constant about three times faster than resting one and a fast voltage-dependent unblocking rate constant. Specific experimental conditions and morphological organization of the neuromuscular synapses were considered to simulate time course of the mecamylamine block development. Thus, likewise for the neuronal nAChRs, the trapping mechanism determined the action of mecamylamine on synaptic neuromuscular currents evoked by the endogenous agonist acetylcholine (ACh), however specific morphological organization of the synaptic transmission delayed time development of the currents block.

  2. A comparative analysis of image features between weave embroidered Thangka and piles embroidered Thangka

    NASA Astrophysics Data System (ADS)

    Li, Zhenjiang; Wang, Weilan

    2018-04-01

    Thangka is a treasure of Tibetan culture. In its digital protection, most of the current research focuses on the content of Thangka images, not the fabrication process. For silk embroidered Thangka of "Guo Tang", there are two craft methods, namely, weave embroidered and piles embroidered. The local texture of weave embroidered Thangka is rough, and that of piles embroidered Thangka is more smooth. In order to distinguish these two kinds of fabrication processes from images, a effectively segmentation algorithm of color blocks is designed firstly, and the obtained color blocks contain the local texture patterns of Thangka image; Secondly, the local texture features of the color block are extracted and screened; Finally, the selected features are analyzed experimentally. The experimental analysis shows that the proposed features can well reflect the difference between methods of weave embroidered and piles embroidered.

  3. Development blocks in innovation networks: The Swedish manufacturing industry, 1970-2007.

    PubMed

    Taalbi, Josef

    2017-01-01

    The notion of development blocks (Dahmén, 1950, 1991) suggests the co-evolution of technologies and industries through complementarities and the overcoming of imbalances. This study proposes and applies a methodology to analyse development blocks empirically. To assess the extent and character of innovational interdependencies between industries the study combines analysis of innovation biographies and statistical network analysis. This is made possible by using data from a newly constructed innovation output database for Sweden. The study finds ten communities of closely related industries in which innovation activity has been prompted by the emergence of technological imbalances or by the exploitation of new technological opportunities. The communities found in the Swedish network of innovation are shown to be stable over time and often characterized by strong user-supplier interdependencies. These findings serve to stress how historical imbalances and opportunities are key to understanding the dynamics of the long-run development of industries and new technologies.

  4. SCA with rotation to distinguish common and distinctive information in linked data.

    PubMed

    Schouteden, Martijn; Van Deun, Katrijn; Pattyn, Sven; Van Mechelen, Iven

    2013-09-01

    Often data are collected that consist of different blocks that all contain information about the same entities (e.g., items, persons, or situations). In order to unveil both information that is common to all data blocks and information that is distinctive for one or a few of them, an integrated analysis of the whole of all data blocks may be most useful. Interesting classes of methods for such an approach are simultaneous-component and multigroup factor analysis methods. These methods yield dimensions underlying the data at hand. Unfortunately, however, in the results from such analyses, common and distinctive types of information are mixed up. This article proposes a novel method to disentangle the two kinds of information, by making use of the rotational freedom of component and factor models. We illustrate this method with data from a cross-cultural study of emotions.

  5. Tectonic analysis of folds in the Colorado plateau of Arizona

    NASA Technical Reports Server (NTRS)

    Davis, G. H.

    1975-01-01

    Structural mapping and analysis of folds in Phanerozoic rocks in northern Arizona, using LANDSAT-1 imagery, yielded information for a tectonic model useful in identifying regional fracture zones within the Colorado Plateau tectonic province. Since the monoclines within the province developed as a response to differential movements of basement blocks along high-angle faults, the monoclinal fold pattern records the position and trend of many elements of the regional fracture system. The Plateau is divided into a mosaic of complex, polyhedral crustal blocks whose steeply dipping faces correspond to major fracture zones. Zones of convergence and changes in the trend of the monoclinal traces reveal the corners of the blocks. Igneous (and salt) diapirs have been emplaced into many of the designated zones of crustal weakness. As loci of major fracturing, folding, and probably facies changes, the fractures exert control on the entrapment of oil and gas.

  6. Dynamic Sliding Analysis of a Gravity Dam with Fluid-Structure-Foundation Interaction Using Finite Elements and Newmark's Sliding Block Analysis

    NASA Astrophysics Data System (ADS)

    Goldgruber, Markus; Shahriari, Shervin; Zenz, Gerald

    2015-11-01

    To reduce the natural hazard risks—due to, e.g., earthquake excitation—seismic safety assessments are carried out. Especially under severe loading, due to maximum credible or the so-called safety evaluation earthquake, critical infrastructure, as these are high dams, must not fail. However, under high loading local failure might be allowed as long as the entire structure does not collapse. Hence, for a dam, the loss of sliding stability during a short time period might be acceptable if the cumulative displacements after an event are below an acceptable value. This performance is not only valid for gravity dams but also for rock blocks as sliding is even more imminent in zones with higher seismic activity. Sliding modes cannot only occur in the dam-foundation contact, but also in sliding planes formed due to geological conditions. This work compares the qualitative possible and critical displacements for two methods, the well-known Newmark's sliding block analysis and a Fluid-Foundation-Structure Interaction simulation with the finite elements method. The results comparison of the maximum displacements at the end of the seismic event of the two methods depicts that for high friction angles, they are fairly close. For low friction angles, the results are differing more. The conclusion is that the commonly used Newmark's sliding block analysis and the finite elements simulation are only comparable for high friction angles, where this factor dominates the behaviour of the structure. Worth to mention is that the proposed simulation methods are also applicable to dynamic rock wedge problems and not only to dams.

  7. Lung evolution as a cipher for physiology

    PubMed Central

    Torday, J. S.; Rehan, V. K.

    2009-01-01

    In the postgenomic era, we need an algorithm to readily translate genes into physiologic principles. The failure to advance biomedicine is due to the false hope raised in the wake of the Human Genome Project (HGP) by the promise of systems biology as a ready means of reconstructing physiology from genes. like the atom in physics, the cell, not the gene, is the smallest completely functional unit of biology. Trying to reassemble gene regulatory networks without accounting for this fundamental feature of evolution will result in a genomic atlas, but not an algorithm for functional genomics. For example, the evolution of the lung can be “deconvoluted” by applying cell-cell communication mechanisms to all aspects of lung biology development, homeostasis, and regeneration/repair. Gene regulatory networks common to these processes predict ontogeny, phylogeny, and the disease-related consequences of failed signaling. This algorithm elucidates characteristics of vertebrate physiology as a cascade of emergent and contingent cellular adaptational responses. By reducing complex physiological traits to gene regulatory networks and arranging them hierarchically in a self-organizing map, like the periodic table of elements in physics, the first principles of physiology will emerge. PMID:19366785

  8. The Development of a Portable Hard Disk Encryption/Decryption System with a MEMS Coded Lock.

    PubMed

    Zhang, Weiping; Chen, Wenyuan; Tang, Jian; Xu, Peng; Li, Yibin; Li, Shengyong

    2009-01-01

    In this paper, a novel portable hard-disk encryption/decryption system with a MEMS coded lock is presented, which can authenticate the user and provide the key for the AES encryption/decryption module. The portable hard-disk encryption/decryption system is composed of the authentication module, the USB portable hard-disk interface card, the ATA protocol command decoder module, the data encryption/decryption module, the cipher key management module, the MEMS coded lock controlling circuit module, the MEMS coded lock and the hard disk. The ATA protocol circuit, the MEMS control circuit and AES encryption/decryption circuit are designed and realized by FPGA(Field Programmable Gate Array). The MEMS coded lock with two couplers and two groups of counter-meshing-gears (CMGs) are fabricated by a LIGA-like process and precision engineering method. The whole prototype was fabricated and tested. The test results show that the user's password could be correctly discriminated by the MEMS coded lock, and the AES encryption module could get the key from the MEMS coded lock. Moreover, the data in the hard-disk could be encrypted or decrypted, and the read-write speed of the dataflow could reach 17 MB/s in Ultra DMA mode.

  9. A Lightweight Protocol for Secure Video Streaming

    PubMed Central

    Morkevicius, Nerijus; Bagdonas, Kazimieras

    2018-01-01

    The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing “Fog Node-End Device” layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard. PMID:29757988

  10. Steganographic optical image encryption system based on reversible data hiding and double random phase encoding

    NASA Astrophysics Data System (ADS)

    Chuang, Cheng-Hung; Chen, Yen-Lin

    2013-02-01

    This study presents a steganographic optical image encryption system based on reversible data hiding and double random phase encoding (DRPE) techniques. Conventional optical image encryption systems can securely transmit valuable images using an encryption method for possible application in optical transmission systems. The steganographic optical image encryption system based on the DRPE technique has been investigated to hide secret data in encrypted images. However, the DRPE techniques vulnerable to attacks and many of the data hiding methods in the DRPE system can distort the decrypted images. The proposed system, based on reversible data hiding, uses a JBIG2 compression scheme to achieve lossless decrypted image quality and perform a prior encryption process. Thus, the DRPE technique enables a more secured optical encryption process. The proposed method extracts and compresses the bit planes of the original image using the lossless JBIG2 technique. The secret data are embedded in the remaining storage space. The RSA algorithm can cipher the compressed binary bits and secret data for advanced security. Experimental results show that the proposed system achieves a high data embedding capacity and lossless reconstruction of the original images.

  11. A Lightweight Protocol for Secure Video Streaming.

    PubMed

    Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis

    2018-05-14

    The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.

  12. DICOM relay over the cloud.

    PubMed

    Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis

    2013-05-01

    Healthcare institutions worldwide have adopted picture archiving and communication system (PACS) for enterprise access to images, relying on Digital Imaging Communication in Medicine (DICOM) standards for data exchange. However, communication over a wider domain of independent medical institutions is not well standardized. A DICOM-compliant bridge was developed for extending and sharing DICOM services across healthcare institutions without requiring complex network setups or dedicated communication channels. A set of DICOM routers interconnected through a public cloud infrastructure was implemented to support medical image exchange among institutions. Despite the advantages of cloud computing, new challenges were encountered regarding data privacy, particularly when medical data are transmitted over different domains. To address this issue, a solution was introduced by creating a ciphered data channel between the entities sharing DICOM services. Two main DICOM services were implemented in the bridge: Storage and Query/Retrieve. The performance measures demonstrated it is quite simple to exchange information and processes between several institutions. The solution can be integrated with any currently installed PACS-DICOM infrastructure. This method works transparently with well-known cloud service providers. Cloud computing was introduced to augment enterprise PACS by providing standard medical imaging services across different institutions, offering communication privacy and enabling creation of wider PACS scenarios with suitable technical solutions.

  13. A novel chaos-based image encryption algorithm using DNA sequence operations

    NASA Astrophysics Data System (ADS)

    Chai, Xiuli; Chen, Yiran; Broyde, Lucie

    2017-01-01

    An image encryption algorithm based on chaotic system and deoxyribonucleic acid (DNA) sequence operations is proposed in this paper. First, the plain image is encoded into a DNA matrix, and then a new wave-based permutation scheme is performed on it. The chaotic sequences produced by 2D Logistic chaotic map are employed for row circular permutation (RCP) and column circular permutation (CCP). Initial values and parameters of the chaotic system are calculated by the SHA 256 hash of the plain image and the given values. Then, a row-by-row image diffusion method at DNA level is applied. A key matrix generated from the chaotic map is used to fuse the confused DNA matrix; also the initial values and system parameters of the chaotic system are renewed by the hamming distance of the plain image. Finally, after decoding the diffused DNA matrix, we obtain the cipher image. The DNA encoding/decoding rules of the plain image and the key matrix are determined by the plain image. Experimental results and security analyses both confirm that the proposed algorithm has not only an excellent encryption result but also resists various typical attacks.

  14. [Present status and trend of heart fluid mechanics research based on medical image analysis].

    PubMed

    Gan, Jianhong; Yin, Lixue; Xie, Shenghua; Li, Wenhua; Lu, Jing; Luo, Anguo

    2014-06-01

    With introduction of current main methods for heart fluid mechanics researches, we studied the characteristics and weakness for three primary analysis methods based on magnetic resonance imaging, color Doppler ultrasound and grayscale ultrasound image, respectively. It is pointed out that particle image velocity (PIV), speckle tracking and block match have the same nature, and three algorithms all adopt block correlation. The further analysis shows that, with the development of information technology and sensor, the research for cardiac function and fluid mechanics will focus on energy transfer process of heart fluid, characteristics of Chamber wall related to blood fluid and Fluid-structure interaction in the future heart fluid mechanics fields.

  15. SYSTID - A flexible tool for the analysis of communication systems.

    NASA Technical Reports Server (NTRS)

    Dawson, C. T.; Tranter, W. H.

    1972-01-01

    Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.

  16. Decision Analysis Model for Passenger-Aircraft Fire Safety with Application to Fire-Blocking of Seats

    DTIC Science & Technology

    1984-04-01

    50] Collision Aeronaves - Boeing 747 PH-BUF de KLI y Boeing N737PA de Pan Amt en los rodeos (Tenerife) el 27 Marzo de 1977, Joint Accident Report...SAMPLE APPLICATION: FIRE-BLOCKING OF SEATS .......................... 24 4.1 Expected Losses and Savings with Seat Blocking ................... 26 4.2...analyzed to examine the sensitivity of the results on the existing materials. 24 MI ɚ C0 .04 4 0 Cc00 C 0 CU 4’.4 .. c o 90 𔃾.~w - -J 440 c 0 0 cq I04 C.0

  17. Reduced order feedback control equations for linear time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1981-01-01

    An algorithm was developed which can be used to obtain the equations. In a more general context, the algorithm computes a real nonsingular similarity transformation matrix which reduces a real nonsymmetric matrix to block diagonal form, each block of which is a real quasi upper triangular matrix. The algorithm works with both defective and derogatory matrices and when and if it fails, the resultant output can be used as a guide for the reformulation of the mathematical equations that lead up to the ill conditioned matrix which could not be block diagonalized.

  18. The therapeutic efficacy of sacroiliac joint blocks with triamcinolone acetonide in the treatment of sacroiliac joint dysfunction without spondyloarthropathy.

    PubMed

    Liliang, Po-Chou; Lu, Kang; Weng, Hui-Ching; Liang, Cheng-Loong; Tsai, Yu-Duan; Chen, Han-Jung

    2009-04-20

    Prospective case series. The study aimed to investigate the therapeutic efficacy of sacroiliac joint (SIJ) blocks with triamcinolone acetonide in patients with SIJ pain without spondyloarthropathy. Numerous studies have demonstrated that SIJ blocks with corticosteroid/anesthetic provide long-term pain relief in seronegative spondyloarthropathy. However, only one report on SIJ dysfunction patients without spondyloarthropathy shows promising results. We conducted a prospective observational study of patients at a University Spine Center from March 2005 to May 2006. The above mentioned SIJ blocks were performed in 150 patients, and dual SIJ blocks confirmed SIJ pain in 39 patients (26%). Twenty-six patients (66.7%) experienced significant pain reduction for more than 6 weeks; the overall mean duration of pain reduction in these responders was 36.8 +/- 9.9 weeks. SIJ blocks were ineffective in 13 patients (33.3%); the mean duration of pain reduction in these patients was 4.4 +/- 1.8 weeks. Univariate analysis revealed that treatment failure was significantly associated with a history of lumbar/lumbosacral fusion (P = 0.03). SIJ blocks with triamcinolone acetonide are beneficial for some patients with SIJ pain without spondyloarthropathy. The SIJ blocks showed a long-lasting efficacy in two-thirds of the patients; however, the duration of its efficacy was shorter in patients with a history of lumbar/lumbosacral fusion. These findings suggest the need for further studies.

  19. Clustering of GPS velocities in the Mojave Block, southeastern California

    USGS Publications Warehouse

    Savage, James C.; Simpson, Robert W.

    2013-01-01

    We find subdivisions within the Mojave Block using cluster analysis to identify groupings in the velocities observed at GPS stations there. The clusters are represented on a fault map by symbols located at the positions of the GPS stations, each symbol representing the cluster to which the velocity of that GPS station belongs. Fault systems that separate the clusters are readily identified on such a map. The most significant representation as judged by the gap test involves 4 clusters within the Mojave Block. The fault systems bounding the clusters from east to west are 1) the faults defining the eastern boundary of the Northeast Mojave Domain extended southward to connect to the Hector Mine rupture, 2) the Calico-Paradise fault system, 3) the Landers-Blackwater fault system, and 4) the Helendale-Lockhart fault system. This division of the Mojave Block is very similar to that proposed by Meade and Hager. However, no cluster boundary coincides with the Garlock Fault, the northern boundary of the Mojave Block. Rather, the clusters appear to continue without interruption from the Mojave Block north into the southern Walker Lane Belt, similar to the continuity across the Garlock Fault of the shear zone along the Blackwater-Little Lake fault system observed by Peltzer et al. Mapped traces of individual faults in the Mojave Block terminate within the block and do not continue across the Garlock Fault [Dokka and Travis, ].

  20. Rapid ordering of block copolymer thin films

    DOE PAGES

    Majewski, Pawel W.; Yager, Kevin G.

    2016-08-18

    Block-copolymers self-assemble into diverse morphologies, where nanoscale order can be finely tuned via block architecture and processing conditions. However, the ultimate usage of these materials in real-world applications may be hampered by the extremely long thermal annealing times—hours or days—required to achieve good order. Here, we provide an overview of the fundamentals of block-copolymer self-assembly kinetics, and review the techniques that have been demonstrated to influence, and enhance, these ordering kinetics. We discuss the inherent tradeoffs between oven annealing, solvent annealing, microwave annealing, zone annealing, and other directed self-assembly methods; including an assessment of spatial and temporal characteristics. Here, wemore » also review both real-space and reciprocal-space analysis techniques for quantifying order in these systems.« less

  1. Contextual Compression of Large-Scale Wind Turbine Array Simulations: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interactive visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contexualized representation is a valid approach and encourages contextual data management.« less

  2. Contextual Compression of Large-Scale Wind Turbine Array Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interative visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contextualized representation is a valid approach and encourages contextual data management.« less

  3. A blocking primer increases specificity in environmental DNA detection of bull trout (Salvelinus confluentus)

    Treesearch

    Taylor M. Wilcox; Michael K. Schwartz; Kevin S. McKelvey; Michael K. Young; Winsor H. Lowe

    2014-01-01

    Environmental DNA (eDNA) is increasingly applied as a highly sensitive way to detect aquatic animals non-invasively. However, distinguishing closely related taxa can be particularly challenging. Previous studies of ancient DNA and genetic diet analysis have used blocking primers to enrich target template in the presence of abundant, non-target DNA. Here we apply a...

  4. Soil-contact decay tests using small blocks : a procedural analysis

    Treesearch

    Rodney C. De Groot; James W. Evans; Paul G. Forsyth; Camille M. Freitag; Jeffrey J. Morrell

    Much discussion has been held regarding the merits of laboratory decay tests compared with field tests to evaluate wood preservatives. In this study, procedural aspects of soil jar decay tests with 1 cm 3 blocks were critically examined. Differences among individual bottles were a major source of variation in this method. The reproducibility and sensitivity of the soil...

  5. The Impact of Block Scheduling on Various Indicators of School Success.

    ERIC Educational Resources Information Center

    Nichols, Joe D.

    This project focused on the collection and analysis of longitudinal student data generated by six high schools from a large urban school system in the Midwest. Two of the schools recently converted to a 4 X 4 scheduling structure, while 3 additional schools have used a block-8 scheduling structure for a number of years. One school maintains a…

  6. Analysis of Pre-treatment Woody Vegetation and Environmental Data for the Missouri Ozark Forest Ecosystem Project

    Treesearch

    John M. Kabrick; David R. Larsen; Stephen R. Shifley

    1997-01-01

    We conducted a study to identify pre-treatment trends in woody species density, diameter, and basal area among MOFEP sites, blocks, and treatment areas; relate woody species differences among sites, blocks, and treatment areas to differences in environmental conditions; and identify potential treatment response differences based upon our fmdings. Sites 2 through 5 had...

  7. Dimensions of driving anger and their relationships with aberrant driving.

    PubMed

    Zhang, Tingru; Chan, Alan H S; Zhang, Wei

    2015-08-01

    The purpose of this study was to investigate the relationship between driving anger and aberrant driving behaviours. An internet-based questionnaire survey was administered to a sample of Chinese drivers, with driving anger measured by a 14-item short Driving Anger Scale (DAS) and the aberrant driving behaviours measured by a 23-item Driver Behaviour Questionnaire (DBQ). The results of Confirmatory Factor Analysis demonstrated that the three-factor model (hostile gesture, arrival-blocking and safety-blocking) of the DAS fitted the driving anger data well. The Exploratory Factor Analysis on DBQ data differentiated four types of aberrant driving, viz. emotional violation, error, deliberate violation and maintaining progress violation. For the anger-aberration relation, it was found that only "arrival-blocking" anger was a significant positive predictor for all four types of aberrant driving behaviours. The "safety-blocking" anger revealed a negative impact on deliberate violations, a finding different from previously established positive anger-aberration relation. These results suggest that drivers with different patterns of driving anger would show different behavioural tendencies and as a result intervention strategies may be differentially effective for drivers of different profiles. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Density-cluster NMA: A new protein decomposition technique for coarse-grained normal mode analysis.

    PubMed

    Demerdash, Omar N A; Mitchell, Julie C

    2012-07-01

    Normal mode analysis has emerged as a useful technique for investigating protein motions on long time scales. This is largely due to the advent of coarse-graining techniques, particularly Hooke's Law-based potentials and the rotational-translational blocking (RTB) method for reducing the size of the force-constant matrix, the Hessian. Here we present a new method for domain decomposition for use in RTB that is based on hierarchical clustering of atomic density gradients, which we call Density-Cluster RTB (DCRTB). The method reduces the number of degrees of freedom by 85-90% compared with the standard blocking approaches. We compared the normal modes from DCRTB against standard RTB using 1-4 residues in sequence in a single block, with good agreement between the two methods. We also show that Density-Cluster RTB and standard RTB perform well in capturing the experimentally determined direction of conformational change. Significantly, we report superior correlation of DCRTB with B-factors compared with 1-4 residue per block RTB. Finally, we show significant reduction in computational cost for Density-Cluster RTB that is nearly 100-fold for many examples. Copyright © 2012 Wiley Periodicals, Inc.

  9. Concept analysis and the building blocks of theory: misconceptions regarding theory development.

    PubMed

    Bergdahl, Elisabeth; Berterö, Carina M

    2016-10-01

    The purpose of this article is to discuss the attempts to justify concepts analysis as a way to construct theory - a notion often advocated in nursing. The notion that concepts are the building blocks or threads from which theory is constructed is often repeated. It can be found in many articles and well-known textbooks. However, this notion is seldom explained or defended. The notion of concepts as building blocks has also been questioned by several authors. However, most of these authors seem to agree to some degree that concepts are essential components from which theory is built. Discussion paper. Literature was reviewed to synthesize and debate current knowledge. Our point is that theory is not built by concepts analysis or clarification and we will show that this notion has its basis in some serious misunderstandings. We argue that concept analysis is not a part of sound scientific method and should be abandoned. The current methods of concept analysis in nursing have no foundation in philosophy of science or in language philosophy. The type of concept analysis performed in nursing is not a way to 'construct' theory. Rather, theories are formed by creative endeavour to propose a solution to a scientific and/or practical problem. The bottom line is that the current style and form of concept analysis in nursing should be abandoned in favour of methods in line with modern theory of science. © 2016 John Wiley & Sons Ltd.

  10. The iSelect 9 K SNP analysis revealed polyploidization induced revolutionary changes and intense human selection causing strong haplotype blocks in wheat.

    PubMed

    Hao, Chenyang; Wang, Yuquan; Chao, Shiaoman; Li, Tian; Liu, Hongxia; Wang, Lanfen; Zhang, Xueyong

    2017-01-30

    A Chinese wheat mini core collection was genotyped using the wheat 9 K iSelect SNP array. Total 2420 and 2396 polymorphic SNPs were detected on the A and the B genome chromosomes, which formed 878 haplotype blocks. There were more blocks in the B genome, but the average block size was significantly (P < 0.05) smaller than those in the A genome. Intense selection (domestication and breeding) had a stronger effect on the A than on the B genome chromosomes. Based on the genetic pedigrees, many blocks can be traced back to a well-known Strampelli cross, which was made one century ago. Furthermore, polyploidization of wheat (both tetraploidization and hexaploidization) induced revolutionary changes in both the A and the B genomes, with a greater increase of gene diversity compared to their diploid ancestors. Modern breeding has dramatically increased diversity in the gene coding regions, though obvious blocks were formed on most of the chromosomes in both tetraploid and hexaploid wheats. Tag-SNP markers identified in this study can be used for marker assisted selection using haplotype blocks as a wheat breeding strategy. This strategy can also be employed to facilitate genome selection in other self-pollinating crop species.

  11. Wrinkle surface instability of an inhomogeneous elastic block with graded stiffness

    NASA Astrophysics Data System (ADS)

    Yang, Shengyou; Chen, Yi-chao

    2017-04-01

    Surface instabilities have been studied extensively for both homogeneous materials and film/substrate structures but relatively less for materials with continuously varying properties. This paper studies wrinkle surface instability of a graded neo-Hookean block with exponentially varying modulus under plane strain by using the linear bifurcation analysis. We derive the first variation condition for minimizing the potential energy functional and solve the linearized equations of equilibrium to find the necessary conditions for surface instability. It is found that for a homogeneous block or an inhomogeneous block with increasing modulus from the surface, the critical stretch for surface instability is 0.544 (0.456 strain), which is independent of the geometry and the elastic modulus on the surface of the block. This critical stretch coincides with that reported by Biot (1963 Appl. Sci. Res. 12, 168-182. (doi:10.1007/BF03184638)) 53 years ago for the onset of wrinkle instabilities in a half-space of homogeneous neo-Hookean materials. On the other hand, for an inhomogeneous block with decreasing modulus from the surface, the critical stretch for surface instability ranges from 0.544 to 1 (0-0.456 strain), depending on the modulus gradient, and the length and height of the block. This sheds light on the effects of the material inhomogeneity and structural geometry on surface instability.

  12. Teaching strategies and student achievement in high school block scheduled biology classes

    NASA Astrophysics Data System (ADS)

    Louden, Cynthia Knapp

    The objectives of this study included determining whether teachers in block or traditionally scheduled biology classes (1) implement inquiry-based instruction more often or with different methods, (2) understand the concept of inquiry-based instruction as it is described in the National Science Standards, (3) have classes with significantly different student achievement, and (4) believe that their school schedule facilitates their use of inquiry-based instruction in the classroom. Biology teachers in block and non-block scheduled classes were interviewed, surveyed, and observed to determine the degree to which they implement inquiry-based instructional practices in their classrooms. State biology exams were used to indicate student achievement. Teachers in block scheduled and traditional classes used inquiry-based instruction with nearly the same frequency. Approximately 30% of all teachers do not understand the concept of inquiry-based instruction as described by the National Science Standards. No significant achievement differences between block and traditionally scheduled biology classes were found using ANCOVA analyses and a nonequivalent control-group quasi-experimental design. Using the same analysis techniques, significant achievement differences were found between biology classes with teachers who used inquiry-based instruction frequently and infrequently. Teachers in block schedules believed that their schedules facilitated inquiry-based instruction more than teachers in traditional schedules.

  13. Comparison of success rates, learning curves, and inter-subject performance variability of robot-assisted and manual ultrasound-guided nerve block needle guidance in simulation.

    PubMed

    Morse, J; Terrasini, N; Wehbe, M; Philippona, C; Zaouter, C; Cyr, S; Hemmerling, T M

    2014-06-01

    This study focuses on a recently developed robotic nerve block system and its impact on learning regional anaesthesia skills. We compared success rates, learning curves, performance times, and inter-subject performance variability of robot-assisted vs manual ultrasound (US)-guided nerve block needle guidance. The hypothesis of this study is that robot assistance will result in faster skill acquisition than manual needle guidance. Five co-authors with different experience with nerve blocks and the robotic system performed both manual and robot-assisted, US-guided nerve blocks on two different nerves of a nerve phantom. Ten trials were performed for each of the four procedures. Time taken to move from a shared starting position till the needle was inserted into the target nerve was defined as the performance time. A successful block was defined as the insertion of the needle into the target nerve. Average performance times were compared using analysis of variance. P<0.05 was considered significant. Data presented as mean (standard deviation). All blocks were successful. There were significant differences in performance times between co-authors to perform the manual blocks, either superficial (P=0.001) or profound (P=0.0001); no statistical difference between co-authors was noted for the robot-assisted blocks. Linear regression indicated that the average decrease in time between consecutive trials for robot-assisted blocks of 1.8 (1.6) s was significantly (P=0.007) greater than the decrease for manual blocks of 0.3 (0.3) s. Robot assistance of nerve blocks allows for faster learning of needle guidance over manual positioning and reduces inter-subject performance variability. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Analysis of the association between isokinetic knee strength with offensive and defensive jumping capacity in high-level female volleyball athletes.

    PubMed

    Sattler, Tine; Sekulic, Damir; Esco, Michael R; Mahmutovic, Ifet; Hadzic, Vedran

    2015-09-01

    Isokinetic-knee-strength was hypothesized to be an important factor related to jumping performance. However, studies examining this relation among elite female athletes and sport-specific jumps are lacking. This investigation determined the influence of isokinetic-knee flexor/extensor strength measures on spike-jump (offensive) and block-jump (defensive) performance among high-level female volleyball players. Cross-sectional laboratory study. Eighty-two female volleyball athletes (age = 21.3 ± 3.8 years, height = 175.4 ± 6.76 cm, and weight = 68.29 ± 8.53 kg) volunteered to participate in this study. The studied variables included spike-jump and block-jump performance and a set of isokinetic tests to evaluate the eccentric and concentric strength capacities of the knee extensors (quadriceps - Q), and flexors (hamstring - H) for both legs. Both jumping tests showed high intra-session reliability (ICC of 0.87 and 0.95 for spike-jump and block-jump, respectively). The athletes were clustered into three achievement-groups based on their spike-jump and block-jump performances. For the block-jump, ANOVA identified significant differences between achievement-groups for all isokinetic variables except the Right-Q-Eccentric-Strength. When observed for spike-jump, achievement-groups differed significantly in all tests but Right-H-Concentric-Strength. Discriminant canonical analysis showed that the isokinetic-strength variables were more associated with block-jump then spike-jump-performance. The eccentric isokinetic measures were relatively less important determinants of block-jump than for the spike-jump performance. Data support the hypothesis of the importance of isokinetic strength measures for the expression of rapid muscular performance in volleyball. The results point to the necessity of the differential approach in sport training for defensive and offensive duties. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  15. Medical Student–Reported Outcomes of a Radiation Oncologist–Led Preclinical Course in Oncology: A Five-Year Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Ankit; Koottappillil, Brian; Shah, Bhartesh

    Purpose: There is a recognized need for more robust training in oncology for medical students. At our institution, we have offered a core dedicated oncology block, led by a radiation oncologist course director, during the second year of the medical school curriculum since the 2008-2009 academic year. Herein, we report the outcomes of the oncology block over the past 5 years through an analysis of student perceptions of the course, both immediately after completion of the block and in the third year. Methods and Materials: We analyzed 2 separate surveys. The first assessed student impressions of how well the course metmore » each of the course's learning objectives through a survey that was administered to students immediately after the oncology block in 2012. The second was administered after students completed the oncology block during the required radiology clerkship in the third year. All questions used a 5-level Likert scale and were analyzed by use of a Wilcoxon signed-rank test. Results: Of the 169 students who took the oncology course in 2012, 127 (75.1%) completed the course feedback survey. Over 73% of students agreed or strongly agreed that the course met its 3 learning objectives. Of the 699 medical students who took the required radiology clerkship between 2010 and 2013, 538 participated in the second survey, for a total response rate of 77%. Of these students, 368 (68.4%) agreed or strongly agreed that the course was effective in contributing to their overall medical education. Conclusion: Student perceptions of the oncology block are favorable and have improved across multiple categories since the inception of the course. Students self-reported that a dedicated preclinical oncology block was effective in helping identify the basics of cancer therapy and laying the foundation for clinical electives in oncology, including radiation oncology.« less

  16. Neuromuscular blocking agent administration for emergent tracheal intubation is associated with decreased prevalence of procedure-related complications.

    PubMed

    Wilcox, Susan R; Bittner, Edward A; Elmer, Jonathan; Seigel, Todd A; Nguyen, Nicole Thuy P; Dhillon, Anahat; Eikermann, Matthias; Schmidt, Ulrich

    2012-06-01

    Emergent intubation is associated with a high rate of complications. Neuromuscular blocking agents are routinely used in the operating room and emergency department to facilitate intubation. However, use of neuromuscular blocking agents during emergent airway management outside of the operating room and emergency department is controversial. We hypothesized that the use of neuromuscular blocking agents is associated with a decreased prevalence of hypoxemia and reduced rate of procedure-related complications. Five hundred sixty-six patients undergoing emergent intubations in two tertiary care centers, Massachusetts General Hospital, Boston, MA, and the University of California Los Angeles, Ronald Reagan Medical Center, Los Angeles, CA, were enrolled in a prospective, observational study. The 112 patients intubated during cardiopulmonary resuscitation were excluded, leaving 454 patients for analysis. All intubations were supervised by attendings trained in Critical Care Medicine. We measured intubating conditions, oxygen saturation during and 5 mins following intubation. We assessed the prevalence of procedure-related complications defined as esophageal intubation, traumatic intubation, aspiration, dental injury, and endobronchial intubation. The use of neuromuscular blocking agents was associated with a lower prevalence of hypoxemia (10.1% vs. 17.4%, p = .022) and a lower prevalence of procedure-related complications (3.1% vs. 8.3%, p = .012). This association persisted in a multivariate analysis, which controlled for airway grade, sedation, and institution. Use of neuromuscular blocking agents was associated with significantly improved intubating conditions (laryngeal view, p = .014; number of intubation attempts, p = .049). After controlling for the number of intubation attempts and laryngoscopic view, muscle relaxant use is an independent predictor of complications associated with emergency intubation (p = .037), and there is a trend towards improvement of oxygenation (p = .07). The use of neuromuscular blocking agents, when used by intensivists with a high level of training and experience, is associated with a decrease in procedure-related complications.

  17. High-dose versus low-dose local anaesthetic for transversus abdominis plane block post-Caesarean delivery analgesia: a meta-analysis.

    PubMed

    Ng, S C; Habib, A S; Sodha, S; Carvalho, B; Sultan, P

    2018-02-01

    The optimal local-anaesthetic (LA) dose for transversus-abdominis-plane (TAP) block is unclear. In this meta-analysis, we aimed to determine whether TAP blocks for Caesarean delivery (CD) with low-dose (LD) LA demonstrated non-inferiority in terms of analgesic efficacy, compared with high-dose (HD) LA. A literature search was performed for randomised controlled trials examining the analgesic efficacy of TAP blocks vs control after CD. The different dosing used in these studies was classified as HD or LD (bupivacaine equivalents >50 or ≤50 mg per block side, respectively). The pooled results of each dose group vs control were indirectly compared using the Q test. The primary outcome was 24 h opioid consumption. Secondary outcomes included 6 and 24 h postoperative pain scores, time to first analgesia, 6 h opioid consumption, opioid-related side-effects, and maternal satisfaction. Fourteen studies consisting of 770 women (389 TAP and 381 control) were included. Compared with controls, the 24 h opioid consumption (milligram morphine equivalents) was lower in HD [mean difference (MD) 95% confidence interval (CI) -22.41 (-38.56, -6.26); P=0.007; I 2 =93%] and LD [MD 95% CI -16.29 (-29.74, -2.84); P=0.02; I 2 =98%] TAP groups. However, no differences were demonstrated between the HD and LD groups (P=0.57). There were also no differences between the HD and LD groups for the 6 h opioid consumption, time to first analgesia, 6 and 24 h pain scores, postoperative nausea and vomiting, pruritus, and maternal satisfaction. Low-dose TAP blocks for Caesarean delivery provide analgesia and opioid-sparing effects comparable with the high-dose blocks. This suggests that lower doses can be used to reduce local anaesthetic toxicity risk without compromising the analgesic efficacy. Copyright © 2017 British Journal of Anaesthesia. Published by Elsevier Ltd. All rights reserved.

  18. The efficacy of adding dexamethasone, midazolam, or epinephrine to 0.5% bupivacaine in supraclavicular brachial plexus block.

    PubMed

    El-Baradey, Ghada F; Elshmaa, Nagat S

    2014-11-01

    The aim was to assess the effectiveness of adding either dexamethasone or midazolam in comparison with epinephrine addition to 0.5% bupivacaine in supraclavicular brachial plexus block. This is a prospective randomized controlled observer-blinded study. This study was carried out in Tanta University Hospital on 60 patients of both sexes; American Society of Anesthesiologists physical Status I and II, age range from 18 to 45 years undergo elective surgery to upper limb. All patients were anesthetized with ultrasound guided supraclavicular brachial plexus block and randomly divided into three groups (each group 20 patients) Group E (epinephrine): 30 mL bupivacaine 0.5%with 1:200,000 epinephrine (5 μg/mL). Group D (dexamethasone): 30 mL bupivacaine 0.5% and dexamethasone 8 mg. Group M (midazolam): 30 ml bupivacaine 0.5% and midazolam 50 μg/kg. The primary outcome measures were onset and duration of sensory and motor block and time to first analgesic request. The windows version of SPSS 11.0.1 (SPSS Inc., Chicago, IL, USA) was used for statistical analysis. Data were presented in form of mean ± standard deviation multiple analysis of variance (ANOVA) was used to compare the three groups and Scheffe test was used after ANOVA. Power of significance P < 0.05 was considered to be statistically significant. Onset of sensory and motor block was significantly rapid (P < 0.05) in Groups D and M in comparison with Group E. Time of administration of rescue analgesic, duration of sensory and motor block showed significant increase (P < 0.05) in Group D in comparison with Group M which showed significant increase (P < 0.05) in comparison with Group E. In comparison with epinephrine and midazolam addition of dexamethasone to bupivacaine had rapid onset of block and longer time to first analgesic request with fewer side-effects.

  19. Small RNA populations revealed by blocking rRNA fragments in Drosophila melanogaster reproductive tissues

    PubMed Central

    Dalmay, Tamas

    2018-01-01

    RNA interference (RNAi) is a complex and highly conserved regulatory mechanism mediated via small RNAs (sRNAs). Recent technical advances in high throughput sequencing have enabled an increasingly detailed analysis of sRNA abundances and profiles in specific body parts and tissues. This enables investigations of the localized roles of microRNAs (miRNAs) and small interfering RNAs (siRNAs). However, variation in the proportions of non-coding RNAs in the samples being compared can hinder these analyses. Specific tissues may vary significantly in the proportions of fragments of longer non-coding RNAs (such as ribosomal RNA or transfer RNA) present, potentially reflecting tissue-specific differences in biological functions. For example, in Drosophila, some tissues contain a highly abundant 30nt rRNA fragment (the 2S rRNA) as well as abundant 5’ and 3’ terminal rRNA fragments. These can pose difficulties for the construction of sRNA libraries as they can swamp the sequencing space and obscure sRNA abundances. Here we addressed this problem and present a modified “rRNA blocking” protocol for the construction of high-definition (HD) adapter sRNA libraries, in D. melanogaster reproductive tissues. The results showed that 2S rRNAs targeted by blocking oligos were reduced from >80% to < 0.01% total reads. In addition, the use of multiple rRNA blocking oligos to bind the most abundant rRNA fragments allowed us to reveal the underlying sRNA populations at increased resolution. Side-by-side comparisons of sequencing libraries of blocked and non-blocked samples revealed that rRNA blocking did not change the miRNA populations present, but instead enhanced their abundances. We suggest that this rRNA blocking procedure offers the potential to improve the in-depth analysis of differentially expressed sRNAs within and across different tissues. PMID:29474379

  20. MHD natural convection and entropy generation in an open cavity having different horizontal porous blocks saturated with a ferrofluid

    NASA Astrophysics Data System (ADS)

    Gibanov, Nikita S.; Sheremet, Mikhail A.; Oztop, Hakan F.; Al-Salem, Khaled

    2018-04-01

    In this study, natural convection combined with entropy generation of Fe3O4-water nanofluid within a square open cavity filled with two different porous blocks under the influence of uniform horizontal magnetic field is numerically studied. Porous blocks of different thermal properties, permeability and porosity are located on the bottom wall. The bottom wall of the cavity is kept at hot temperature Th, while upper open boundary is at constant cold temperature Tc and other walls of the cavity are supposed to be adiabatic. Governing equations with corresponding boundary conditions formulated in dimensionless stream function and vorticity using Brinkman-extended Darcy model for porous blocks have been solved numerically using finite difference method. Numerical analysis has been carried out for wide ranges of Hartmann number, nanoparticles volume fraction and length of the porous blocks. It has been found that an addition of spherical ferric oxide nanoparticles can order the flow structures inside the cavity.

  1. Ensemble methods with simple features for document zone classification

    NASA Astrophysics Data System (ADS)

    Obafemi-Ajayi, Tayo; Agam, Gady; Xie, Bingqing

    2012-01-01

    Document layout analysis is of fundamental importance for document image understanding and information retrieval. It requires the identification of blocks extracted from a document image via features extraction and block classification. In this paper, we focus on the classification of the extracted blocks into five classes: text (machine printed), handwriting, graphics, images, and noise. We propose a new set of features for efficient classifications of these blocks. We present a comparative evaluation of three ensemble based classification algorithms (boosting, bagging, and combined model trees) in addition to other known learning algorithms. Experimental results are demonstrated for a set of 36503 zones extracted from 416 document images which were randomly selected from the tobacco legacy document collection. The results obtained verify the robustness and effectiveness of the proposed set of features in comparison to the commonly used Ocropus recognition features. When used in conjunction with the Ocropus feature set, we further improve the performance of the block classification system to obtain a classification accuracy of 99.21%.

  2. The importance of non-quasigeostrophic forcing during the development of a blocking anticyclone

    NASA Technical Reports Server (NTRS)

    Tsou, Chih-Hua; Smith, Phillip J.

    1990-01-01

    This study examines the impact of non-quasigeostrophic (NQG) processes during the development of a blocking anticyclone (January 21, 1979 over the southern tip of Greenland) and a precursor, upstream intense cyclone (January 18, 1979). Energy quantities and height tendencies determined from quasigeostrophic estimates are compared with the same quantities obtained from more general formulations. GLA FGGE Level III-b analysis on a 4 deg lat by 5 deg long grid was used to obtain energetics results. It is concluded that NQG processes strengthened the intensity of the block and a precursor explosive cyclone and that a portion of this increase resulted from enhanced baroclinic conversion of eddy potential to eddy kinetic energy and reduced barotropic energy conversion from eddy to zonal flow. It is suggested that NQG vorticity advection, instead of moderating wave developments, enhanced the block development, and it is also suggested that QG forcing might not have been adequate to produce the observed block development.

  3. Kinematics and mechanics of tectonic block rotations

    NASA Technical Reports Server (NTRS)

    Nur, Amos; Scotti, Oona; Ron, Hagai

    1989-01-01

    Paleomagnetic, structural geology, and rock mechanics data are combined to explore the validity of the block rotation concept and its significance. The analysis is based on data from (1) Northern Israel, where fault slip and spacing are used to predict block rotation; (2) the Mojave Desert, with well-documented strike-slip fault sets, organized in at least three major domains; (3) the Lake Mead, Nevada, fault system with well-defined sets of strike-slip faults, which, in contrast to the Mojave region, are surrounded with domains of normal faults; and (4) the San Gabriel Mountains domain with a multiple set of strike-slip faults. It is found that block rotations can have a profound influence on the interpretation of geodetic measurements and the inversion of geodetic data, especially the type collected in GPS surveys. Furthermore, block rotations and domain boundaries may be involved in creating the heterogeneities along active fault systems which are responsible for the initiation and termination of earthquake rupture.

  4. Large Deformation Analysis of a High Steep Slope Relating to the Laxiwa Reservoir, China

    NASA Astrophysics Data System (ADS)

    Lin, Peng; Liu, Xiaoli; Hu, Senying; Li, Pujian

    2016-06-01

    The unstable rock slope in the Laxiwa reservoir area of the Yellow River upstream, China, shows the signs of gravitational and water-impounding induced large deformations over an area of 1.15 × 105 m2. Slope movements have been measured daily at more than 560 observation points since 2009, when the reservoir was first impounded. At two of these points, an average daily movement of around 60-80 mm has ever been observed since the beginning of the impounding. Based on the observed deformations and the geology of the site, a fluid-solid coupling model was then adopted to investigate the existing rockslide activity to better understand the mechanism underlying the large deformations. The results from the field observation, kinematic analysis and numerical modeling indicate that the slope instability is dominated by the strong structurally controlled unstable rock mass. Based on an integrated overview of these analyses, a new toppling mode, i.e. the so-called `conjugate block' mode, is proposed to explain the large deformation mechanism of the slope. The conjugate block is formed by a `dumping block' and toppling blocks. The large deformation of the slope is dominated by (1) a toppling component and (2) a subsiding bilinear wedge induced by planar sliding along the deep-seated faults. Following a thorough numerical analysis, it is concluded that small collapses of rock blocks along the slope will be more frequent with the impounding process continuing and the water level fluctuating during the subsequent operation period. Based on a shear strength reduction method and field monitoring, four controlling faults are identified and the instability of the loose structure in the surface layer is analyzed and discussed. The factor of safety against the sliding failure along the deep seated fractures in the slope is 1.72, which reveals that (1) the collapse of the free-standing fractured blocks cannot be ruled out and the volume of the unstable blocks may be greater than 100,000 m3; (2) the collapse of the whole slope, i.e. with the volume being greater than 92 million m3, or a very large collapse involving several million m3, is considered to be of very low likelihood, unless there are extreme conditions, such as earthquakes and exceptionally heavy rain.

  5. Programmer's manual for the Mission Analysis Evaluation and Space Trajectory Operations program (MAESTRO)

    NASA Technical Reports Server (NTRS)

    Lutzky, D.; Bjorkman, W. S.

    1973-01-01

    The Mission Analysis Evaluation and Space Trajectory Operations program known as MAESTRO is described. MAESTRO is an all FORTRAN, block style, computer program designed to perform various mission control tasks. This manual is a guide to MAESTRO, providing individuals the capability of modifying the program to suit their needs. Descriptions are presented of each of the subroutines descriptions consist of input/output description, theory, subroutine description, and a flow chart where applicable. The programmer's manual also contains a detailed description of the common blocks, a subroutine cross reference map, and a general description of the program structure.

  6. Basic performance evaluation of a Si-PM array-based LGSO phoswich DOI block detector for a high-resolution small animal PET system.

    PubMed

    Yamamoto, Seiichi

    2013-07-01

    The silicon photomultiplier (Si-PM) is a promising photodetector for PET. However, it remains unclear whether Si-PM can be used for a depth-of-interaction (DOI) detector based on the decay time differences of the scintillator where pulse shape analysis is used. For clarification, we tested the Hamamatsu 4 × 4 Si-PM array (S11065-025P) combined with scintillators that used different decay times to develop DOI block detectors using the pulse shape analysis. First, Ce-doped Gd(2)SiO(5) (GSO) scintillators of 0.5 mol% Ce were arranged in a 4 × 4 matrix and were optically coupled to the center of each pixel of the Si-PM array for measurement of the energy resolution as well as its gain variations according to the temperature. Then two types of Ce-doped Lu(1.9)Gd(0.1)Si0(5) (LGSO) scintillators, 0.025 mol% Ce (decay time: ~31 ns) and 0.75 mol% Ce (decay time: ~46 ns), were optically coupled in the DOI direction, arranged in a 11 × 7 matrix, and optically coupled to a Si-PM array for testing of the possibility of a high-resolution DOI detector. The energy resolution of the Si-PM array-based GSO block detector was 18 ± 4.4 % FWHM for a Cs-137 gamma source (662 keV). Less than 1 mm crystals were clearly resolved in the position map of the LGSO DOI block detector. The peak-to-valley ratio (P/V) derived from the pulse shape spectra of the LGSO DOI block detector was 2.2. These results confirmed that Si-PM array-based DOI block detectors are promising for high-resolution small animal PET systems.

  7. Continuity Clinic Model and Diabetic Outcomes in Internal Medicine Residencies: Findings of the Educational Innovations Project Ambulatory Collaborative

    PubMed Central

    Francis, Maureen D.; Julian, Katherine A.; Wininger, David A.; Drake, Sean; Bollman, KeriLyn; Nabors, Christopher; Pereira, Anne; Rosenblum, Michael; Zelenski, Amy B.; Sweet, David; Thomas, Kris; Varney, Andrew; Warm, Eric; Francis, Mark L.

    2016-01-01

    Background Efforts to improve diabetes care in residency programs are ongoing and in the midst of continuity clinic redesign at many institutions. While there appears to be a link between resident continuity and improvement in glycemic control for diabetic patients, it is uncertain whether clinic structure affects quality measures and patient outcomes. Methods This multi-institutional, cross-sectional study included 12 internal medicine programs. Three outcomes (glycemic control, blood pressure control, and achievement of target low-density lipoprotein [LDL]) and 2 process measures (A1C and LDL measurement) were reported for diabetic patients. Traditional, block, and combination clinic models were compared using analysis of covariance (ANCOVA). Analysis was adjusted for continuity, utilization, workload, and panel size. Results No significant differences were found in glycemic control across clinic models (P = .06). The percentage of diabetic patients with LDL < 100 mg/dL was 60% in block, compared to 54.9% and 55% in traditional and combination models (P = .006). The percentage of diabetic patients with blood pressure < 130/80 mmHg was 48.4% in block, compared to 36.7% and 36.9% in other models (P < .001). The percentage of diabetic patients with HbA1C measured was 92.1% in block compared to 75.2% and 82.1% in other models (P < .001). Also, the percentage of diabetic patients with LDL measured was significantly different across all groups, with 91.2% in traditional, 70.4% in combination, and 83.3% in block model programs (P < .001). Conclusions While high scores on diabetic quality measures are achievable in any clinic model, the block model design was associated with better performance. PMID:26913099

  8. Water Polo Game-Related Statistics in Women’s International Championships: Differences and Discriminatory Power

    PubMed Central

    Escalante, Yolanda; Saavedra, Jose M.; Tella, Victor; Mansilla, Mirella; García-Hermoso, Antonio; Dominguez, Ana M.

    2012-01-01

    The aims of this study were (i) to compare women’s water polo game-related statistics by match outcome (winning and losing teams) and phase (preliminary, classificatory, and semi-final/bronze medal/gold medal), and (ii) identify characteristics that discriminate performances for each phase. The game-related statistics of the 124 women’s matches played in five International Championships (World and European Championships) were analyzed. Differences between winning and losing teams in each phase were determined using the chi-squared. A discriminant analysis was then performed according to context in each of the three phases. It was found that the game-related statistics differentiate the winning from the losing teams in each phase of an international championship. The differentiating variables were both offensive (centre goals, power-play goals, counterattack goal, assists, offensive fouls, steals, blocked shots, and won sprints) and defensive (goalkeeper-blocked shots, goalkeeper-blocked inferiority shots, and goalkeeper-blocked 5-m shots). The discriminant analysis showed the game-related statistics to discriminate performance in all phases: preliminary, classificatory, and final phases (92%, 90%, and 83%, respectively). Two variables were discriminatory by match outcome (winning or losing teams) in all three phases: goals and goalkeeper-blocked shots. Key pointsThe preliminary phase that more than one variable was involved in this differentiation, including both offensive and defensive aspects of the game.The game-related statistics were found to have a high discriminatory power in predicting the result of matches with shots and goalkeeper-blocked shots being discriminatory variables in all three phases.Knowledge of the characteristics of women’s water polo game-related statistics of the winning teams and their power to predict match outcomes will allow coaches to take these characteristics into account when planning training and match preparation. PMID:24149356

  9. Investigating the impact of atmospheric blocking on temperature extremes across Europe using an objective index

    NASA Astrophysics Data System (ADS)

    Brunner, Lukas; Steiner, Andrea; Sillmann, Jana

    2017-04-01

    Atmospheric blocking is a key contributor to European temperature extremes. It leads to stable, long-lasting weather patterns, which favor the development of cold and warm spells. The link between blocking and such temperature extremes differs significantly across Europe. In northern Europe a majority of warm spells are connected to blocking, while cold spells are suppressed during blocked conditions. In southern Europe the opposite picture arises with most cold spells occurring during blocking and warm spells suppressed. Building on earlier work by Brunner et al. (2017) this study aims at a better understanding of the connection between blocking and temperature extremes in Europe. We investigate cold and warm spells with and without blocking in observations from the European daily high-resolution gridded dataset (E-OBS) from 1979 to 2015. We use an objective extreme index (Russo et al. 2015) to identify and compare cold and warm spells across Europe. Our work is lead by the main question: Are cold/warm spells coinciding with blocking different from cold/warm spells during unblocked conditions in regard to duration, extend, or amplitude? Here we present our research question and the study setup, and show first results of our analysis on European temperature extremes. Brunner, L., G. Hegerl, and A. Steiner (2017): Connecting Atmospheric Blocking to European Temperature Extremes in Spring. J. Climate, 30, 585-594, doi: 10.1175/JCLI-D-16-0518.1. Russo, S., J. Sillmann, and E. M. Fischer (2015): Top ten European heatwaves since 1950 and their occurrence in the coming decades. Environ. Res. Lett. 10.12, S. 124003. doi: 10.1088/1748-9326/10/12/124003.

  10. Comparison of the post-operative analgesic effect of paravertebral block, pectoral nerve block and local infiltration in patients undergoing modified radical mastectomy: A randomised double-blind trial

    PubMed Central

    Syal, Kartik; Chandel, Ankita

    2017-01-01

    Background and Aims: Paravertebral block, pectoral nerve (Pecs) block and wound infiltration are three modalities for post-operative analgesia following breast surgery. This study compares the analgesic efficacy of these techniques for post-operative analgesia. Methods: Sixty-five patients with American Society of Anesthesiologists’ physical status 1 or 2 undergoing modified radical mastectomy with axillary dissection were recruited for the study. All patients received 21 mL 0.5% bupivacaine with adrenaline in the technique which was performed at the end of the surgery prior to extubation. Patients in Group 1 (local anaesthetic [LA], n = 22) received infiltration at the incision site after surgery, Group 2 patients (paravertebral block [PVB], n = 22) received ultrasound-guided ipsilateral paravertebral block while Group 3 patients [PECT] (n = 21) received ultrasound-guided ipsilateral Pecs blocks I and II. Patients were evaluated for pain scores at 0, 2, 4, 6, 12 and 24 h, duration of post-operative analgesia and rescue analgesic doses required. Non-normally distributed data were analysed using the Kruskal-Wallis test and Analysis of variance for normal distribution. Results: The post-operative visual analogue scale scores were lower in PVB group compared with others at 0, 2, 4, 12 and 24 h (P < 0.05). Mean duration of analgesia was significantly prolonged in PVB group (P < 0.001) with lesser rescue analgesic consumption up to 24 h. Conclusion: Ultrasound-guided paravertebral block reduces post-operative pain scores, prolongs the duration of analgesia and decreases demands for rescue analgesics in the first 24 h of post-operative period compared to ultrasound-guided Pecs block and local infiltration block. PMID:28890559

  11. Comparison of the post-operative analgesic effect of paravertebral block, pectoral nerve block and local infiltration in patients undergoing modified radical mastectomy: A randomised double-blind trial.

    PubMed

    Syal, Kartik; Chandel, Ankita

    2017-08-01

    Paravertebral block, pectoral nerve (Pecs) block and wound infiltration are three modalities for post-operative analgesia following breast surgery. This study compares the analgesic efficacy of these techniques for post-operative analgesia. Sixty-five patients with American Society of Anesthesiologists' physical status 1 or 2 undergoing modified radical mastectomy with axillary dissection were recruited for the study. All patients received 21 mL 0.5% bupivacaine with adrenaline in the technique which was performed at the end of the surgery prior to extubation. Patients in Group 1 (local anaesthetic [LA], n = 22) received infiltration at the incision site after surgery, Group 2 patients (paravertebral block [PVB], n = 22) received ultrasound-guided ipsilateral paravertebral block while Group 3 patients [PECT] ( n = 21) received ultrasound-guided ipsilateral Pecs blocks I and II. Patients were evaluated for pain scores at 0, 2, 4, 6, 12 and 24 h, duration of post-operative analgesia and rescue analgesic doses required. Non-normally distributed data were analysed using the Kruskal-Wallis test and Analysis of variance for normal distribution. The post-operative visual analogue scale scores were lower in PVB group compared with others at 0, 2, 4, 12 and 24 h ( P < 0.05). Mean duration of analgesia was significantly prolonged in PVB group ( P < 0.001) with lesser rescue analgesic consumption up to 24 h. Ultrasound-guided paravertebral block reduces post-operative pain scores, prolongs the duration of analgesia and decreases demands for rescue analgesics in the first 24 h of post-operative period compared to ultrasound-guided Pecs block and local infiltration block.

  12. Molecular analysis of the Na+ channel blocking actions of the novel class I anti-arrhythmic agent RSD 921

    PubMed Central

    Pugsley, Michael K; Goldin, Alan L

    1999-01-01

    RSD 921 is a novel, structurally unique, class I Na+ channel blocking drug under development as a local anaesthetic agent and possibly for the treatment of cardiac arrhythmias. The effects of RSD 921 on wild-type heart, skeletal muscle, neuronal and non-inactivating IFMQ3 mutant neuronal Na+ channels expressed in Xenopus laevis oocytes were examined using a two-electrode voltage clamp.RSD 921 produced similarly potent tonic block of all three wild-type channel isoforms, with EC50 values between 35 and 47 μM, whereas the EC50 for block of the IFMQ3 mutant channel was 110±5.5 μM.Block of Na+ channels by RSD 921 was concentration and use-dependent, with marked frequency-dependent block of heart channels and mild frequency-dependent block of skeletal muscle, wild-type neuronal and IFMQ3 mutant channels.RSD 921 produced a minimal hyperpolarizing shift in the steady-state voltage-dependence of inactivation of all three wild-type channel isoforms.Open channel block of the IFMQ3 mutant channel was best fit with a first order blocking scheme with kon equal to 0.11±0.012×106 M−1 s−1 and koff equal to 12.5±2.5 s−1, resulting in KD of 117±31 μM. Recovery from open channel block occurred with a time constant of 14±2.7 s−1.These results suggest that RSD 921 preferentially interacts with the open state of the Na+ channel, and that the drug may produce potent local anaesthetic or anti-arrhythmic action under conditions of shortened action potentials, such as during anoxia or ischaemia. PMID:10369450

  13. Molecular analysis of the Na+ channel blocking actions of the novel class I anti-arrhythmic agent RSD 921.

    PubMed

    Pugsley, M K; Goldin, A L

    1999-05-01

    RSD 921 is a novel, structurally unique, class I Na+ channel blocking drug under development as a local anaesthetic agent and possibly for the treatment of cardiac arrhythmias. The effects of RSD 921 on wild-type heart, skeletal muscle, neuronal and non-inactivating IFMQ3 mutant neuronal Na+ channels expressed in Xenopus laevis oocytes were examined using a two-electrode voltage clamp. RSD 921 produced similarly potent tonic block of all three wild-type channel isoforms, with EC50 values between 35 and 47 microM, whereas the EC50 for block of the IFMQ3 mutant channel was 110+5.5 microM. Block of Na+ channels by RSD 921 was concentration and use-dependent, with marked frequency-dependent block of heart channels and mild frequency-dependent block of skeletal muscle, wild-type neuronal and IFMQ3 mutant channels. RSD 921 produced a minimal hyperpolarizing shift in the steady-state voltage-dependence of inactivation of all three wild-type channel isoforms. Open channel block of the IFMQ3 mutant channel was best fit with a first order blocking scheme with k(on) equal to 0.11+/-0.012x10(6) M(-1) s(-1) and k(off) equal to 12.5+/-2.5 s(-1), resulting in KD of 117+/-31 microM. Recovery from open channel block occurred with a time constant of 14+/-2.7 s(-1). These results suggest that RSD 921 preferentially interacts with the open state of the Na+ channel, and that the drug may produce potent local anaesthetic or anti-arrhythmic action under conditions of shortened action potentials, such as during anoxia or ischaemia.

  14. Liposomal bupivacaine peripheral nerve block for the management of postoperative pain.

    PubMed

    Hamilton, Thomas W; Athanassoglou, Vassilis; Trivella, Marialena; Strickland, Louise H; Mellon, Stephen; Murray, David; Pandit, Hemant G

    2016-08-25

    Postoperative pain remains a significant issue with poor perioperative pain management associated with an increased risk of morbidity and mortality. Liposomal bupivacaine is an analgesic consisting of bupivacaine hydrochloride encapsulated within multiple, non-concentric lipid bi-layers offering a novel method of sustained release. To assess the analgesic efficacy and adverse effects of liposomal bupivacaine infiltration peripheral nerve block for the management of postoperative pain. We identified randomised trials of liposomal bupivacaine peripheral nerve block for the management of postoperative pain. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (2016, Issue 1), Ovid MEDLINE (1946 to January Week 1 2016), Ovid MEDLINE In-Process (14 January 2016), EMBASE (1974 to 13 January 2016), ISI Web of Science (1945 to 14 January 2016), and reference lists of retrieved articles. We sought unpublished studies from Internet sources, and searched clinical trials databases for ongoing trials. The date of the most recent search was 15 January 2016. Randomised, double-blind, placebo- or active-controlled clinical trials of a single dose of liposomal bupivacaine administered as a peripheral nerve block in adults aged 18 years or over undergoing elective surgery at any surgical site. We included trials if they had at least two comparison groups for liposomal bupivacaine peripheral nerve block compared with placebo or other types of analgesia. Two review authors independently considered trials for inclusion in the review, assessed risk of bias, and extracted data. We performed analyses using standard statistical techniques as described in the Cochrane Handbook for Systematic Reviews of Interventions, using Review Manager 5. We planned to perform a meta-analysis, however there were insufficient data to ensure a clinically meaningful answer; as such we have produced a 'Summary of findings' table in a narrative format, and where possible we assessed the evidence using GRADE (Grading of Recommendations Assessment, Development and Evaluation). We identified seven studies that met inclusion criteria for this review. Three were recorded as completed (or terminated) but no results were published. Of the remaining four studies (299 participants): two investigated liposomal bupivacaine transversus abdominis plane (TAP) block, one liposomal bupivacaine dorsal penile nerve block, and one ankle block. The study investigating liposomal bupivacaine ankle block was a Phase II dose-escalating/de-escalating trial presenting pooled data that we could not use in our analysis.The studies did not report our primary outcome, cumulative pain score between 0 and 72 hours, and secondary outcomes, mean pain score at 12, 24, 48, 72, or 96 hours. One study reported no difference in mean pain score during the first, second, and third postoperative 24-hour periods in participants receiving liposomal bupivacaine TAP block compared to no TAP block. Two studies, both in people undergoing laparoscopic surgery under TAP block, investigated cumulative postoperative opioid dose, reported opposing findings. One found a lower cumulative opioid consumption between 0 and 72 hours compared to bupivacaine hydrochloride TAP block and one found no difference during the first, second, and third postoperative 24-hour periods compared to no TAP block. No studies reported time to first postoperative opioid or percentage not requiring opioids over the initial 72 hours. No studies reported a health economic analysis or patient-reported outcome measures (outside of pain). The review authors sought data regarding adverse events but none were available, however there were no withdrawals reported to be due to adverse events.Using GRADE, we considered the quality of evidence to be very low with any estimate of effect very uncertain and further research very likely to have an important impact on our confidence in the estimate of effect. All studies were at high risk of bias due to their small sample size (fewer than 50 participants per arm) leading to uncertainty around effect estimates. Additionally, inconsistency of results and sparseness of data resulted in further downgrading of the quality of the data. A lack of evidence has prevented an assessment of the efficacy of liposomal bupivacaine administered as a peripheral nerve block. At present there is a lack of data to support or refute the use of liposomal bupivacaine administered as a peripheral nerve block for the management of postoperative pain. Further research is very likely to have an important impact on our confidence in the estimate of effect and is likely to change the estimate.

  15. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research

    PubMed Central

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D

    2015-01-01

    Background A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. Objective This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. Methods NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f 1, f 2, ..., f k. The input for each function f i has 3 components: a random number r, an integer n, and input data m. The result, f i(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f 1(r 1, n 1, m 1), f 2(r 2, n 2, m 2), ..., f k(r k, n k, m k). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result of the encryption, concatenated with the random number r, is the final NHash study identifier. Results We performed experiments using a large synthesized dataset comparing NHash with random strings, and demonstrated neglegible probability for collision. We implemented NHash for the Center for SUDEP Research (CSR), a National Institute for Neurological Disorders and Stroke-funded Center Without Walls for Collaborative Research in the Epilepsies. This multicenter collaboration involves 14 institutions across the United States and Europe, bringing together extensive and diverse expertise to understand sudden unexpected death in epilepsy patients (SUDEP). Conclusions The CSR Data Repository has successfully used NHash to link deidentified multimodal clinical data collected in participating CSR institutions, meeting all desired objectives of NHash. PMID:26554419

  16. A METHOD FOR USING BLOCKED AND EVENT-RELATED FMRI DATA TO STUDY “RESTING STATE” FUNCTIONAL CONNECTIVITY

    PubMed Central

    Fair, Damien A.; Schlaggar, Bradley L.; Cohen B.A., Alexander L.; Miezin, Francis M.; Dosenbach, Nico U.F.; Wenger, Kristin K.; Fox, Michael D.; Snyder, Abraham Z.; Raichle, Marcus E.; Petersen, Steven E.

    2007-01-01

    Resting state functional connectivity MRI (fcMRI) has become a particularly useful tool for studying regional relationships in typical and atypical populations. Because many investigators have already obtained large datasets of task related fMRI, the ability to use this existing task data for resting state fcMRI is of considerable interest. Two classes of datasets could potentially be modified to emulate resting state data. These datasets include: 1) “interleaved” resting blocks from blocked or mixed blocked/event-related sets, and 2) residual timecourses from event-related sets that lack rest blocks. Using correlation analysis, we compared the functional connectivity of resting epochs taken from a mixed blocked/event-related design fMRI data set and the residuals derived from event-related data with standard continuous resting state data to determine which class of data can best emulate resting state data. We show that despite some differences, the functional connectivity for the interleaved resting periods taken from blocked designs is both qualitatively and quantitatively very similar to that of “continuous” resting state data. In contrast, despite being qualitatively similar to “continuous” resting state data, residuals derived from event-related design data had several distinct quantitative differences. These results suggest that the interleaved resting state data such as those taken from blocked or mixed blocked/event-related fMRI designs are well-suited for resting state functional connectivity analyses. Although using event-related data residuals for resting state functional connectivity may still be useful, results should be interpreted with care. PMID:17239622

  17. US-Guided Femoral and Sciatic Nerve Blocks for Analgesia During Endovenous Laser Ablation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Saim, E-mail: ysaim@akdeniz.edu.tr; Ceken, Kagan; Alimoglu, Emel

    2013-02-15

    Endovenous laser ablation may be associated with significant pain when performed under standard local tumescent anesthesia. The purpose of this study was to investigate the efficacy of femoral and sciatic nerve blocks for analgesia during endovenous ablation in patients with lower extremity venous insufficiency. During a 28-month period, ultrasound-guided femoral or sciatic nerve blocks were performed to provide analgesia during endovenous laser ablation in 506 legs and 307 patients. The femoral block (n = 402) was performed at the level of the inguinal ligament, and the sciatic block at the posterior midthigh (n = 124), by injecting a diluted lidocainemore » solution under ultrasound guidance. After the blocks, endovenous laser ablations and other treatments (phlebectomy or foam sclerotherapy) were performed in the standard fashion. After the procedures, a visual analogue pain scale (1-10) was used for pain assessment. After the blocks, pain scores were 0 or 1 (no pain) in 240 legs, 2 or 3 (uncomfortable) in 225 legs, and 4 or 5 (annoying) in 41 legs. Patients never experienced any pain higher than score 5. The statistical analysis revealed no significant difference between the pain scores of the right leg versus the left leg (p = 0.321) and between the pain scores after the femoral versus sciatic block (p = 0.7). Ultrasound-guided femoral and sciatic nerve blocks may provide considerable reduction of pain during endovenous laser and other treatments, such as ambulatory phlebectomy and foam sclerotherapy. They may make these procedures more comfortable for the patient and easier for the operator.« less

  18. High quality tissue miniarray technique using a conventional TV/radio telescopic antenna.

    PubMed

    Elkablawy, Mohamed A; Albasri, Abdulkader M

    2015-01-01

    The tissue microarray (TMA) is widely accepted as a fast and cost-effective research tool for in situ tissue analysis in modern pathology. However, the current automated and manual TMA techniques have some drawbacks restricting their productivity. Our study aimed to introduce an improved manual tissue miniarray (TmA) technique that is simple and readily applicable to a broad range of tissue samples. In this study, a conventional TV/radio telescopic antenna was used to punch tissue cores manually from donor paraffin embedded tissue blocks which were pre-incubated at 40oC. The cores were manually transferred, organized and attached to a standard block mould, and filled with liquid paraffin to construct TmA blocks without any use of recipient paraffin blocks. By using a conventional TV/radio antenna, it was possible to construct TmA paraffin blocks with variable formats of array size and number (2-mm x 42, 2.5-mm x 30, 3-mm x 24, 4-mm x 20 and 5-mm x 12 cores). Up to 2-mm x 84 cores could be mounted and stained on a standard microscopic slide by cutting two sections from two different blocks and mounting them beside each other. The technique was simple and caused minimal damage to the donor blocks. H and E and immunostained slides showed well-defined tissue morphology and array configuration. This technique is easy to reproduce, quick, inexpensive and creates uniform blocks with abundant tissues without specialized equipment. It was found to improve the stability of the cores within the paraffin block and facilitated no losses during cutting and immunostaining.

  19. Geological, geomechanical and geostatistical assessment of rockfall hazard in San Quirico Village (Abruzzo, Italy)

    NASA Astrophysics Data System (ADS)

    Chiessi, Vittorio; D'Orefice, Maurizio; Scarascia Mugnozza, Gabriele; Vitale, Valerio; Cannese, Christian

    2010-07-01

    This paper describes the results of a rockfall hazard assessment for the village of San Quirico (Abruzzo region, Italy) based on an engineering-geological model. After the collection of geological, geomechanical, and geomorphological data, the rockfall hazard assessment was performed based on two separate approaches: i) simulation of detachment of rock blocks and their downhill movement using a GIS; and ii) application of geostatistical techniques to the analysis of georeferenced observations of previously fallen blocks, in order to assess the probability of arrival of blocks due to potential future collapses. The results show that the trajectographic analysis is significantly influenced by the input parameters, with particular reference to the coefficients of restitution values. In order to solve this problem, the model was calibrated based on repeated field observations. The geostatistical approach is useful because it gives the best estimation of point-source phenomena such as rockfalls; however, the sensitivity of results to basic assumptions, e.g. assessment of variograms and choice of a threshold value, may be problematic. Consequently, interpolations derived from different variograms have been used and compared among them; hence, those showing the lowest errors were adopted. The data sets which were statistically analysed are relevant to both kinetic energy and surveyed rock blocks in the accumulation area. The obtained maps highlight areas susceptible to rock block arrivals, and show that the area accommodating the new settlement of S. Quirico Village has the highest level of hazard according to both probabilistic and deterministic methods.

  20. Diagnosis and treatment of posterior sacroiliac complex pain: a systematic review with comprehensive analysis of the published data.

    PubMed

    King, Wade; Ahmed, Shihab U; Baisden, Jamie; Patel, Nileshkumar; Kennedy, David J; Duszynski, Belinda; MacVicar, John

    2015-02-01

    To assess the evidence on the validity of sacral lateral branch blocks and the effectiveness of sacral lateral branch thermal radiofrequency neurotomy in managing sacroiliac complex pain. Systematic review with comprehensive analysis of all published data. Six reviewers searched the literature on sacral lateral branch interventions. Each assessed the methodologies of studies found and the quality of the evidence presented. The outcomes assessed were diagnostic validity and effectiveness of treatment for sacroiliac complex pain. The evidence found was appraised in accordance with the Grades of Recommendation, Assessment, Development, and Evaluation (GRADE) system of evaluating scientific evidence. The searches yielded two primary publications on sacral lateral branch blocks and 15 studies of the effectiveness of sacral lateral branch thermal radiofrequency neurotomy. One study showed multisite, multidepth sacral lateral branch blocks can anesthetize the posterior sacroiliac ligaments. Therapeutic studies show sacral lateral branch thermal radiofrequency neurotomy can relieve sacroiliac complex pain to some extent. The evidence of the validity of these blocks and the effectiveness of this treatment were rated as moderate in accordance with the GRADE system. The literature on sacral lateral branch interventions is sparse. One study demonstrates the face validity of multisite, multidepth sacral lateral branch blocks for diagnosis of posterior sacroiliac complex pain. Some evidence of moderate quality exists on therapeutic procedures, but it is insufficient to determine the indications and effectiveness of sacral lateral branch thermal radiofrequency neurotomy, and more research is required. Wiley Periodicals, Inc.

Top