Sample records for random keys applied

  1. Randomness determines practical security of BB84 quantum key distribution.

    PubMed

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-10

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  2. Randomness determines practical security of BB84 quantum key distribution

    PubMed Central

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-01-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system. PMID:26552359

  3. Randomness determines practical security of BB84 quantum key distribution

    NASA Astrophysics Data System (ADS)

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  4. Encoding plaintext by Fourier transform hologram in double random phase encoding using fingerprint keys

    NASA Astrophysics Data System (ADS)

    Takeda, Masafumi; Nakano, Kazuya; Suzuki, Hiroyuki; Yamaguchi, Masahiro

    2012-09-01

    It has been shown that biometric information can be used as a cipher key for binary data encryption by applying double random phase encoding. In such methods, binary data are encoded in a bit pattern image, and the decrypted image becomes a plain image when the key is genuine; otherwise, decrypted images become random images. In some cases, images decrypted by imposters may not be fully random, such that the blurred bit pattern can be partially observed. In this paper, we propose a novel bit coding method based on a Fourier transform hologram, which makes images decrypted by imposters more random. Computer experiments confirm that the method increases the randomness of images decrypted by imposters while keeping the false rejection rate as low as in the conventional method.

  5. DNA based random key generation and management for OTP encryption.

    PubMed

    Zhang, Yunpeng; Liu, Xin; Sun, Manhui

    2017-09-01

    One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.

  6. Enhancing superconducting critical current by randomness

    NASA Astrophysics Data System (ADS)

    Wang, Y. L.; Thoutam, L. R.; Xiao, Z. L.; Shen, B.; Pearson, J. E.; Divan, R.; Ocola, L. E.; Crabtree, G. W.; Kwok, W. K.

    2016-01-01

    The key ingredient of high critical currents in a type-II superconductor is defect sites that pin vortices. Contrary to earlier understanding on nanopatterned artificial pinning, here we show unequivocally the advantages of a random pinscape over an ordered array in a wide magnetic field range. We reveal that the better performance of a random pinscape is due to the variation of its local density of pinning sites (LDOPS), which mitigates the motion of vortices. This is confirmed by achieving even higher enhancement of the critical current through a conformally mapped random pinscape, where the distribution of the LDOPS is further enlarged. The demonstrated key role of LDOPS in enhancing superconducting critical currents gets at the heart of random versus commensurate pinning. Our findings highlight the importance of random pinscapes in enhancing the superconducting critical currents of applied superconductors.

  7. Optical detection of random features for high security applications

    NASA Astrophysics Data System (ADS)

    Haist, T.; Tiziani, H. J.

    1998-02-01

    Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.

  8. Towards an Analysis of Review Article in Applied Linguistics: Its Classes, Purposes and Characteristics

    ERIC Educational Resources Information Center

    Azar, Ali Sorayyaei; Hashim, Azirah

    2014-01-01

    The classes, purposes and characteristics associated with the review article in the field of applied linguistics were analyzed. The data were collected from a randomly selected corpus of thirty two review articles from a discipline-related key journal in applied linguistics. The findings revealed that different sub-genres can be identified within…

  9. Necessary detection efficiencies for secure quantum key distribution and bound randomness

    NASA Astrophysics Data System (ADS)

    Acín, Antonio; Cavalcanti, Daniel; Passaro, Elsa; Pironio, Stefano; Skrzypczyk, Paul

    2016-01-01

    In recent years, several hacking attacks have broken the security of quantum cryptography implementations by exploiting the presence of losses and the ability of the eavesdropper to tune detection efficiencies. We present a simple attack of this form that applies to any protocol in which the key is constructed from the results of untrusted measurements performed on particles coming from an insecure source or channel. Because of its generality, the attack applies to a large class of protocols, from standard prepare-and-measure to device-independent schemes. Our attack gives bounds on the critical detection efficiencies necessary for secure quantum key distribution, which show that the implementation of most partly device-independent solutions is, from the point of view of detection efficiency, almost as demanding as fully device-independent ones. We also show how our attack implies the existence of a form of bound randomness, namely nonlocal correlations in which a nonsignalling eavesdropper can find out a posteriori the result of any implemented measurement.

  10. Logic Encryption

    DTIC Science & Technology

    2014-02-01

    a. REPORT U b . ABSTRACT U c. THIS PAGE U 19b. TELEPHONE NUMBER (Include area code) N/A Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std...logic encryption and IC testing – (a) fault excitation, ( b ) propagation, and (c) masking…………………………………………......…………………………………….. 8 Figure 6: A...distance between the outputs of designs on applying the correct key and a random wrong key: (a) Random insertion of XORs in ISCAS designs [6,7,11], ( b

  11. BARI+: A Biometric Based Distributed Key Management Approach for Wireless Body Area Networks

    PubMed Central

    Muhammad, Khaliq-ur-Rahman Raazi Syed; Lee, Heejo; Lee, Sungyoung; Lee, Young-Koo

    2010-01-01

    Wireless body area networks (WBAN) consist of resource constrained sensing devices just like other wireless sensor networks (WSN). However, they differ from WSN in topology, scale and security requirements. Due to these differences, key management schemes designed for WSN are inefficient and unnecessarily complex when applied to WBAN. Considering the key management issue, WBAN are also different from WPAN because WBAN can use random biometric measurements as keys. We highlight the differences between WSN and WBAN and propose an efficient key management scheme, which makes use of biometrics and is specifically designed for WBAN domain. PMID:22319333

  12. BARI+: a biometric based distributed key management approach for wireless body area networks.

    PubMed

    Muhammad, Khaliq-ur-Rahman Raazi Syed; Lee, Heejo; Lee, Sungyoung; Lee, Young-Koo

    2010-01-01

    Wireless body area networks (WBAN) consist of resource constrained sensing devices just like other wireless sensor networks (WSN). However, they differ from WSN in topology, scale and security requirements. Due to these differences, key management schemes designed for WSN are inefficient and unnecessarily complex when applied to WBAN. Considering the key management issue, WBAN are also different from WPAN because WBAN can use random biometric measurements as keys. We highlight the differences between WSN and WBAN and propose an efficient key management scheme, which makes use of biometrics and is specifically designed for WBAN domain.

  13. Small Private Key PKS on an Embedded Microprocessor

    PubMed Central

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-01-01

    Multivariate quadratic ( ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012. PMID:24651722

  14. Small private key MQPKS on an embedded microprocessor.

    PubMed

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-03-19

    Multivariate quadratic (MQ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to MQ cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key MQ scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key MQ scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing MQ on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key MQ scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012.

  15. HyDEn: A Hybrid Steganocryptographic Approach for Data Encryption Using Randomized Error-Correcting DNA Codes

    PubMed Central

    Regoui, Chaouki; Durand, Guillaume; Belliveau, Luc; Léger, Serge

    2013-01-01

    This paper presents a novel hybrid DNA encryption (HyDEn) approach that uses randomized assignments of unique error-correcting DNA Hamming code words for single characters in the extended ASCII set. HyDEn relies on custom-built quaternary codes and a private key used in the randomized assignment of code words and the cyclic permutations applied on the encoded message. Along with its ability to detect and correct errors, HyDEn equals or outperforms existing cryptographic methods and represents a promising in silico DNA steganographic approach. PMID:23984392

  16. Server-Controlled Identity-Based Authenticated Key Exchange

    NASA Astrophysics Data System (ADS)

    Guo, Hua; Mu, Yi; Zhang, Xiyong; Li, Zhoujun

    We present a threshold identity-based authenticated key exchange protocol that can be applied to an authenticated server-controlled gateway-user key exchange. The objective is to allow a user and a gateway to establish a shared session key with the permission of the back-end servers, while the back-end servers cannot obtain any information about the established session key. Our protocol has potential applications in strong access control of confidential resources. In particular, our protocol possesses the semantic security and demonstrates several highly-desirable security properties such as key privacy and transparency. We prove the security of the protocol based on the Bilinear Diffie-Hellman assumption in the random oracle model.

  17. Review of Random Phase Encoding in Volume Holographic Storage

    PubMed Central

    Su, Wei-Chia; Sun, Ching-Cherng

    2012-01-01

    Random phase encoding is a unique technique for volume hologram which can be applied to various applications such as holographic multiplexing storage, image encryption, and optical sensing. In this review article, we first review and discuss diffraction selectivity of random phase encoding in volume holograms, which is the most important parameter related to multiplexing capacity of volume holographic storage. We then review an image encryption system based on random phase encoding. The alignment of phase key for decryption of the encoded image stored in holographic memory is analyzed and discussed. In the latter part of the review, an all-optical sensing system implemented by random phase encoding and holographic interconnection is presented.

  18. Counterfactual Quantum Deterministic Key Distribution

    NASA Astrophysics Data System (ADS)

    Zhang, Sheng; Wang, Jian; Tang, Chao-Jing

    2013-01-01

    We propose a new counterfactual quantum cryptography protocol concerning about distributing a deterministic key. By adding a controlled blocking operation module to the original protocol [T.G. Noh, Phys. Rev. Lett. 103 (2009) 230501], the correlation between the polarizations of the two parties, Alice and Bob, is extended, therefore, one can distribute both deterministic keys and random ones using our protocol. We have also given a simple proof of the security of our protocol using the technique we ever applied to the original protocol. Most importantly, our analysis produces a bound tighter than the existing ones.

  19. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    NASA Astrophysics Data System (ADS)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  20. Study on the key technology of optical encryption based on compressive ghost imaging with double random-phase encoding

    NASA Astrophysics Data System (ADS)

    Zhang, Leihong; Pan, Zilan; Liang, Dong; Ma, Xiuhua; Zhang, Dawei

    2015-12-01

    An optical encryption method based on compressive ghost imaging (CGI) with double random-phase encoding (DRPE), named DRPE-CGI, is proposed. The information is first encrypted by the sender with DRPE, the DRPE-coded image is encrypted by the system of computational ghost imaging with a secret key. The key of N random-phase vectors is generated by the sender and will be shared with the receiver who is the authorized user. The receiver decrypts the DRPE-coded image with the key, with the aid of CGI and a compressive sensing technique, and then reconstructs the original information by the technique of DRPE-decoding. The experiments suggest that cryptanalysts cannot get any useful information about the original image even if they eavesdrop 60% of the key at a given time, so the security of DRPE-CGI is higher than that of the security of conventional ghost imaging. Furthermore, this method can reduce 40% of the information quantity compared with ghost imaging while the qualities of reconstructing the information are the same. It can also improve the quality of the reconstructed plaintext information compared with DRPE-GI with the same sampling times. This technique can be immediately applied to encryption and data storage with the advantages of high security, fast transmission, and high quality of reconstructed information.

  1. Diversity of Poissonian populations.

    PubMed

    Eliazar, Iddo I; Sokolov, Igor M

    2010-01-01

    Populations represented by collections of points scattered randomly on the real line are ubiquitous in science and engineering. The statistical modeling of such populations leads naturally to Poissonian populations-Poisson processes on the real line with a distinguished maximal point. Poissonian populations are infinite objects underlying key issues in statistical physics, probability theory, and random fractals. Due to their infiniteness, measuring the diversity of Poissonian populations depends on the lower-bound cut-off applied. This research characterizes the classes of Poissonian populations whose diversities are invariant with respect to the cut-off level applied and establishes an elemental connection between these classes and extreme-value theory. The measures of diversity considered are variance and dispersion, Simpson's index and inverse participation ratio, Shannon's entropy and Rényi's entropy, and Gini's index.

  2. Three-dimensional information hierarchical encryption based on computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Shen, Xueju; Cao, Liangcai; Zhang, Hao; Zong, Song; Jin, Guofan

    2016-12-01

    A novel approach for encrypting three-dimensional (3-D) scene information hierarchically based on computer-generated holograms (CGHs) is proposed. The CGHs of the layer-oriented 3-D scene information are produced by angular-spectrum propagation algorithm at different depths. All the CGHs are then modulated by different chaotic random phase masks generated by the logistic map. Hierarchical encryption encoding is applied when all the CGHs are accumulated one by one, and the reconstructed volume of the 3-D scene information depends on permissions of different users. The chaotic random phase masks could be encoded into several parameters of the chaotic sequences to simplify the transmission and preservation of the keys. Optical experiments verify the proposed method and numerical simulations show the high key sensitivity, high security, and application flexibility of the method.

  3. Applying Chaos Theory to Lesson Planning and Delivery

    ERIC Educational Resources Information Center

    Cvetek, Slavko

    2008-01-01

    In this article, some of the ways in which thinking about chaos theory can help teachers and student-teachers to accept uncertainty and randomness as natural conditions in the classroom are considered. Building on some key features of complex systems commonly attributed to chaos theory (e.g. complexity, nonlinearity, sensitivity to initial…

  4. Analysis on pseudo excitation of random vibration for structure of time flight counter

    NASA Astrophysics Data System (ADS)

    Wu, Qiong; Li, Dapeng

    2015-03-01

    Traditional computing method is inefficient for getting key dynamical parameters of complicated structure. Pseudo Excitation Method(PEM) is an effective method for calculation of random vibration. Due to complicated and coupling random vibration in rocket or shuttle launching, the new staging white noise mathematical model is deduced according to the practical launch environment. This deduced model is applied for PEM to calculate the specific structure of Time of Flight Counter(ToFC). The responses of power spectral density and the relevant dynamic characteristic parameters of ToFC are obtained in terms of the flight acceptance test level. Considering stiffness of fixture structure, the random vibration experiments are conducted in three directions to compare with the revised PEM. The experimental results show the structure can bear the random vibration caused by launch without any damage and key dynamical parameters of ToFC are obtained. The revised PEM is similar with random vibration experiment in dynamical parameters and responses are proved by comparative results. The maximum error is within 9%. The reasons of errors are analyzed to improve reliability of calculation. This research provides an effective method for solutions of computing dynamical characteristic parameters of complicated structure in the process of rocket or shuttle launching.

  5. Experimental Demonstration of Polarization Encoding Measurement-Device-Independent Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Tang, Zhiyuan; Liao, Zhongfa; Xu, Feihu; Qi, Bing; Qian, Li; Lo, Hoi-Kwong

    2014-05-01

    We demonstrate the first implementation of polarization encoding measurement-device-independent quantum key distribution (MDI-QKD), which is immune to all detector side-channel attacks. Active phase randomization of each individual pulse is implemented to protect against attacks on imperfect sources. By optimizing the parameters in the decoy state protocol, we show that it is feasible to implement polarization encoding MDI-QKD with commercial off-the-shelf devices. A rigorous finite key analysis is applied to estimate the secure key rate. Our work paves the way for the realization of a MDI-QKD network, in which the users only need compact and low-cost state-preparation devices and can share complicated and expensive detectors provided by an untrusted network server.

  6. Experimental demonstration of polarization encoding measurement-device-independent quantum key distribution.

    PubMed

    Tang, Zhiyuan; Liao, Zhongfa; Xu, Feihu; Qi, Bing; Qian, Li; Lo, Hoi-Kwong

    2014-05-16

    We demonstrate the first implementation of polarization encoding measurement-device-independent quantum key distribution (MDI-QKD), which is immune to all detector side-channel attacks. Active phase randomization of each individual pulse is implemented to protect against attacks on imperfect sources. By optimizing the parameters in the decoy state protocol, we show that it is feasible to implement polarization encoding MDI-QKD with commercial off-the-shelf devices. A rigorous finite key analysis is applied to estimate the secure key rate. Our work paves the way for the realization of a MDI-QKD network, in which the users only need compact and low-cost state-preparation devices and can share complicated and expensive detectors provided by an untrusted network server.

  7. Random ambience using high fidelity images

    NASA Astrophysics Data System (ADS)

    Abu, Nur Azman; Sahib, Shahrin

    2011-06-01

    Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported.

  8. Random patterns and biometrics for counterfeit deterrence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tolk, K.M.

    1993-12-31

    Sandia National Laboratories (SNL) has been working on non-counterfeitable seals, tags, and documents for over fifteen years. During that time, several technologies have been developed that can be applied to deter counterfeiting of identification documents such as ID cards, passports, and possibly credit cards. Two technologies are presented in some detail. The first is reflective particle tagging technology that was developed to help verify treaties limiting the numbers of nuclear weapons that participating parties may possess. This approach uses the random locations and orientations of reflective particles applied to the surface of an item to uniquely identify the item. Themore » resulting tags are secure against even the most determined adversaries. The second technology uses biometric information printed on the document and public key cryptography to ensure that an adversary cannot issue identification documents to unauthorized individuals.« less

  9. Policy entrepreneurship in UK central government: The behavioural insights team and the use of randomized controlled trials

    PubMed Central

    2014-01-01

    What factors explain the success of the UK Cabinet Office’s Behavioural Insights Team? To answer this question, this article applies insights from organizational theory, particularly accounts of change agents. Change agents are able—with senior sponsorship—to foster innovation by determination and skill: they win allies and circumvent more traditional bureaucratic procedures. Although Behavioural Insights Team is a change agent—maybe even a skunkworks unit—not all the facilitating factors identified in the literature apply in this central government context. Key factors are its willingness to work in a non-hierarchical way, skills at forming alliances, and the ability to form good relationships with expert audiences. It has been able to promote a more entrepreneurial approach to government by using randomized controlled trials as a robust method of policy evaluation. PMID:28596638

  10. Anomalous transport in fluid field with random waiting time depending on the preceding jump length

    NASA Astrophysics Data System (ADS)

    Zhang, Hong; Li, Guo-Hua

    2016-11-01

    Anomalous (or non-Fickian) transport behaviors of particles have been widely observed in complex porous media. To capture the energy-dependent characteristics of non-Fickian transport of a particle in flow fields, in the present paper a generalized continuous time random walk model whose waiting time probability distribution depends on the preceding jump length is introduced, and the corresponding master equation in Fourier-Laplace space for the distribution of particles is derived. As examples, two generalized advection-dispersion equations for Gaussian distribution and lévy flight with the probability density function of waiting time being quadratic dependent on the preceding jump length are obtained by applying the derived master equation. Project supported by the Foundation for Young Key Teachers of Chengdu University of Technology, China (Grant No. KYGG201414) and the Opening Foundation of Geomathematics Key Laboratory of Sichuan Province, China (Grant No. scsxdz2013009).

  11. A new compound control method for sine-on-random mixed vibration test

    NASA Astrophysics Data System (ADS)

    Zhang, Buyun; Wang, Ruochen; Zeng, Falin

    2017-09-01

    Vibration environmental test (VET) is one of the important and effective methods to provide supports for the strength design, reliability and durability test of mechanical products. A new separation control strategy was proposed to apply in multiple-input multiple-output (MIMO) sine on random (SOR) mixed mode vibration test, which is the advanced and intensive test type of VET. As the key problem of the strategy, correlation integral method was applied to separate the mixed signals which included random and sinusoidal components. The feedback control formula of MIMO linear random vibration system was systematically deduced in frequency domain, and Jacobi control algorithm was proposed in view of the elements, such as self-spectrum, coherence, and phase of power spectral density (PSD) matrix. Based on the excessive correction of excitation in sine vibration test, compression factor was introduced to reduce the excitation correction, avoiding the destruction to vibration table or other devices. The two methods were synthesized to be applied in MIMO SOR vibration test system. In the final, verification test system with the vibration of a cantilever beam as the control object was established to verify the reliability and effectiveness of the methods proposed in the paper. The test results show that the exceeding values can be controlled in the tolerance range of references accurately, and the method can supply theory and application supports for mechanical engineering.

  12. Regular three-dimensional presentations improve in the identification of surgical liver anatomy - a randomized study.

    PubMed

    Müller-Stich, Beat P; Löb, Nicole; Wald, Diana; Bruckner, Thomas; Meinzer, Hans-Peter; Kadmon, Martina; Büchler, Markus W; Fischer, Lars

    2013-09-25

    Three-dimensional (3D) presentations enhance the understanding of complex anatomical structures. However, it has been shown that two dimensional (2D) "key views" of anatomical structures may suffice in order to improve spatial understanding. The impact of real 3D images (3Dr) visible only with 3D glasses has not been examined yet. Contrary to 3Dr, regular 3D images apply techniques such as shadows and different grades of transparency to create the impression of 3D.This randomized study aimed to define the impact of both the addition of key views to CT images (2D+) and the use of 3Dr on the identification of liver anatomy in comparison with regular 3D presentations (3D). A computer-based teaching module (TM) was used. Medical students were randomized to three groups (2D+ or 3Dr or 3D) and asked to answer 11 anatomical questions and 4 evaluative questions. Both 3D groups had animated models of the human liver available to them which could be moved in all directions. 156 medical students (57.7% female) participated in this randomized trial. Students exposed to 3Dr and 3D performed significantly better than those exposed to 2D+ (p < 0.01, ANOVA). There were no significant differences between 3D and 3Dr and no significant gender differences (p > 0.1, t-test). Students randomized to 3D and 3Dr not only had significantly better results, but they also were significantly faster in answering the 11 anatomical questions when compared to students randomized to 2D+ (p < 0.03, ANOVA). Whether or not "key views" were used had no significant impact on the number of correct answers (p > 0.3, t-test). This randomized trial confirms that regular 3D visualization improve the identification of liver anatomy.

  13. Decoy-state quantum key distribution with biased basis choice

    PubMed Central

    Wei, Zhengchao; Wang, Weilong; Zhang, Zhen; Gao, Ming; Ma, Zhi; Ma, Xiongfeng

    2013-01-01

    We propose a quantum key distribution scheme that combines a biased basis choice with the decoy-state method. In this scheme, Alice sends all signal states in the Z basis and decoy states in the X and Z basis with certain probabilities, and Bob measures received pulses with optimal basis choice. This scheme simplifies the system and reduces the random number consumption. From the simulation result taking into account of statistical fluctuations, we find that in a typical experimental setup, the proposed scheme can increase the key rate by at least 45% comparing to the standard decoy-state scheme. In the postprocessing, we also apply a rigorous method to upper bound the phase error rate of the single-photon components of signal states. PMID:23948999

  14. Decoy-state quantum key distribution with biased basis choice.

    PubMed

    Wei, Zhengchao; Wang, Weilong; Zhang, Zhen; Gao, Ming; Ma, Zhi; Ma, Xiongfeng

    2013-01-01

    We propose a quantum key distribution scheme that combines a biased basis choice with the decoy-state method. In this scheme, Alice sends all signal states in the Z basis and decoy states in the X and Z basis with certain probabilities, and Bob measures received pulses with optimal basis choice. This scheme simplifies the system and reduces the random number consumption. From the simulation result taking into account of statistical fluctuations, we find that in a typical experimental setup, the proposed scheme can increase the key rate by at least 45% comparing to the standard decoy-state scheme. In the postprocessing, we also apply a rigorous method to upper bound the phase error rate of the single-photon components of signal states.

  15. Prefixed-threshold real-time selection method in free-space quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Wenyuan; Xu, Feihu; Lo, Hoi-Kwong

    2018-03-01

    Free-space quantum key distribution allows two parties to share a random key with unconditional security, between ground stations, between mobile platforms, and even in satellite-ground quantum communications. Atmospheric turbulence causes fluctuations in transmittance, which further affect the quantum bit error rate and the secure key rate. Previous postselection methods to combat atmospheric turbulence require a threshold value determined after all quantum transmission. In contrast, here we propose a method where we predetermine the optimal threshold value even before quantum transmission. Therefore, the receiver can discard useless data immediately, thus greatly reducing data storage requirements and computing resources. Furthermore, our method can be applied to a variety of protocols, including, for example, not only single-photon BB84 but also asymptotic and finite-size decoy-state BB84, which can greatly increase its practicality.

  16. Robustness and fragility in coupled oscillator networks under targeted attacks.

    PubMed

    Yuan, Tianyu; Aihara, Kazuyuki; Tanaka, Gouhei

    2017-01-01

    The dynamical tolerance of coupled oscillator networks against local failures is studied. As the fraction of failed oscillator nodes gradually increases, the mean oscillation amplitude in the entire network decreases and then suddenly vanishes at a critical fraction as a phase transition. This critical fraction, widely used as a measure of the network robustness, was analytically derived for random failures but not for targeted attacks so far. Here we derive the general formula for the critical fraction, which can be applied to both random failures and targeted attacks. We consider the effects of targeting oscillator nodes based on their degrees. First we deal with coupled identical oscillators with homogeneous edge weights. Then our theory is applied to networks with heterogeneous edge weights and to those with nonidentical oscillators. The analytical results are validated by numerical experiments. Our results reveal the key factors governing the robustness and fragility of oscillator networks.

  17. Enhancing superconducting critical current by randomness

    DOE PAGES

    Wang, Y. L.; Thoutam, L. R.; Xiao, Z. L.; ...

    2016-01-11

    The key ingredient of high critical currents in a type-II superconductor is defect sites that pin vortices. Here, we demonstrate that a random pinscape, an overlooked pinning system in nanopatterned superconductors, can lead to a substantially larger critical current enhancement at high magnetic fields than an ordered array of vortex pin sites. We reveal that the better performance of a random pinscape is due to the variation of the local density of its pinning sites, which mitigates the motion of vortices. This is confirmed by achieving even higher enhancement of the critical current through a conformally mapped random pinscape, wheremore » the distribution of the local density of pinning sites is further enlarged. Our findings highlight the potential of random pinscapes in enhancing the superconducting critical currents of applied superconductors in which random pin sites of nanoscale defects emerging in the materials synthesis process or through ex-situ irradiation are the only practical choice for large-scale production. Our results may also stimulate research on effects of a random pinscape in other complementary systems such as colloidal crystals, Bose-Einstein condensates, and Luttinger liquids.« less

  18. University Students’ Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics

    PubMed Central

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution—the central, unifying, and overarching theme in biology. Aspects strongly related to abstract “threshold” concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument for assessing students’ conceptual knowledge of randomness and probability in the context of evolution. To address this problem, we have developed two instruments, Randomness and Probability Test in the Context of Evolution (RaProEvo) and Randomness and Probability Test in the Context of Mathematics (RaProMath), that include both multiple-choice and free-response items. The instruments were administered to 140 university students in Germany, then the Rasch partial-credit model was applied to assess them. The results indicate that the instruments generate reliable and valid inferences about students’ conceptual knowledge of randomness and probability in the two contexts (which are separable competencies). Furthermore, RaProEvo detected significant differences in knowledge of randomness and probability, as well as evolutionary theory, between biology majors and preservice biology teachers. PMID:28572180

  19. Registration algorithm of point clouds based on multiscale normal features

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Peng, Zhongtao; Su, Hang; Xia, GuiHua

    2015-01-01

    The point cloud registration technology for obtaining a three-dimensional digital model is widely applied in many areas. To improve the accuracy and speed of point cloud registration, a registration method based on multiscale normal vectors is proposed. The proposed registration method mainly includes three parts: the selection of key points, the calculation of feature descriptors, and the determining and optimization of correspondences. First, key points are selected from the point cloud based on the changes of magnitude of multiscale curvatures obtained by using principal components analysis. Then the feature descriptor of each key point is proposed, which consists of 21 elements based on multiscale normal vectors and curvatures. The correspondences in a pair of two point clouds are determined according to the descriptor's similarity of key points in the source point cloud and target point cloud. Correspondences are optimized by using a random sampling consistency algorithm and clustering technology. Finally, singular value decomposition is applied to optimized correspondences so that the rigid transformation matrix between two point clouds is obtained. Experimental results show that the proposed point cloud registration algorithm has a faster calculation speed, higher registration accuracy, and better antinoise performance.

  20. [Krigle estimation and its simulated sampling of Chilo suppressalis population density].

    PubMed

    Yuan, Zheming; Bai, Lianyang; Wang, Kuiwu; Hu, Xiangyue

    2004-07-01

    In order to draw up a rational sampling plan for the larvae population of Chilo suppressalis, an original population and its two derivative populations, random population and sequence population, were sampled and compared with random sampling, gap-range-random sampling, and a new systematic sampling integrated Krigle interpolation and random original position. As for the original population whose distribution was up to aggregative and dependence range in line direction was 115 cm (6.9 units), gap-range-random sampling in line direction was more precise than random sampling. Distinguishing the population pattern correctly is the key to get a better precision. Gap-range-random sampling and random sampling are fit for aggregated population and random population, respectively, but both of them are difficult to apply in practice. Therefore, a new systematic sampling named as Krigle sample (n = 441) was developed to estimate the density of partial sample (partial estimation, n = 441) and population (overall estimation, N = 1500). As for original population, the estimated precision of Krigle sample to partial sample and population was better than that of investigation sample. With the increase of the aggregation intensity of population, Krigel sample was more effective than investigation sample in both partial estimation and overall estimation in the appropriate sampling gap according to the dependence range.

  1. Simultaneous transmission for an encrypted image and a double random-phase encryption key

    NASA Astrophysics Data System (ADS)

    Yuan, Sheng; Zhou, Xin; Li, Da-Hai; Zhou, Ding-Fu

    2007-06-01

    We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.

  2. Simultaneous transmission for an encrypted image and a double random-phase encryption key.

    PubMed

    Yuan, Sheng; Zhou, Xin; Li, Da-hai; Zhou, Ding-fu

    2007-06-20

    We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.

  3. Taxi-Out Time Prediction for Departures at Charlotte Airport Using Machine Learning Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong; Malik, Waqar; Jung, Yoon C.

    2016-01-01

    Predicting the taxi-out times of departures accurately is important for improving airport efficiency and takeoff time predictability. In this paper, we attempt to apply machine learning techniques to actual traffic data at Charlotte Douglas International Airport for taxi-out time prediction. To find the key factors affecting aircraft taxi times, surface surveillance data is first analyzed. From this data analysis, several variables, including terminal concourse, spot, runway, departure fix and weight class, are selected for taxi time prediction. Then, various machine learning methods such as linear regression, support vector machines, k-nearest neighbors, random forest, and neural networks model are applied to actual flight data. Different traffic flow and weather conditions at Charlotte airport are also taken into account for more accurate prediction. The taxi-out time prediction results show that linear regression and random forest techniques can provide the most accurate prediction in terms of root-mean-square errors. We also discuss the operational complexity and uncertainties that make it difficult to predict the taxi times accurately.

  4. The Method of Randomization for Cluster-Randomized Trials: Challenges of Including Patients with Multiple Chronic Conditions

    PubMed Central

    Esserman, Denise; Allore, Heather G.; Travison, Thomas G.

    2016-01-01

    Cluster-randomized clinical trials (CRT) are trials in which the unit of randomization is not a participant but a group (e.g. healthcare systems or community centers). They are suitable when the intervention applies naturally to the cluster (e.g. healthcare policy); when lack of independence among participants may occur (e.g. nursing home hygiene); or when it is most ethical to apply an intervention to all within a group (e.g. school-level immunization). Because participants in the same cluster receive the same intervention, CRT may approximate clinical practice, and may produce generalizable findings. However, when not properly designed or interpreted, CRT may induce biased results. CRT designs have features that add complexity to statistical estimation and inference. Chief among these is the cluster-level correlation in response measurements induced by the randomization. A critical consideration is the experimental unit of inference; often it is desirable to consider intervention effects at the level of the individual rather than the cluster. Finally, given that the number of clusters available may be limited, simple forms of randomization may not achieve balance between intervention and control arms at either the cluster- or participant-level. In non-clustered clinical trials, balance of key factors may be easier to achieve because the sample can be homogenous by exclusion of participants with multiple chronic conditions (MCC). CRTs, which are often pragmatic, may eschew such restrictions. Failure to account for imbalance may induce bias and reducing validity. This article focuses on the complexities of randomization in the design of CRTs, such as the inclusion of patients with MCC, and imbalances in covariate factors across clusters. PMID:27478520

  5. Key management of the double random-phase-encoding method using public-key encryption

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2010-03-01

    Public-key encryption has been used to encode the key of the encryption process. In the proposed technique, an input image has been encrypted by using the double random-phase-encoding method using extended fractional Fourier transform. The key of the encryption process have been encoded by using the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. The encoded key has then been transmitted to the receiver side along with the encrypted image. In the decryption process, first the encoded key has been decrypted using the secret key and then the encrypted image has been decrypted by using the retrieved key parameters. The proposed technique has advantage over double random-phase-encoding method because the problem associated with the transmission of the key has been eliminated by using public-key encryption. Computer simulation has been carried out to validate the proposed technique.

  6. Survey of rural, private wells. Statistical design

    USGS Publications Warehouse

    Mehnert, Edward; Schock, Susan C.; ,

    1991-01-01

    Half of Illinois' 38 million acres were planted in corn and soybeans in 1988. On the 19 million acres planted in corn and soybeans, approximately 1 million tons of nitrogen fertilizer and 50 million pounds of pesticides were applied. Because groundwater is the water supply for over 90 percent of rural Illinois, the occurrence of agricultural chemicals in groundwater in Illinois is of interest to the agricultural community, the public, and regulatory agencies. The occurrence of agricultural chemicals in groundwater is well documented. However, the extent of this contamination still needs to be defined. This can be done by randomly sampling wells across a geographic area. Key elements of a random, water-well sampling program for regional groundwater quality include the overall statistical design of the program, definition of the sample population, selection of wells to be sampled, and analysis of survey results. These elements must be consistent with the purpose for conducting the program; otherwise, the program will not provide the desired information. The need to carefully design and conduct a sampling program becomes readily apparent when one considers the high cost of collecting and analyzing a sample. For a random sampling program conducted in Illinois, the key elements, as well as the limitations imposed by available information, are described.

  7. A simplification of the fractional Hartley transform applied to image security system in phase

    NASA Astrophysics Data System (ADS)

    Jimenez, Carlos J.; Vilardy, Juan M.; Perez, Ronal

    2017-01-01

    In this work we develop a new encryption system for encoded image in phase using the fractional Hartley transform (FrHT), truncation operations and random phase masks (RPMs). We introduce a simplification of the FrHT with the purpose of computing this transform in an efficient and fast way. The security of the encryption system is increased by using nonlinear operations, such as the phase encoding and the truncation operations. The image to encrypt (original image) is encoded in phase and the truncation operations applied in the encryption-decryption system are the amplitude and phase truncations. The encrypted image is protected by six keys, which are the two fractional orders of the FrHTs, the two RPMs and the two pseudorandom code images generated by the amplitude and phase truncation operations. All these keys have to be correct for a proper recovery of the original image in the decryption system. We present digital results that confirm our approach.

  8. A hybrid quantum-inspired genetic algorithm for multiobjective flow shop scheduling.

    PubMed

    Li, Bin-Bin; Wang, Ling

    2007-06-01

    This paper proposes a hybrid quantum-inspired genetic algorithm (HQGA) for the multiobjective flow shop scheduling problem (FSSP), which is a typical NP-hard combinatorial optimization problem with strong engineering backgrounds. On the one hand, a quantum-inspired GA (QGA) based on Q-bit representation is applied for exploration in the discrete 0-1 hyperspace by using the updating operator of quantum gate and genetic operators of Q-bit. Moreover, random-key representation is used to convert the Q-bit representation to job permutation for evaluating the objective values of the schedule solution. On the other hand, permutation-based GA (PGA) is applied for both performing exploration in permutation-based scheduling space and stressing exploitation for good schedule solutions. To evaluate solutions in multiobjective sense, a randomly weighted linear-sum function is used in QGA, and a nondominated sorting technique including classification of Pareto fronts and fitness assignment is applied in PGA with regard to both proximity and diversity of solutions. To maintain the diversity of the population, two trimming techniques for population are proposed. The proposed HQGA is tested based on some multiobjective FSSPs. Simulation results and comparisons based on several performance metrics demonstrate the effectiveness of the proposed HQGA.

  9. A Comparison of One Time Pad Random Key Generation using Linear Congruential Generator and Quadratic Congruential Generator

    NASA Astrophysics Data System (ADS)

    Apdilah, D.; Harahap, M. K.; Khairina, N.; Husein, A. M.; Harahap, M.

    2018-04-01

    One Time Pad algorithm always requires a pairing of the key for plaintext. If the length of keys less than a length of the plaintext, the key will be repeated until the length of the plaintext same with the length of the key. In this research, we use Linear Congruential Generator and Quadratic Congruential Generator for generating a random number. One Time Pad use a random number as a key for encryption and decryption process. Key will generate the first letter from the plaintext, we compare these two algorithms in terms of time speed encryption, and the result is a combination of OTP with LCG faster than the combination of OTP with QCG.

  10. Applying health economics for policy decision making: do devices differ from drugs?

    PubMed

    Sorenson, Corinna; Tarricone, Rosanna; Siebert, Markus; Drummond, Michael

    2011-05-01

    Medical devices pose unique challenges for economic evaluation and associated decision-making processes that differ from pharmaceuticals. We highlight and discuss these challenges in the context of cardiac device therapy, based on a systematic review of relevant economic evaluations. Key challenges include practical difficulties in conducting randomized clinical trials, allowing for a 'learning curve' and user characteristics, accounting for the wider organizational impacts of introducing new devices, and allowing for variations in product characteristics and prices over time.

  11. Extinction vulnerability of coral reef fishes.

    PubMed

    Graham, Nicholas A J; Chabanet, Pascale; Evans, Richard D; Jennings, Simon; Letourneur, Yves; Aaron Macneil, M; McClanahan, Tim R; Ohman, Marcus C; Polunin, Nicholas V C; Wilson, Shaun K

    2011-04-01

    With rapidly increasing rates of contemporary extinction, predicting extinction vulnerability and identifying how multiple stressors drive non-random species loss have become key challenges in ecology. These assessments are crucial for avoiding the loss of key functional groups that sustain ecosystem processes and services. We developed a novel predictive framework of species extinction vulnerability and applied it to coral reef fishes. Although relatively few coral reef fishes are at risk of global extinction from climate disturbances, a negative convex relationship between fish species locally vulnerable to climate change vs. fisheries exploitation indicates that the entire community is vulnerable on the many reefs where both stressors co-occur. Fishes involved in maintaining key ecosystem functions are more at risk from fishing than climate disturbances. This finding is encouraging as local and regional commitment to fisheries management action can maintain reef ecosystem functions pending progress towards the more complex global problem of stabilizing the climate. © 2011 Blackwell Publishing Ltd/CNRS.

  12. Feasibility of Extracting Key Elements from ClinicalTrials.gov to Support Clinicians' Patient Care Decisions.

    PubMed

    Kim, Heejun; Bian, Jiantao; Mostafa, Javed; Jonnalagadda, Siddhartha; Del Fiol, Guilherme

    2016-01-01

    Motivation: Clinicians need up-to-date evidence from high quality clinical trials to support clinical decisions. However, applying evidence from the primary literature requires significant effort. Objective: To examine the feasibility of automatically extracting key clinical trial information from ClinicalTrials.gov. Methods: We assessed the coverage of ClinicalTrials.gov for high quality clinical studies that are indexed in PubMed. Using 140 random ClinicalTrials.gov records, we developed and tested rules for the automatic extraction of key information. Results: The rate of high quality clinical trial registration in ClinicalTrials.gov increased from 0.2% in 2005 to 17% in 2015. Trials reporting results increased from 3% in 2005 to 19% in 2015. The accuracy of the automatic extraction algorithm for 10 trial attributes was 90% on average. Future research is needed to improve the algorithm accuracy and to design information displays to optimally present trial information to clinicians.

  13. Multi-party Semi-quantum Key Agreement with Delegating Quantum Computation

    NASA Astrophysics Data System (ADS)

    Liu, Wen-Jie; Chen, Zhen-Yu; Ji, Sai; Wang, Hai-Bin; Zhang, Jun

    2017-10-01

    A multi-party semi-quantum key agreement (SQKA) protocol based on delegating quantum computation (DQC) model is proposed by taking Bell states as quantum resources. In the proposed protocol, the participants only need the ability of accessing quantum channel and preparing single photons {|0〉, |1〉, |+〉, |-〉}, while the complicated quantum operations, such as the unitary operations and Bell measurement, will be delegated to the remote quantum center. Compared with previous quantum key agreement protocols, this client-server model is more feasible in the early days of the emergence of quantum computers. In order to prevent the attacks from outside eavesdroppers, inner participants and quantum center, two single photon sequences are randomly inserted into Bell states: the first sequence is used to perform the quantum channel detection, while the second is applied to disorder the positions of message qubits, which guarantees the security of the protocol.

  14. Extinction vulnerability of coral reef fishes

    PubMed Central

    Graham, Nicholas A J; Chabanet, Pascale; Evans, Richard D; Jennings, Simon; Letourneur, Yves; Aaron MacNeil, M; McClanahan, Tim R; Öhman, Marcus C; Polunin, Nicholas V C; Wilson, Shaun K

    2011-01-01

    With rapidly increasing rates of contemporary extinction, predicting extinction vulnerability and identifying how multiple stressors drive non-random species loss have become key challenges in ecology. These assessments are crucial for avoiding the loss of key functional groups that sustain ecosystem processes and services. We developed a novel predictive framework of species extinction vulnerability and applied it to coral reef fishes. Although relatively few coral reef fishes are at risk of global extinction from climate disturbances, a negative convex relationship between fish species locally vulnerable to climate change vs. fisheries exploitation indicates that the entire community is vulnerable on the many reefs where both stressors co-occur. Fishes involved in maintaining key ecosystem functions are more at risk from fishing than climate disturbances. This finding is encouraging as local and regional commitment to fisheries management action can maintain reef ecosystem functions pending progress towards the more complex global problem of stabilizing the climate. PMID:21320260

  15. Biased random key genetic algorithm with insertion and gender selection for capacitated vehicle routing problem with time windows

    NASA Astrophysics Data System (ADS)

    Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri

    2017-06-01

    Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.

  16. Practicing Field Hockey Skills Along the Contextual Interference Continuum: A Comparison of Five Practice Schedules

    PubMed Central

    Cheong, Jadeera Phaik Geok; Lay, Brendan; Grove, J. Robert; Medic, Nikola; Razman, Rizal

    2012-01-01

    To overcome the weakness of the contextual interference (CI) effect within applied settings, Brady, 2008 recommended that the amount of interference be manipulated. This study investigated the effect of five practice schedules on the learning of three field hockey skills. Fifty-five pre-university students performed a total of 90 trials for each skill under blocked, mixed or random practice orders. Results showed a significant time effect with all five practice conditions leading to improvements in acquisition and learning of the skills. No significant differences were found between the groups. The findings of the present study did not support the CI effect and suggest that either blocked, mixed, or random practice schedules can be used effectively when structuring practice for beginners. Key pointsThe contextual interference effect did not surface when using sport skills.There appears to be no difference between blocked and random practice schedules in the learning of field hockey skills.Low (blocked), moderate (mixed) or high (random) interference practice schedules can be used effectively when conducting a multiple skill practice session for beginners. PMID:24149204

  17. Practicing field hockey skills along the contextual interference continuum: a comparison of five practice schedules.

    PubMed

    Cheong, Jadeera Phaik Geok; Lay, Brendan; Grove, J Robert; Medic, Nikola; Razman, Rizal

    2012-01-01

    To overcome the weakness of the contextual interference (CI) effect within applied settings, Brady, 2008 recommended that the amount of interference be manipulated. This study investigated the effect of five practice schedules on the learning of three field hockey skills. Fifty-five pre-university students performed a total of 90 trials for each skill under blocked, mixed or random practice orders. Results showed a significant time effect with all five practice conditions leading to improvements in acquisition and learning of the skills. No significant differences were found between the groups. The findings of the present study did not support the CI effect and suggest that either blocked, mixed, or random practice schedules can be used effectively when structuring practice for beginners. Key pointsThe contextual interference effect did not surface when using sport skills.There appears to be no difference between blocked and random practice schedules in the learning of field hockey skills.Low (blocked), moderate (mixed) or high (random) interference practice schedules can be used effectively when conducting a multiple skill practice session for beginners.

  18. Effects of topology on network evolution

    NASA Astrophysics Data System (ADS)

    Oikonomou, Panos; Cluzel, Philippe

    2006-08-01

    The ubiquity of scale-free topology in nature raises the question of whether this particular network design confers an evolutionary advantage. A series of studies has identified key principles controlling the growth and the dynamics of scale-free networks. Here, we use neuron-based networks of boolean components as a framework for modelling a large class of dynamical behaviours in both natural and artificial systems. Applying a training algorithm, we characterize how networks with distinct topologies evolve towards a pre-established target function through a process of random mutations and selection. We find that homogeneous random networks and scale-free networks exhibit drastically different evolutionary paths. Whereas homogeneous random networks accumulate neutral mutations and evolve by sparse punctuated steps, scale-free networks evolve rapidly and continuously. Remarkably, this latter property is robust to variations of the degree exponent. In contrast, homogeneous random networks require a specific tuning of their connectivity to optimize their ability to evolve. These results highlight an organizing principle that governs the evolution of complex networks and that can improve the design of engineered systems.

  19. Large-Scale Cubic-Scaling Random Phase Approximation Correlation Energy Calculations Using a Gaussian Basis.

    PubMed

    Wilhelm, Jan; Seewald, Patrick; Del Ben, Mauro; Hutter, Jürg

    2016-12-13

    We present an algorithm for computing the correlation energy in the random phase approximation (RPA) in a Gaussian basis requiring [Formula: see text] operations and [Formula: see text] memory. The method is based on the resolution of the identity (RI) with the overlap metric, a reformulation of RI-RPA in the Gaussian basis, imaginary time, and imaginary frequency integration techniques, and the use of sparse linear algebra. Additional memory reduction without extra computations can be achieved by an iterative scheme that overcomes the memory bottleneck of canonical RPA implementations. We report a massively parallel implementation that is the key for the application to large systems. Finally, cubic-scaling RPA is applied to a thousand water molecules using a correlation-consistent triple-ζ quality basis.

  20. Experimental demonstration of an active phase randomization and monitor module for quantum key distribution

    NASA Astrophysics Data System (ADS)

    Sun, Shi-Hai; Liang, Lin-Mei

    2012-08-01

    Phase randomization is a very important assumption in the BB84 quantum key distribution (QKD) system with weak coherent source; otherwise, eavesdropper may spy the final key. In this Letter, a stable and monitored active phase randomization scheme for the one-way and two-way QKD system is proposed and demonstrated in experiments. Furthermore, our scheme gives an easy way for Alice to monitor the degree of randomization in experiments. Therefore, we expect our scheme to become a standard part in future QKD systems due to its secure significance and feasibility.

  1. Improved Neural Networks with Random Weights for Short-Term Load Forecasting

    PubMed Central

    Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo

    2015-01-01

    An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting. PMID:26629825

  2. Improved Neural Networks with Random Weights for Short-Term Load Forecasting.

    PubMed

    Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo

    2015-01-01

    An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting.

  3. Random motor generation in a finger tapping task: influence of spatial contingency and of cortical and subcortical hemispheric brain lesions

    PubMed Central

    Annoni, J.; Pegna, A.

    1997-01-01

    OBJECTIVE—To test the hypothesis that, during random motor generation, the spatial contingencies inherent to the task would induce additional preferences in normal subjects, shifting their performances farther from randomness. By contrast, perceptual or executive dysfunction could alter these task related biases in patients with brain damage.
METHODS—Two groups of patients, with right and left focal brain lesions, as well as 25 right handed subjects matched for age and handedness were asked to execute a random choice motor task—namely, to generate a random series of 180 button presses from a set of 10 keys placed vertically in front of them.
RESULTS—In the control group, as in the left brain lesion group, motor generation was subject to deviations from theoretical expected randomness, similar to those when numbers are generated mentally, as immediate repetitions (successive presses on the same key) are avoided. However, the distribution of button presses was also contingent on the topographic disposition of the keys: the central keys were chosen more often than those placed at extreme positions. Small distances were favoured, particularly with the left hand. These patterns were influenced by implicit strategies and task related contingencies.
 By contrast, right brain lesion patients with frontal involvement tended to show a more square distribution of key presses—that is, the number of key presses tended to be more equally distributed. The strategies were also altered by brain lesions: the number of immediate repetitions was more frequent when the lesion involved the right frontal areas yielding a random generation nearer to expected theoretical randomness. The frequency of adjacent key presses was increased by right anterior and left posterior cortical as well as by right subcortical lesions, but decreased by left subcortical lesions.
CONCLUSIONS—Depending on the side of the lesion and the degree of cortical-subcortical involvement, the deficits take on a different aspect and direct repetions and adjacent key presses have different patterns of alterations. Motor random generation is therefore a complex task which seems to necessitate the participation of numerous cerebral structures, among which those situated in the right frontal, left posterior, and subcortical regions have a predominant role.

 PMID:9408109

  4. Key Items to Get Right When Conducting a Randomized Controlled Trial in Education

    ERIC Educational Resources Information Center

    Coalition for Evidence-Based Policy, 2005

    2005-01-01

    This is a checklist of key items to get right when conducting a randomized controlled trial to evaluate an educational program or practice ("intervention"). It is intended as a practical resource for researchers and sponsors of research, describing items that are often critical to the success of a randomized controlled trial. A significant…

  5. Feasibility of Extracting Key Elements from ClinicalTrials.gov to Support Clinicians’ Patient Care Decisions

    PubMed Central

    Kim, Heejun; Bian, Jiantao; Mostafa, Javed; Jonnalagadda, Siddhartha; Del Fiol, Guilherme

    2016-01-01

    Motivation: Clinicians need up-to-date evidence from high quality clinical trials to support clinical decisions. However, applying evidence from the primary literature requires significant effort. Objective: To examine the feasibility of automatically extracting key clinical trial information from ClinicalTrials.gov. Methods: We assessed the coverage of ClinicalTrials.gov for high quality clinical studies that are indexed in PubMed. Using 140 random ClinicalTrials.gov records, we developed and tested rules for the automatic extraction of key information. Results: The rate of high quality clinical trial registration in ClinicalTrials.gov increased from 0.2% in 2005 to 17% in 2015. Trials reporting results increased from 3% in 2005 to 19% in 2015. The accuracy of the automatic extraction algorithm for 10 trial attributes was 90% on average. Future research is needed to improve the algorithm accuracy and to design information displays to optimally present trial information to clinicians. PMID:28269867

  6. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  7. Novel image compression-encryption hybrid algorithm based on key-controlled measurement matrix in compressive sensing

    NASA Astrophysics Data System (ADS)

    Zhou, Nanrun; Zhang, Aidi; Zheng, Fen; Gong, Lihua

    2014-10-01

    The existing ways to encrypt images based on compressive sensing usually treat the whole measurement matrix as the key, which renders the key too large to distribute and memorize or store. To solve this problem, a new image compression-encryption hybrid algorithm is proposed to realize compression and encryption simultaneously, where the key is easily distributed, stored or memorized. The input image is divided into 4 blocks to compress and encrypt, then the pixels of the two adjacent blocks are exchanged randomly by random matrices. The measurement matrices in compressive sensing are constructed by utilizing the circulant matrices and controlling the original row vectors of the circulant matrices with logistic map. And the random matrices used in random pixel exchanging are bound with the measurement matrices. Simulation results verify the effectiveness, security of the proposed algorithm and the acceptable compression performance.

  8. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects.

    PubMed

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called "cluster randomization"). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  9. The cosmic microwave background radiation power spectrum as a random bit generator for symmetric- and asymmetric-key cryptography.

    PubMed

    Lee, Jeffrey S; Cleaver, Gerald B

    2017-10-01

    In this note, the Cosmic Microwave Background (CMB) Radiation is shown to be capable of functioning as a Random Bit Generator, and constitutes an effectively infinite supply of truly random one-time pad values of arbitrary length. It is further argued that the CMB power spectrum potentially conforms to the FIPS 140-2 standard. Additionally, its applicability to the generation of a (n × n) random key matrix for a Vernam cipher is established.

  10. Hacking on decoy-state quantum key distribution system with partial phase randomization

    NASA Astrophysics Data System (ADS)

    Sun, Shi-Hai; Jiang, Mu-Sheng; Ma, Xiang-Chun; Li, Chun-Yan; Liang, Lin-Mei

    2014-04-01

    Quantum key distribution (QKD) provides means for unconditional secure key transmission between two distant parties. However, in practical implementations, it suffers from quantum hacking due to device imperfections. Here we propose a hybrid measurement attack, with only linear optics, homodyne detection, and single photon detection, to the widely used vacuum + weak decoy state QKD system when the phase of source is partially randomized. Our analysis shows that, in some parameter regimes, the proposed attack would result in an entanglement breaking channel but still be able to trick the legitimate users to believe they have transmitted secure keys. That is, the eavesdropper is able to steal all the key information without discovered by the users. Thus, our proposal reveals that partial phase randomization is not sufficient to guarantee the security of phase-encoding QKD systems with weak coherent states.

  11. All about Eve: Secret Sharing using Quantum Effects

    NASA Technical Reports Server (NTRS)

    Jackson, Deborah J.

    2005-01-01

    This document discusses the nature of light (including classical light and photons), encryption, quantum key distribution (QKD), light polarization and beamsplitters and their application to information communication. A quantum of light represents the smallest possible subdivision of radiant energy (light) and is called a photon. The QKD key generation sequence is outlined including the receiver broadcasting the initial signal indicating reception availability, timing pulses from the sender to provide reference for gated detection of photons, the sender generating photons through random polarization while the receiver detects photons with random polarization and communicating via data link to mutually establish random keys. The QKD network vision includes inter-SATCOM, point-to-point Gnd Fiber and SATCOM-fiber nodes. QKD offers an unconditionally secure method of exchanging encryption keys. Ongoing research will focus on how to increase the key generation rate.

  12. Hacking on decoy-state quantum key distribution system with partial phase randomization.

    PubMed

    Sun, Shi-Hai; Jiang, Mu-Sheng; Ma, Xiang-Chun; Li, Chun-Yan; Liang, Lin-Mei

    2014-04-23

    Quantum key distribution (QKD) provides means for unconditional secure key transmission between two distant parties. However, in practical implementations, it suffers from quantum hacking due to device imperfections. Here we propose a hybrid measurement attack, with only linear optics, homodyne detection, and single photon detection, to the widely used vacuum + weak decoy state QKD system when the phase of source is partially randomized. Our analysis shows that, in some parameter regimes, the proposed attack would result in an entanglement breaking channel but still be able to trick the legitimate users to believe they have transmitted secure keys. That is, the eavesdropper is able to steal all the key information without discovered by the users. Thus, our proposal reveals that partial phase randomization is not sufficient to guarantee the security of phase-encoding QKD systems with weak coherent states.

  13. A sensitivity analysis for missing outcomes due to truncation by death under the matched-pairs design.

    PubMed

    Imai, Kosuke; Jiang, Zhichao

    2018-04-29

    The matched-pairs design enables researchers to efficiently infer causal effects from randomized experiments. In this paper, we exploit the key feature of the matched-pairs design and develop a sensitivity analysis for missing outcomes due to truncation by death, in which the outcomes of interest (e.g., quality of life measures) are not even well defined for some units (e.g., deceased patients). Our key idea is that if 2 nearly identical observations are paired prior to the randomization of the treatment, the missingness of one unit's outcome is informative about the potential missingness of the other unit's outcome under an alternative treatment condition. We consider the average treatment effect among always-observed pairs (ATOP) whose units exhibit no missing outcome regardless of their treatment status. The naive estimator based on available pairs is unbiased for the ATOP if 2 units of the same pair are identical in terms of their missingness patterns. The proposed sensitivity analysis characterizes how the bounds of the ATOP widen as the degree of the within-pair similarity decreases. We further extend the methodology to the matched-pairs design in observational studies. Our simulation studies show that informative bounds can be obtained under some scenarios when the proportion of missing data is not too large. The proposed methodology is also applied to the randomized evaluation of the Mexican universal health insurance program. An open-source software package is available for implementing the proposed research. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects

    PubMed Central

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called “cluster randomization”). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies. PMID:28584874

  15. New Quantum Key Distribution Scheme Based on Random Hybrid Quantum Channel with EPR Pairs and GHZ States

    NASA Astrophysics Data System (ADS)

    Yan, Xing-Yu; Gong, Li-Hua; Chen, Hua-Ying; Zhou, Nan-Run

    2018-05-01

    A theoretical quantum key distribution scheme based on random hybrid quantum channel with EPR pairs and GHZ states is devised. In this scheme, EPR pairs and tripartite GHZ states are exploited to set up random hybrid quantum channel. Only one photon in each entangled state is necessary to run forth and back in the channel. The security of the quantum key distribution scheme is guaranteed by more than one round of eavesdropping check procedures. It is of high capacity since one particle could carry more than two bits of information via quantum dense coding.

  16. Physical key-protected one-time pad

    PubMed Central

    Horstmeyer, Roarke; Judkewitz, Benjamin; Vellekoop, Ivo M.; Assawaworrarit, Sid; Yang, Changhuei

    2013-01-01

    We describe an encrypted communication principle that forms a secure link between two parties without electronically saving either of their keys. Instead, random cryptographic bits are kept safe within the unique mesoscopic randomness of two volumetric scattering materials. We demonstrate how a shared set of patterned optical probes can generate 10 gigabits of statistically verified randomness between a pair of unique 2 mm3 scattering objects. This shared randomness is used to facilitate information-theoretically secure communication following a modified one-time pad protocol. Benefits of volumetric physical storage over electronic memory include the inability to probe, duplicate or selectively reset any bits without fundamentally altering the entire key space. Our ability to securely couple the randomness contained within two unique physical objects can extend to strengthen hardware required by a variety of cryptographic protocols, which is currently a critically weak link in the security pipeline of our increasingly mobile communication culture. PMID:24345925

  17. Assessment of groundwater exploitation in an aquifer using the random walk on grid method: a case study at Ordos, China

    NASA Astrophysics Data System (ADS)

    Nan, Tongchao; Li, Kaixuan; Wu, Jichun; Yin, Lihe

    2018-04-01

    Sustainability has been one of the key criteria of effective water exploitation. Groundwater exploitation and water-table decline at Haolebaoji water source site in the Ordos basin in NW China has drawn public attention due to concerns about potential threats to ecosystems and grazing land in the area. To better investigate the impact of production wells at Haolebaoji on the water table, an adapted algorithm called the random walk on grid method (WOG) is applied to simulate the hydraulic head in the unconfined and confined aquifers. This is the first attempt to apply WOG to a real groundwater problem. The method can not only evaluate the head values but also the contributions made by each source/sink term. One is allowed to analyze the impact of source/sink terms just as if one had an analytical solution. The head values evaluated by WOG match the values derived from the software Groundwater Modeling System (GMS). It suggests that WOG is effective and applicable in a heterogeneous aquifer with respect to practical problems, and the resultant information is useful for groundwater management.

  18. Random Number Generation and Executive Functions in Parkinson's Disease: An Event-Related Brain Potential Study.

    PubMed

    Münte, Thomas F; Joppich, Gregor; Däuper, Jan; Schrader, Christoph; Dengler, Reinhard; Heldmann, Marcus

    2015-01-01

    The generation of random sequences is considered to tax executive functions and has been reported to be impaired in Parkinson's disease (PD) previously. To assess the neurophysiological markers of random number generation in PD. Event-related potentials (ERP) were recorded in 12 PD patients and 12 age-matched normal controls (NC) while either engaging in random number generation (RNG) by pressing the number keys on a computer keyboard in a random sequence or in ordered number generation (ONG) necessitating key presses in the canonical order. Key presses were paced by an external auditory stimulus at a rate of 1 tone every 1800 ms. As a secondary task subjects had to monitor the tone-sequence for a particular target tone to which the number "0" key had to be pressed. This target tone occurred randomly and infrequently, thus creating a secondary oddball task. Behaviorally, PD patients showed an increased tendency to count in steps of one as well as a tendency towards repetition avoidance. Electrophysiologically, the amplitude of the P3 component of the ERP to the target tone of the secondary task was reduced during RNG in PD but not in NC. The behavioral findings indicate less random behavior in PD while the ERP findings suggest that this impairment comes about, because attentional resources are depleted in PD.

  19. From Discovery to Justification: Outline of an Ideal Research Program in Empirical Psychology

    PubMed Central

    Witte, Erich H.; Zenker, Frank

    2017-01-01

    The gold standard for an empirical science is the replicability of its research results. But the estimated average replicability rate of key-effects that top-tier psychology journals report falls between 36 and 39% (objective vs. subjective rate; Open Science Collaboration, 2015). So the standard mode of applying null-hypothesis significance testing (NHST) fails to adequately separate stable from random effects. Therefore, NHST does not fully convince as a statistical inference strategy. We argue that the replicability crisis is “home-made” because more sophisticated strategies can deliver results the successful replication of which is sufficiently probable. Thus, we can overcome the replicability crisis by integrating empirical results into genuine research programs. Instead of continuing to narrowly evaluate only the stability of data against random fluctuations (discovery context), such programs evaluate rival hypotheses against stable data (justification context). PMID:29163256

  20. The trajectory of scientific discovery: concept co-occurrence and converging semantic distance.

    PubMed

    Cohen, Trevor; Schvaneveldt, Roger W

    2010-01-01

    The paradigm of literature-based knowledge discovery originated by Swanson involves finding meaningful associations between terms or concepts that have not occurred together in any previously published document. While several automated approaches have been applied to this problem, these generally evaluate the literature at a point in time, and do not evaluate the role of change over time in distributional statistics as an indicator of meaningful implicit associations. To address this issue, we develop and evaluate Symmetric Random Indexing (SRI), a novel variant of the Random Indexing (RI) approach that is able to measure implicit association over time. SRI is found to compare favorably to existing RI variants in the prediction of future direct co-occurrence. Summary statistics over several experiments suggest a trend of converging semantic distance prior to the co-occurrence of key terms for two seminal historical literature-based discoveries.

  1. Asymmetric multiple information cryptosystem based on chaotic spiral phase mask and random spectrum decomposition

    NASA Astrophysics Data System (ADS)

    Rafiq Abuturab, Muhammad

    2018-01-01

    A new asymmetric multiple information cryptosystem based on chaotic spiral phase mask (CSPM) and random spectrum decomposition is put forwarded. In the proposed system, each channel of secret color image is first modulated with a CSPM and then gyrator transformed. The gyrator spectrum is randomly divided into two complex-valued masks. The same procedure is applied to multiple secret images to get their corresponding first and second complex-valued masks. Finally, first and second masks of each channel are independently added to produce first and second complex ciphertexts, respectively. The main feature of the proposed method is the different secret images encrypted by different CSPMs using different parameters as the sensitive decryption/private keys which are completely unknown to unauthorized users. Consequently, the proposed system would be resistant to potential attacks. Moreover, the CSPMs are easier to position in the decoding process owing to their own centering mark on axis focal ring. The retrieved secret images are free from cross-talk noise effects. The decryption process can be implemented by optical experiment. Numerical simulation results demonstrate the viability and security of the proposed method.

  2. Patent citation network in nanotechnology (1976-2004)

    NASA Astrophysics Data System (ADS)

    Li, Xin; Chen, Hsinchun; Huang, Zan; Roco, Mihail C.

    2007-06-01

    The patent citation networks are described using critical node, core network, and network topological analysis. The main objective is understanding of the knowledge transfer processes between technical fields, institutions and countries. This includes identifying key influential players and subfields, the knowledge transfer patterns among them, and the overall knowledge transfer efficiency. The proposed framework is applied to the field of nanoscale science and engineering (NSE), including the citation networks of patent documents, submitting institutions, technology fields, and countries. The NSE patents were identified by keywords "full-text" searching of patents at the United States Patent and Trademark Office (USPTO). The analysis shows that the United States is the most important citation center in NSE research. The institution citation network illustrates a more efficient knowledge transfer between institutions than a random network. The country citation network displays a knowledge transfer capability as efficient as a random network. The technology field citation network and the patent document citation network exhibit a␣less efficient knowledge diffusion capability than a random network. All four citation networks show a tendency to form local citation clusters.

  3. Anonymous authenticated communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Cheryl L; Schroeppel, Richard C; Snyder, Lillian A

    2007-06-19

    A method of performing electronic communications between members of a group wherein the communications are authenticated as being from a member of the group and have not been altered, comprising: generating a plurality of random numbers; distributing in a digital medium the plurality of random numbers to the members of the group; publishing a hash value of contents of the digital medium; distributing to the members of the group public-key-encrypted messages each containing a same token comprising a random number; and encrypting a message with a key generated from the token and the plurality of random numbers.

  4. A New Quantum Gray-Scale Image Encoding Scheme

    NASA Astrophysics Data System (ADS)

    Naseri, Mosayeb; Abdolmaleky, Mona; Parandin, Fariborz; Fatahi, Negin; Farouk, Ahmed; Nazari, Reza

    2018-02-01

    In this paper, a new quantum images encoding scheme is proposed. The proposed scheme mainly consists of four different encoding algorithms. The idea behind of the scheme is a binary key generated randomly for each pixel of the original image. Afterwards, the employed encoding algorithm is selected corresponding to the qubit pair of the generated randomized binary key. The security analysis of the proposed scheme proved its enhancement through both randomization of the generated binary image key and altering the gray-scale value of the image pixels using the qubits of randomized binary key. The simulation of the proposed scheme assures that the final encoded image could not be recognized visually. Moreover, the histogram diagram of encoded image is flatter than the original one. The Shannon entropies of the final encoded images are significantly higher than the original one, which indicates that the attacker can not gain any information about the encoded images. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, IRAN

  5. Trackside acoustic diagnosis of axle box bearing based on kurtosis-optimization wavelet denoising

    NASA Astrophysics Data System (ADS)

    Peng, Chaoyong; Gao, Xiaorong; Peng, Jianping; Wang, Ai

    2018-04-01

    As one of the key components of railway vehicles, the operation condition of the axle box bearing has a significant effect on traffic safety. The acoustic diagnosis is more suitable than vibration diagnosis for trackside monitoring. The acoustic signal generated by the train axle box bearing is an amplitude modulation and frequency modulation signal with complex train running noise. Although empirical mode decomposition (EMD) and some improved time-frequency algorithms have proved to be useful in bearing vibration signal processing, it is hard to extract the bearing fault signal from serious trackside acoustic background noises by using those algorithms. Therefore, a kurtosis-optimization-based wavelet packet (KWP) denoising algorithm is proposed, as the kurtosis is the key indicator of bearing fault signal in time domain. Firstly, the geometry based Doppler correction is applied to signals of each sensor, and with the signal superposition of multiple sensors, random noises and impulse noises, which are the interference of the kurtosis indicator, are suppressed. Then, the KWP is conducted. At last, the EMD and Hilbert transform is applied to extract the fault feature. Experiment results indicate that the proposed method consisting of KWP and EMD is superior to the EMD.

  6. Differential equation models for sharp threshold dynamics.

    PubMed

    Schramm, Harrison C; Dimitrov, Nedialko B

    2014-01-01

    We develop an extension to differential equation models of dynamical systems to allow us to analyze probabilistic threshold dynamics that fundamentally and globally change system behavior. We apply our novel modeling approach to two cases of interest: a model of infectious disease modified for malware where a detection event drastically changes dynamics by introducing a new class in competition with the original infection; and the Lanchester model of armed conflict, where the loss of a key capability drastically changes the effectiveness of one of the sides. We derive and demonstrate a step-by-step, repeatable method for applying our novel modeling approach to an arbitrary system, and we compare the resulting differential equations to simulations of the system's random progression. Our work leads to a simple and easily implemented method for analyzing probabilistic threshold dynamics using differential equations. Published by Elsevier Inc.

  7. Bayesian dynamic modeling of time series of dengue disease case counts.

    PubMed

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health.

  8. Quantum cryptography for secure free-space communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, R.J.; Buttler, W.T.; Kwiat, P.G.

    1999-03-01

    The secure distribution of the secret random bit sequences known as key material, is an essential precursor to their use for the encryption and decryption of confidential communications. Quantum cryptography is a new technique for secure key distribution with single-photon transmissions: Heisenberg`s uncertainty principle ensures that an adversary can neither successfully tap the key transmissions, nor evade detection (eavesdropping raises the key error rate above a threshold value). The authors have developed experimental quantum cryptography systems based on the transmission of non-orthogonal photon polarization states to generate shared key material over line-of-sight optical links. Key material is built up usingmore » the transmission of a single-photon per bit of an initial secret random sequence. A quantum-mechanically random subset of this sequence is identified, becoming the key material after a data reconciliation stage with the sender. The authors have developed and tested a free-space quantum key distribution (QKD) system over an outdoor optical path of {approximately}1 km at Los Alamos National Laboratory under nighttime conditions. Results show that free-space QKD can provide secure real-time key distribution between parties who have a need to communicate secretly. Finally, they examine the feasibility of surface to satellite QKD.« less

  9. Secure communications using quantum cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, R.J.; Buttler, W.T.; Kwiat, P.G.

    1997-08-01

    The secure distribution of the secret random bit sequences known as {open_quotes}key{close_quotes} material, is an essential precursor to their use for the encryption and decryption of confidential communications. Quantum cryptography is an emerging technology for secure key distribution with single-photon transmissions, nor evade detection (eavesdropping raises the key error rate above a threshold value). We have developed experimental quantum cryptography systems based on the transmission of non-orthogonal single-photon states to generate shared key material over multi-kilometer optical fiber paths and over line-of-sight links. In both cases, key material is built up using the transmission of a single-photon per bit ofmore » an initial secret random sequence. A quantum-mechanically random subset of this sequence is identified, becoming the key material after a data reconciliation stage with the sender. In our optical fiber experiment we have performed quantum key distribution over 24-km of underground optical fiber using single-photon interference states, demonstrating that secure, real-time key generation over {open_quotes}open{close_quotes} multi-km node-to-node optical fiber communications links is possible. We have also constructed a quantum key distribution system for free-space, line-of-sight transmission using single-photon polarization states, which is currently undergoing laboratory testing. 7 figs.« less

  10. Encrypted optical storage with wavelength-key and random phase codes.

    PubMed

    Matoba, O; Javidi, B

    1999-11-10

    An encrypted optical memory system that uses a wavelength code as well as input and Fourier-plane random phase codes is proposed. Original data are illuminated by a coherent light source with a specified wavelength and are then encrypted with two random phase codes before being stored holographically in a photorefractive material. Successful decryption requires the use of a readout beam with the same wavelength as that used in the recording, in addition to the correct phase key in the Fourier plane. The wavelength selectivity of the proposed system is evaluated numerically. We show that the number of available wavelength keys depends on the correlation length of the phase key in the Fourier plane. Preliminary experiments of encryption and decryption of optical memory in a LiNbO(3):Fe photorefractive crystal are demonstrated.

  11. Metacognitive therapy versus cognitive behavioural therapy for depression: a randomized pilot study.

    PubMed

    Jordan, Jennifer; Carter, Janet D; McIntosh, Virginia V W; Fernando, Kumari; Frampton, Christopher M A; Porter, Richard J; Mulder, Roger T; Lacey, Cameron; Joyce, Peter R

    2014-10-01

    Metacognitive therapy (MCT) is one of the newer developments within cognitive therapy. This randomized controlled pilot study compared independently applied MCT with cognitive behavioural therapy (CBT) in outpatients with depression to explore the relative speed and efficacy of MCT, ahead of a planned randomized controlled trial. A total of 48 participants referred for outpatient therapy were randomized to up to 12 weeks of MCT or CBT. Key outcomes were reduction in depressive symptoms at week 4 and week 12, measured using the independent-clinician-rated Quick Inventory of Depressive Symptomatology16. Intention-to-treat and completer analyses as well as additional methods of reporting outcome of depression are presented. Both therapies were effective in producing clinically significant change in depressive symptoms, with moderate-to-large effect sizes obtained. No differences were detected between therapies in overall outcome or early change on clinician-rated or self-reported measures. Post-hoc analyses suggest that MCT may have been adversely affected by greater comorbidity. In this large pilot study conducted independently of MCT's developers, MCT was an effective treatment for outpatients with depression, with similar results overall to CBT. Insufficient power and imbalanced comorbidity limit conclusions regarding comparative efficacy so further studies of MCT and CBT are required. © The Royal Australian and New Zealand College of Psychiatrists 2014.

  12. Information Security Scheme Based on Computational Temporal Ghost Imaging.

    PubMed

    Jiang, Shan; Wang, Yurong; Long, Tao; Meng, Xiangfeng; Yang, Xiulun; Shu, Rong; Sun, Baoqing

    2017-08-09

    An information security scheme based on computational temporal ghost imaging is proposed. A sequence of independent 2D random binary patterns are used as encryption key to multiply with the 1D data stream. The cipher text is obtained by summing the weighted encryption key. The decryption process can be realized by correlation measurement between the encrypted information and the encryption key. Due to the instinct high-level randomness of the key, the security of this method is greatly guaranteed. The feasibility of this method and robustness against both occlusion and additional noise attacks are discussed with simulation, respectively.

  13. Secure content objects

    DOEpatents

    Evans, William D [Cupertino, CA

    2009-02-24

    A secure content object protects electronic documents from unauthorized use. The secure content object includes an encrypted electronic document, a multi-key encryption table having at least one multi-key component, an encrypted header and a user interface device. The encrypted document is encrypted using a document encryption key associated with a multi-key encryption method. The encrypted header includes an encryption marker formed by a random number followed by a derivable variation of the same random number. The user interface device enables a user to input a user authorization. The user authorization is combined with each of the multi-key components in the multi-key encryption key table and used to try to decrypt the encrypted header. If the encryption marker is successfully decrypted, the electronic document may be decrypted. Multiple electronic documents or a document and annotations may be protected by the secure content object.

  14. A short history of randomized experiments in criminology. A meager feast.

    PubMed

    Farrington, David P

    2003-06-01

    This article discusses advantages of randomized experiments and key issues raised in the following articles. The main concern is the growth and decrease in the use of randomized experiments by the California Youth Authority, the U.S. National Institute of Justice, and the British Home Office, although other experiments are also discussed. It is concluded that feast and famine periods are influenced by key individuals. It is recommended that policy makers, practitioners, funders, the mass media, and the general public need better education in research quality so that they can tell the difference between good and poor evaluation studies. They might then demand better evaluations using randomized experiments.

  15. Posing the research question: not so simple.

    PubMed

    Thabane, Lehana; Thomas, Tara; Ye, Chenglin; Paul, James

    2009-01-01

    The success of any research process relies, in part, on how well investigators are able to translate a clinical problem into a research question-a task that is not so simple for novice investigators. The PICOT approach requires that the framing of the research question specify the target Population, the Intervention of interest, the Comparator intervention, key Outcomes, and the Time frame over which the outcomes are assessed. This paper describes the use of the PICOT structure in framing research questions and examines PICOT criteria as applied to the anesthesia literature. We also provide a roadmap for applying the PICOT format in identifying and framing clear research questions. In addition to searching MEDLINE for the literature on framing research questions, we performed a systematic review of articles published in four key anesthesia journals in 2006, including Anesthesiology, Anesthesia & Analgesia, the British Journal of Anaesthesia, and the Canadian Journal of Anesthesia. Three hundred thirteen articles (n = 313) were included in this review, with the following distribution by study design: 139 (44%) randomized controlled trials, 129 (41%) cohort studies, and 45 (15%) case-controlled, cross-sectional studies or systematic reviews. Overall, 96% (95% confidence interval: 91,100) of articles did not apply the PICOT approach in reporting the research question. The PICOT approach may be helpful in defining and clearly stating the research question. It remains to be determined whether or not compliance with the PICOT style, or any other format for framing research questions, is associated with a higher quality of research reporting.

  16. Random walk hierarchy measure: What is more hierarchical, a chain, a tree or a star?

    PubMed Central

    Czégel, Dániel; Palla, Gergely

    2015-01-01

    Signs of hierarchy are prevalent in a wide range of systems in nature and society. One of the key problems is quantifying the importance of hierarchical organisation in the structure of the network representing the interactions or connections between the fundamental units of the studied system. Although a number of notable methods are already available, their vast majority is treating all directed acyclic graphs as already maximally hierarchical. Here we propose a hierarchy measure based on random walks on the network. The novelty of our approach is that directed trees corresponding to multi level pyramidal structures obtain higher hierarchy scores compared to directed chains and directed stars. Furthermore, in the thermodynamic limit the hierarchy measure of regular trees is converging to a well defined limit depending only on the branching number. When applied to real networks, our method is computationally very effective, as the result can be evaluated with arbitrary precision by subsequent multiplications of the transition matrix describing the random walk process. In addition, the tests on real world networks provided very intuitive results, e.g., the trophic levels obtained from our approach on a food web were highly consistent with former results from ecology. PMID:26657012

  17. Random walk hierarchy measure: What is more hierarchical, a chain, a tree or a star?

    NASA Astrophysics Data System (ADS)

    Czégel, Dániel; Palla, Gergely

    2015-12-01

    Signs of hierarchy are prevalent in a wide range of systems in nature and society. One of the key problems is quantifying the importance of hierarchical organisation in the structure of the network representing the interactions or connections between the fundamental units of the studied system. Although a number of notable methods are already available, their vast majority is treating all directed acyclic graphs as already maximally hierarchical. Here we propose a hierarchy measure based on random walks on the network. The novelty of our approach is that directed trees corresponding to multi level pyramidal structures obtain higher hierarchy scores compared to directed chains and directed stars. Furthermore, in the thermodynamic limit the hierarchy measure of regular trees is converging to a well defined limit depending only on the branching number. When applied to real networks, our method is computationally very effective, as the result can be evaluated with arbitrary precision by subsequent multiplications of the transition matrix describing the random walk process. In addition, the tests on real world networks provided very intuitive results, e.g., the trophic levels obtained from our approach on a food web were highly consistent with former results from ecology.

  18. Random walk hierarchy measure: What is more hierarchical, a chain, a tree or a star?

    PubMed

    Czégel, Dániel; Palla, Gergely

    2015-12-10

    Signs of hierarchy are prevalent in a wide range of systems in nature and society. One of the key problems is quantifying the importance of hierarchical organisation in the structure of the network representing the interactions or connections between the fundamental units of the studied system. Although a number of notable methods are already available, their vast majority is treating all directed acyclic graphs as already maximally hierarchical. Here we propose a hierarchy measure based on random walks on the network. The novelty of our approach is that directed trees corresponding to multi level pyramidal structures obtain higher hierarchy scores compared to directed chains and directed stars. Furthermore, in the thermodynamic limit the hierarchy measure of regular trees is converging to a well defined limit depending only on the branching number. When applied to real networks, our method is computationally very effective, as the result can be evaluated with arbitrary precision by subsequent multiplications of the transition matrix describing the random walk process. In addition, the tests on real world networks provided very intuitive results, e.g., the trophic levels obtained from our approach on a food web were highly consistent with former results from ecology.

  19. Combining deep residual neural network features with supervised machine learning algorithms to classify diverse food image datasets.

    PubMed

    McAllister, Patrick; Zheng, Huiru; Bond, Raymond; Moorhead, Anne

    2018-04-01

    Obesity is increasing worldwide and can cause many chronic conditions such as type-2 diabetes, heart disease, sleep apnea, and some cancers. Monitoring dietary intake through food logging is a key method to maintain a healthy lifestyle to prevent and manage obesity. Computer vision methods have been applied to food logging to automate image classification for monitoring dietary intake. In this work we applied pretrained ResNet-152 and GoogleNet convolutional neural networks (CNNs), initially trained using ImageNet Large Scale Visual Recognition Challenge (ILSVRC) dataset with MatConvNet package, to extract features from food image datasets; Food 5K, Food-11, RawFooT-DB, and Food-101. Deep features were extracted from CNNs and used to train machine learning classifiers including artificial neural network (ANN), support vector machine (SVM), Random Forest, and Naive Bayes. Results show that using ResNet-152 deep features with SVM with RBF kernel can accurately detect food items with 99.4% accuracy using Food-5K validation food image dataset and 98.8% with Food-5K evaluation dataset using ANN, SVM-RBF, and Random Forest classifiers. Trained with ResNet-152 features, ANN can achieve 91.34%, 99.28% when applied to Food-11 and RawFooT-DB food image datasets respectively and SVM with RBF kernel can achieve 64.98% with Food-101 image dataset. From this research it is clear that using deep CNN features can be used efficiently for diverse food item image classification. The work presented in this research shows that pretrained ResNet-152 features provide sufficient generalisation power when applied to a range of food image classification tasks. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Classification of epileptic EEG signals based on simple random sampling and sequential feature selection.

    PubMed

    Ghayab, Hadi Ratham Al; Li, Yan; Abdulla, Shahab; Diykh, Mohammed; Wan, Xiangkui

    2016-06-01

    Electroencephalogram (EEG) signals are used broadly in the medical fields. The main applications of EEG signals are the diagnosis and treatment of diseases such as epilepsy, Alzheimer, sleep problems and so on. This paper presents a new method which extracts and selects features from multi-channel EEG signals. This research focuses on three main points. Firstly, simple random sampling (SRS) technique is used to extract features from the time domain of EEG signals. Secondly, the sequential feature selection (SFS) algorithm is applied to select the key features and to reduce the dimensionality of the data. Finally, the selected features are forwarded to a least square support vector machine (LS_SVM) classifier to classify the EEG signals. The LS_SVM classifier classified the features which are extracted and selected from the SRS and the SFS. The experimental results show that the method achieves 99.90, 99.80 and 100 % for classification accuracy, sensitivity and specificity, respectively.

  1. Framework for cascade size calculations on random networks

    NASA Astrophysics Data System (ADS)

    Burkholz, Rebekka; Schweitzer, Frank

    2018-04-01

    We present a framework to calculate the cascade size evolution for a large class of cascade models on random network ensembles in the limit of infinite network size. Our method is exact and applies to network ensembles with almost arbitrary degree distribution, degree-degree correlations, and, in case of threshold models, for arbitrary threshold distribution. With our approach, we shift the perspective from the known branching process approximations to the iterative update of suitable probability distributions. Such distributions are key to capture cascade dynamics that involve possibly continuous quantities and that depend on the cascade history, e.g., if load is accumulated over time. As a proof of concept, we provide two examples: (a) Constant load models that cover many of the analytically tractable casacade models, and, as a highlight, (b) a fiber bundle model that was not tractable by branching process approximations before. Our derivations cover the whole cascade dynamics, not only their steady state. This allows us to include interventions in time or further model complexity in the analysis.

  2. University Students' Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics.

    PubMed

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution-the central, unifying, and overarching theme in biology. Aspects strongly related to abstract "threshold" concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument for assessing students' conceptual knowledge of randomness and probability in the context of evolution. To address this problem, we have developed two instruments, Ra ndomness and Pro bability Test in the Context of Evo lution (RaProEvo) and Ra ndomness and Pro bability Test in the Context of Math ematics (RaProMath), that include both multiple-choice and free-response items. The instruments were administered to 140 university students in Germany, then the Rasch partial-credit model was applied to assess them. The results indicate that the instruments generate reliable and valid inferences about students' conceptual knowledge of randomness and probability in the two contexts (which are separable competencies). Furthermore, RaProEvo detected significant differences in knowledge of randomness and probability, as well as evolutionary theory, between biology majors and preservice biology teachers. © 2017 D. Fiedler et al. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  3. Optical image encryption by random shifting in fractional Fourier domains

    NASA Astrophysics Data System (ADS)

    Hennelly, B.; Sheridan, J. T.

    2003-02-01

    A number of methods have recently been proposed in the literature for the encryption of two-dimensional information by use of optical systems based on the fractional Fourier transform. Typically, these methods require random phase screen keys for decrypting the data, which must be stored at the receiver and must be carefully aligned with the received encrypted data. A new technique based on a random shifting, or jigsaw, algorithm is proposed. This method does not require the use of phase keys. The image is encrypted by juxtaposition of sections of the image in fractional Fourier domains. The new method has been compared with existing methods and shows comparable or superior robustness to blind decryption. Optical implementation is discussed, and the sensitivity of the various encryption keys to blind decryption is examined.

  4. Pseudo-random number generator for the Sigma 5 computer

    NASA Technical Reports Server (NTRS)

    Carroll, S. N.

    1983-01-01

    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  5. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  6. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    PubMed Central

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196

  7. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption.

    PubMed

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-29

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  8. Spinal Cord Stimulation Modulates Gene Expression in the Spinal Cord of an Animal Model of Peripheral Nerve Injury.

    PubMed

    Tilley, Dana M; Cedeño, David L; Kelley, Courtney A; Benyamin, Ramsin; Vallejo, Ricardo

    Previously, we found that application of pulsed radiofrequency to a peripheral nerve injury induces changes in key genes regulating nociception concurrent with alleviation of paw sensitivity in an animal model. In the current study, we evaluated such genes after applying spinal cord stimulation (SCS) therapy. Male Sprague-Dawley rats (n = 6 per group) were randomized into test and control groups. The spared nerve injury model was used to simulate a neuropathic pain state. A 4-contact microelectrode was implanted at the L1 vertebral level and SCS was applied continuously for 72 hours. Mechanical hyperalgesia was tested. Spinal cord tissues were collected and analyzed using real-time polymerase chain reaction to quantify levels of IL1β, GABAbr1, subP, Na/K ATPase, cFos, 5HT3ra, TNFα, Gal, VIP, NpY, IL6, GFAP, ITGAM, and BDNF. Paw withdrawal thresholds significantly decreased in spared nerve injury animals and stimulation attenuated sensitivity within 24 hours (P = 0.049), remaining significant through 72 hours (P = 0.003). Nerve injury caused up-regulation of TNFα, GFAP, ITGAM, and cFOS as well as down-regulation of Na/K ATPase. Spinal cord stimulation therapy modulated the expression of 5HT3ra, cFOS, and GABAbr1. Strong inverse relationships in gene expression relative to the amount of applied current were observed for GABAbr1 (R = -0.65) and Na/K ATPase (R = -0.58), and a positive linear correlations between 5HT3r (R = 0.80) and VIP (R = 0.50) were observed. Continuously applied SCS modulates expression of key genes involved in the regulation of neuronal membrane potential.

  9. Source-Independent Quantum Random Number Generation

    NASA Astrophysics Data System (ADS)

    Cao, Zhu; Zhou, Hongyi; Yuan, Xiao; Ma, Xiongfeng

    2016-01-01

    Quantum random number generators can provide genuine randomness by appealing to the fundamental principles of quantum mechanics. In general, a physical generator contains two parts—a randomness source and its readout. The source is essential to the quality of the resulting random numbers; hence, it needs to be carefully calibrated and modeled to achieve information-theoretical provable randomness. However, in practice, the source is a complicated physical system, such as a light source or an atomic ensemble, and any deviations in the real-life implementation from the theoretical model may affect the randomness of the output. To close this gap, we propose a source-independent scheme for quantum random number generation in which output randomness can be certified, even when the source is uncharacterized and untrusted. In our randomness analysis, we make no assumptions about the dimension of the source. For instance, multiphoton emissions are allowed in optical implementations. Our analysis takes into account the finite-key effect with the composable security definition. In the limit of large data size, the length of the input random seed is exponentially small compared to that of the output random bit. In addition, by modifying a quantum key distribution system, we experimentally demonstrate our scheme and achieve a randomness generation rate of over 5 ×103 bit /s .

  10. Sensitivity of Polar Stratospheric Ozone Loss to Uncertainties in Chemical Reaction Kinetics

    NASA Technical Reports Server (NTRS)

    Kawa, S. Randolph; Stolarski, Richard S.; Douglass, Anne R.; Newman, Paul A.

    2008-01-01

    Several recent observational and laboratory studies of processes involved in polar stratospheric ozone loss have prompted a reexamination of aspect of out understanding for this key indicator of global change. To a large extent, our confidence in understanding and projecting changes in polar and global ozone is based on our ability to to simulate these process in numerical models of chemistry and transport. These models depend on laboratory-measured kinetic reaction rates and photlysis cross section to simulate molecular interactions. In this study we use a simple box-model scenario for Antarctic ozone to estimate the uncertainty in loss attributable to known reaction kinetic uncertainties. Following the method of earlier work, rates and uncertainties from the latest laboratory evaluation are applied in random combinations. We determine the key reaction and rates contributing the largest potential errors and compare the results to observations to evaluate which combinations are consistent with atmospheric data. Implications for our theoretical and practical understanding of polar ozone loss will be assessed.

  11. Optical design of cipher block chaining (CBC) encryption mode by using digital holography

    NASA Astrophysics Data System (ADS)

    Gil, Sang Keun; Jeon, Seok Hee; Jung, Jong Rae; Kim, Nam

    2016-03-01

    We propose an optical design of cipher block chaining (CBC) encryption by using digital holographic technique, which has higher security than the conventional electronic method because of the analog-type randomized cipher text with 2-D array. In this paper, an optical design of CBC encryption mode is implemented by 2-step quadrature phase-shifting digital holographic encryption technique using orthogonal polarization. A block of plain text is encrypted with the encryption key by applying 2-step phase-shifting digital holography, and it is changed into cipher text blocks which are digital holograms. These ciphered digital holograms with the encrypted information are Fourier transform holograms and are recorded on CCDs with 256 gray levels quantized intensities. The decryption is computed by these encrypted digital holograms of cipher texts, the same encryption key and the previous cipher text. Results of computer simulations are presented to verify that the proposed method shows the feasibility in the high secure CBC encryption system.

  12. Simulating of the measurement-device independent quantum key distribution with phase randomized general sources

    PubMed Central

    Wang, Qin; Wang, Xiang-Bin

    2014-01-01

    We present a model on the simulation of the measurement-device independent quantum key distribution (MDI-QKD) with phase randomized general sources. It can be used to predict experimental observations of a MDI-QKD with linear channel loss, simulating corresponding values for the gains, the error rates in different basis, and also the final key rates. Our model can be applicable to the MDI-QKDs with arbitrary probabilistic mixture of different photon states or using any coding schemes. Therefore, it is useful in characterizing and evaluating the performance of the MDI-QKD protocol, making it a valuable tool in studying the quantum key distributions. PMID:24728000

  13. Stretchable Random Lasers with Tunable Coherent Loops.

    PubMed

    Sun, Tzu-Min; Wang, Cih-Su; Liao, Chi-Shiun; Lin, Shih-Yao; Perumal, Packiyaraj; Chiang, Chia-Wei; Chen, Yang-Fang

    2015-12-22

    Stretchability represents a key feature for the emerging world of realistic applications in areas, including wearable gadgets, health monitors, and robotic skins. Many optical and electronic technologies that can respond to large strain deformations have been developed. Laser plays a very important role in our daily life since it was discovered, which is highly desirable for the development of stretchable devices. Herein, stretchable random lasers with tunable coherent loops are designed, fabricated, and demonstrated. To illustrate our working principle, the stretchable random laser is made possible by transferring unique ZnO nanobrushes on top of polydimethylsiloxane (PDMS) elastomer substrate. Apart from the traditional gain material of ZnO nanorods, ZnO nanobrushes were used as optical gain materials so they can serve as scattering centers and provide the Fabry-Perot cavity to enhance laser action. The stretchable PDMS substrate gives the degree of freedom to mechanically tune the coherent loops of the random laser action by changing the density of ZnO nanobrushes. It is found that the number of laser modes increases with increasing external strain applied on the PDMS substrate due to the enhanced possibility for the formation of coherent loops. The device can be stretched by up to 30% strain and subjected to more than 100 cycles without loss in laser action. The result shows a major advance for the further development of man-made smart stretchable devices.

  14. Noise in two-color electronic distance meter measurements revisited

    USGS Publications Warehouse

    Langbein, J.

    2004-01-01

    Frequent, high-precision geodetic data have temporally correlated errors. Temporal correlations directly affect both the estimate of rate and its standard error; the rate of deformation is a key product from geodetic measurements made in tectonically active areas. Various models of temporally correlated errors are developed and these provide relations between the power spectral density and the data covariance matrix. These relations are applied to two-color electronic distance meter (EDM) measurements made frequently in California over the past 15-20 years. Previous analysis indicated that these data have significant random walk error. Analysis using the noise models developed here indicates that the random walk model is valid for about 30% of the data. A second 30% of the data can be better modeled with power law noise with a spectral index between 1 and 2, while another 30% of the data can be modeled with a combination of band-pass-filtered plus random walk noise. The remaining 10% of the data can be best modeled as a combination of band-pass-filtered plus power law noise. This band-pass-filtered noise is a product of an annual cycle that leaks into adjacent frequency bands. For time spans of more than 1 year these more complex noise models indicate that the precision in rate estimates is better than that inferred by just the simpler, random walk model of noise.

  15. A Multivariate Randomization Text of Association Applied to Cognitive Test Results

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert; Beard, Bettina

    2009-01-01

    Randomization tests provide a conceptually simple, distribution-free way to implement significance testing. We have applied this method to the problem of evaluating the significance of the association among a number (k) of variables. The randomization method was the random re-ordering of k-1 of the variables. The criterion variable was the value of the largest eigenvalue of the correlation matrix.

  16. Physical Unclonable Function Hardware Keys Utilizing Kirchhoff-Law Secure Key Exchange and Noise-Based Logic

    NASA Astrophysics Data System (ADS)

    Kish, Laszlo B.; Kwan, Chiman

    Weak unclonable function (PUF) encryption key means that the manufacturer of the hardware can clone the key but not anybody else. Strong unclonable function (PUF) encryption key means that even the manufacturer of the hardware is unable to clone the key. In this paper, first we introduce an "ultra" strong PUF with intrinsic dynamical randomness, which is not only unclonable but also gets renewed to an independent key (with fresh randomness) during each use via the unconditionally secure key exchange. The solution utilizes the Kirchhoff-law-Johnson-noise (KLJN) method for dynamical key renewal and a one-time-pad secure key for the challenge/response process. The secure key is stored in a flash memory on the chip to provide tamper-resistance and nonvolatile storage with zero power requirements in standby mode. Simplified PUF keys are shown: a strong PUF utilizing KLJN protocol during the first run and noise-based logic (NBL) hyperspace vector string verification method for the challenge/response during the rest of its life or until it is re-initialized. Finally, the simplest PUF utilizes NBL without KLJN thus it can be cloned by the manufacturer but not by anybody else.

  17. Analyzing Randomized Controlled Interventions: Three Notes for Applied Linguists

    ERIC Educational Resources Information Center

    Vanhove, Jan

    2015-01-01

    I discuss three common practices that obfuscate or invalidate the statistical analysis of randomized controlled interventions in applied linguistics. These are (a) checking whether randomization produced groups that are balanced on a number of possibly relevant covariates, (b) using repeated measures ANOVA to analyze pretest-posttest designs, and…

  18. Information verification cryptosystem using one-time keys based on double random phase encoding and public-key cryptography

    NASA Astrophysics Data System (ADS)

    Zhao, Tieyu; Ran, Qiwen; Yuan, Lin; Chi, Yingying; Ma, Jing

    2016-08-01

    A novel image encryption system based on double random phase encoding (DRPE) and RSA public-key algorithm is proposed. The main characteristic of the system is that each encryption process produces a new decryption key (even for the same plaintext), thus the encryption system conforms to the feature of the one-time pad (OTP) cryptography. The other characteristic of the system is the use of fingerprint key. Only with the rightful authorization will the true decryption be obtained, otherwise the decryption will result in noisy images. So the proposed system can be used to determine whether the ciphertext is falsified by attackers. In addition, the system conforms to the basic agreement of asymmetric cryptosystem (ACS) due to the combination with the RSA public-key algorithm. The simulation results show that the encryption scheme has high robustness against the existing attacks.

  19. A Short History of Randomized Experiments in Criminology: A Meager Feast.

    ERIC Educational Resources Information Center

    Farrington, David P.

    2003-01-01

    Discusses advantages of randomized experiments and key issues raised in this special issue. Focuses on growth and decrease in the use of randomized experiments by the California Youth Authority, the U.S. National Institute of Justice, and the British Home Office. Calls for increased recognition of the importance of randomized experiments. (SLD)

  20. Key-Generation Algorithms for Linear Piece In Hand Matrix Method

    NASA Astrophysics Data System (ADS)

    Tadaki, Kohtaro; Tsujii, Shigeo

    The linear Piece In Hand (PH, for short) matrix method with random variables was proposed in our former work. It is a general prescription which can be applicable to any type of multivariate public-key cryptosystems for the purpose of enhancing their security. Actually, we showed, in an experimental manner, that the linear PH matrix method with random variables can certainly enhance the security of HFE against the Gröbner basis attack, where HFE is one of the major variants of multivariate public-key cryptosystems. In 1998 Patarin, Goubin, and Courtois introduced the plus method as a general prescription which aims to enhance the security of any given MPKC, just like the linear PH matrix method with random variables. In this paper we prove the equivalence between the plus method and the primitive linear PH matrix method, which is introduced by our previous work to explain the notion of the PH matrix method in general in an illustrative manner and not for a practical use to enhance the security of any given MPKC. Based on this equivalence, we show that the linear PH matrix method with random variables has the substantial advantage over the plus method with respect to the security enhancement. In the linear PH matrix method with random variables, the three matrices, including the PH matrix, play a central role in the secret-key and public-key. In this paper, we clarify how to generate these matrices and thus present two probabilistic polynomial-time algorithms to generate these matrices. In particular, the second one has a concise form, and is obtained as a byproduct of the proof of the equivalence between the plus method and the primitive linear PH matrix method.

  1. Novel image encryption algorithm based on multiple-parameter discrete fractional random transform

    NASA Astrophysics Data System (ADS)

    Zhou, Nanrun; Dong, Taiji; Wu, Jianhua

    2010-08-01

    A new method of digital image encryption is presented by utilizing a new multiple-parameter discrete fractional random transform. Image encryption and decryption are performed based on the index additivity and multiple parameters of the multiple-parameter fractional random transform. The plaintext and ciphertext are respectively in the spatial domain and in the fractional domain determined by the encryption keys. The proposed algorithm can resist statistic analyses effectively. The computer simulation results show that the proposed encryption algorithm is sensitive to the multiple keys, and that it has considerable robustness, noise immunity and security.

  2. Biometrics based key management of double random phase encoding scheme using error control codes

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2013-08-01

    In this paper, an optical security system has been proposed in which key of the double random phase encoding technique is linked to the biometrics of the user to make it user specific. The error in recognition due to the biometric variation is corrected by encoding the key using the BCH code. A user specific shuffling key is used to increase the separation between genuine and impostor Hamming distance distribution. This shuffling key is then further secured using the RSA public key encryption to enhance the security of the system. XOR operation is performed between the encoded key and the feature vector obtained from the biometrics. The RSA encoded shuffling key and the data obtained from the XOR operation are stored into a token. The main advantage of the present technique is that the key retrieval is possible only in the simultaneous presence of the token and the biometrics of the user which not only authenticates the presence of the original input but also secures the key of the system. Computational experiments showed the effectiveness of the proposed technique for key retrieval in the decryption process by using the live biometrics of the user.

  3. Bayesian dynamic modeling of time series of dengue disease case counts

    PubMed Central

    López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-01-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model’s short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health. PMID:28671941

  4. A cluster randomized trial to assess the effect of clinical pathways for patients with stroke: results of the clinical pathways for effective and appropriate care study

    PubMed Central

    2012-01-01

    Background Clinical pathways (CPs) are used to improve the outcomes of acute stroke, but their use in stroke care is questionable, because the evidence on their effectiveness is still inconclusive. The objective of this study was to evaluate whether CPs improve the outcomes and the quality of care provided to patients after acute ischemic stroke. Methods This was a multicentre cluster-randomized trial, in which 14 hospitals were randomized to the CP arm or to the non intervention/usual care (UC) arm. Healthcare workers in the CP arm received 3 days of training in quality improvement of CPs and in use of a standardized package including information on evidence-based key interventions and indicators. Healthcare workers in the usual-care arm followed their standard procedures. The teams in the CP arm developed their CPs over a 6-month period. The primary end point was mortality. Secondary end points were: use of diagnostic and therapeutic procedures, implementation of organized care, length of stay, re-admission and institutionalization rates after discharge, dependency levels, and complication rates. Results Compared with the patients in the UC arm, the patients in the CP arm had a significantly lower risk of mortality at 7 days (OR = 0.10; 95% CI 0.01 to 0.95) and significantly lower rates of adverse functional outcomes, expressed as the odds of not returning to pre-stroke functioning in their daily life (OR = 0.42; 95 CI 0.18 to 0.98). There was no significant effect on 30-day mortality. Compared with the UC arm, the hospital diagnostic and therapeutic procedures were performed more appropriately in the CP arm, and the evidence-based key interventions and organized care were more applied in the CP arm. Conclusions CPs can significantly improve the outcomes of patients with ischemic patients with stroke, indicating better application of evidence-based key interventions and of diagnostic and therapeutic procedures. This study tested a new hypothesis and provided evidence on how CPs can work. Trial registration ClinicalTrials.gov ID: [NCT00673491]. PMID:22781160

  5. Cryptosystem based on two-step phase-shifting interferometry and the RSA public-key encryption algorithm

    NASA Astrophysics Data System (ADS)

    Meng, X. F.; Peng, X.; Cai, L. Z.; Li, A. M.; Gao, Z.; Wang, Y. R.

    2009-08-01

    A hybrid cryptosystem is proposed, in which one image is encrypted to two interferograms with the aid of double random-phase encoding (DRPE) and two-step phase-shifting interferometry (2-PSI), then three pairs of public-private keys are utilized to encode and decode the session keys (geometrical parameters, the second random-phase mask) and interferograms. In the stage of decryption, the ciphered image can be decrypted by wavefront reconstruction, inverse Fresnel diffraction, and real amplitude normalization. This approach can successfully solve the problem of key management and dispatch, resulting in increased security strength. The feasibility of the proposed cryptosystem and its robustness against some types of attack are verified and analyzed by computer simulations.

  6. On the design of henon and logistic map-based random number generator

    NASA Astrophysics Data System (ADS)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  7. Exploring diversity in ensemble classification: Applications in large area land cover mapping

    NASA Astrophysics Data System (ADS)

    Mellor, Andrew; Boukir, Samia

    2017-07-01

    Ensemble classifiers, such as random forests, are now commonly applied in the field of remote sensing, and have been shown to perform better than single classifier systems, resulting in reduced generalisation error. Diversity across the members of ensemble classifiers is known to have a strong influence on classification performance - whereby classifier errors are uncorrelated and more uniformly distributed across ensemble members. The relationship between ensemble diversity and classification performance has not yet been fully explored in the fields of information science and machine learning and has never been examined in the field of remote sensing. This study is a novel exploration of ensemble diversity and its link to classification performance, applied to a multi-class canopy cover classification problem using random forests and multisource remote sensing and ancillary GIS data, across seven million hectares of diverse dry-sclerophyll dominated public forests in Victoria Australia. A particular emphasis is placed on analysing the relationship between ensemble diversity and ensemble margin - two key concepts in ensemble learning. The main novelty of our work is on boosting diversity by emphasizing the contribution of lower margin instances used in the learning process. Exploring the influence of tree pruning on diversity is also a new empirical analysis that contributes to a better understanding of ensemble performance. Results reveal insights into the trade-off between ensemble classification accuracy and diversity, and through the ensemble margin, demonstrate how inducing diversity by targeting lower margin training samples is a means of achieving better classifier performance for more difficult or rarer classes and reducing information redundancy in classification problems. Our findings inform strategies for collecting training data and designing and parameterising ensemble classifiers, such as random forests. This is particularly important in large area remote sensing applications, for which training data is costly and resource intensive to collect.

  8. The Effect of Acupressure at Spleen 6 Acupuncture Point on the Anxiety Level and Sedative and Analgesics Consumption of Women during Labor: A Randomized, Single-blind Clinical Trial

    PubMed Central

    Samadi, Parisa; Alipour, Zahra; Lamyian, Minoor

    2018-01-01

    Background: Labor pain is the most severe pain women would experience, which could lead to loss of emotional control that plays a key role in creating a traumatic delivery experience and psychological disorders. The goal of this study was to evaluate the effect of acupressure on anxiety level and sedative and analgesics consumption in women during labor. Materials and Methods: This study was a randomized, single-blind clinical trial performed at Maryam Hospital in Tehran, Iran. One hundred and thirty-one pregnant women in Labor Ward were selected by convenience sampling. Subjects were randomly allocated to three groups, namely experimental group (pressure group), control group 1(touh group) and, control group 2 (routine care group). The study data were gathered using demographic information form, and assessed with Faces Anxiety Scale (FAS). For participants belonging to the experimental group, pressure was applied to the Spleen 6 acupoint for 30 min, and for those with only light touch was applied to the Spleen 6 acupoint for 30 min. The collected data were analyzed using Statistical Package for the Social Sciences 16 and descriptive statistics. Results: There was a significant difference between the three groups in terms of the mean of anxiety after 30 min of starting the intervention and 30 min after termination of the intervention; the anxiety of the experimental group was significantly decreased (p = 0.03). Sedative and analgesics consumption was significantly lower in the experimental group compared to the other groups (p = 0.006). Conclusions: This study showed that compression of the Spleen 6 acupoint was an effective complementary method to decrease maternal anxiety and analgesic consumption, especially pethidine. PMID:29628954

  9. Authentication and Encryption Using Modified Elliptic Curve Cryptography with Particle Swarm Optimization and Cuckoo Search Algorithm

    NASA Astrophysics Data System (ADS)

    Kota, Sujatha; Padmanabhuni, Venkata Nageswara Rao; Budda, Kishor; K, Sruthi

    2018-05-01

    Elliptic Curve Cryptography (ECC) uses two keys private key and public key and is considered as a public key cryptographic algorithm that is used for both authentication of a person and confidentiality of data. Either one of the keys is used in encryption and other in decryption depending on usage. Private key is used in encryption by the user and public key is used to identify user in the case of authentication. Similarly, the sender encrypts with the private key and the public key is used to decrypt the message in case of confidentiality. Choosing the private key is always an issue in all public key Cryptographic Algorithms such as RSA, ECC. If tiny values are chosen in random the security of the complete algorithm becomes an issue. Since the Public key is computed based on the Private Key, if they are not chosen optimally they generate infinity values. The proposed Modified Elliptic Curve Cryptography uses selection in either of the choices; the first option is by using Particle Swarm Optimization and the second option is by using Cuckoo Search Algorithm for randomly choosing the values. The proposed algorithms are developed and tested using sample database and both are found to be secured and reliable. The test results prove that the private key is chosen optimally not repetitive or tiny and the computations in public key will not reach infinity.

  10. Predicting temperate forest stand types using only structural profiles from discrete return airborne lidar

    NASA Astrophysics Data System (ADS)

    Fedrigo, Melissa; Newnham, Glenn J.; Coops, Nicholas C.; Culvenor, Darius S.; Bolton, Douglas K.; Nitschke, Craig R.

    2018-02-01

    Light detection and ranging (lidar) data have been increasingly used for forest classification due to its ability to penetrate the forest canopy and provide detail about the structure of the lower strata. In this study we demonstrate forest classification approaches using airborne lidar data as inputs to random forest and linear unmixing classification algorithms. Our results demonstrated that both random forest and linear unmixing models identified a distribution of rainforest and eucalypt stands that was comparable to existing ecological vegetation class (EVC) maps based primarily on manual interpretation of high resolution aerial imagery. Rainforest stands were also identified in the region that have not previously been identified in the EVC maps. The transition between stand types was better characterised by the random forest modelling approach. In contrast, the linear unmixing model placed greater emphasis on field plots selected as endmembers which may not have captured the variability in stand structure within a single stand type. The random forest model had the highest overall accuracy (84%) and Cohen's kappa coefficient (0.62). However, the classification accuracy was only marginally better than linear unmixing. The random forest model was applied to a region in the Central Highlands of south-eastern Australia to produce maps of stand type probability, including areas of transition (the 'ecotone') between rainforest and eucalypt forest. The resulting map provided a detailed delineation of forest classes, which specifically recognised the coalescing of stand types at the landscape scale. This represents a key step towards mapping the structural and spatial complexity of these ecosystems, which is important for both their management and conservation.

  11. 10 CFR 26.67 - Random drug and alcohol testing of individuals who have applied for authorization.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Random drug and alcohol testing of individuals who have applied for authorization. 26.67 Section 26.67 Energy NUCLEAR REGULATORY COMMISSION FITNESS FOR DUTY PROGRAMS Granting and Maintaining Authorization § 26.67 Random drug and alcohol testing of individuals who...

  12. Low frequency cabin noise reduction based on the intrinsic structural tuning concept: The theory and the experimental results, phase 2. [jet aircraft noise

    NASA Technical Reports Server (NTRS)

    Sengupta, G.

    1978-01-01

    Low frequency cabin noise and sonically induced stresses in an aircraft fuselage may be reduced by intrinsic tuning of the various structural members such as the skin, stringers, and frames and then applying damping treatments on these members. The concept is also useful in identifying the key structural resonance mechanisms controlling the fuselage response to broadband random excitation and in developing suitable damping treatments for reducing the structural response in various frequency ranges. The mathematical proof of the concept and the results of some laboratory and field tests on a group of skin-stringer panels are described. In the so-called stiffness-controlled region, the noise transmission may actually be controlled by stiffener resonances, depending upon the relationship between the natural frequencies of the skin bay and the stiffeners. Therefore, cabin noise in the stiffness-controlled region may be effectively reduced by applying damping treatments on the stiffeners.

  13. Individual Differences Methods for Randomized Experiments

    ERIC Educational Resources Information Center

    Tucker-Drob, Elliot M.

    2011-01-01

    Experiments allow researchers to randomly vary the key manipulation, the instruments of measurement, and the sequences of the measurements and manipulations across participants. To date, however, the advantages of randomized experiments to manipulate both the aspects of interest and the aspects that threaten internal validity have been primarily…

  14. Unravelling changing interspecific interactions across environmental gradients using Markov random fields.

    PubMed

    Clark, Nicholas J; Wells, Konstans; Lindberg, Oscar

    2018-05-16

    Inferring interactions between co-occurring species is key to identify processes governing community assembly. Incorporating interspecific interactions in predictive models is common in ecology, yet most methods do not adequately account for indirect interactions (where an interaction between two species is masked by their shared interactions with a third) and assume interactions do not vary along environmental gradients. Markov random fields (MRF) overcome these limitations by estimating interspecific interactions, while controlling for indirect interactions, from multispecies occurrence data. We illustrate the utility of MRFs for ecologists interested in interspecific interactions, and demonstrate how covariates can be included (a set of models known as Conditional Random Fields, CRF) to infer how interactions vary along environmental gradients. We apply CRFs to two data sets of presence-absence data. The first illustrates how blood parasite (Haemoproteus, Plasmodium, and nematode microfilaria spp.) co-infection probabilities covary with relative abundance of their avian hosts. The second shows that co-occurrences between mosquito larvae and predatory insects vary along water temperature gradients. Other applications are discussed, including the potential to identify replacement or shifting impacts of highly connected species along climate or land-use gradients. We provide tools for building CRFs and plotting/interpreting results as an R package. © 2018 by the Ecological Society of America.

  15. Metabolic Profiling of Adiponectin Levels in Adults: Mendelian Randomization Analysis.

    PubMed

    Borges, Maria Carolina; Barros, Aluísio J D; Ferreira, Diana L Santos; Casas, Juan Pablo; Horta, Bernardo Lessa; Kivimaki, Mika; Kumari, Meena; Menon, Usha; Gaunt, Tom R; Ben-Shlomo, Yoav; Freitas, Deise F; Oliveira, Isabel O; Gentry-Maharaj, Aleksandra; Fourkala, Evangelia; Lawlor, Debbie A; Hingorani, Aroon D

    2017-12-01

    Adiponectin, a circulating adipocyte-derived protein, has insulin-sensitizing, anti-inflammatory, antiatherogenic, and cardiomyocyte-protective properties in animal models. However, the systemic effects of adiponectin in humans are unknown. Our aims were to define the metabolic profile associated with higher blood adiponectin concentration and investigate whether variation in adiponectin concentration affects the systemic metabolic profile. We applied multivariable regression in ≤5909 adults and Mendelian randomization (using cis -acting genetic variants in the vicinity of the adiponectin gene as instrumental variables) for analyzing the causal effect of adiponectin in the metabolic profile of ≤37 545 adults. Participants were largely European from 6 longitudinal studies and 1 genome-wide association consortium. In the multivariable regression analyses, higher circulating adiponectin was associated with higher high-density lipoprotein lipids and lower very-low-density lipoprotein lipids, glucose levels, branched-chain amino acids, and inflammatory markers. However, these findings were not supported by Mendelian randomization analyses for most metabolites. Findings were consistent between sexes and after excluding high-risk groups (defined by age and occurrence of previous cardiovascular event) and 1 study with admixed population. Our findings indicate that blood adiponectin concentration is more likely to be an epiphenomenon in the context of metabolic disease than a key determinant. © 2017 The Authors.

  16. Estimation of 3D shape from image orientations.

    PubMed

    Fleming, Roland W; Holtmann-Rice, Daniel; Bülthoff, Heinrich H

    2011-12-20

    One of the main functions of vision is to estimate the 3D shape of objects in our environment. Many different visual cues, such as stereopsis, motion parallax, and shading, are thought to be involved. One important cue that remains poorly understood comes from surface texture markings. When a textured surface is slanted in 3D relative to the observer, the surface patterns appear compressed in the retinal image, providing potentially important information about 3D shape. What is not known, however, is how the brain actually measures this information from the retinal image. Here, we explain how the key information could be extracted by populations of cells tuned to different orientations and spatial frequencies, like those found in the primary visual cortex. To test this theory, we created stimuli that selectively stimulate such cell populations, by "smearing" (filtering) images of 2D random noise into specific oriented patterns. We find that the resulting patterns appear vividly 3D, and that increasing the strength of the orientation signals progressively increases the sense of 3D shape, even though the filtering we apply is physically inconsistent with what would occur with a real object. This finding suggests we have isolated key mechanisms used by the brain to estimate shape from texture. Crucially, we also find that adapting the visual system's orientation detectors to orthogonal patterns causes unoriented random noise to look like a specific 3D shape. Together these findings demonstrate a crucial role of orientation detectors in the perception of 3D shape.

  17. Quantum-noise randomized data encryption for wavelength-division-multiplexed fiber-optic networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corndorf, Eric; Liang Chuang; Kanter, Gregory S.

    2005-06-15

    We demonstrate high-rate randomized data-encryption through optical fibers using the inherent quantum-measurement noise of coherent states of light. Specifically, we demonstrate 650 Mbit/s data encryption through a 10 Gbit/s data-bearing, in-line amplified 200-km-long line. In our protocol, legitimate users (who share a short secret key) communicate using an M-ry signal set while an attacker (who does not share the secret key) is forced to contend with the fundamental and irreducible quantum-measurement noise of coherent states. Implementations of our protocol using both polarization-encoded signal sets as well as polarization-insensitive phase-keyed signal sets are experimentally and theoretically evaluated. Different from the performancemore » criteria for the cryptographic objective of key generation (quantum key-generation), one possible set of performance criteria for the cryptographic objective of data encryption is established and carefully considered.« less

  18. Dynamic Loads Generation for Multi-Point Vibration Excitation Problems

    NASA Technical Reports Server (NTRS)

    Shen, Lawrence

    2011-01-01

    A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.

  19. Known plaintext attack on double random phase encoding using fingerprint as key and a method for avoiding the attack.

    PubMed

    Tashima, Hideaki; Takeda, Masafumi; Suzuki, Hiroyuki; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2010-06-21

    We have shown that the application of double random phase encoding (DRPE) to biometrics enables the use of biometrics as cipher keys for binary data encryption. However, DRPE is reported to be vulnerable to known-plaintext attacks (KPAs) using a phase recovery algorithm. In this study, we investigated the vulnerability of DRPE using fingerprints as cipher keys to the KPAs. By means of computational experiments, we estimated the encryption key and restored the fingerprint image using the estimated key. Further, we propose a method for avoiding the KPA on the DRPE that employs the phase retrieval algorithm. The proposed method makes the amplitude component of the encrypted image constant in order to prevent the amplitude component of the encrypted image from being used as a clue for phase retrieval. Computational experiments showed that the proposed method not only avoids revealing the cipher key and the fingerprint but also serves as a sufficiently accurate verification system.

  20. Do religion and religiosity have anything to do with alcohol consumption patterns? Evidence from two fish landing sites on Lake Victoria Uganda.

    PubMed

    Tumwesigye, Nazarius M; Atuyambe, Lynn; Kibira, Simon P S; Wabwire-Mangen, Fred; Tushemerirwe, Florence; Wagner, Glenn J

    2013-09-01

    Fish landing sites have high levels of harmful use of alcohol. This paper examines the role of religion and religiosity on alcohol consumption at two fish landing sites on Lake Victoria in Uganda. Questionnaires were administered to randomly selected people at the sites. Dependent variables included alcohol consumption during the previous 30 days, whereas the key independent variables were religion and religiosity. Bivariate and multivariate analysis techniques were applied. People reporting low religiosity were five times more likely to have consumed alcohol (95% confidence interval: 2.45-10.04) compared with those reporting low/average religiosity. Religion and religiosity are potential channels for controlling alcohol use.

  1. Computing diffusivities from particle models out of equilibrium

    NASA Astrophysics Data System (ADS)

    Embacher, Peter; Dirr, Nicolas; Zimmer, Johannes; Reina, Celia

    2018-04-01

    A new method is proposed to numerically extract the diffusivity of a (typically nonlinear) diffusion equation from underlying stochastic particle systems. The proposed strategy requires the system to be in local equilibrium and have Gaussian fluctuations but it is otherwise allowed to undergo arbitrary out-of-equilibrium evolutions. This could be potentially relevant for particle data obtained from experimental applications. The key idea underlying the method is that finite, yet large, particle systems formally obey stochastic partial differential equations of gradient flow type satisfying a fluctuation-dissipation relation. The strategy is here applied to three classic particle models, namely independent random walkers, a zero-range process and a symmetric simple exclusion process in one space dimension, to allow the comparison with analytic solutions.

  2. Simple Criteria to Determine the Set of Key Parameters of the DRPE Method by a Brute-force Attack

    NASA Astrophysics Data System (ADS)

    Nalegaev, S. S.; Petrov, N. V.

    Known techniques of breaking Double Random Phase Encoding (DRPE), which bypass the resource-intensive brute-force method, require at least two conditions: the attacker knows the encryption algorithm; there is an access to the pairs of source and encoded images. Our numerical results show that for the accurate recovery by numerical brute-force attack, someone needs only some a priori information about the source images, which can be quite general. From the results of our numerical experiments with optical data encryption DRPE with digital holography, we have proposed four simple criteria for guaranteed and accurate data recovery. These criteria can be applied, if the grayscale, binary (including QR-codes) or color images are used as a source.

  3. Natural texture retrieval based on perceptual similarity measurement

    NASA Astrophysics Data System (ADS)

    Gao, Ying; Dong, Junyu; Lou, Jianwen; Qi, Lin; Liu, Jun

    2018-04-01

    A typical texture retrieval system performs feature comparison and might not be able to make human-like judgments of image similarity. Meanwhile, it is commonly known that perceptual texture similarity is difficult to be described by traditional image features. In this paper, we propose a new texture retrieval scheme based on texture perceptual similarity. The key of the proposed scheme is that prediction of perceptual similarity is performed by learning a non-linear mapping from image features space to perceptual texture space by using Random Forest. We test the method on natural texture dataset and apply it on a new wallpapers dataset. Experimental results demonstrate that the proposed texture retrieval scheme with perceptual similarity improves the retrieval performance over traditional image features.

  4. Single-random-phase holographic encryption of images

    NASA Astrophysics Data System (ADS)

    Tsang, P. W. M.

    2017-02-01

    In this paper, a method is proposed for encrypting an optical image onto a phase-only hologram, utilizing a single random phase mask as the private encryption key. The encryption process can be divided into 3 stages. First the source image to be encrypted is scaled in size, and pasted onto an arbitrary position in a larger global image. The remaining areas of the global image that are not occupied by the source image could be filled with randomly generated contents. As such, the global image as a whole is very different from the source image, but at the same time the visual quality of the source image is preserved. Second, a digital Fresnel hologram is generated from the new image, and converted into a phase-only hologram based on bi-directional error diffusion. In the final stage, a fixed random phase mask is added to the phase-only hologram as the private encryption key. In the decryption process, the global image together with the source image it contained, can be reconstructed from the phase-only hologram if it is overlaid with the correct decryption key. The proposed method is highly resistant to different forms of Plain-Text-Attacks, which are commonly used to deduce the encryption key in existing holographic encryption process. In addition, both the encryption and the decryption processes are simple and easy to implement.

  5. Physically Unclonable Cryptographic Primitives by Chemical Vapor Deposition of Layered MoS2.

    PubMed

    Alharbi, Abdullah; Armstrong, Darren; Alharbi, Somayah; Shahrjerdi, Davood

    2017-12-26

    Physically unclonable cryptographic primitives are promising for securing the rapidly growing number of electronic devices. Here, we introduce physically unclonable primitives from layered molybdenum disulfide (MoS 2 ) by leveraging the natural randomness of their island growth during chemical vapor deposition (CVD). We synthesize a MoS 2 monolayer film covered with speckles of multilayer islands, where the growth process is engineered for an optimal speckle density. Using the Clark-Evans test, we confirm that the distribution of islands on the film exhibits complete spatial randomness, hence indicating the growth of multilayer speckles is a spatial Poisson process. Such a property is highly desirable for constructing unpredictable cryptographic primitives. The security primitive is an array of 2048 pixels fabricated from this film. The complex structure of the pixels makes the physical duplication of the array impossible (i.e., physically unclonable). A unique optical response is generated by applying an optical stimulus to the structure. The basis for this unique response is the dependence of the photoemission on the number of MoS 2 layers, which by design is random throughout the film. Using a threshold value for the photoemission, we convert the optical response into binary cryptographic keys. We show that the proper selection of this threshold is crucial for maximizing combination randomness and that the optimal value of the threshold is linked directly to the growth process. This study reveals an opportunity for generating robust and versatile security primitives from layered transition metal dichalcogenides.

  6. Effective conductivity of suspensions of hard spheres by Brownian motion simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan Kim, I.; Torquato, S.

    1991-02-15

    A generalized Brownian motion simulation technique developed by Kim and Torquato (J. Appl. Phys. {bold 68}, 3892 (1990)) is applied to compute exactly'' the effective conductivity {sigma}{sub {ital e}} of heterogeneous media composed of regular and random distributions of hard spheres of conductivity {sigma}{sub 2} in a matrix of conductivity {sigma}{sub 1} for virtually the entire volume fraction range and for several values of the conductivity ratio {alpha}={sigma}{sub 2}/{sigma}{sub 1}, including superconducting spheres ({alpha}={infinity}) and perfectly insulating spheres ({alpha}=0). A key feature of the procedure is the use of {ital first}-{ital passage}-{ital time} equations in the two homogeneous phases andmore » at the two-phase interface. The method is shown to yield {sigma}{sub {ital e}} accurately with a comparatively fast execution time. The microstructure-sensitive analytical approximation of {sigma}{sub {ital e}} for dispersions derived by Torquato (J. Appl. Phys. {bold 58}, 3790 (1985)) is shown to be in excellent agreement with our data for random suspensions for the wide range of conditions reported here.« less

  7. Phase and amplitude modification of a laser beam by two deformable mirrors using conventional 4f image encryption techniques

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Rzasa, John Robertson; Davis, Christopher C.

    2017-08-01

    The image encryption and decryption technique using lens components and random phase screens has attracted a great deal of research interest in the past few years. In general, the optical encryption technique can translate a positive image into an image with nearly a white speckle pattern that is impossible to decrypt. However, with the right keys as conjugated random phase screens, the white noise speckle pattern can be decoded into the original image. We find that the fundamental ideas in image encryption can be borrowed and applied to carry out beam corrections through turbulent channels. Based on our detailed analysis, we show that by using two deformable mirrors arranged in similar fashions as in the image encryption technique, a large number of controllable phase and amplitude distribution patterns can be generated from a collimated Gaussian beam. Such a result can be further coupled with wavefront sensing techniques to achieve laser beam correction against turbulence distortions. In application, our approach leads to a new type of phase conjugation mirror that could be beneficial for directed energy systems.

  8. Enhancing decision-making and cognitive impulse control with transcranial direct current stimulation (tDCS) applied over the orbitofrontal cortex (OFC): A randomized and sham-controlled exploratory study.

    PubMed

    Ouellet, Julien; McGirr, Alexander; Van den Eynde, Frederique; Jollant, Fabrice; Lepage, Martin; Berlim, Marcelo T

    2015-10-01

    Decision-making and impulse control (both cognitive and motor) are complex interrelated processes which rely on a distributed neural network that includes multiple cortical and subcortical regions. Among them, the orbitofrontal cortex (OFC) seems to be particularly relevant as demonstrated by several neuropsychological and neuroimaging investigations. In the present study we assessed whether transcranial direct current stimulation (tDCS) applied bilaterally over the OFC is able to modulate decision-making and cognitive impulse control. More specifically, 45 healthy subjects were randomized to receive a single 30-min session of active or sham anodal tDCS (1.5 mA) applied over either the left or the right OFC (coupled with contralateral cathodal tDCS). They were also assessed pre- and post-tDCS with a battery of computerized tasks. Our results show that participants who received active anodal tDCS (irrespective of laterality), vs. those who received sham tDCS, displayed more advantageous decision-making (i.e., increased Iowa Gambling Task "net scores" [p = 0.04]), as well as improved cognitive impulse control (i.e., decreased "interference" in the Stroop Word-Colour Task [p = 0.007]). However, we did not observe tDCS-related effects on mood (assessed by visual analogue scales), attentional levels (assessed by the Continuous Performance Task) or motor impulse control (assessed by the Stop-Signal Task). Our study potentially serves as a key translational step towards the development of novel non-invasive neuromodulation-based therapeutic interventions directly targeting vulnerability factors for psychiatric conditions such as suicidal behaviour and addiction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Passive state preparation in the Gaussian-modulated coherent-states quantum key distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, Bing; Evans, Philip G.; Grice, Warren P.

    In the Gaussian-modulated coherent-states (GMCS) quantum key distribution (QKD) protocol, Alice prepares quantum states actively: For each transmission, Alice generates a pair of Gaussian-distributed random numbers, encodes them on a weak coherent pulse using optical amplitude and phase modulators, and then transmits the Gaussian-modulated weak coherent pulse to Bob. Here we propose a passive state preparation scheme using a thermal source. In our scheme, Alice splits the output of a thermal source into two spatial modes using a beam splitter. She measures one mode locally using conjugate optical homodyne detectors, and transmits the other mode to Bob after applying appropriatemore » optical attenuation. Under normal conditions, Alice's measurement results are correlated to Bob's, and they can work out a secure key, as in the active state preparation scheme. Given the initial thermal state generated by the source is strong enough, this scheme can tolerate high detector noise at Alice's side. Furthermore, the output of the source does not need to be single mode, since an optical homodyne detector can selectively measure a single mode determined by the local oscillator. Preliminary experimental results suggest that the proposed scheme could be implemented using an off-the-shelf amplified spontaneous emission source.« less

  10. Passive state preparation in the Gaussian-modulated coherent-states quantum key distribution

    DOE PAGES

    Qi, Bing; Evans, Philip G.; Grice, Warren P.

    2018-01-01

    In the Gaussian-modulated coherent-states (GMCS) quantum key distribution (QKD) protocol, Alice prepares quantum states actively: For each transmission, Alice generates a pair of Gaussian-distributed random numbers, encodes them on a weak coherent pulse using optical amplitude and phase modulators, and then transmits the Gaussian-modulated weak coherent pulse to Bob. Here we propose a passive state preparation scheme using a thermal source. In our scheme, Alice splits the output of a thermal source into two spatial modes using a beam splitter. She measures one mode locally using conjugate optical homodyne detectors, and transmits the other mode to Bob after applying appropriatemore » optical attenuation. Under normal conditions, Alice's measurement results are correlated to Bob's, and they can work out a secure key, as in the active state preparation scheme. Given the initial thermal state generated by the source is strong enough, this scheme can tolerate high detector noise at Alice's side. Furthermore, the output of the source does not need to be single mode, since an optical homodyne detector can selectively measure a single mode determined by the local oscillator. Preliminary experimental results suggest that the proposed scheme could be implemented using an off-the-shelf amplified spontaneous emission source.« less

  11. Critical side channel effects in random bit generation with multiple semiconductor lasers in a polarization-based quantum key distribution system.

    PubMed

    Ko, Heasin; Choi, Byung-Seok; Choe, Joong-Seon; Kim, Kap-Joong; Kim, Jong-Hoi; Youn, Chun Ju

    2017-08-21

    Most polarization-based BB84 quantum key distribution (QKD) systems utilize multiple lasers to generate one of four polarization quantum states randomly. However, random bit generation with multiple lasers can potentially open critical side channels that significantly endangers the security of QKD systems. In this paper, we show unnoticed side channels of temporal disparity and intensity fluctuation, which possibly exist in the operation of multiple semiconductor laser diodes. Experimental results show that the side channels can enormously degrade security performance of QKD systems. An important system issue for the improvement of quantum bit error rate (QBER) related with laser driving condition is further addressed with experimental results.

  12. An AES chip with DPA resistance using hardware-based random order execution

    NASA Astrophysics Data System (ADS)

    Bo, Yu; Xiangyu, Li; Cong, Chen; Yihe, Sun; Liji, Wu; Xiangmin, Zhang

    2012-06-01

    This paper presents an AES (advanced encryption standard) chip that combats differential power analysis (DPA) side-channel attack through hardware-based random order execution. Both decryption and encryption procedures of an AES are implemented on the chip. A fine-grained dataflow architecture is proposed, which dynamically exploits intrinsic byte-level independence in the algorithm. A novel circuit called an HMF (Hold-Match-Fetch) unit is proposed for random control, which randomly sets execution orders for concurrent operations. The AES chip was manufactured in SMIC 0.18 μm technology. The average energy for encrypting one group of plain texts (128 bits secrete keys) is 19 nJ. The core area is 0.43 mm2. A sophisticated experimental setup was built to test the DPA resistance. Measurement-based experimental results show that one byte of a secret key cannot be disclosed from our chip under random mode after 64000 power traces were used in the DPA attack. Compared with the corresponding fixed order execution, the hardware based random order execution is improved by at least 21 times the DPA resistance.

  13. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    PubMed

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  14. Information hiding based on double random-phase encoding and public-key cryptography.

    PubMed

    Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li

    2009-03-02

    A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.

  15. Phase-only asymmetric optical cryptosystem based on random modulus decomposition

    NASA Astrophysics Data System (ADS)

    Xu, Hongfeng; Xu, Wenhui; Wang, Shuaihua; Wu, Shaofan

    2018-06-01

    We propose a phase-only asymmetric optical cryptosystem based on random modulus decomposition (RMD). The cryptosystem is presented for effectively improving the capacity to resist various attacks, including the attack of iterative algorithms. On the one hand, RMD and phase encoding are combined to remove the constraints that can be used in the attacking process. On the other hand, the security keys (geometrical parameters) introduced by Fresnel transform can increase the key variety and enlarge the key space simultaneously. Numerical simulation results demonstrate the strong feasibility, security and robustness of the proposed cryptosystem. This cryptosystem will open up many new opportunities in the application fields of optical encryption and authentication.

  16. Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun

    2015-06-01

    Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.

  17. Photonic quantum simulator for unbiased phase covariant cloning

    NASA Astrophysics Data System (ADS)

    Knoll, Laura T.; López Grande, Ignacio H.; Larotonda, Miguel A.

    2018-01-01

    We present the results of a linear optics photonic implementation of a quantum circuit that simulates a phase covariant cloner, using two different degrees of freedom of a single photon. We experimentally simulate the action of two mirrored 1→ 2 cloners, each of them biasing the cloned states into opposite regions of the Bloch sphere. We show that by applying a random sequence of these two cloners, an eavesdropper can mitigate the amount of noise added to the original input state and therefore, prepare clones with no bias, but with the same individual fidelity, masking its presence in a quantum key distribution protocol. Input polarization qubit states are cloned into path qubit states of the same photon, which is identified as a potential eavesdropper in a quantum key distribution protocol. The device has the flexibility to produce mirrored versions that optimally clone states on either the northern or southern hemispheres of the Bloch sphere, as well as to simulate optimal and non-optimal cloning machines by tuning the asymmetry on each of the cloning machines.

  18. Polaron melting and ordering as key mechanisms for colossal resistance effects in manganites

    PubMed Central

    Jooss, Ch.; Wu, L.; Beetz, T.; Klie, R. F.; Beleggia, M.; Schofield, M. A.; Schramm, S.; Hoffmann, J.; Zhu, Y.

    2007-01-01

    Polarons, the combined motion of electrons in a cloth of their lattice distortions, are a key transport feature in doped manganites. To develop a profound understanding of the colossal resistance effects induced by external fields, the study of polaron correlations and the resulting collective polaron behavior, i.e., polaron ordering and transition from polaronic transport to metallic transport is essential. We show that static long-range ordering of Jahn–Teller polarons forms a polaron solid which represents a new type of charge and orbital ordered state. The related noncentrosymmetric lattice distortions establish a connection between colossal resistance effects and multiferroic properties, i.e., the coexistence of ferroelectric and antiferromagnetic ordering. Colossal resistance effects due to an electrically induced polaron solid–liquid transition are directly observed in a transmission electron microscope with local electric stimulus applied in situ using a piezo-controlled tip. Our results shed light onto the colossal resistance effects in magnetic field and have a strong impact on the development of correlated electron-device applications such as resistive random access memory (RRAM). PMID:17699633

  19. Differential Fault Analysis on CLEFIA

    NASA Astrophysics Data System (ADS)

    Chen, Hua; Wu, Wenling; Feng, Dengguo

    CLEFIA is a new 128-bit block cipher proposed by SONY corporation recently. The fundamental structure of CLEFIA is a generalized Feistel structure consisting of 4 data lines. In this paper, the strength of CLEFIA against the differential fault attack is explored. Our attack adopts the byte-oriented model of random faults. Through inducing randomly one byte fault in one round, four bytes of faults can be simultaneously obtained in the next round, which can efficiently reduce the total induce times in the attack. After attacking the last several rounds' encryptions, the original secret key can be recovered based on some analysis of the key schedule. The data complexity analysis and experiments show that only about 18 faulty ciphertexts are needed to recover the entire 128-bit secret key and about 54 faulty ciphertexts for 192/256-bit keys.

  20. Public perceptions of key performance indicators of healthcare in Alberta, Canada.

    PubMed

    Northcott, Herbert C; Harvey, Michael D

    2012-06-01

    To examine the relationship between public perceptions of key performance indicators assessing various aspects of the health-care system. Cross-sequential survey research. Annual telephone surveys of random samples of adult Albertans selected by random digit dialing and stratified according to age, sex and region (n = 4000 for each survey year). The survey questionnaires included single-item measures of key performance indicators to assess public perceptions of availability, accessibility, quality, outcome and satisfaction with healthcare. Cronbach's α and factor analysis were used to assess the relationship between key performance indicators focusing on the health-care system overall and on a recent interaction with the health-care system. The province of Alberta, Canada during the years 1996-2004. Four thousand adults randomly selected each survey year. Survey questions measuring public perceptions of healthcare availability, accessibility, quality, outcome and satisfaction with healthcare. Factor analysis identified two principal components with key performance indicators focusing on the health system overall loading most strongly on the first component and key performance indicators focusing on the most recent health-care encounter loading most strongly on the second component. Assessments of the quality of care most recently received, accessibility of that care and perceived outcome of care tended to be higher than the more general assessments of overall health system quality and accessibility. Assessments of specific health-care encounters and more general assessments of the overall health-care system, while related, nevertheless comprise separate dimensions for health-care evaluation.

  1. Randomization Procedures Applied to Analysis of Ballistic Data

    DTIC Science & Technology

    1991-06-01

    test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE

  2. Identifying key hospital service quality factors in online health communities.

    PubMed

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain; Kim, Minki

    2015-04-07

    The volume of health-related user-created content, especially hospital-related questions and answers in online health communities, has rapidly increased. Patients and caregivers participate in online community activities to share their experiences, exchange information, and ask about recommended or discredited hospitals. However, there is little research on how to identify hospital service quality automatically from the online communities. In the past, in-depth analysis of hospitals has used random sampling surveys. However, such surveys are becoming impractical owing to the rapidly increasing volume of online data and the diverse analysis requirements of related stakeholders. As a solution for utilizing large-scale health-related information, we propose a novel approach to identify hospital service quality factors and overtime trends automatically from online health communities, especially hospital-related questions and answers. We defined social media-based key quality factors for hospitals. In addition, we developed text mining techniques to detect such factors that frequently occur in online health communities. After detecting these factors that represent qualitative aspects of hospitals, we applied a sentiment analysis to recognize the types of recommendations in messages posted within online health communities. Korea's two biggest online portals were used to test the effectiveness of detection of social media-based key quality factors for hospitals. To evaluate the proposed text mining techniques, we performed manual evaluations on the extraction and classification results, such as hospital name, service quality factors, and recommendation types using a random sample of messages (ie, 5.44% (9450/173,748) of the total messages). Service quality factor detection and hospital name extraction achieved average F1 scores of 91% and 78%, respectively. In terms of recommendation classification, performance (ie, precision) is 78% on average. Extraction and classification performance still has room for improvement, but the extraction results are applicable to more detailed analysis. Further analysis of the extracted information reveals that there are differences in the details of social media-based key quality factors for hospitals according to the regions in Korea, and the patterns of change seem to accurately reflect social events (eg, influenza epidemics). These findings could be used to provide timely information to caregivers, hospital officials, and medical officials for health care policies.

  3. An eCK-Secure Authenticated Key Exchange Protocol without Random Oracles

    NASA Astrophysics Data System (ADS)

    Moriyama, Daisuke; Okamoto, Tatsuaki

    This paper presents a (PKI-based) two-pass authenticated key exchange (AKE) protocol that is secure in the extended Canetti-Krawczyk (eCK) security model. The security of the proposed protocol is proven without random oracles (under three assumptions), and relies on no implementation techniques such as a trick by LaMacchia, Lauter and Mityagin (so-called the NAXOS trick). Since an AKE protocol that is eCK-secure under a NAXOS-like implementation trick will be no more eCK-secure if some realistic information leakage occurs through side-channel attacks, it has been an important open problem how to realize an eCK-secure AKE protocol without using the NAXOS tricks (and without random oracles).

  4. Systolic array processing of the sequential decoding algorithm

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Yao, K.

    1989-01-01

    A systolic array processing technique is applied to implementing the stack algorithm form of the sequential decoding algorithm. It is shown that sorting, a key function in the stack algorithm, can be efficiently realized by a special type of systolic arrays known as systolic priority queues. Compared to the stack-bucket algorithm, this approach is shown to have the advantages that the decoding always moves along the optimal path, that it has a fast and constant decoding speed and that its simple and regular hardware architecture is suitable for VLSI implementation. Three types of systolic priority queues are discussed: random access scheme, shift register scheme and ripple register scheme. The property of the entries stored in the systolic priority queue is also investigated. The results are applicable to many other basic sorting type problems.

  5. Fitting and Calibrating a Multilevel Mixed-Effects Stem Taper Model for Maritime Pine in NW Spain

    PubMed Central

    Arias-Rodil, Manuel; Castedo-Dorado, Fernando; Cámara-Obregón, Asunción; Diéguez-Aranda, Ulises

    2015-01-01

    Stem taper data are usually hierarchical (several measurements per tree, and several trees per plot), making application of a multilevel mixed-effects modelling approach essential. However, correlation between trees in the same plot/stand has often been ignored in previous studies. Fitting and calibration of a variable-exponent stem taper function were conducted using data from 420 trees felled in even-aged maritime pine (Pinus pinaster Ait.) stands in NW Spain. In the fitting step, the tree level explained much more variability than the plot level, and therefore calibration at plot level was omitted. Several stem heights were evaluated for measurement of the additional diameter needed for calibration at tree level. Calibration with an additional diameter measured at between 40 and 60% of total tree height showed the greatest improvement in volume and diameter predictions. If additional diameter measurement is not available, the fixed-effects model fitted by the ordinary least squares technique should be used. Finally, we also evaluated how the expansion of parameters with random effects affects the stem taper prediction, as we consider this a key question when applying the mixed-effects modelling approach to taper equations. The results showed that correlation between random effects should be taken into account when assessing the influence of random effects in stem taper prediction. PMID:26630156

  6. Flexible embedding of networks

    NASA Astrophysics Data System (ADS)

    Fernandez-Gracia, Juan; Buckee, Caroline; Onnela, Jukka-Pekka

    We introduce a model for embedding one network into another, focusing on the case where network A is much bigger than network B. Nodes from network A are assigned to the nodes in network B using an algorithm where we control the extent of localization of node placement in network B using a single parameter. Starting from an unassigned node in network A, called the source node, we first map this node to a randomly chosen node in network B, called the target node. We then assign the neighbors of the source node to the neighborhood of the target node using a random walk based approach. To assign each neighbor of the source node to one of the nodes in network B, we perform a random walk starting from the target node with stopping probability α. We repeat this process until all nodes in network A have been mapped to the nodes of network B. The simplicity of the model allows us to calculate key quantities of interest in closed form. By varying the parameter α, we are able to produce embeddings from very local (α = 1) to very global (α --> 0). We show how our calculations fit the simulated results, and we apply the model to study how social networks are embedded in geography and how the neurons of C. Elegans are embedded in the surrounding volume.

  7. Efficient Graph-Based Resource Allocation Scheme Using Maximal Independent Set for Randomly- Deployed Small Star Networks

    PubMed Central

    Zhou, Jian; Wang, Lusheng; Wang, Weidong; Zhou, Qingfeng

    2017-01-01

    In future scenarios of heterogeneous and dense networks, randomly-deployed small star networks (SSNs) become a key paradigm, whose system performance is restricted to inter-SSN interference and requires an efficient resource allocation scheme for interference coordination. Traditional resource allocation schemes do not specifically focus on this paradigm and are usually too time consuming in dense networks. In this article, a very efficient graph-based scheme is proposed, which applies the maximal independent set (MIS) concept in graph theory to help divide SSNs into almost interference-free groups. We first construct an interference graph for the system based on a derived distance threshold indicating for any pair of SSNs whether there is intolerable inter-SSN interference or not. Then, SSNs are divided into MISs, and the same resource can be repetitively used by all the SSNs in each MIS. Empirical parameters and equations are set in the scheme to guarantee high performance. Finally, extensive scenarios both dense and nondense are randomly generated and simulated to demonstrate the performance of our scheme, indicating that it outperforms the classical max K-cut-based scheme in terms of system capacity, utility and especially time cost. Its achieved system capacity, utility and fairness can be close to the near-optimal strategy obtained by a time-consuming simulated annealing search. PMID:29113109

  8. Intervention Fidelity: Aspects of Complementary and Alternative Medicine (CAM) Research

    PubMed Central

    Wyatt, Gwen; Sikorskii, Alla; Rahbar, Mohammad Hossein; Victorson, David; Adams, Lora

    2010-01-01

    Background The Treatment Fidelity Workgroup (TFW) established by the National Institutes of Health (NIH) provides a 5-point structure for intervention fidelity: dosing, interventionists’ consistency, intervention delivery, receipt and enactment of the intervention. Using our reflexology trial, we apply the first three points. Objectives Study objectives are to: 1) evaluate key dosage dimensions associated with CAM research; 2) evaluate approaches to interventionists’ consistency of delivery of CAM protocols; and 3) evaluate and discuss data that reflect CAM intervention fidelity. Intervention Women with late stage breast cancer (N=318) were randomly assigned to either 4 weeks of reflexology, placebo, or standard care. Results Dosing consists of three dimensions: frequency (4-sessions), duration (30 minutes), and interval between sessions (5–9 days). Interventionist consistency revealed over a 90% accuracy rate in following the protocol; 84% and 89% completion rate of the 4 session in the reflexology and placebo groups respectively; and no differences in attrition after randomization between reflexology and placebo groups (17% and 15%, respectively). Intervention delivery, examined through debriefing data, indicated a significantly higher rate of correct guesses on group assignment in the reflexology group as compared to the placebo (82% versus 46%, p-value=.0002). Conclusions This study points out the relevance of dosing, interventionists’ consistency, and delivery data within a CAM clinical trial, as well as the challenges of blinding. Implications Monitoring intervention fidelity by using the key areas identified by the BCC ensures that findings from a clinical trial are meaningful and have the potential to be translated to clinical practice. PMID:20467309

  9. Color image encryption based on color blend and chaos permutation in the reality-preserving multiple-parameter fractional Fourier transform domain

    NASA Astrophysics Data System (ADS)

    Lang, Jun

    2015-03-01

    In this paper, we propose a novel color image encryption method by using Color Blend (CB) and Chaos Permutation (CP) operations in the reality-preserving multiple-parameter fractional Fourier transform (RPMPFRFT) domain. The original color image is first exchanged and mixed randomly from the standard red-green-blue (RGB) color space to R‧G‧B‧ color space by rotating the color cube with a random angle matrix. Then RPMPFRFT is employed for changing the pixel values of color image, three components of the scrambled RGB color space are converted by RPMPFRFT with three different transform pairs, respectively. Comparing to the complex output transform, the RPMPFRFT transform ensures that the output is real which can save storage space of image and convenient for transmission in practical applications. To further enhance the security of the encryption system, the output of the former steps is scrambled by juxtaposition of sections of the image in the reality-preserving multiple-parameter fractional Fourier domains and the alignment of sections is determined by two coupled chaotic logistic maps. The parameters in the Color Blend, Chaos Permutation and the RPMPFRFT transform are regarded as the key in the encryption algorithm. The proposed color image encryption can also be applied to encrypt three gray images by transforming the gray images into three RGB color components of a specially constructed color image. Numerical simulations are performed to demonstrate that the proposed algorithm is feasible, secure, sensitive to keys and robust to noise attack and data loss.

  10. A new interpretation of the Keller-Segel model based on multiphase modelling.

    PubMed

    Byrne, Helen M; Owen, Markus R

    2004-12-01

    In this paper an alternative derivation and interpretation are presented of the classical Keller-Segel model of cell migration due to random motion and chemotaxis. A multiphase modelling approach is used to describe how a population of cells moves through a fluid containing a diffusible chemical to which the cells are attracted. The cells and fluid are viewed as distinct components of a two-phase mixture. The principles of mass and momentum balance are applied to each phase, and appropriate constitutive laws imposed to close the resulting equations. A key assumption here is that the stress in the cell phase is influenced by the concentration of the diffusible chemical. By restricting attention to one-dimensional cartesian geometry we show how the model reduces to a pair of nonlinear coupled partial differential equations for the cell density and the chemical concentration. These equations may be written in the form of the Patlak-Keller-Segel model, naturally including density-dependent nonlinearities in the cell motility coefficients. There is a direct relationship between the random motility and chemotaxis coefficients, both depending in an inter-related manner on the chemical concentration. We suggest that this may explain why many chemicals appear to stimulate both chemotactic and chemokinetic responses in cell populations. After specialising our model to describe slime mold we then show how the functional form of the chemical potential that drives cell locomotion influences the ability of the system to generate spatial patterns. The paper concludes with a summary of the key results and a discussion of avenues for future research.

  11. Randomization in clinical trials in orthodontics: its significance in research design and methods to achieve it.

    PubMed

    Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore

    2011-12-01

    Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.

  12. Study of Randomness in AES Ciphertexts Produced by Randomly Generated S-Boxes and S-Boxes with Various Modulus and Additive Constant Polynomials

    NASA Astrophysics Data System (ADS)

    Das, Suman; Sadique Uz Zaman, J. K. M.; Ghosh, Ranjan

    2016-06-01

    In Advanced Encryption Standard (AES), the standard S-Box is conventionally generated by using a particular irreducible polynomial {11B} in GF(28) as the modulus and a particular additive constant polynomial {63} in GF(2), though it can be generated by many other polynomials. In this paper, it has been shown that it is possible to generate secured AES S-Boxes by using some other selected modulus and additive polynomials and also can be generated randomly, using a PRNG like BBS. A comparative study has been made on the randomness of corresponding AES ciphertexts generated, using these S-Boxes, by the NIST Test Suite coded for this paper. It has been found that besides using the standard one, other moduli and additive constants are also able to generate equally or better random ciphertexts; the same is true for random S-Boxes also. As these new types of S-Boxes are user-defined, hence unknown, they are able to prevent linear and differential cryptanalysis. Moreover, they act as additional key-inputs to AES, thus increasing the key-space.

  13. Some Effects of Procedural Variations on Choice Responding in Concurrent Chains

    ERIC Educational Resources Information Center

    Moore, J.

    2009-01-01

    The present research used pigeons in a three-key operant chamber and varied procedural features pertaining to both initial and terminal links of concurrent chains. The initial links randomly alternated on the side keys during a session, while the terminal links always appeared on the center key. Both equal and unequal initial-link schedules were…

  14. Application of Crossover Design for Conducting Rigorous Extension Evaluations

    ERIC Educational Resources Information Center

    Jayaratne, K. S. U.; Bird, Carolyn L.; McClelland, Jacquelyn W.

    2013-01-01

    With the increasing demand for accountability of Extension programming, Extension professionals need to apply rigorous evaluation designs. Randomized designs are useful to eliminate selection biases of program participants and to improve the accuracy of evaluation. However, randomized control designs are not practical to apply in Extension program…

  15. Optical asymmetric image encryption using gyrator wavelet transform

    NASA Astrophysics Data System (ADS)

    Mehra, Isha; Nishchal, Naveen K.

    2015-11-01

    In this paper, we propose a new optical information processing tool termed as gyrator wavelet transform to secure a fully phase image, based on amplitude- and phase-truncation approach. The gyrator wavelet transform constitutes four basic parameters; gyrator transform order, type and level of mother wavelet, and position of different frequency bands. These parameters are used as encryption keys in addition to the random phase codes to the optical cryptosystem. This tool has also been applied for simultaneous compression and encryption of an image. The system's performance and its sensitivity to the encryption parameters, such as, gyrator transform order, and robustness has also been analyzed. It is expected that this tool will not only update current optical security systems, but may also shed some light on future developments. The computer simulation results demonstrate the abilities of the gyrator wavelet transform as an effective tool, which can be used in various optical information processing applications, including image encryption, and image compression. Also this tool can be applied for securing the color image, multispectral, and three-dimensional images.

  16. SYMPOSIUM REPORT: An Evidence-Based Approach to IBS and CIC: Applying New Advances to Daily Practice

    PubMed Central

    Chey, William D.

    2017-01-01

    Many nonpharmacologic and pharmacologic therapies are available to manage irritable bowel syndrome (IBS) and chronic idiopathic constipation (CIC). The American College of Gastroenterology (ACG) regularly publishes reviews on IBS and CIC therapies. The most recent of these reviews was published by the ACG Task Force on the Management of Functional Bowel Disorders in 2014. The key objective of this review was to evaluate the efficacy of therapies for IBS or CIC compared with placebo or no treatment in randomized controlled trials. Evidence-based approaches to managing diarrhea-predominant IBS include dietary measures, such as a diet low in gluten and fermentable oligo-, di-, and monosaccharides and polyols (FODMAPs); loperamide; antispasmodics; peppermint oil; probiotics; tricyclic antidepressants; alosetron; eluxadoline, and rifaximin. Evidence-based approaches to managing constipation-predominant IBS and CIC include fiber, stimulant laxatives, polyethylene glycol, selective serotonin reuptake inhibitors, lubiprostone, and guanylate cyclase agonists. With the growing evidence base for IBS and CIC therapies, it has become increasingly important for clinicians to assess the quality of evidence and understand how to apply it to the care of individual patients. PMID:28729815

  17. Cascade phenomenon against subsequent failures in complex networks

    NASA Astrophysics Data System (ADS)

    Jiang, Zhong-Yuan; Liu, Zhi-Quan; He, Xuan; Ma, Jian-Feng

    2018-06-01

    Cascade phenomenon may lead to catastrophic disasters which extremely imperil the network safety or security in various complex systems such as communication networks, power grids, social networks and so on. In some flow-based networks, the load of failed nodes can be redistributed locally to their neighboring nodes to maximally preserve the traffic oscillations or large-scale cascading failures. However, in such local flow redistribution model, a small set of key nodes attacked subsequently can result in network collapse. Then it is a critical problem to effectively find the set of key nodes in the network. To our best knowledge, this work is the first to study this problem comprehensively. We first introduce the extra capacity for every node to put up with flow fluctuations from neighbors, and two extra capacity distributions including degree based distribution and average distribution are employed. Four heuristic key nodes discovering methods including High-Degree-First (HDF), Low-Degree-First (LDF), Random and Greedy Algorithms (GA) are presented. Extensive simulations are realized in both scale-free networks and random networks. The results show that the greedy algorithm can efficiently find the set of key nodes in both scale-free and random networks. Our work studies network robustness against cascading failures from a very novel perspective, and methods and results are very useful for network robustness evaluations and protections.

  18. Study protocol: developing a decision system for inclusive housing: applying a systematic, mixed-method quasi-experimental design.

    PubMed

    Zeeman, Heidi; Kendall, Elizabeth; Whitty, Jennifer A; Wright, Courtney J; Townsend, Clare; Smith, Dianne; Lakhani, Ali; Kennerley, Samantha

    2016-03-15

    Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability.

  19. Video encryption using chaotic masks in joint transform correlator

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2015-03-01

    A real-time optical video encryption technique using a chaotic map has been reported. In the proposed technique, each frame of video is encrypted using two different chaotic random phase masks in the joint transform correlator architecture. The different chaotic random phase masks can be obtained either by using different iteration levels or by using different seed values of the chaotic map. The use of different chaotic random phase masks makes the decryption process very complex for an unauthorized person. Optical, as well as digital, methods can be used for video encryption but the decryption is possible only digitally. To further enhance the security of the system, the key parameters of the chaotic map are encoded using RSA (Rivest-Shamir-Adleman) public key encryption. Numerical simulations are carried out to validate the proposed technique.

  20. Xinyinqin: a computer-based heart sound simulator.

    PubMed

    Zhan, X X; Pei, J H; Xiao, Y H

    1995-01-01

    "Xinyinqin" is the Chinese phoneticized name of the Heart Sound Simulator (HSS). The "qin" in "Xinyinqin" is the Chinese name of a category of musical instruments, which means that the operation of HSS is very convenient--like playing an electric piano with the keys. HSS is connected to the GAME I/O of an Apple microcomputer. The generation of sound is controlled by a program. Xinyinqin is used as a teaching aid of Diagnostics. It has been applied in teaching for three years. In this demonstration we will introduce the following functions of HSS: 1) The main program has two modules. The first one is the heart auscultation training module. HSS can output a heart sound selected by the student. Another program module is used to test the student's learning condition. The computer can randomly simulate a certain heart sound and ask the student to name it. The computer gives the student's answer an assessment: "correct" or "incorrect." When the answer is incorrect, the computer will output that heart sound again for the student to listen to; this process is repeated until she correctly identifies it. 2) The program is convenient to use and easy to control. By pressing the S key, it is able to output a slow heart rate until the student can clearly identify the rhythm. The heart rate, like the actual rate of a patient, can then be restored by hitting any key. By pressing the SPACE BAR, the heart sound output can be stopped to allow the teacher to explain something to the student. The teacher can resume playing the heart sound again by hitting any key; she can also change the content of the training by hitting RETURN key. In the future, we plan to simulate more heart sounds and incorporate relevant graphs.

  1. Duodenal-jejunal bypass surgery up-regulates the expression of the hepatic insulin signaling proteins and the key regulatory enzymes of intestinal gluconeogenesis in diabetic Goto-Kakizaki rats.

    PubMed

    Sun, Dong; Wang, Kexin; Yan, Zhibo; Zhang, Guangyong; Liu, Shaozhuang; Liu, Fengjun; Hu, Chunxiao; Hu, Sanyuan

    2013-11-01

    Duodenal-jejunal bypass (DJB), which is not routinely applied in metabolic surgery, is an effective surgical procedure in terms of type 2 diabetes mellitus resolution. However, the underlying mechanisms are still undefined. Our aim was to investigate the diabetic improvement by DJB and to explore the changes in hepatic insulin signaling proteins and regulatory enzymes of gluconeogenesis after DJB in a non-obese diabetic rat model. Sixteen adult male Goto-Kakizaki rats were randomly divided into DJB and sham-operated groups. The body weight, food intake, hormone levels, and glucose metabolism were measured. The levels of protein expression and phosphorylation of insulin receptor-beta (IR-β) and insulin receptor substrate 2 (IRS-2) were evaluated in the liver. We also detected the expression of key regulatory enzymes of gluconeogenesis [phosphoenoylpyruvate carboxykinase-1 (PCK1), glucose-6-phosphatase-alpha (G6Pase-α)] in small intestine and liver. DJB induced significant diabetic improvement with higher postprandial glucagons-like peptide 1, peptide YY, and insulin levels, but without weight loss. The DJB group exhibited increased expression and phosphorylation of IR-β and IRS-2 in liver, up-regulated the expression of PCK1 and G6Pase-α in small intestine, and down-regulated the expression of these enzymes in liver. DJB is effective in up-regulating the expression of the key proteins in the hepatic insulin signaling pathway and the key regulatory enzymes of intestinal gluconeogenesis and down-regulating the expression of the key regulatory enzymes of hepatic gluconeogenesis without weight loss. Our study helps to reveal the potential role of hepatic insulin signaling pathway and intestinal gluconeogenesis in ameliorating insulin resistance after metabolic surgery.

  2. An optical authentication system based on imaging of excitation-selected lanthanide luminescence.

    PubMed

    Carro-Temboury, Miguel R; Arppe, Riikka; Vosch, Tom; Sørensen, Thomas Just

    2018-01-01

    Secure data encryption relies heavily on one-way functions, and copy protection relies on features that are difficult to reproduce. We present an optical authentication system based on lanthanide luminescence from physical one-way functions or physical unclonable functions (PUFs). They cannot be reproduced and thus enable unbreakable encryption. Further, PUFs will prevent counterfeiting if tags with unique PUFs are grafted onto products. We have developed an authentication system that comprises a hardware reader, image analysis, and authentication software and physical keys that we demonstrate as an anticounterfeiting system. The physical keys are PUFs made from random patterns of taggants in polymer films on glass that can be imaged following selected excitation of particular lanthanide(III) ions doped into the individual taggants. This form of excitation-selected imaging ensures that by using at least two lanthanide(III) ion dopants, the random patterns cannot be copied, because the excitation selection will fail when using any other emitter. With the developed reader and software, the random patterns are read and digitized, which allows a digital pattern to be stored. This digital pattern or digital key can be used to authenticate the physical key in anticounterfeiting or to encrypt any message. The PUF key was produced with a staggering nominal encoding capacity of 7 3600 . Although the encoding capacity of the realized authentication system reduces to 6 × 10 104 , it is more than sufficient to completely preclude counterfeiting of products.

  3. Establishing security of quantum key distribution without monitoring disturbance

    NASA Astrophysics Data System (ADS)

    Koashi, Masato

    2015-10-01

    In conventional quantum key distribution (QKD) protocols, the information leak to an eavesdropper is estimated through the basic principle of quantum mechanics dictated in the original version of Heisenberg's uncertainty principle. The amount of leaked information on a shared sifted key is bounded from above essentially by using information-disturbance trade-off relations, based on the amount of signal disturbance measured via randomly sampled or inserted probe signals. Here we discuss an entirely different avenue toward the private communication, which does not rely on the information disturbance trade-off relations and hence does not require a monitoring of signal disturbance. The independence of the amount of privacy amplification from that of disturbance tends to give it a high tolerance on the channel noises. The lifting of the burden of precise statistical estimation of disturbance leads to a favorable finite-key-size effect. A protocol based on the novel principle can be implemented by only using photon detectors and classical optics tools: a laser, a phase modulator, and an interferometer. The protocol resembles the differential-phase-shift QKD protocol in that both share a simple binary phase shift keying on a coherent train of weak pulses from a laser. The difference lies in the use of a variable-delay interferometer in the new protocol, which randomly changes the combination of pulse pairs to be superposed. This extra randomness has turned out to be enough to upper-bound the information extracted by the eavesdropper, regardless of how they have disturbed the quantum signal.

  4. Completely device-independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Aguilar, Edgar A.; Ramanathan, Ravishankar; Kofler, Johannes; Pawłowski, Marcin

    2016-08-01

    Quantum key distribution (QKD) is a provably secure way for two distant parties to establish a common secret key, which then can be used in a classical cryptographic scheme. Using quantum entanglement, one can reduce the necessary assumptions that the parties have to make about their devices, giving rise to device-independent QKD (DIQKD). However, in all existing protocols to date the parties need to have an initial (at least partially) random seed as a resource. In this work, we show that this requirement can be dropped. Using recent advances in the fields of randomness amplification and randomness expansion, we demonstrate that it is sufficient for the message the parties want to communicate to be (partially) unknown to the adversaries—an assumption without which any type of cryptography would be pointless to begin with. One party can use her secret message to locally generate a secret sequence of bits, which can then be openly used by herself and the other party in a DIQKD protocol. Hence our work reduces the requirements needed to perform secure DIQKD and establish safe communication.

  5. A biometric method to secure telemedicine systems.

    PubMed

    Zhang, G H; Poon, Carmen C Y; Li, Ye; Zhang, Y T

    2009-01-01

    Security and privacy are among the most crucial issues for data transmission in telemedicine systems. This paper proposes a solution for securing wireless data transmission in telemedicine systems, i.e. within a body sensor network (BSN), between the BSN and server as well as between the server and professionals who have assess to the server. A unique feature of this solution is the generation of random keys by physiological data (i.e. a biometric approach) for securing communication at all 3 levels. In the performance analysis, inter-pulse interval of photoplethysmogram is used as an example to generate these biometric keys to protect wireless data transmission. The results of statistical analysis and computational complexity suggest that this type of key is random enough to make telemedicine systems resistant to attacks.

  6. Hormone-Balancing Effect of Pre-Gelatinized Organic Maca (Lepidium peruvianum Chacon): (III) Clinical responses of early-postmenopausal women to Maca in double blind, randomized, Placebo-controlled, crossover configuration, outpatient study

    PubMed Central

    Meissner, H. O.; Mscisz, A.; Reich-Bilinska, H.; Mrozikiewicz, P.; Bobkiewicz-Kozlowska, T.; Kedzia, B.; Lowicka, A.; Barchia, I.

    2006-01-01

    This is the second, conclusive part of the clinical study on clinical responses of early-postmenopausal women to standardized doses of pre-Gelatinized Organic Maca (Maca-GO). Total of 34 Caucasian women volunteers participated in a double-blind, randomized, four months outpatient crossover configuration Trial. After fulfilling the criteria of being early-postmenopausal: blood Estrogen (E2<40 pg/ml) and Follicle Stimulating Hormone (FSH>30 IU/ml) at admission, they were randomly allocated to Placebo (P) and Maca-GO (M) treatments (2 groups of 11 participants each). Two 500 mg vegetable hard gel capsules with Maca-GO or Placebo powder were self-administered twice daily with meals (total 2 g/day). At admission and follow-up monthly intervals, body mass index (BMI), blood pressure, levels of gonadal, pituitary, thyroid and adrenal hormones, lipids and key minerals were measured. Bone markers were determined after four months M and P use in 12 participants. Menopausal symptoms were assessed according to Greene’s Score (GMS) and Kupperman’s Index (KMI). Data were analyzed using multivariate technique on blocs of monthly. Results and canonical variate technique was applied to GMS and KMI matrices. Two months application of Maca-GO stimulated (P<0.05) production of E2, suppressed (P<0.05) blood FSH, Thyroid (T3) and Adrenocorticotropic hormones, Cortisol, and BMI, increased (P<0.05) low density lipoproteins, blood Iron and alleviated (P<0.001) menopausal symptoms. Maca-GO noticeably increased bone density markers. In conclusion, Maca-GO applied to early-postmenopausal women (i) acted as a toner of hormonal processes along the Hypothalamus-Pituitary-Ovarian axis, (ii) balanced hormone levels and (iii) relieved symptoms of menopausal discomfort, (hot flushes and night sweating in particular), thus, (iv) exhibited a distinctive function peculiar to adaptogens, providing an alternative non-hormonal plant option to reduce dependence on hormone therapy programs (HRT). PMID:23675006

  7. Hormone-Balancing Effect of Pre-Gelatinized Organic Maca (Lepidium peruvianum Chacon): (III) Clinical responses of early-postmenopausal women to Maca in double blind, randomized, Placebo-controlled, crossover configuration, outpatient study.

    PubMed

    Meissner, H O; Mscisz, A; Reich-Bilinska, H; Mrozikiewicz, P; Bobkiewicz-Kozlowska, T; Kedzia, B; Lowicka, A; Barchia, I

    2006-12-01

    This is the second, conclusive part of the clinical study on clinical responses of early-postmenopausal women to standardized doses of pre-Gelatinized Organic Maca (Maca-GO). Total of 34 Caucasian women volunteers participated in a double-blind, randomized, four months outpatient crossover configuration Trial. After fulfilling the criteria of being early-postmenopausal: blood Estrogen (E2<40 pg/ml) and Follicle Stimulating Hormone (FSH>30 IU/ml) at admission, they were randomly allocated to Placebo (P) and Maca-GO (M) treatments (2 groups of 11 participants each). Two 500 mg vegetable hard gel capsules with Maca-GO or Placebo powder were self-administered twice daily with meals (total 2 g/day). At admission and follow-up monthly intervals, body mass index (BMI), blood pressure, levels of gonadal, pituitary, thyroid and adrenal hormones, lipids and key minerals were measured. Bone markers were determined after four months M and P use in 12 participants. Menopausal symptoms were assessed according to Greene's Score (GMS) and Kupperman's Index (KMI). Data were analyzed using multivariate technique on blocs of monthly. Results and canonical variate technique was applied to GMS and KMI matrices. Two months application of Maca-GO stimulated (P<0.05) production of E2, suppressed (P<0.05) blood FSH, Thyroid (T3) and Adrenocorticotropic hormones, Cortisol, and BMI, increased (P<0.05) low density lipoproteins, blood Iron and alleviated (P<0.001) menopausal symptoms. Maca-GO noticeably increased bone density markers. In conclusion, Maca-GO applied to early-postmenopausal women (i) acted as a toner of hormonal processes along the Hypothalamus-Pituitary-Ovarian axis, (ii) balanced hormone levels and (iii) relieved symptoms of menopausal discomfort, (hot flushes and night sweating in particular), thus, (iv) exhibited a distinctive function peculiar to adaptogens, providing an alternative non-hormonal plant option to reduce dependence on hormone therapy programs (HRT).

  8. Randomly auditing research labs could be an affordable way to improve research quality: A simulation study

    PubMed Central

    Zardo, Pauline; Graves, Nicholas

    2018-01-01

    The “publish or perish” incentive drives many researchers to increase the quantity of their papers at the cost of quality. Lowering quality increases the number of false positive errors which is a key cause of the reproducibility crisis. We adapted a previously published simulation of the research world where labs that produce many papers are more likely to have “child” labs that inherit their characteristics. This selection creates a competitive spiral that favours quantity over quality. To try to halt the competitive spiral we added random audits that could detect and remove labs with a high proportion of false positives, and also improved the behaviour of “child” and “parent” labs who increased their effort and so lowered their probability of making a false positive error. Without auditing, only 0.2% of simulations did not experience the competitive spiral, defined by a convergence to the highest possible false positive probability. Auditing 1.35% of papers avoided the competitive spiral in 71% of simulations, and auditing 1.94% of papers in 95% of simulations. Audits worked best when they were only applied to established labs with 50 or more papers compared with labs with 25 or more papers. Adding a ±20% random error to the number of false positives to simulate peer reviewer error did not reduce the audits’ efficacy. The main benefit of the audits was via the increase in effort in “child” and “parent” labs. Audits improved the literature by reducing the number of false positives from 30.2 per 100 papers to 12.3 per 100 papers. Auditing 1.94% of papers would cost an estimated $15.9 million per year if applied to papers produced by National Institutes of Health funding. Our simulation greatly simplifies the research world and there are many unanswered questions about if and how audits would work that can only be addressed by a trial of an audit. PMID:29649314

  9. Randomly auditing research labs could be an affordable way to improve research quality: A simulation study.

    PubMed

    Barnett, Adrian G; Zardo, Pauline; Graves, Nicholas

    2018-01-01

    The "publish or perish" incentive drives many researchers to increase the quantity of their papers at the cost of quality. Lowering quality increases the number of false positive errors which is a key cause of the reproducibility crisis. We adapted a previously published simulation of the research world where labs that produce many papers are more likely to have "child" labs that inherit their characteristics. This selection creates a competitive spiral that favours quantity over quality. To try to halt the competitive spiral we added random audits that could detect and remove labs with a high proportion of false positives, and also improved the behaviour of "child" and "parent" labs who increased their effort and so lowered their probability of making a false positive error. Without auditing, only 0.2% of simulations did not experience the competitive spiral, defined by a convergence to the highest possible false positive probability. Auditing 1.35% of papers avoided the competitive spiral in 71% of simulations, and auditing 1.94% of papers in 95% of simulations. Audits worked best when they were only applied to established labs with 50 or more papers compared with labs with 25 or more papers. Adding a ±20% random error to the number of false positives to simulate peer reviewer error did not reduce the audits' efficacy. The main benefit of the audits was via the increase in effort in "child" and "parent" labs. Audits improved the literature by reducing the number of false positives from 30.2 per 100 papers to 12.3 per 100 papers. Auditing 1.94% of papers would cost an estimated $15.9 million per year if applied to papers produced by National Institutes of Health funding. Our simulation greatly simplifies the research world and there are many unanswered questions about if and how audits would work that can only be addressed by a trial of an audit.

  10. Learning and innovative elements of strategy adoption rules expand cooperative network topologies.

    PubMed

    Wang, Shijun; Szalay, Máté S; Zhang, Changshui; Csermely, Peter

    2008-04-09

    Cooperation plays a key role in the evolution of complex systems. However, the level of cooperation extensively varies with the topology of agent networks in the widely used models of repeated games. Here we show that cooperation remains rather stable by applying the reinforcement learning strategy adoption rule, Q-learning on a variety of random, regular, small-word, scale-free and modular network models in repeated, multi-agent Prisoner's Dilemma and Hawk-Dove games. Furthermore, we found that using the above model systems other long-term learning strategy adoption rules also promote cooperation, while introducing a low level of noise (as a model of innovation) to the strategy adoption rules makes the level of cooperation less dependent on the actual network topology. Our results demonstrate that long-term learning and random elements in the strategy adoption rules, when acting together, extend the range of network topologies enabling the development of cooperation at a wider range of costs and temptations. These results suggest that a balanced duo of learning and innovation may help to preserve cooperation during the re-organization of real-world networks, and may play a prominent role in the evolution of self-organizing, complex systems.

  11. Dynamics of Quantum Adiabatic Evolution Algorithm for Number Partitioning

    NASA Technical Reports Server (NTRS)

    Smelyanskiy, V. N.; Toussaint, U. V.; Timucin, D. A.

    2002-01-01

    We have developed a general technique to study the dynamics of the quantum adiabatic evolution algorithm applied to random combinatorial optimization problems in the asymptotic limit of large problem size n. We use as an example the NP-complete Number Partitioning problem and map the algorithm dynamics to that of an auxiliary quantum spin glass system with the slowly varying Hamiltonian. We use a Green function method to obtain the adiabatic eigenstates and the minimum excitation gap. g min, = O(n 2(exp -n/2), corresponding to the exponential complexity of the algorithm for Number Partitioning. The key element of the analysis is the conditional energy distribution computed for the set of all spin configurations generated from a given (ancestor) configuration by simultaneous flipping of a fixed number of spins. For the problem in question this distribution is shown to depend on the ancestor spin configuration only via a certain parameter related to 'the energy of the configuration. As the result, the algorithm dynamics can be described in terms of one-dimensional quantum diffusion in the energy space. This effect provides a general limitation of a quantum adiabatic computation in random optimization problems. Analytical results are in agreement with the numerical simulation of the algorithm.

  12. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    PubMed

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  13. Dynamics of Quantum Adiabatic Evolution Algorithm for Number Partitioning

    NASA Technical Reports Server (NTRS)

    Smelyanskiy, Vadius; vonToussaint, Udo V.; Timucin, Dogan A.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We have developed a general technique to study the dynamics of the quantum adiabatic evolution algorithm applied to random combinatorial optimization problems in the asymptotic limit of large problem size n. We use as an example the NP-complete Number Partitioning problem and map the algorithm dynamics to that of an auxiliary quantum spin glass system with the slowly varying Hamiltonian. We use a Green function method to obtain the adiabatic eigenstates and the minimum exitation gap, gmin = O(n2(sup -n/2)), corresponding to the exponential complexity of the algorithm for Number Partitioning. The key element of the analysis is the conditional energy distribution computed for the set of all spin configurations generated from a given (ancestor) configuration by simultaneous flipping of a fixed number of spins. For the problem in question this distribution is shown to depend on the ancestor spin configuration only via a certain parameter related to the energy of the configuration. As the result, the algorithm dynamics can be described in terms of one-dimensional quantum diffusion in the energy space. This effect provides a general limitation of a quantum adiabatic computation in random optimization problems. Analytical results are in agreement with the numerical simulation of the algorithm.

  14. [Using fractional polynomials to estimate the safety threshold of fluoride in drinking water].

    PubMed

    Pan, Shenling; An, Wei; Li, Hongyan; Yang, Min

    2014-01-01

    To study the dose-response relationship between fluoride content in drinking water and prevalence of dental fluorosis on the national scale, then to determine the safety threshold of fluoride in drinking water. Meta-regression analysis was applied to the 2001-2002 national endemic fluorosis survey data of key wards. First, fractional polynomial (FP) was adopted to establish fixed effect model, determining the best FP structure, after that restricted maximum likelihood (REML) was adopted to estimate between-study variance, then the best random effect model was established. The best FP structure was first-order logarithmic transformation. Based on the best random effect model, the benchmark dose (BMD) of fluoride in drinking water and its lower limit (BMDL) was calculated as 0.98 mg/L and 0.78 mg/L. Fluoride in drinking water can only explain 35.8% of the variability of the prevalence, among other influencing factors, ward type was a significant factor, while temperature condition and altitude were not. Fractional polynomial-based meta-regression method is simple, practical and can provide good fitting effect, based on it, the safety threshold of fluoride in drinking water of our country is determined as 0.8 mg/L.

  15. Optimal production lot size and reorder point of a two-stage supply chain while random demand is sensitive with sales teams' initiatives

    NASA Astrophysics Data System (ADS)

    Sankar Sana, Shib

    2016-01-01

    The paper develops a production-inventory model of a two-stage supply chain consisting of one manufacturer and one retailer to study production lot size/order quantity, reorder point sales teams' initiatives where demand of the end customers is dependent on random variable and sales teams' initiatives simultaneously. The manufacturer produces the order quantity of the retailer at one lot in which the procurement cost per unit quantity follows a realistic convex function of production lot size. In the chain, the cost of sales team's initiatives/promotion efforts and wholesale price of the manufacturer are negotiated at the points such that their optimum profits reached nearer to their target profits. This study suggests to the management of firms to determine the optimal order quantity/production quantity, reorder point and sales teams' initiatives/promotional effort in order to achieve their maximum profits. An analytical method is applied to determine the optimal values of the decision variables. Finally, numerical examples with its graphical presentation and sensitivity analysis of the key parameters are presented to illustrate more insights of the model.

  16. Background fluorescence estimation and vesicle segmentation in live cell imaging with conditional random fields.

    PubMed

    Pécot, Thierry; Bouthemy, Patrick; Boulanger, Jérôme; Chessel, Anatole; Bardin, Sabine; Salamero, Jean; Kervrann, Charles

    2015-02-01

    Image analysis applied to fluorescence live cell microscopy has become a key tool in molecular biology since it enables to characterize biological processes in space and time at the subcellular level. In fluorescence microscopy imaging, the moving tagged structures of interest, such as vesicles, appear as bright spots over a static or nonstatic background. In this paper, we consider the problem of vesicle segmentation and time-varying background estimation at the cellular scale. The main idea is to formulate the joint segmentation-estimation problem in the general conditional random field framework. Furthermore, segmentation of vesicles and background estimation are alternatively performed by energy minimization using a min cut-max flow algorithm. The proposed approach relies on a detection measure computed from intensity contrasts between neighboring blocks in fluorescence microscopy images. This approach permits analysis of either 2D + time or 3D + time data. We demonstrate the performance of the so-called C-CRAFT through an experimental comparison with the state-of-the-art methods in fluorescence video-microscopy. We also use this method to characterize the spatial and temporal distribution of Rab6 transport carriers at the cell periphery for two different specific adhesion geometries.

  17. Password-only authenticated three-party key exchange with provable security in the standard model.

    PubMed

    Nam, Junghyun; Choo, Kim-Kwang Raymond; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon; Won, Dongho

    2014-01-01

    Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks.

  18. Single-electron random-number generator (RNG) for highly secure ubiquitous computing applications

    NASA Astrophysics Data System (ADS)

    Uchida, Ken; Tanamoto, Tetsufumi; Fujita, Shinobu

    2007-11-01

    Since the security of all modern cryptographic techniques relies on unpredictable and irreproducible digital keys generated by random-number generators (RNGs), the realization of high-quality RNG is essential for secure communications. In this report, a new RNG, which utilizes single-electron phenomena, is proposed. A room-temperature operating silicon single-electron transistor (SET) having nearby an electron pocket is used as a high-quality, ultra-small RNG. In the proposed RNG, stochastic single-electron capture/emission processes to/from the electron pocket are detected with high sensitivity by the SET, and result in giant random telegraphic signals (GRTS) on the SET current. It is experimentally demonstrated that the single-electron RNG generates extremely high-quality random digital sequences at room temperature, in spite of its simple configuration. Because of its small-size and low-power properties, the single-electron RNG is promising as a key nanoelectronic device for future ubiquitous computing systems with highly secure mobile communication capabilities.

  19. Optical information authentication using compressed double-random-phase-encoded images and quick-response codes.

    PubMed

    Wang, Xiaogang; Chen, Wen; Chen, Xudong

    2015-03-09

    In this paper, we develop a new optical information authentication system based on compressed double-random-phase-encoded images and quick-response (QR) codes, where the parameters of optical lightwave are used as keys for optical decryption and the QR code is a key for verification. An input image attached with QR code is first optically encoded in a simplified double random phase encoding (DRPE) scheme without using interferometric setup. From the single encoded intensity pattern recorded by a CCD camera, a compressed double-random-phase-encoded image, i.e., the sparse phase distribution used for optical decryption, is generated by using an iterative phase retrieval technique with QR code. We compare this technique to the other two methods proposed in literature, i.e., Fresnel domain information authentication based on the classical DRPE with holographic technique and information authentication based on DRPE and phase retrieval algorithm. Simulation results show that QR codes are effective on improving the security and data sparsity of optical information encryption and authentication system.

  20. Practical quantum key distribution protocol without monitoring signal disturbance.

    PubMed

    Sasaki, Toshihiko; Yamamoto, Yoshihisa; Koashi, Masato

    2014-05-22

    Quantum cryptography exploits the fundamental laws of quantum mechanics to provide a secure way to exchange private information. Such an exchange requires a common random bit sequence, called a key, to be shared secretly between the sender and the receiver. The basic idea behind quantum key distribution (QKD) has widely been understood as the property that any attempt to distinguish encoded quantum states causes a disturbance in the signal. As a result, implementation of a QKD protocol involves an estimation of the experimental parameters influenced by the eavesdropper's intervention, which is achieved by randomly sampling the signal. If the estimation of many parameters with high precision is required, the portion of the signal that is sacrificed increases, thus decreasing the efficiency of the protocol. Here we propose a QKD protocol based on an entirely different principle. The sender encodes a bit sequence onto non-orthogonal quantum states and the receiver randomly dictates how a single bit should be calculated from the sequence. The eavesdropper, who is unable to learn the whole of the sequence, cannot guess the bit value correctly. An achievable rate of secure key distribution is calculated by considering complementary choices between quantum measurements of two conjugate observables. We found that a practical implementation using a laser pulse train achieves a key rate comparable to a decoy-state QKD protocol, an often-used technique for lasers. It also has a better tolerance of bit errors and of finite-sized-key effects. We anticipate that this finding will give new insight into how the probabilistic nature of quantum mechanics can be related to secure communication, and will facilitate the simple and efficient use of conventional lasers for QKD.

  1. Effectiveness of 1% versus 0.2% chlorhexidine gels in reducing alveolar osteitis from mandibular third molar surgery: A randomized, double-blind clinical trial

    PubMed Central

    Bravo-Pérez, Manuel; Sánchez-López, José D.; Muñoz-Soto, Esther; Romero-Olid, María N.; Baca-García, Pilar

    2013-01-01

    Purpose: Alveolar osteitis (AO) is the most common postoperative complication of dental extractions. The purpose of this study was to compare the effectiveness of 1% versus 0.2% chlorhexidine (CHX) gel in reducing postoperative AO after surgical extraction of mandibular third molars, and assess the impact of treatment on the Oral HealthRelated Quality of Life (OHRQoL). Material and Methods: This clinical study was a randomized, double-blind clinical trial. Eighty eight patients underwent surgical extraction of one retained mandibular third molar with the intra-alveolar application of 0.2% CHX gel. Afterwards, they were assigned to one of two groups: 1% CHX gel (n=42) or 0.2% CHX gel (n=46). The patients applied the gel twice a day to the wound for one week. All patients were evaluated for AO. Results: In the 0.2% CHX gel group, 13% of AO incidence was found, while in the 1% CHX gel group, AO incidence was 7%, a difference that was not statistically significant. Variables such as sensation of pain and inflammation at baseline and during one week, as well as OHRQoL of the patients at 24 hours and 7 days post-extraction, gave no statistically significant differences. Conclusions: There are no significant differences in AO after surgical extraction of mandibular third molars, when comparing applying 1% CHX gel twice a day for 7 days with 0.2% CHX gel. Key words:Alveolar osteitis, chlorhexidine gel, third molar. PMID:23722126

  2. Practical considerations for estimating clinical trial accrual periods: application to a multi-center effectiveness study

    PubMed Central

    Carter, Rickey E; Sonne, Susan C; Brady, Kathleen T

    2005-01-01

    Background Adequate participant recruitment is vital to the conduct of a clinical trial. Projected recruitment rates are often over-estimated, and the time to recruit the target population (accrual period) is often under-estimated. Methods This report illustrates three approaches to estimating the accrual period and applies the methods to a multi-center, randomized, placebo controlled trial undergoing development. Results Incorporating known sources of accrual variation can yield a more justified estimate of the accrual period. Simulation studies can be incorporated into a clinical trial's planning phase to provide estimates for key accrual summaries including the mean and standard deviation of the accrual period. Conclusion The accrual period of a clinical trial should be carefully considered, and the allocation of sufficient time for participant recruitment is a fundamental aspect of planning a clinical trial. PMID:15796782

  3. The multi-queue model applied to random access protocol

    NASA Astrophysics Data System (ADS)

    Fan, Xinlong

    2013-03-01

    The connection of everything in a sensory and an intelligent way is a pursuit in smart environment. This paper introduces the engineered cell-sensors into the multi-agent systems to realize the smart environment. The seamless interface with the natural environment and strong information-processing ability of cell with the achievements of synthetic biology make the construction of engineered cell-sensors possible. However, the engineered cell-sensors are only simple-functional and unreliable computational entities. Therefore how to combine engineered cell-sensors with digital device is a key problem in order to realize the smart environment. We give the abstract structure and interaction modes of the engineered cell-sensors in order to introduce engineered cell-sensors into multi-agent systems. We believe that the introduction of engineered cell-sensors will push forward the development of the smart environment.

  4. Is Reducing Uncertain Control the Key to Successful Test Anxiety Intervention for Secondary School Students? Findings from a Randomized Control Trial

    ERIC Educational Resources Information Center

    Putwain, David W.; Pescod, Marc

    2018-01-01

    The aim of the study was to conduct a randomized control trial of a targeted, facilitated, test anxiety intervention for a group of adolescent students, and to examine the mediating role of uncertain control. Fifty-six participants (male = 19, white = 21, mean age = 14.7 years) were randomly allocated to an early intervention or wait-list control…

  5. Optical image cryptosystem using chaotic phase-amplitude masks encoding and least-data-driven decryption by compressive sensing

    NASA Astrophysics Data System (ADS)

    Lang, Jun; Zhang, Jing

    2015-03-01

    In our proposed optical image cryptosystem, two pairs of phase-amplitude masks are generated from the chaotic web map for image encryption in the 4f double random phase-amplitude encoding (DRPAE) system. Instead of transmitting the real keys and the enormous masks codes, only a few observed measurements intermittently chosen from the masks are delivered. Based on compressive sensing paradigm, we suitably refine the series expansions of web map equations to better reconstruct the underlying system. The parameters of the chaotic equations can be successfully calculated from observed measurements and then can be used to regenerate the correct random phase-amplitude masks for decrypting the encoded information. Numerical simulations have been performed to verify the proposed optical image cryptosystem. This cryptosystem can provide a new key management and distribution method. It has the advantages of sufficiently low occupation of the transmitted key codes and security improvement of information transmission without sending the real keys.

  6. Novel asymmetric cryptosystem based on distorted wavefront beam illumination and double-random phase encoding.

    PubMed

    Yu, Honghao; Chang, Jun; Liu, Xin; Wu, Chuhan; He, Yifan; Zhang, Yongjian

    2017-04-17

    Herein, we propose a new security enhancing method that employs wavefront aberrations as optical keys to improve the resistance capabilities of conventional double-random phase encoding (DRPE) optical cryptosystems. This study has two main innovations. First, we exploit a special beam-expander afocal-reflecting to produce different types of aberrations, and the wavefront distortion can be altered by changing the shape of the afocal-reflecting system using a deformable mirror. Then, we reconstruct the wavefront aberrations via the surface fitting of Zernike polynomials and use the reconstructed aberrations as novel asymmetric vector keys. The ideal wavefront and the distorted wavefront obtained by wavefront sensing can be regarded as a pair of private and public keys. The wavelength and focal length of the Fourier lens can be used as additional keys to increase the number of degrees of freedom. This novel cryptosystem can enhance the resistance to various attacks aimed at DRPE systems. Finally, we conduct ZEMAX and MATLAB simulations to demonstrate the superiority of this method.

  7. Cognitive Therapy Versus Exposure and Applied Relaxation in Social Phobia: A Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Clark, David M.; Ehlers, Anke; Hackmann, Ann; McManus, Freda; Fennell, Melanie; Grey, Nick; Waddington, Louise; Wild, Jennifer

    2006-01-01

    A new cognitive therapy (CT) program was compared with an established behavioral treatment. Sixty-two patients meeting Diagnostic and Statistical Manual of Mental Disorders (4th ed.; American Psychiatric Association, 1994) criteria for social phobia were randomly assigned to CT, exposure plus applied relaxation (EXP = AR), or wait-list (WAIT). CT…

  8. New secure communication-layer standard for medical image management (ISCL)

    NASA Astrophysics Data System (ADS)

    Kita, Kouichi; Nohara, Takashi; Hosoba, Minoru; Yachida, Masuyoshi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    1999-07-01

    This paper introduces a summary of the standard draft of ISCL 1.00 which will be published by MEDIS-DC officially. ISCL is abbreviation of Integrated Secure Communication Layer Protocols for Secure Medical Image Management Systems. ISCL is a security layer which manages security function between presentation layer and TCP/IP layer. ISCL mechanism depends on basic function of a smart IC card and symmetric secret key mechanism. A symmetry key for each session is made by internal authentication function of a smart IC card with a random number. ISCL has three functions which assure authentication, confidently and integrity. Entity authentication process is done through 3 path 4 way method using functions of internal authentication and external authentication of a smart iC card. Confidentially algorithm and MAC algorithm for integrity are able to be selected. ISCL protocols are communicating through Message Block which consists of Message Header and Message Data. ISCL protocols are evaluating by applying to regional collaboration system for image diagnosis, and On-line Secure Electronic Storage system for medical images. These projects are supported by Medical Information System Development Center. These project shows ISCL is useful to keep security.

  9. [The meaning of medical intervention and religious faith for the elderly cancer patient].

    PubMed

    Teixeira, Jorge Juarez Vieira; Lefèvre, Fernando

    2008-01-01

    This study aimed at identifying the meaning medical intervention and religious faith have for the elderly patient with cancer. A descriptive and qualitative investigation was developed between January 9 and March 28, 2001 in the Hospital do Servidor Público Estadual--Francisco Morato de Oliveira/IAMSPE (Hospital for State Public Servants). The studied sample was not randomized and consisted of 20 elderly men and women with cancer. The data were collected in semi-structured interviews and organized and analyzed using the Collective Subject Discourse method, applying three methodological illustrations: the Central Idea, Key Expressions and the Collective Subject Discourse (CSD). The main central ideas of the discourse material were: 1. Nothing to complain about. I think it is very good and they are on the right track; 2. No. For now, I'm doing everything the doctors say; 3. I've already participated, but not currently; 4. I don't participate in religious activity; 5. Invigoration, hope and balance. Religious faith is everything! 6. It remains the same; however, it changed the way to be. The CSD shows that the adopted medical intervention gave the elderly renewed hope and that religious faith is a key instrument for facing the disease.

  10. Semi-supervised and unsupervised extreme learning machines.

    PubMed

    Huang, Gao; Song, Shiji; Gupta, Jatinder N D; Wu, Cheng

    2014-12-01

    Extreme learning machines (ELMs) have proven to be efficient and effective learning mechanisms for pattern classification and regression. However, ELMs are primarily applied to supervised learning problems. Only a few existing research papers have used ELMs to explore unlabeled data. In this paper, we extend ELMs for both semi-supervised and unsupervised tasks based on the manifold regularization, thus greatly expanding the applicability of ELMs. The key advantages of the proposed algorithms are as follows: 1) both the semi-supervised ELM (SS-ELM) and the unsupervised ELM (US-ELM) exhibit learning capability and computational efficiency of ELMs; 2) both algorithms naturally handle multiclass classification or multicluster clustering; and 3) both algorithms are inductive and can handle unseen data at test time directly. Moreover, it is shown in this paper that all the supervised, semi-supervised, and unsupervised ELMs can actually be put into a unified framework. This provides new perspectives for understanding the mechanism of random feature mapping, which is the key concept in ELM theory. Empirical study on a wide range of data sets demonstrates that the proposed algorithms are competitive with the state-of-the-art semi-supervised or unsupervised learning algorithms in terms of accuracy and efficiency.

  11. SeaQuaKE: Sea-Optimized Quantum Key Exchange

    DTIC Science & Technology

    2014-08-01

    which is led by Applied Communications Sciences under the ONR Free Space Optical Quantum Key Distribution Special Notice (13-SN-0004 under ONRBAA13...aerosol model scenarios. 15. SUBJECT TERMS Quantum communications, free - space optical communications 16. SECURITY CLASSIFICATION OF: 17...SeaQuaKE) project, which is led by Applied Communications Sciences under the ONR Free Space Optical Quantum Key Distribution Special Notice (13-SN

  12. Acquisition of delayed matching in the pigeon.

    PubMed

    Berryman, R; Cumming, W W; Nevin, J A

    1963-01-01

    Pigeons were exposed to three successive matching-to-sample procedures. On a given trial, the sample (red, green or blue light) appeared on a center key; observing responses to this key produced the comparison stimuli on two side keys. Seven different experimental conditions could govern the temporal relations between the sample and comparison stimuli. In the "simultaneous" condition, the center key response was followed immediately by illumination of the side key comparison stimuli, with the center key remaining on. In "zero delay" the center key response simultaneously turned the side keys on and the center key off, while in the "variable delay" conditions, intervals of 1, 2, 4, 10, and 24 sec were interposed between the offset of the sample and the appearance of the comparison stimuli on the side keys. In all conditions, a response to the side key of matching hue produced reinforcement, while a response to the non-matching side key was followed by a blackout. In procedure I all seven experimental conditions were presented in randomly permutated order. After nine sessions of exposure (at 191 trials per session, for a total of 1719 trials) the birds gave no evidence of acquisition in any of the conditions. They were therefore transferred to Procedure II, which required them to match only in the "simultaneous" condition, with both the sample and comparison stimuli present at the same time. With the exception of one bird, all subjects acquired this performance to near 100% levels. Next, in Procedure III, they were once more exposed to presentation of all seven experimental conditions in random order. In contrast to Procedure I, they now acquired the delay performance, and were able to match effectively at delays of about 4 sec.

  13. At least some errors are randomly generated (Freud was wrong)

    NASA Technical Reports Server (NTRS)

    Sellen, A. J.; Senders, J. W.

    1986-01-01

    An experiment was carried out to expose something about human error generating mechanisms. In the context of the experiment, an error was made when a subject pressed the wrong key on a computer keyboard or pressed no key at all in the time allotted. These might be considered, respectively, errors of substitution and errors of omission. Each of seven subjects saw a sequence of three digital numbers, made an easily learned binary judgement about each, and was to press the appropriate one of two keys. Each session consisted of 1,000 presentations of randomly permuted, fixed numbers broken into 10 blocks of 100. One of two keys should have been pressed within one second of the onset of each stimulus. These data were subjected to statistical analyses in order to probe the nature of the error generating mechanisms. Goodness of fit tests for a Poisson distribution for the number of errors per 50 trial interval and for an exponential distribution of the length of the intervals between errors were carried out. There is evidence for an endogenous mechanism that may best be described as a random error generator. Furthermore, an item analysis of the number of errors produced per stimulus suggests the existence of a second mechanism operating on task driven factors producing exogenous errors. Some errors, at least, are the result of constant probability generating mechanisms with error rate idiosyncratically determined for each subject.

  14. Comparing Single Versus Double Screw-Rod Anterior Instrumentation for Treating Thoracolumbar Burst Fractures with Incomplete Neurological Deficit: A Prospective, Randomized Controlled Trial.

    PubMed

    Yu, Yu; Wang, Juan; Shao, Gaohai; Wang, Qunbo; Li, Bo

    2016-05-19

    BACKGROUND Following a thoracolumbar burst fracture (TCBF), anterior screw-rods apply pressure upon the graft site. However, there is limited evidence comparing single screw-rod anterior instrumentation (SSRAI) to double screw-rod anterior instrumentation (DSRAI) for TCBFs. Our objective was to compare SSRAI versus DSRAI for TCBFs with incomplete neurological deficit. MATERIAL AND METHODS A total of 51 participants with T11-L2 TCBFs (AO classification: A3) were randomly assigned to receive SSRAI or DSRAI. Key preoperative, perioperative, and postoperative data were collected. Statistical analysis was conducted to determine the independent factors associated with inferior clinical outcomes, as well as the comparative efficacy of SSRAI and DSRAI. RESULTS There were no significant differences in the key demographic and clinical characteristics between the two groups (all p>0.05). Smoking status was significantly associated with inferior three-month and six-month Denis pain scores (Wald statistic=4.246, p=0.039). Both SSRAI and DSRAI were significantly effective in improving three-month and six-month postoperative degree of kyphosis, three-month and six-month postoperative ASIA impairment scale scores, three-month and six-month postoperative Denis pain score, and three-month and six-month postoperative Denis work score (all p<0.001). Although there were no significant differences between DSRAI and SSRAI with respect to all outcomes (all p>0.05), DSRAI displayed significantly longer operating times, as well as significantly larger operative blood losses (both p<0.001). CONCLUSIONS SSRAI may be preferable over DSRAI for TCBFs with incomplete neurological deficit due to its lower operating time and amount of operative blood loss.

  15. Using Friends as Sensors to Detect Global-Scale Contagious Outbreaks

    PubMed Central

    Garcia-Herranz, Manuel; Moro, Esteban; Cebrian, Manuel; Christakis, Nicholas A.; Fowler, James H.

    2014-01-01

    Recent research has focused on the monitoring of global–scale online data for improved detection of epidemics, mood patterns, movements in the stock market political revolutions, box-office revenues, consumer behaviour and many other important phenomena. However, privacy considerations and the sheer scale of data available online are quickly making global monitoring infeasible, and existing methods do not take full advantage of local network structure to identify key nodes for monitoring. Here, we develop a model of the contagious spread of information in a global-scale, publicly-articulated social network and show that a simple method can yield not just early detection, but advance warning of contagious outbreaks. In this method, we randomly choose a small fraction of nodes in the network and then we randomly choose a friend of each node to include in a group for local monitoring. Using six months of data from most of the full Twittersphere, we show that this friend group is more central in the network and it helps us to detect viral outbreaks of the use of novel hashtags about 7 days earlier than we could with an equal-sized randomly chosen group. Moreover, the method actually works better than expected due to network structure alone because highly central actors are both more active and exhibit increased diversity in the information they transmit to others. These results suggest that local monitoring is not just more efficient, but also more effective, and it may be applied to monitor contagious processes in global–scale networks. PMID:24718030

  16. Using friends as sensors to detect global-scale contagious outbreaks.

    PubMed

    Garcia-Herranz, Manuel; Moro, Esteban; Cebrian, Manuel; Christakis, Nicholas A; Fowler, James H

    2014-01-01

    Recent research has focused on the monitoring of global-scale online data for improved detection of epidemics, mood patterns, movements in the stock market political revolutions, box-office revenues, consumer behaviour and many other important phenomena. However, privacy considerations and the sheer scale of data available online are quickly making global monitoring infeasible, and existing methods do not take full advantage of local network structure to identify key nodes for monitoring. Here, we develop a model of the contagious spread of information in a global-scale, publicly-articulated social network and show that a simple method can yield not just early detection, but advance warning of contagious outbreaks. In this method, we randomly choose a small fraction of nodes in the network and then we randomly choose a friend of each node to include in a group for local monitoring. Using six months of data from most of the full Twittersphere, we show that this friend group is more central in the network and it helps us to detect viral outbreaks of the use of novel hashtags about 7 days earlier than we could with an equal-sized randomly chosen group. Moreover, the method actually works better than expected due to network structure alone because highly central actors are both more active and exhibit increased diversity in the information they transmit to others. These results suggest that local monitoring is not just more efficient, but also more effective, and it may be applied to monitor contagious processes in global-scale networks.

  17. Improving early relationships: a randomized, controlled trial of an age-paced parenting newsletter.

    PubMed

    Waterston, Tony; Welsh, Brenda; Keane, Brigid; Cook, Margaret; Hammal, Donna; Parker, Louise; McConachie, Helen

    2009-01-01

    Parenting is recognized as a key mediator in both health and educational outcomes. Much is known on the value of support and group work in benefiting parenting, but little is known on the effect of written information. A randomized, controlled trial was conducted to evaluate the effect of a parenting newsletter, sent monthly to the parents' home from birth to 1 year, on maternal well-being and parenting style. We tested the hypothesis that mothers receiving the newsletter would show less stress and better parenting characteristics than controls. Parents of first infants born in a North East England District General Hospital between February and October 2003 who consented to take part in the study were randomly allocated to either the intervention or control arm. Those in the intervention arm were sent 12 monthly issues of an age-paced parenting newsletter containing information on emotional development, parent-child interaction, and play. Both the intervention and control group received normal parenting support. Mothers in both groups completed the Well-being Index, Parenting Daily Hassles Scale, and the Adult-Adolescent Parenting Inventory at birth and at 1 year. One hundred eighty-five mothers were recruited, with 94 randomly assigned to the intervention group, and 91 controls. Allowing for differences at recruitment, there were significant differences between the groups at 1 year: the intervention mothers had lower frequency and intensity of perceived hassles and fewer inappropriate expectations of the infant on the Adult-Adolescent Parenting Inventory than the control mothers. A monthly parenting newsletter sent directly to the home in the first year of life seems to help parents to understand their infant better and feel less hassled. This intervention is low cost and can be applied to all parents, so it is nonstigmatizing.

  18. Quantum random walks on congested lattices and the effect of dephasing.

    PubMed

    Motes, Keith R; Gilchrist, Alexei; Rohde, Peter P

    2016-01-27

    We consider quantum random walks on congested lattices and contrast them to classical random walks. Congestion is modelled on lattices that contain static defects which reverse the walker's direction. We implement a dephasing process after each step which allows us to smoothly interpolate between classical and quantum random walks as well as study the effect of dephasing on the quantum walk. Our key results show that a quantum walker escapes a finite boundary dramatically faster than a classical walker and that this advantage remains in the presence of heavily congested lattices.

  19. Extracting random numbers from quantum tunnelling through a single diode.

    PubMed

    Bernardo-Gavito, Ramón; Bagci, Ibrahim Ethem; Roberts, Jonathan; Sexton, James; Astbury, Benjamin; Shokeir, Hamzah; McGrath, Thomas; Noori, Yasir J; Woodhead, Christopher S; Missous, Mohamed; Roedig, Utz; Young, Robert J

    2017-12-19

    Random number generation is crucial in many aspects of everyday life, as online security and privacy depend ultimately on the quality of random numbers. Many current implementations are based on pseudo-random number generators, but information security requires true random numbers for sensitive applications like key generation in banking, defence or even social media. True random number generators are systems whose outputs cannot be determined, even if their internal structure and response history are known. Sources of quantum noise are thus ideal for this application due to their intrinsic uncertainty. In this work, we propose using resonant tunnelling diodes as practical true random number generators based on a quantum mechanical effect. The output of the proposed devices can be directly used as a random stream of bits or can be further distilled using randomness extraction algorithms, depending on the application.

  20. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is presented revealing a power relationship. General models describing the soft error rates across scaled product generations are presented. The analysis methodology may be applied to other scaled microelectronic products and their key parameters.

  1. Contextual interference effect on perceptual-cognitive skills training.

    PubMed

    Broadbent, David P; Causer, Joe; Ford, Paul R; Williams, A Mark

    2015-06-01

    Contextual interference (CI) effect predicts that a random order of practice for multiple skills is superior for learning compared to a blocked order. We report a novel attempt to examine the CI effect during acquisition and transfer of anticipatory judgments from simulation training to an applied sport situation. Participants were required to anticipate tennis shots under either a random practice schedule or a blocked practice schedule. Response accuracy was recorded for both groups in pretest, during acquisition, and on a 7-d retention test. Transfer of learning was assessed through a field-based tennis protocol that attempted to assess performance in an applied sport setting. The random practice group had significantly higher response accuracy scores on the 7-d laboratory retention test compared to the blocked group. Moreover, during the transfer of anticipatory judgments to an applied sport situation, the decision times of the random practice group were significantly lower compared to the blocked group. The CI effect extends to the training of anticipatory judgments through simulation techniques. Furthermore, we demonstrate for the first time that the CI effect increases transfer of learning from simulation training to an applied sport task, highlighting the importance of using appropriate practice schedules during simulation training.

  2. Six-day randomized safety trial of intravaginal lime juice.

    PubMed

    Mauck, Christine K; Ballagh, Susan A; Creinin, Mitchell D; Weiner, Debra H; Doncel, Gustavo F; Fichorova, Raina N; Schwartz, Jill L; Chandra, Neelima; Callahan, Marianne M

    2008-11-01

    Nigerian women reportedly apply lime juice intravaginally to protect themselves against HIV. In vitro data suggest that lime juice is virucidal, but only at cytotoxic concentrations. This is the first controlled, randomized safety trial of lime juice applied to the human vagina. Forty-seven women were randomized to apply water or lime juice (25%, 50%, or undiluted) intravaginally twice daily for two 6-day intervals, separated by a 3-week washout period. Product application also was randomized: during 1 interval, product was applied using a saturated tampon and in the other by douche. Vaginal pH, symptoms, signs of irritation observed via naked eye examination and colposcopy, microflora, and markers of inflammation in cervicovaginal lavages were evaluated after 1 hour and on days 3 and 7. The largest reduction in pH was about one-half a pH unit, seen 1 hour after douching with 100% lime juice. We observed a dose-dependent pattern of symptoms and clinical and laboratory findings that were consistent with a compromised vaginal barrier function. The brief reduction in pH after vaginal lime juice application is unlikely to be virucidal in the presence of semen. Lime juice is unlikely to protect against HIV and may actually be harmful.

  3. Modeling a space-based quantum link that includes an adaptive optics system

    NASA Astrophysics Data System (ADS)

    Duchane, Alexander W.; Hodson, Douglas D.; Mailloux, Logan O.

    2017-10-01

    Quantum Key Distribution uses optical pulses to generate shared random bit strings between two locations. If a high percentage of the optical pulses are comprised of single photons, then the statistical nature of light and information theory can be used to generate secure shared random bit strings which can then be converted to keys for encryption systems. When these keys are incorporated along with symmetric encryption techniques such as a one-time pad, then this method of key generation and encryption is resistant to future advances in quantum computing which will significantly degrade the effectiveness of current asymmetric key sharing techniques. This research first reviews the transition of Quantum Key Distribution free-space experiments from the laboratory environment to field experiments, and finally, ongoing space experiments. Next, a propagation model for an optical pulse from low-earth orbit to ground and the effects of turbulence on the transmitted optical pulse is described. An Adaptive Optics system is modeled to correct for the aberrations caused by the atmosphere. The long-term point spread function of the completed low-earth orbit to ground optical system is explored in the results section. Finally, the impact of this optical system and its point spread function on an overall quantum key distribution system as well as the future work necessary to show this impact is described.

  4. Impact of early applied upper limb stimulation: the EXPLICIT-stroke programme design.

    PubMed

    Kwakkel, Gert; Meskers, Carel G M; van Wegen, Erwin E; Lankhorst, Guus J; Geurts, Alexander C H; van Kuijk, Annet A; Lindeman, Eline; Visser-Meily, Anne; de Vlugt, Erwin; Arendzen, J Hans

    2008-12-17

    Main claims of the literature are that functional recovery of the paretic upper limb is mainly defined within the first month post stroke and that rehabilitation services should preferably be applied intensively and in a task-oriented way within this particular time window. EXplaining PLastICITy after stroke (acronym EXPLICIT-stroke) aims to explore the underlying mechanisms of post stroke upper limb recovery. Two randomized single blinded trials form the core of the programme, investigating the effects of early modified Constraint-Induced Movement Therapy (modified CIMT) and EMG-triggered Neuro-Muscular Stimulation (EMG-NMS) in patients with respectively a favourable or poor probability for recovery of dexterity. 180 participants suffering from an acute, first-ever ischemic stroke will be recruited. Functional prognosis at the end of the first week post stroke is used to stratify patient into a poor prognosis group for upper limb recovery (N = 120, A2 project) and a group with a favourable prognosis (N = 60, A1 project). Both groups will be randomized to an experimental arm receiving respectively modified CIMT (favourable prognosis) or EMG-NMS (poor prognosis) for 3 weeks or to a control arm receiving usual care. Primary outcome variable will be the Action Research Arm Test (ARAT), assessed at 1,2,3,4,5, 8, 12 and 26 weeks post stroke. To study the impact of modified CIMT or EMG-NMS on stroke recovery mechanisms i.e. neuroplasticity, compensatory movements and upper limb neuromechanics, 60 patients randomly selected from projects A1 and A2 will undergo TMS, kinematical and haptic robotic measurements within a repeated measurement design. Additionally, 30 patients from the A1 project will undergo fMRI at baseline, 5 and 26 weeks post stroke. EXPLICIT stroke is a 5 year translational research programme which main aim is to investigate the effects of early applied intensive intervention for regaining dexterity and to explore the underlying mechanisms that are involved in regaining upper limb function after stroke. EXPLICIT-stroke will provide an answer to the key question whether therapy induced improvements are due to either a reduction of basic motor impairment by neural repair i.e. restitution of function and/or the use of behavioural compensation strategies i.e. substitution of function.

  5. Real-time fast physical random number generator with a photonic integrated circuit.

    PubMed

    Ugajin, Kazusa; Terashima, Yuta; Iwakawa, Kento; Uchida, Atsushi; Harayama, Takahisa; Yoshimura, Kazuyuki; Inubushi, Masanobu

    2017-03-20

    Random number generators are essential for applications in information security and numerical simulations. Most optical-chaos-based random number generators produce random bit sequences by offline post-processing with large optical components. We demonstrate a real-time hardware implementation of a fast physical random number generator with a photonic integrated circuit and a field programmable gate array (FPGA) electronic board. We generate 1-Tbit random bit sequences and evaluate their statistical randomness using NIST Special Publication 800-22 and TestU01. All of the BigCrush tests in TestU01 are passed using 410-Gbit random bit sequences. A maximum real-time generation rate of 21.1 Gb/s is achieved for random bit sequences in binary format stored in a computer, which can be directly used for applications involving secret keys in cryptography and random seeds in large-scale numerical simulations.

  6. Identifying Key Hospital Service Quality Factors in Online Health Communities

    PubMed Central

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain

    2015-01-01

    Background The volume of health-related user-created content, especially hospital-related questions and answers in online health communities, has rapidly increased. Patients and caregivers participate in online community activities to share their experiences, exchange information, and ask about recommended or discredited hospitals. However, there is little research on how to identify hospital service quality automatically from the online communities. In the past, in-depth analysis of hospitals has used random sampling surveys. However, such surveys are becoming impractical owing to the rapidly increasing volume of online data and the diverse analysis requirements of related stakeholders. Objective As a solution for utilizing large-scale health-related information, we propose a novel approach to identify hospital service quality factors and overtime trends automatically from online health communities, especially hospital-related questions and answers. Methods We defined social media–based key quality factors for hospitals. In addition, we developed text mining techniques to detect such factors that frequently occur in online health communities. After detecting these factors that represent qualitative aspects of hospitals, we applied a sentiment analysis to recognize the types of recommendations in messages posted within online health communities. Korea’s two biggest online portals were used to test the effectiveness of detection of social media–based key quality factors for hospitals. Results To evaluate the proposed text mining techniques, we performed manual evaluations on the extraction and classification results, such as hospital name, service quality factors, and recommendation types using a random sample of messages (ie, 5.44% (9450/173,748) of the total messages). Service quality factor detection and hospital name extraction achieved average F1 scores of 91% and 78%, respectively. In terms of recommendation classification, performance (ie, precision) is 78% on average. Extraction and classification performance still has room for improvement, but the extraction results are applicable to more detailed analysis. Further analysis of the extracted information reveals that there are differences in the details of social media–based key quality factors for hospitals according to the regions in Korea, and the patterns of change seem to accurately reflect social events (eg, influenza epidemics). Conclusions These findings could be used to provide timely information to caregivers, hospital officials, and medical officials for health care policies. PMID:25855612

  7. [Efficacy on hemiplegic spasticity treated with plum blossom needle tapping therapy at the key points and Bobath therapy: a randomized controlled trial].

    PubMed

    Wang, Fei; Zhang, Lijuan; Wang, Jianhua; Shi, Yan; Zheng, Liya

    2015-08-01

    To evaluate the efficacy on hemiplegic spasticity after cerebral infarction treated with plum blossom needle tapping therapy at the key points and Bobath therapy. Eighty patients were collected, in compliance with the inclusive criteria of hemiplegic spasticity after cerebral infarction, and randomized into an observation group and a control group, 40 cases in each one. In the control group, Bobath manipulation therapy was adopted to relieve spasticity and the treatment of 8 weeks was required. In the observation group, on the basis of the treatment as the control group, the tapping therapy with plum blossom needle was applied to the key points, named Jianyu (LI 15), Jianliao (LI 14), Jianzhen (SI 9), Hegu (LI 4), Chengfu (BL 36), Zusanli (ST 36), Xiyangguan (GB 33), etc. The treatment was given for 15 min each time, once a day. Before treatment, after 4 and 8 weeks of treatment, the Fugl-Meyer assessment (FMA) and Barthel index (BI) were adopted to evaluate the motor function of the extremity and the activity of daily life in the patients of the two groups separately. The modified Ashworth scale was used to evaluate the effect of anti-spasticity. In 4 and 8 weeks of treatment, FMA: scores and BI scores were all significantly increased as compared with those before treatment in the two groups: (both P<0. 05). The results in 8 weeks of treatment in the observation group were significantly better than those in the control group (all P<0. 05). In 4 and 8 weeks of treatment, the scores of spasticity state were improved as compared with those before treatment in the patients of the two groups (all P<0. 05). The result in 8 weeks of treatment in the observation group was significantly better than that in the control group (P<0. 05). In 8 weeks of treatment, the total effective rate of anti-spasticity was 90. 0% (36/40) in the observation group, better than 75. 0% (30/40) in the control group (P<0. 05). The tapping therapy with plum blossom needle at the key points combined with Bobath therapy effectively relieves hemiplegic spasticity in the patients of cerebral infarction and improves the motor function of extremity and the activity of daily life.

  8. 75 FR 39656 - Availability of Seats for the Florida Keys National Marine Sanctuary Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-12

    ... Restoration (member), Tourism--Lower Keys (member), Tourism Lower Keys (alternate), and Tourism Upper Keys... seat for which they are applying; community and professional affiliations; philosophy regarding the...

  9. MIP models and hybrid algorithms for simultaneous job splitting and scheduling on unrelated parallel machines.

    PubMed

    Eroglu, Duygu Yilmaz; Ozmutlu, H Cenk

    2014-01-01

    We developed mixed integer programming (MIP) models and hybrid genetic-local search algorithms for the scheduling problem of unrelated parallel machines with job sequence and machine-dependent setup times and with job splitting property. The first contribution of this paper is to introduce novel algorithms which make splitting and scheduling simultaneously with variable number of subjobs. We proposed simple chromosome structure which is constituted by random key numbers in hybrid genetic-local search algorithm (GAspLA). Random key numbers are used frequently in genetic algorithms, but it creates additional difficulty when hybrid factors in local search are implemented. We developed algorithms that satisfy the adaptation of results of local search into the genetic algorithms with minimum relocation operation of genes' random key numbers. This is the second contribution of the paper. The third contribution of this paper is three developed new MIP models which are making splitting and scheduling simultaneously. The fourth contribution of this paper is implementation of the GAspLAMIP. This implementation let us verify the optimality of GAspLA for the studied combinations. The proposed methods are tested on a set of problems taken from the literature and the results validate the effectiveness of the proposed algorithms.

  10. Analysis of using interpulse intervals to generate 128-bit biometric random binary sequences for securing wireless body sensor networks.

    PubMed

    Zhang, Guang-He; Poon, Carmen C Y; Zhang, Yuan-Ting

    2012-01-01

    Wireless body sensor network (WBSN), a key building block for m-Health, demands extremely stringent resource constraints and thus lightweight security methods are preferred. To minimize resource consumption, utilizing information already available to a WBSN, particularly common to different sensor nodes of a WBSN, for security purposes becomes an attractive solution. In this paper, we tested the randomness and distinctiveness of the 128-bit biometric binary sequences (BSs) generated from interpulse intervals (IPIs) of 20 healthy subjects as well as 30 patients suffered from myocardial infarction and 34 subjects with other cardiovascular diseases. The encoding time of a biometric BS on a WBSN node is on average 23 ms and memory occupation is 204 bytes for any given IPI sequence. The results from five U.S. National Institute of Standards and Technology statistical tests suggest that random biometric BSs can be generated from both healthy subjects and cardiovascular patients and can potentially be used as authentication identifiers for securing WBSNs. Ultimately, it is preferred that these biometric BSs can be used as encryption keys such that key distribution over the WBSN can be avoided.

  11. Choice of optical system is critical for the security of double random phase encryption systems

    NASA Astrophysics Data System (ADS)

    Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Cassidy, Derek; Zhao, Liang; Ryle, James P.; Healy, John J.; Sheridan, John T.

    2017-06-01

    The linear canonical transform (LCT) is used in modeling a coherent light-field propagation through first-order optical systems. Recently, a generic optical system, known as the quadratic phase encoding system (QPES), for encrypting a two-dimensional image has been reported. In such systems, two random phase keys and the individual LCT parameters (α,β,γ) serve as secret keys of the cryptosystem. It is important that such encryption systems also satisfy some dynamic security properties. We, therefore, examine such systems using two cryptographic evaluation methods, the avalanche effect and bit independence criterion, which indicate the degree of security of the cryptographic algorithms using QPES. We compared our simulation results with the conventional Fourier and the Fresnel transform-based double random phase encryption (DRPE) systems. The results show that the LCT-based DRPE has an excellent avalanche and bit independence characteristics compared to the conventional Fourier and Fresnel-based encryption systems.

  12. Misinterpretation of statistical distance in security of quantum key distribution shown by simulation

    NASA Astrophysics Data System (ADS)

    Iwakoshi, Takehisa; Hirota, Osamu

    2014-10-01

    This study will test an interpretation in quantum key distribution (QKD) that trace distance between the distributed quantum state and the ideal mixed state is a maximum failure probability of the protocol. Around 2004, this interpretation was proposed and standardized to satisfy both of the key uniformity in the context of universal composability and operational meaning of the failure probability of the key extraction. However, this proposal has not been verified concretely yet for many years while H. P. Yuen and O. Hirota have thrown doubt on this interpretation since 2009. To ascertain this interpretation, a physical random number generator was employed to evaluate key uniformity in QKD. In this way, we calculated statistical distance which correspond to trace distance in quantum theory after a quantum measurement is done, then we compared it with the failure probability whether universal composability was obtained. As a result, the degree of statistical distance of the probability distribution of the physical random numbers and the ideal uniformity was very large. It is also explained why trace distance is not suitable to guarantee the security in QKD from the view point of quantum binary decision theory.

  13. Password-Only Authenticated Three-Party Key Exchange with Provable Security in the Standard Model

    PubMed Central

    Nam, Junghyun; Kim, Junghwan; Kang, Hyun-Kyu; Kim, Jinsoo; Paik, Juryon

    2014-01-01

    Protocols for password-only authenticated key exchange (PAKE) in the three-party setting allow two clients registered with the same authentication server to derive a common secret key from their individual password shared with the server. Existing three-party PAKE protocols were proven secure under the assumption of the existence of random oracles or in a model that does not consider insider attacks. Therefore, these protocols may turn out to be insecure when the random oracle is instantiated with a particular hash function or an insider attack is mounted against the partner client. The contribution of this paper is to present the first three-party PAKE protocol whose security is proven without any idealized assumptions in a model that captures insider attacks. The proof model we use is a variant of the indistinguishability-based model of Bellare, Pointcheval, and Rogaway (2000), which is one of the most widely accepted models for security analysis of password-based key exchange protocols. We demonstrated that our protocol achieves not only the typical indistinguishability-based security of session keys but also the password security against undetectable online dictionary attacks. PMID:24977229

  14. Prognostic Factors for Survival in Patients with Gastric Cancer using a Random Survival Forest

    PubMed

    Adham, Davoud; Abbasgholizadeh, Nategh; Abazari, Malek

    2017-01-01

    Background: Gastric cancer is the fifth most common cancer and the third top cause of cancer related death with about 1 million new cases and 700,000 deaths in 2012. The aim of this investigation was to identify important factors for outcome using a random survival forest (RSF) approach. Materials and Methods: Data were collected from 128 gastric cancer patients through a historical cohort study in Hamedan-Iran from 2007 to 2013. The event under consideration was death due to gastric cancer. The random survival forest model in R software was applied to determine the key factors affecting survival. Four split criteria were used to determine importance of the variables in the model including log-rank, conversation?? of events, log-rank score, and randomization. Efficiency of the model was confirmed in terms of Harrell’s concordance index. Results: The mean age of diagnosis was 63 ±12.57 and mean and median survival times were 15.2 (95%CI: 13.3, 17.0) and 12.3 (95%CI: 11.0, 13.4) months, respectively. The one-year, two-year, and three-year rates for survival were 51%, 13%, and 5%, respectively. Each RSF approach showed a slightly different ranking order. Very important covariates in nearly all the 4 RSF approaches were metastatic status, age at diagnosis and tumor size. The performance of each RSF approach was in the range of 0.29-0.32 and the best error rate was obtained by the log-rank splitting rule; second, third, and fourth ranks were log-rank score, conservation of events, and the random splitting rule, respectively. Conclusion: Low survival rate of gastric cancer patients is an indication of absence of a screening program for early diagnosis of the disease. Timely diagnosis in early phases increases survival and decreases mortality. Creative Commons Attribution License

  15. Encrypted holographic data storage based on orthogonal-phase-code multiplexing.

    PubMed

    Heanue, J F; Bashaw, M C; Hesselink, L

    1995-09-10

    We describe an encrypted holographic data-storage system that combines orthogonal-phase-code multiplexing with a random-phase key. The system offers the security advantages of random-phase coding but retains the low cross-talk performance and the minimum code storage requirements typical in an orthogonal-phase-code-multiplexing system.

  16. Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization

    ERIC Educational Resources Information Center

    Lock, Robin H.; Lock, Patti Frazer

    2008-01-01

    Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…

  17. University Students' Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics

    ERIC Educational Resources Information Center

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution-- the central, unifying, and overarching theme in biology. Aspects strongly related to abstract "threshold" concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument…

  18. Mixing Methods in Randomized Controlled Trials (RCTs): Validation, Contextualization, Triangulation, and Control

    ERIC Educational Resources Information Center

    Spillane, James P.; Pareja, Amber Stitziel; Dorner, Lisa; Barnes, Carol; May, Henry; Huff, Jason; Camburn, Eric

    2010-01-01

    In this paper we described how we mixed research approaches in a Randomized Control Trial (RCT) of a school principal professional development program. Using examples from our study we illustrate how combining qualitative and quantitative data can address some key challenges from validating instruments and measures of mediator variables to…

  19. The Drug-Testing Dilemma.

    ERIC Educational Resources Information Center

    Dowling-Sendor, Benjamin

    1999-01-01

    The recent decision of the 8th U.S. Circuit Court of Appeals in "Miller," based on the school district's interest in preventing possible abuse, gave legal support for random, suspiciousless drug testing of students. Contends this is a "slippery slope" argument, that the key factor in deciding whether to adopt a policy of random drug testing should…

  20. Detecting phase transitions in a neural network and its application to classification of syndromes in traditional Chinese medicine

    NASA Astrophysics Data System (ADS)

    Chen, J.; Xi, G.; Wang, W.

    2008-02-01

    Detecting phase transitions in neural networks (determined or random) presents a challenging subject for phase transitions play a key role in human brain activity. In this paper, we detect numerically phase transitions in two types of random neural network(RNN) under proper parameters.

  1. 10 CFR 26.67 - Random drug and alcohol testing of individuals who have applied for authorization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... for one or more random tests after any applicable requirement for pre-access testing in §§ 26.65 or 26... individual for any pre-access testing that may be required under §§ 26.65 or 26.69, and thereafter, the... other entity relies on drug and alcohol tests that were conducted before the individual applied for...

  2. 10 CFR 26.67 - Random drug and alcohol testing of individuals who have applied for authorization.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... for one or more random tests after any applicable requirement for pre-access testing in § 26.65 or... individual for any pre-access testing that may be required under § 26.65 or § 26.69, and thereafter, the... other entity relies on drug and alcohol tests that were conducted before the individual applied for...

  3. 10 CFR 26.67 - Random drug and alcohol testing of individuals who have applied for authorization.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... for one or more random tests after any applicable requirement for pre-access testing in §§ 26.65 or 26... individual for any pre-access testing that may be required under §§ 26.65 or 26.69, and thereafter, the... other entity relies on drug and alcohol tests that were conducted before the individual applied for...

  4. 10 CFR 26.67 - Random drug and alcohol testing of individuals who have applied for authorization.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... for one or more random tests after any applicable requirement for pre-access testing in §§ 26.65 or 26... individual for any pre-access testing that may be required under §§ 26.65 or 26.69, and thereafter, the... other entity relies on drug and alcohol tests that were conducted before the individual applied for...

  5. Biometrics encryption combining palmprint with two-layer error correction codes

    NASA Astrophysics Data System (ADS)

    Li, Hengjian; Qiu, Jian; Dong, Jiwen; Feng, Guang

    2017-07-01

    To bridge the gap between the fuzziness of biometrics and the exactitude of cryptography, based on combining palmprint with two-layer error correction codes, a novel biometrics encryption method is proposed. Firstly, the randomly generated original keys are encoded by convolutional and cyclic two-layer coding. The first layer uses a convolution code to correct burst errors. The second layer uses cyclic code to correct random errors. Then, the palmprint features are extracted from the palmprint images. Next, they are fused together by XORing operation. The information is stored in a smart card. Finally, the original keys extraction process is the information in the smart card XOR the user's palmprint features and then decoded with convolutional and cyclic two-layer code. The experimental results and security analysis show that it can recover the original keys completely. The proposed method is more secure than a single password factor, and has higher accuracy than a single biometric factor.

  6. A Case Report on Red Ear Syndrome with Tinnitus Successfully Treated with Transcranial Random Noise Stimulation.

    PubMed

    Kreuzer, Peter M; Vielsmeier, Veronika; Poeppl, Timm B; Langguth, Berthold

    2017-01-01

    The red ear syndrome represents a rare symptom complex consisting of auricular erythema associated with painful and burning sensations. It has been described in combination with tinnitus rarely. It has been hypothesized to be etiologically related to altered trigeminal afferent input, temporomandibular disorders, and thalamic dysfunction. The initial objective of applying transcranial random noise stimulation (tRNS) in a case of red ear syndrome in combination with tinnitus was the alleviation of the phantom sounds. This is a case report on the successful treatment of red ear syndrome with tinnitus by means of transcranial random noise stimulation (tRNS) and a short review on the published cases of this condition. We present the case of a 50-year-old woman reporting a simultaneous onset of constant left-sided tinnitus and feelings of warmth accompanied by an intermittent stabbing and/or oppressive pain stretching from the ipsilateral ear to the head/neck/shoulder region, occasionally accompanied by nausea/vomiting and dizziness. After failure of pharmacological treatment attempts, either because of lacking clinical effects (gabapentin, zolmitriptan, and indomethacin) or because of adverse reactions (pregabaline), the patient was offered an experimental neuromodulatory treatment with bitemporal tRNS primarily targeting the tinnitus complaints of the patient. tRNS was conducted in 2 - 3 day sessions (stimulation site: bilateral temporal cortex/2.0 mA/10 s on-and-off-ramp/offset 0 mA/20 min/random frequencies 101 - 640 Hz / NeuroConn Eldith DC-Stimulator plus). In 3 consecutive pain attacks repeated sessions of tRNS resulted in substantial alleviation of pain intensity and a prolongation of the interval between attacks. This was an expected finding as the proposed tRNS treatment was initially offered to the patient aiming at an alleviation of the tinnitus complaints (which remained unaffected by tRNS). The reported data derive from compassionate use treatment in one single patient. Application of a sham condition would have been desirable, but is not possible in the context of compassionate use treatment. Nevertheless, we would consider it rather unlikely that the reported effects are purely unspecific as the patient did exclusively report symptom alleviation of pain-related parameters without affecting the tinnitus. This case report demonstrates the feasibility and therapeutic potential of applying neuromodulatory treatment approaches in red ear syndrome, a rare form of trigemino-autonomal headache. Therefore, it deserves detailed observation in clinical routine applications as well as controlled trials further investigating its neurobiological effects. Key words: Red ear syndrome, pain, trigemino-autonomal headache, chronic tinnitus, transcranial electrical stimulation, random noise stimulation.

  7. A Structural Modeling Approach to a Multilevel Random Coefficients Model.

    ERIC Educational Resources Information Center

    Rovine, Michael J.; Molenaar, Peter C. M.

    2000-01-01

    Presents a method for estimating the random coefficients model using covariance structure modeling and allowing one to estimate both fixed and random effects. The method is applied to real and simulated data, including marriage data from J. Belsky and M. Rovine (1990). (SLD)

  8. Tensor Minkowski Functionals for random fields on the sphere

    NASA Astrophysics Data System (ADS)

    Chingangbam, Pravabati; Yogendran, K. P.; Joby, P. K.; Ganesan, Vidhya; Appleby, Stephen; Park, Changbom

    2017-12-01

    We generalize the translation invariant tensor-valued Minkowski Functionals which are defined on two-dimensional flat space to the unit sphere. We apply them to level sets of random fields. The contours enclosing boundaries of level sets of random fields give a spatial distribution of random smooth closed curves. We outline a method to compute the tensor-valued Minkowski Functionals numerically for any random field on the sphere. Then we obtain analytic expressions for the ensemble expectation values of the matrix elements for isotropic Gaussian and Rayleigh fields. The results hold on flat as well as any curved space with affine connection. We elucidate the way in which the matrix elements encode information about the Gaussian nature and statistical isotropy (or departure from isotropy) of the field. Finally, we apply the method to maps of the Galactic foreground emissions from the 2015 PLANCK data and demonstrate their high level of statistical anisotropy and departure from Gaussianity.

  9. Simultaneous and coordinated rotational switching of all molecular rotors in a network

    DOE PAGES

    Zhang, Y.; Kersell, H.; Stefak, R.; ...

    2016-05-09

    A range of artificial molecular systems have been created that can exhibit controlled linear and rotational motion. In the development of such systems, a key step is the addition of communication between molecules in a network. Here, we show that a two-dimensional array of dipolar molecular rotors can undergo simultaneous rotational switching by applying an electric field from the tip of a scanning tunnelling microscope. Several hundred rotors made from porphyrin-based double-decker complexes can be simultaneously rotated when in a hexagonal rotor network on a Cu(111) surface by applying biases above ±1 V at 80 K. The phenomenon is observedmore » only in a hexagonal rotor network due to the degeneracy of the ground state dipole rotational energy barrier of the system. Defects are essential to increase electric torque on the rotor network and to stabilize the switched rotor domains. At low biases and low initial rotator angles, slight reorientations of individual rotors can occur resulting in the rotator arms pointing in different directions. In conclusion, analysis reveals that the rotator arm directions here are not random, but are coordinated to minimize energy via cross talk among the rotors through dipolar interactions.« less

  10. Novel Strategy for Photopatterning Emissive Polymer Brushes for Organic Light Emitting Diode Applications

    PubMed Central

    2017-01-01

    A light-mediated methodology to grow patterned, emissive polymer brushes with micron feature resolution is reported and applied to organic light emitting diode (OLED) displays. Light is used for both initiator functionalization of indium tin oxide and subsequent atom transfer radical polymerization of methacrylate-based fluorescent and phosphorescent iridium monomers. The iridium centers play key roles in photocatalyzing and mediating polymer growth while also emitting light in the final OLED structure. The scope of the presented procedure enables the synthesis of a library of polymers with emissive colors spanning the visible spectrum where the dopant incorporation, position of brush growth, and brush thickness are readily controlled. The chain-ends of the polymer brushes remain intact, affording subsequent chain extension and formation of well-defined diblock architectures. This high level of structure and function control allows for the facile preparation of random ternary copolymers and red–green–blue arrays to yield white emission. PMID:28691078

  11. Regional management of farmland feeding geese using an ecological prioritization tool.

    PubMed

    Madsen, Jesper; Bjerrum, Morten; Tombre, Ingunn M

    2014-10-01

    Wild geese foraging on farmland cause increasing conflicts with agricultural interests, calling for a strategic approach to mitigation. In central Norway, conflicts between farmers and spring-staging pink-footed geese feeding on pastures have escalated. To alleviate the conflict, a scheme by which farmers are subsidized to allow geese to forage undisturbed was introduced. To guide allocation of subsidies, an ecological-based ranking of fields at a regional level was recommended and applied. Here we evaluate the scheme. On average, 40 % of subsidized fields were in the top 5 % of the ranking, and 80 % were within the top 20 %. Goose grazing pressure on subsidized pastures was 13 times higher compared to a stratified random selection of non-subsidized pastures, capturing 67 % of the pasture feeding geese despite that subsidized fields only comprised 13 % of the grassland area. Close dialogue between scientists and managers is regarded as a key to the success of the scheme.

  12. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  13. Abstracting Sequences: Reasoning That Is a Key to Academic Achievement.

    PubMed

    Pasnak, Robert; Kidd, Julie K; Gadzichowski, K Marinka; Gallington, Debbie A; Schmerold, Katrina Lea; West, Heather

    2015-01-01

    The ability to understand sequences of items may be an important cognitive ability. To test this proposition, 8 first-grade children from each of 36 classes were randomly assigned to four conditions. Some were taught sequences that represented increasing or decreasing values, or were symmetrical, or were rotations of an object through 6 or 8 positions. Control children received equal numbers of sessions on mathematics, reading, or social studies. Instruction was conducted three times weekly in 15-min sessions for seven months. In May, the children taught sequences applied their understanding to novel sequences, and scored as well or better on three standardized reading tests as the control children. They outscored all children on tests of mathematics concepts, and scored better than control children on some mathematics scales. These findings indicate that developing an understanding of sequences is a form of abstraction, probably involving fluid reasoning, that provides a foundation for academic achievement in early education.

  14. Frustration in Condensed Matter and Protein Folding

    NASA Astrophysics Data System (ADS)

    Li, Z.; Tanner, S.; Conroy, B.; Owens, F.; Tran, M. M.; Boekema, C.

    2014-03-01

    By means of computer modeling, we are studying frustration in condensed matter and protein folding, including the influence of temperature and Thomson-figure formation. Frustration is due to competing interactions in a disordered state. The key issue is how the particles interact to reach the lowest frustration. The relaxation for frustration is mostly a power function (randomly assigned pattern) or an exponential function (regular patterns like Thomson figures). For the atomic Thomson model, frustration is predicted to decrease with the formation of Thomson figures at zero kelvin. We attempt to apply our frustration modeling to protein folding and dynamics. We investigate the homogeneous protein frustration that would cause the speed of the protein folding to increase. Increase of protein frustration (where frustration and hydrophobicity interplay with protein folding) may lead to a protein mutation. Research is supported by WiSE@SJSU and AFC San Jose.

  15. The effect of a dopamine antagonist on conditioning of sexual arousal in women.

    PubMed

    Brom, Mirte; Laan, Ellen; Everaerd, Walter; Spinhoven, Philip; Trimbos, Baptist; Both, Stephanie

    2016-04-01

    Dopamine (DA) plays a key role in reward-seeking behaviours. Accumulating evidence from animal and human studies suggests that human sexual reward learning may also depend on DA transmission. However, research on the role of DA in human sexual reward learning is completely lacking. To investigate whether DA antagonism attenuates classical conditioning of sexual response in humans. Healthy women were randomly allocated to one of two treatment conditions: haloperidol (n = 29) or placebo (n = 29). A differential conditioning paradigm was applied with genital vibrostimulation as unconditional stimulus (US) and neutral pictures as conditional stimuli (CSs). Genital arousal was assessed, and ratings of affective value and subjective sexual arousal were obtained. Haloperidol administration affected unconditional genital responding. However, no significant effects of medication were found for conditioned responding. No firm conclusions can be drawn about whether female sexual reward learning implicates DA transmission since the results do not lend themselves to unambiguous interpretation.

  16. Novel Strategy for Photopatterning Emissive Polymer Brushes for Organic Light Emitting Diode Applications.

    PubMed

    Page, Zachariah A; Narupai, Benjaporn; Pester, Christian W; Bou Zerdan, Raghida; Sokolov, Anatoliy; Laitar, David S; Mukhopadhyay, Sukrit; Sprague, Scott; McGrath, Alaina J; Kramer, John W; Trefonas, Peter; Hawker, Craig J

    2017-06-28

    A light-mediated methodology to grow patterned, emissive polymer brushes with micron feature resolution is reported and applied to organic light emitting diode (OLED) displays. Light is used for both initiator functionalization of indium tin oxide and subsequent atom transfer radical polymerization of methacrylate-based fluorescent and phosphorescent iridium monomers. The iridium centers play key roles in photocatalyzing and mediating polymer growth while also emitting light in the final OLED structure. The scope of the presented procedure enables the synthesis of a library of polymers with emissive colors spanning the visible spectrum where the dopant incorporation, position of brush growth, and brush thickness are readily controlled. The chain-ends of the polymer brushes remain intact, affording subsequent chain extension and formation of well-defined diblock architectures. This high level of structure and function control allows for the facile preparation of random ternary copolymers and red-green-blue arrays to yield white emission.

  17. WHO/INRUD patient care and facility-specific drug use indicators at primary health care centres in Eastern province, Saudi Arabia.

    PubMed

    El Mahalli, A A; Akl, O A M; Al-Dawood, S F; Al-Nehab, A A; Al-Kubaish, H A; Al-Saeed, S; Elkahky, A A A; Salem, A M A A

    2012-11-01

    This study aimed to measure the performance of primary health care centres in Eastern province, Saudi Arabia, using the WHO/International Network of Rational Use of Drugs patient care and facility-specific drug use indicators. In a cross-sectional study, 10 health centres were selected using systematic random sampling. A total of 300 patients were interviewed while visiting the centre from January to March 2011 and 10 pharmacists from the same centres were interviewed. Average consultation time was 7.3 min (optimal > or = 30 min), percentage of drugs adequately labelled was 10% (optimal 100%) and patient's knowledge of correct dosage was 79.3% (optimal 100%). The percentage of key drugs in stock was only 59.2% (optimal 100%). An overall index of rational facility-specific drug use was calculated and applied to rank the health centres for benchmarking.

  18. Comparison of four machine learning methods for object-oriented change detection in high-resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Bai, Ting; Sun, Kaimin; Deng, Shiquan; Chen, Yan

    2018-03-01

    High resolution image change detection is one of the key technologies of remote sensing application, which is of great significance for resource survey, environmental monitoring, fine agriculture, military mapping and battlefield environment detection. In this paper, for high-resolution satellite imagery, Random Forest (RF), Support Vector Machine (SVM), Deep belief network (DBN), and Adaboost models were established to verify the possibility of different machine learning applications in change detection. In order to compare detection accuracy of four machine learning Method, we applied these four machine learning methods for two high-resolution images. The results shows that SVM has higher overall accuracy at small samples compared to RF, Adaboost, and DBN for binary and from-to change detection. With the increase in the number of samples, RF has higher overall accuracy compared to Adaboost, SVM and DBN.

  19. Vemurafenib: a new treatment for BRAF-V600 mutated advanced melanoma

    PubMed Central

    Fisher, Rosalie; Larkin, James

    2012-01-01

    The BRAF inhibitor, vemurafenib, has demonstrated improved progression-free and overall survival compared with chemotherapy in a randomized trial, and represents a new standard of care in patients with advanced melanoma harboring a BRAF-V600 mutation. A BRAF-V600 mutation is identified in approximately half of patients with cutaneous melanoma, and is unequivocally a biomarker predictive of profound clinical benefit for these patients. However, acquired vemurafenib resistance is a major clinical challenge and therapy is not yet curative. A substantial body of translational research has been performed alongside clinical trials of vemurafenib, providing key insights into the molecular basis of response and resistance. This review summarizes the development of vemurafenib for the treatment of BRAF-V600 mutant melanoma and discusses how knowledge of critical signaling pathways will be applied for its optimal clinical use in future. PMID:22904646

  20. Analysis on flood generation processes by means of a continuous simulation model

    NASA Astrophysics Data System (ADS)

    Fiorentino, M.; Gioia, A.; Iacobellis, V.; Manfreda, S.

    2006-03-01

    In the present research, we exploited a continuous hydrological simulation to investigate on key variables responsible of flood peak formation. With this purpose, a distributed hydrological model (DREAM) is used in cascade with a rainfall generator (IRP-Iterated Random Pulse) to simulate a large number of extreme events providing insight into the main controls of flood generation mechanisms. Investigated variables are those used in theoretically derived probability distribution of floods based on the concept of partial contributing area (e.g. Iacobellis and Fiorentino, 2000). The continuous simulation model is used to investigate on the hydrological losses occurring during extreme events, the variability of the source area contributing to the flood peak and its lag-time. Results suggest interesting simplification for the theoretical probability distribution of floods according to the different climatic and geomorfologic environments. The study is applied to two basins located in Southern Italy with different climatic characteristics.

  1. Quantum random walks on congested lattices and the effect of dephasing

    PubMed Central

    Motes, Keith R.; Gilchrist, Alexei; Rohde, Peter P.

    2016-01-01

    We consider quantum random walks on congested lattices and contrast them to classical random walks. Congestion is modelled on lattices that contain static defects which reverse the walker’s direction. We implement a dephasing process after each step which allows us to smoothly interpolate between classical and quantum random walks as well as study the effect of dephasing on the quantum walk. Our key results show that a quantum walker escapes a finite boundary dramatically faster than a classical walker and that this advantage remains in the presence of heavily congested lattices. PMID:26812924

  2. College and Career Readiness Assessment: Validation of the Key Cognitive Strategies Framework

    ERIC Educational Resources Information Center

    Lombardi, Allison R.; Conley, David T.; Seburn, Mary A.; Downs, Andrew M.

    2013-01-01

    In this study, the authors examined the psychometric properties of the key cognitive strategies (KCS) within the CollegeCareerReady[TM] School Diagnostic, a self-report measure of critical thinking skills intended for high school students. Using a cross-validation approach, an exploratory factor analysis was conducted with a randomly selected…

  3. Continuous operation of four-state continuous-variable quantum key distribution system

    NASA Astrophysics Data System (ADS)

    Matsubara, Takuto; Ono, Motoharu; Oguri, Yusuke; Ichikawa, Tsubasa; Hirano, Takuya; Kasai, Kenta; Matsumoto, Ryutaroh; Tsurumaru, Toyohiro

    2016-10-01

    We report on the development of continuous-variable quantum key distribution (CV-QKD) system that are based on discrete quadrature amplitude modulation (QAM) and homodyne detection of coherent states of light. We use a pulsed light source whose wavelength is 1550 nm and repetition rate is 10 MHz. The CV-QKD system can continuously generate secret key which is secure against entangling cloner attack. Key generation rate is 50 kbps when the quantum channel is a 10 km optical fiber. The CV-QKD system we have developed utilizes the four-state and post-selection protocol [T. Hirano, et al., Phys. Rev. A 68, 042331 (2003).]; Alice randomly sends one of four states {|+/-α⟩,|+/-𝑖α⟩}, and Bob randomly performs x- or p- measurement by homodyne detection. A commercially available balanced receiver is used to realize shot-noise-limited pulsed homodyne detection. GPU cards are used to accelerate the software-based post-processing. We use a non-binary LDPC code for error correction (reverse reconciliation) and the Toeplitz matrix multiplication for privacy amplification.

  4. What Are the Key Elements of Educational Interventions for Lay Carers of Patients With Advanced Disease? A Systematic Literature Search and Narrative Review of Structural Components, Processes and Modes of Delivery.

    PubMed

    Farquhar, Morag; Penfold, Clarissa; Walter, Fiona M; Kuhn, Isla; Benson, John

    2016-07-01

    Educating carers about symptom management may help meet patient and carer needs in relation to distressing symptoms in advanced disease. Reviews of the effectiveness of carer interventions exist, but few have focused on educational interventions and none on the key elements that comprise them but which could inform evidence-based design. To identify the key elements (structural components, processes, and delivery modes) of educational interventions for carers of patients with advanced disease. We systematically searched seven databases, applied inclusion and exclusion criteria, conducted quality appraisal, extracted data, and performed a narrative analysis. We included 62 articles related to 49 interventions. Two main delivery modes were identified: personnel-delivered interventions and stand-alone resources. Personnel-delivered interventions targeted individuals or groups, the former conducted at single or multiple time points, and the latter delivered as series. Just more than half targeted carers rather than patient-carer dyads. Most were developed for cancer; few focused purely on symptom management. Stand-alone resources were rare. Methods to evaluate interventions ranged from postintervention evaluations to fully powered randomized controlled trials but of variable quality. Published evaluations of educational interventions for carers in advanced disease are limited, particularly for non-cancer conditions. Key elements for consideration in developing such interventions were identified; however, lack of reporting of reasons for nonparticipation or dropout from interventions limits understanding of the contribution of these elements to interventions' effectiveness. When developing personnel-delivered interventions for carers in advanced disease, consideration of the disease (and, therefore, caring) trajectory, intervention accessibility (timing, location, and transport), and respite provision may be helpful. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  5. Cognitive Therapy and Task Concentration Training Applied as Intensified Group Therapies for Social Anxiety Disorder with Fear of Blushing-A Randomized Controlled Trial.

    PubMed

    Härtling, Samia; Klotsche, Jens; Heinrich, Anke; Hoyer, Jürgen

    2016-11-01

    The current study examines the efficacy of intensified group therapy for social anxiety disorder with fear of blushing. Task concentration training (TCT) and cognitive therapy (CT) were applied during one weekend and compared with a waiting list condition in a randomized controlled trial including 82 patients. On a second weekend, another intervention was added (resulting in TCT-CT and CT-TCT sequences) to examine order effects. Task concentration training and CT were both superior to the waiting list and equally effective after the first therapy weekend. Also, no differences were found between the sequences TCT-CT and CT-TCT at post-assessment. At 6- and 12-month follow-up, effects remained stable or further improved. At the 6-month follow-up, remission rates in completers, established by diagnostic status, were between 69% and 73%. Intensified group therapy is highly effective in treating social anxiety disorder with fear of blushing. Group formats for patients sharing a common primary concern may contribute to the dissemination of cognitive-behavioural therapy. Copyright © 2015 John Wiley & Sons, Ltd. Key Practitioner Message: This study focuses on blushing from fearful individuals within the SAD spectrum to improve evidence for treatment efficacy in those whose social fears are centred around observable bodily sensations. This study integrates task concentration training into the SAD model of Clark and Wells to combine two evidence-based treatments for SAD under one treatment model. This study uses an innovative format of brief, intensified group therapy, conducted on two full-day weekend group sessions delivered over two weekends, with strong observed effect sizes. Copyright © 2015 John Wiley & Sons, Ltd.

  6. What proportion of patients with chronic heart failure are eligible for sacubitril-valsartan?

    PubMed

    Pellicori, Pierpaolo; Urbinati, Alessia; Shah, Parin; MacNamara, Alexandra; Kazmi, Syed; Dierckx, Riet; Zhang, Jufen; Cleland, John G F; Clark, Andrew L

    2017-06-01

    The PARADIGM-HF trial showed that sacubitril-valsartan, an ARB-neprilysin inhibitor, is more effective than enalapril for some patients with heart failure (HF). It is uncertain what proportion of patients with HF would be eligible for sacubitril-valsartan in clinical practice. Between 2001 and 2014, 6131 patients consecutively referred to a community HF clinic with suspected HF were assessed. The criteria required to enter the randomized phase of PARADIGM-HF, including symptoms, NT-proBNP, and current treatment with or without target doses of ACE inhibitors or ARBs, were applied to identify the proportion of patients eligible for sacubitril-valsartan. Recognizing the diversity of clinical opinion and guideline recommendations concerning this issue, entry criteria were applied singly and in combination. Of 1396 patients with reduced left ventricular ejection fraction (≤40%, HFrEF) and contemporary measurement of NT-proBNP, 379 were on target doses of an ACE inhibitor or ARB at their initial visit and, of these, 172 (45%) fulfilled the key entry criteria for the PARADIGM-HF trial. Lack of symptoms (32%) and NT-proBNP <600 ng/L (49%) were common reasons for failure to fulfil criteria. A further 122 patients became eligible during follow-up (n = 294, 21%). However, if background medication and doses were ignored, then 701 (50%) were eligible initially and a further 137 became eligible during follow-up. Of patients with HFrEF referred to a clinic such as ours, only 21% fulfilled the PARADIGM-HF randomization criteria, on which the ESC Guidelines are based; this proportion rises to 60% if background medication is ignored. © 2017 The Authors. European Journal of Heart Failure © 2017 European Society of Cardiology.

  7. Identifying taxonomic and functional surrogates for spring biodiversity conservation.

    PubMed

    Jyväsjärvi, Jussi; Virtanen, Risto; Ilmonen, Jari; Paasivirta, Lauri; Muotka, Timo

    2018-02-27

    Surrogate approaches are widely used to estimate overall taxonomic diversity for conservation planning. Surrogate taxa are frequently selected based on rarity or charisma, whereas selection through statistical modeling has been applied rarely. We used boosted-regression-tree models (BRT) fitted to biological data from 165 springs to identify bryophyte and invertebrate surrogates for taxonomic and functional diversity of boreal springs. We focused on these 2 groups because they are well known and abundant in most boreal springs. The best indicators of taxonomic versus functional diversity differed. The bryophyte Bryum weigelii and the chironomid larva Paratrichocladius skirwithensis best indicated taxonomic diversity, whereas the isopod Asellus aquaticus and the chironomid Macropelopia spp. were the best surrogates of functional diversity. In a scoring algorithm for priority-site selection, taxonomic surrogates performed only slightly better than random selection for all spring-dwelling taxa, but they were very effective in representing spring specialists, providing a distinct improvement over random solutions. However, the surrogates for taxonomic diversity represented functional diversity poorly and vice versa. When combined with cross-taxon complementarity analyses, surrogate selection based on statistical modeling provides a promising approach for identifying groundwater-dependent ecosystems of special conservation value, a key requirement of the EU Water Framework Directive. © 2018 Society for Conservation Biology.

  8. Ensemble survival tree models to reveal pairwise interactions of variables with time-to-events outcomes in low-dimensional setting

    PubMed Central

    Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter

    2018-01-01

    Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes. PMID:29453930

  9. The Impact of Massage Therapy on Function in Pain Populations-A Systematic Review and Meta-Analysis of Randomized Controlled Trials: Part III, Surgical Pain Populations.

    PubMed

    Boyd, Courtney; Crawford, Cindy; Paat, Charmagne F; Price, Ashley; Xenakis, Lea; Zhang, Weimin

    2016-09-01

    Pain is multi-dimensional and may be better addressed through a holistic, biopsychosocial approach. Massage therapy is commonly practiced among patients seeking pain management; however, its efficacy is unclear. This systematic review and meta-analysis is the first to rigorously assess the quality of the evidence for massage therapy's efficacy in treating pain, function-related, and health-related quality of life outcomes in surgical pain populations. Key databases were searched from inception through February 2014. Eligible randomized controlled trials were assessed for methodological quality using SIGN 50 Checklist. Meta-analysis was applied at the outcome level. A professionally diverse steering committee interpreted the results to develop recommendations. Twelve high quality and four low quality studies were included in the review. Results indicate massage therapy is effective for treating pain [standardized mean difference (SMD) = -0.79] and anxiety (SMD = -0.57) compared to active comparators. Based on the available evidence, weak recommendations are suggested for massage therapy, compared to active comparators for reducing pain intensity/severity and anxiety in patients undergoing surgical procedures. This review also discusses massage therapy safety, challenges within this research field, how to address identified research gaps, and next steps for future research. © 2016 American Academy of Pain Medicine.

  10. Secondary Structure Prediction of Protein Constructs Using Random Incremental Truncation and Vacuum-Ultraviolet CD Spectroscopy

    PubMed Central

    Pukáncsik, Mária; Orbán, Ágnes; Nagy, Kinga; Matsuo, Koichi; Gekko, Kunihiko; Maurin, Damien; Hart, Darren; Kézsmárki, István; Vertessy, Beata G.

    2016-01-01

    A novel uracil-DNA degrading protein factor (termed UDE) was identified in Drosophila melanogaster with no significant structural and functional homology to other uracil-DNA binding or processing factors. Determination of the 3D structure of UDE is excepted to provide key information on the description of the molecular mechanism of action of UDE catalysis, as well as in general uracil-recognition and nuclease action. Towards this long-term aim, the random library ESPRIT technology was applied to the novel protein UDE to overcome problems in identifying soluble expressing constructs given the absence of precise information on domain content and arrangement. Nine constructs of UDE were chosen to decipher structural and functional relationships. Vacuum ultraviolet circular dichroism (VUVCD) spectroscopy was performed to define the secondary structure content and location within UDE and its truncated variants. The quantitative analysis demonstrated exclusive α-helical content for the full-length protein, which is preserved in the truncated constructs. Arrangement of α-helical bundles within the truncated protein segments suggested new domain boundaries which differ from the conserved motifs determined by sequence-based alignment of UDE homologues. Here we demonstrate that the combination of ESPRIT and VUVCD spectroscopy provides a new structural description of UDE and confirms that the truncated constructs are useful for further detailed functional studies. PMID:27273007

  11. Cryptosystem for Securing Image Encryption Using Structured Phase Masks in Fresnel Wavelet Transform Domain

    NASA Astrophysics Data System (ADS)

    Singh, Hukum

    2016-12-01

    A cryptosystem for securing image encryption is considered by using double random phase encoding in Fresnel wavelet transform (FWT) domain. Random phase masks (RPMs) and structured phase masks (SPMs) based on devil's vortex toroidal lens (DVTL) are used in spatial as well as in Fourier planes. The images to be encrypted are first Fresnel transformed and then single-level discrete wavelet transform (DWT) is apply to decompose LL,HL, LH and HH matrices. The resulting matrices from the DWT are multiplied by additional RPMs and the resultants are subjected to inverse DWT for the encrypted images. The scheme is more secure because of many parameters used in the construction of SPM. The original images are recovered by using the correct parameters of FWT and SPM. Phase mask SPM based on DVTL increases security that enlarges the key space for encryption and decryption. The proposed encryption scheme is a lens-less optical system and its digital implementation has been performed using MATLAB 7.6.0 (R2008a). The computed value of mean-squared-error between the retrieved and the input images shows the efficacy of scheme. The sensitivity to encryption parameters, robustness against occlusion, entropy and multiplicative Gaussian noise attacks have been analysed.

  12. Occupation times and ergodicity breaking in biased continuous time random walks

    NASA Astrophysics Data System (ADS)

    Bel, Golan; Barkai, Eli

    2005-12-01

    Continuous time random walk (CTRW) models are widely used to model diffusion in condensed matter. There are two classes of such models, distinguished by the convergence or divergence of the mean waiting time. Systems with finite average sojourn time are ergodic and thus Boltzmann-Gibbs statistics can be applied. We investigate the statistical properties of CTRW models with infinite average sojourn time; in particular, the occupation time probability density function is obtained. It is shown that in the non-ergodic phase the distribution of the occupation time of the particle on a given lattice point exhibits bimodal U or trimodal W shape, related to the arcsine law. The key points are as follows. (a) In a CTRW with finite or infinite mean waiting time, the distribution of the number of visits on a lattice point is determined by the probability that a member of an ensemble of particles in equilibrium occupies the lattice point. (b) The asymmetry parameter of the probability distribution function of occupation times is related to the Boltzmann probability and to the partition function. (c) The ensemble average is given by Boltzmann-Gibbs statistics for either finite or infinite mean sojourn time, when detailed balance conditions hold. (d) A non-ergodic generalization of the Boltzmann-Gibbs statistical mechanics for systems with infinite mean sojourn time is found.

  13. Ensemble survival tree models to reveal pairwise interactions of variables with time-to-events outcomes in low-dimensional setting.

    PubMed

    Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter

    2018-02-17

    Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes.

  14. Automatic Trading Agent. RMT Based Portfolio Theory and Portfolio Selection

    NASA Astrophysics Data System (ADS)

    Snarska, M.; Krzych, J.

    2006-11-01

    Portfolio theory is a very powerful tool in the modern investment theory. It is helpful in estimating risk of an investor's portfolio, arosen from lack of information, uncertainty and incomplete knowledge of reality, which forbids a perfect prediction of future price changes. Despite of many advantages this tool is not known and not widely used among investors on Warsaw Stock Exchange. The main reason for abandoning this method is a high level of complexity and immense calculations. The aim of this paper is to introduce an automatic decision-making system, which allows a single investor to use complex methods of Modern Portfolio Theory (MPT). The key tool in MPT is an analysis of an empirical covariance matrix. This matrix, obtained from historical data, biased by such a high amount of statistical uncertainty, that it can be seen as random. By bringing into practice the ideas of Random Matrix Theory (RMT), the noise is removed or significantly reduced, so the future risk and return are better estimated and controlled. These concepts are applied to the Warsaw Stock Exchange Simulator {http://gra.onet.pl}. The result of the simulation is 18% level of gains in comparison with respective 10% loss of the Warsaw Stock Exchange main index WIG.

  15. Ensemble of trees approaches to risk adjustment for evaluating a hospital's performance.

    PubMed

    Liu, Yang; Traskin, Mikhail; Lorch, Scott A; George, Edward I; Small, Dylan

    2015-03-01

    A commonly used method for evaluating a hospital's performance on an outcome is to compare the hospital's observed outcome rate to the hospital's expected outcome rate given its patient (case) mix and service. The process of calculating the hospital's expected outcome rate given its patient mix and service is called risk adjustment (Iezzoni 1997). Risk adjustment is critical for accurately evaluating and comparing hospitals' performances since we would not want to unfairly penalize a hospital just because it treats sicker patients. The key to risk adjustment is accurately estimating the probability of an Outcome given patient characteristics. For cases with binary outcomes, the method that is commonly used in risk adjustment is logistic regression. In this paper, we consider ensemble of trees methods as alternatives for risk adjustment, including random forests and Bayesian additive regression trees (BART). Both random forests and BART are modern machine learning methods that have been shown recently to have excellent performance for prediction of outcomes in many settings. We apply these methods to carry out risk adjustment for the performance of neonatal intensive care units (NICU). We show that these ensemble of trees methods outperform logistic regression in predicting mortality among babies treated in NICU, and provide a superior method of risk adjustment compared to logistic regression.

  16. Evidence-Based Practice in Forensic Mental Health Nursing: A Critical Review.

    PubMed

    Byrt, Richard; Spencer-Stiles, Theresa A; Ismail, Ismail

    2018-06-15

    Literature searches of databases, particularly CINAHL, using key phrases were undertaken. Some authors argue that there is a lack of evidence in forensic mental health (FMH) nursing, with few randomized controlled trials and other methods providing definitive, generalizable evidence. However, literature searches revealed randomized controlled trials of relevance to FMH nursing, many qualitative studies by FMH nurses, and arguments for clinical experience and knowledge of service users, and the latter's views, as sources of evidence. Research findings can be applied to practice, both directly and indirectly. Examples are given of ways that evidence can be used to inform FMH nursing interventions related to therapeutic ward environments, including communication, therapeutic relationships, preventing retraumatization, and enabling physical health. The complex nature of "evidence" is considered in relation to risk assessment and management. FMH nursing can be based on a wide range of sources of evidence. The types of evidence used in practice depend on individual service users' needs and views. In evaluating evidence, it is necessary to be aware of its complex, diverse nature. A distinction can be made between definitive, widely generalizable research findings and evidence with limited generalizability, requiring FMH nurses' judgments about whether it is applicable to their own area of practice. Recommendations for related education and research are made.

  17. Comparison of Collaboration and Performance in Groups of Learners Assembled Randomly or Based on Learners' Topic Preferences

    ERIC Educational Resources Information Center

    Cela, Karina L.; Sicilia, Miguel Ángel; Sánchez, Salvador

    2015-01-01

    Teachers and instructional designers frequently incorporate collaborative learning approaches into their e-learning environments. A key factor of collaborative learning that may affect learner outcomes is whether the collaborative groups are assigned project topics randomly or based on a shared interest in the topic. This is a particularly…

  18. Validity of Random Short Forms: III. Wechsler's Intelligence Scales.

    ERIC Educational Resources Information Center

    Silverstein, A. B.

    1983-01-01

    Formulas for estimating the validity of random short forms were applied to the standardization data for the Wechsler Adult Intelligence Scale-Revised, the Minnesota Multiphasic Personality Inventory, and the Marlowe-Crowne Social Desirability Scale. These formulas demonstrated how much "better than random" the best short forms of these…

  19. Design and implementation of encrypted and decrypted file system based on USBKey and hardware code

    NASA Astrophysics Data System (ADS)

    Wu, Kehe; Zhang, Yakun; Cui, Wenchao; Jiang, Ting

    2017-05-01

    To protect the privacy of sensitive data, an encrypted and decrypted file system based on USBKey and hardware code is designed and implemented in this paper. This system uses USBKey and hardware code to authenticate a user. We use random key to encrypt file with symmetric encryption algorithm and USBKey to encrypt random key with asymmetric encryption algorithm. At the same time, we use the MD5 algorithm to calculate the hash of file to verify its integrity. Experiment results show that large files can be encrypted and decrypted in a very short time. The system has high efficiency and ensures the security of documents.

  20. Single-photon continuous-variable quantum key distribution based on the energy-time uncertainty relation.

    PubMed

    Qi, Bing

    2006-09-15

    We propose a new quantum key distribution protocol in which information is encoded on continuous variables of a single photon. In this protocol, Alice randomly encodes her information on either the central frequency of a narrowband single-photon pulse or the time delay of a broadband single-photon pulse, while Bob randomly chooses to do either frequency measurement or time measurement. The security of this protocol rests on the energy-time uncertainty relation, which prevents Eve from simultaneously determining both frequency and time information with arbitrarily high resolution. Since no interferometer is employed in this scheme, it is more robust against various channel noises, such as polarization and phase fluctuations.

  1. A novel image encryption algorithm based on chaos maps with Markov properties

    NASA Astrophysics Data System (ADS)

    Liu, Quan; Li, Pei-yue; Zhang, Ming-chao; Sui, Yong-xin; Yang, Huai-jiang

    2015-02-01

    In order to construct high complexity, secure and low cost image encryption algorithm, a class of chaos with Markov properties was researched and such algorithm was also proposed. The kind of chaos has higher complexity than the Logistic map and Tent map, which keeps the uniformity and low autocorrelation. An improved couple map lattice based on the chaos with Markov properties is also employed to cover the phase space of the chaos and enlarge the key space, which has better performance than the original one. A novel image encryption algorithm is constructed on the new couple map lattice, which is used as a key stream generator. A true random number is used to disturb the key which can dynamically change the permutation matrix and the key stream. From the experiments, it is known that the key stream can pass SP800-22 test. The novel image encryption can resist CPA and CCA attack and differential attack. The algorithm is sensitive to the initial key and can change the distribution the pixel values of the image. The correlation of the adjacent pixels can also be eliminated. When compared with the algorithm based on Logistic map, it has higher complexity and better uniformity, which is nearer to the true random number. It is also efficient to realize which showed its value in common use.

  2. Safety assessment of a shallow foundation using the random finite element method

    NASA Astrophysics Data System (ADS)

    Zaskórski, Łukasz; Puła, Wojciech

    2015-04-01

    A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical probability distribution fits the empirical probability distribution of bearing capacity basing on 3000 realizations. Assessed probability distribution was applied to compute design values of the bearing capacity and related reliability indices β. Conducted analysis were carried out for a cohesion soil. Hence a friction angle and a cohesion were defined as a random parameters and characterized by two dimensional random fields. A friction angle was described by a bounded distribution as it differs within limited range. While a lognormal distribution was applied in case of a cohesion. Other properties - Young's modulus, Poisson's ratio and unit weight were assumed as deterministic values because they have negligible influence on the stochastic bearing capacity. Griffiths D. V., & Fenton G. A. (1993). Seepage beneath water retaining structures founded on spatially random soil. Géotechnique, 43(6), 577-587.

  3. Facilitation of learning induced by both random and gradual visuomotor task variation

    PubMed Central

    Braun, Daniel A.; Wolpert, Daniel M.

    2012-01-01

    Motor task variation has been shown to be a key ingredient in skill transfer, retention, and structural learning. However, many studies only compare training of randomly varying tasks to either blocked or null training, and it is not clear how experiencing different nonrandom temporal orderings of tasks might affect the learning process. Here we study learning in human subjects who experience the same set of visuomotor rotations, evenly spaced between −60° and +60°, either in a random order or in an order in which the rotation angle changed gradually. We compared subsequent learning of three test blocks of +30°→−30°→+30° rotations. The groups that underwent either random or gradual training showed significant (P < 0.01) facilitation of learning in the test blocks compared with a control group who had not experienced any visuomotor rotations before. We also found that movement initiation times in the random group during the test blocks were significantly (P < 0.05) lower than for the gradual or the control group. When we fit a state-space model with fast and slow learning processes to our data, we found that the differences in performance in the test block were consistent with the gradual or random task variation changing the learning and retention rates of only the fast learning process. Such adaptation of learning rates may be a key feature of ongoing meta-learning processes. Our results therefore suggest that both gradual and random task variation can induce meta-learning and that random learning has an advantage in terms of shorter initiation times, suggesting less reliance on cognitive processes. PMID:22131385

  4. MIP Models and Hybrid Algorithms for Simultaneous Job Splitting and Scheduling on Unrelated Parallel Machines

    PubMed Central

    Ozmutlu, H. Cenk

    2014-01-01

    We developed mixed integer programming (MIP) models and hybrid genetic-local search algorithms for the scheduling problem of unrelated parallel machines with job sequence and machine-dependent setup times and with job splitting property. The first contribution of this paper is to introduce novel algorithms which make splitting and scheduling simultaneously with variable number of subjobs. We proposed simple chromosome structure which is constituted by random key numbers in hybrid genetic-local search algorithm (GAspLA). Random key numbers are used frequently in genetic algorithms, but it creates additional difficulty when hybrid factors in local search are implemented. We developed algorithms that satisfy the adaptation of results of local search into the genetic algorithms with minimum relocation operation of genes' random key numbers. This is the second contribution of the paper. The third contribution of this paper is three developed new MIP models which are making splitting and scheduling simultaneously. The fourth contribution of this paper is implementation of the GAspLAMIP. This implementation let us verify the optimality of GAspLA for the studied combinations. The proposed methods are tested on a set of problems taken from the literature and the results validate the effectiveness of the proposed algorithms. PMID:24977204

  5. Iteration and superposition encryption scheme for image sequences based on multi-dimensional keys

    NASA Astrophysics Data System (ADS)

    Han, Chao; Shen, Yuzhen; Ma, Wenlin

    2017-12-01

    An iteration and superposition encryption scheme for image sequences based on multi-dimensional keys is proposed for high security, big capacity and low noise information transmission. Multiple images to be encrypted are transformed into phase-only images with the iterative algorithm and then are encrypted by different random phase, respectively. The encrypted phase-only images are performed by inverse Fourier transform, respectively, thus new object functions are generated. The new functions are located in different blocks and padded zero for a sparse distribution, then they propagate to a specific region at different distances by angular spectrum diffraction, respectively and are superposed in order to form a single image. The single image is multiplied with a random phase in the frequency domain and then the phase part of the frequency spectrums is truncated and the amplitude information is reserved. The random phase, propagation distances, truncated phase information in frequency domain are employed as multiple dimensional keys. The iteration processing and sparse distribution greatly reduce the crosstalk among the multiple encryption images. The superposition of image sequences greatly improves the capacity of encrypted information. Several numerical experiments based on a designed optical system demonstrate that the proposed scheme can enhance encrypted information capacity and make image transmission at a highly desired security level.

  6. 77 FR 29752 - Petition for Exemption From the Federal Motor Vehicle Motor Theft Prevention Standard; Jaguar...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-18

    ... Smart Key has a discharged battery or is damaged, the emergency key blade can be used to unlock the door... cycling, high and low temperature cycling, mechanical shock, random vibration, thermal stress/shock tests... to the disposition of all Part 543 petitions. Advanced listing, including the release of future...

  7. Quantum cryptography without switching.

    PubMed

    Weedbrook, Christian; Lance, Andrew M; Bowen, Warwick P; Symul, Thomas; Ralph, Timothy C; Lam, Ping Koy

    2004-10-22

    We propose a new coherent state quantum key distribution protocol that eliminates the need to randomly switch between measurement bases. This protocol provides significantly higher secret key rates with increased bandwidths than previous schemes that only make single quadrature measurements. It also offers the further advantage of simplicity compared to all previous protocols which, to date, have relied on switching.

  8. A Comparison of Seventh Grade Thai Students' Reading Comprehension and Motivation to Read English through Applied Instruction Based on the Genre-Based Approach and the Teacher's Manual

    ERIC Educational Resources Information Center

    Sawangsamutchai, Yutthasak; Rattanavich, Saowalak

    2016-01-01

    The objective of this research is to compare the English reading comprehension and motivation to read of seventh grade Thai students taught with applied instruction through the genre-based approach and teachers' manual. A randomized pre-test post-test control group design was used through the cluster random sampling technique. The data were…

  9. Diameter-Growth and Epicormic Branching Response of an East Texas Bottomland Red Oak Stand 3 Years After Thinning and Fertilization

    Treesearch

    Alexander J. Michalek; Brian Roy Lockhart; Matthew W. Lowe; Richard A. Williams

    2004-01-01

    To determine the effects of intermediate silvicultural treatments on bottomland hardwoods, two types of thinning (crown thinning and low thinning) and one level of fertilizer (200 pounds per acre N + 50 pounds per acre P) were applied to a predominantly red oak stand in southeastern Texas. Treatments were applied in a 3 by 2 factorial arrangement as a random-ized...

  10. Effect of expanding medicaid for parents on children's health insurance coverage: lessons from the Oregon experiment.

    PubMed

    DeVoe, Jennifer E; Marino, Miguel; Angier, Heather; O'Malley, Jean P; Crawford, Courtney; Nelson, Christine; Tillotson, Carrie J; Bailey, Steffani R; Gallia, Charles; Gold, Rachel

    2015-01-01

    In the United States, health insurance is not universal. Observational studies show an association between uninsured parents and children. This association persisted even after expansions in child-only public health insurance. Oregon's randomized Medicaid expansion for adults, known as the Oregon Experiment, created a rare opportunity to assess causality between parent and child coverage. To estimate the effect on a child's health insurance coverage status when (1) a parent randomly gains access to health insurance and (2) a parent obtains coverage. Oregon Experiment randomized natural experiment assessing the results of Oregon's 2008 Medicaid expansion. We used generalized estimating equation models to examine the longitudinal effect of a parent randomly selected to apply for Medicaid on their child's Medicaid or Children's Health Insurance Program (CHIP) coverage (intent-to-treat analyses). We used per-protocol analyses to understand the impact on children's coverage when a parent was randomly selected to apply for and obtained Medicaid. Participants included 14409 children aged 2 to 18 years whose parents participated in the Oregon Experiment. For intent-to-treat analyses, the date a parent was selected to apply for Medicaid was considered the date the child was exposed to the intervention. In per-protocol analyses, exposure was defined as whether a selected parent obtained Medicaid. Children's Medicaid or CHIP coverage, assessed monthly and in 6-month intervals relative to their parent's selection date. In the immediate period after selection, children whose parents were selected to apply significantly increased from 3830 (61.4%) to 4152 (66.6%) compared with a nonsignificant change from 5049 (61.8%) to 5044 (61.7%) for children whose parents were not selected to apply. Children whose parents were randomly selected to apply for Medicaid had 18% higher odds of being covered in the first 6 months after parent's selection compared with children whose parents were not selected (adjusted odds ratio [AOR]=1.18; 95% CI, 1.10-1.27). The effect remained significant during months 7 to 12 (AOR=1.11; 95% CI, 1.03-1.19); months 13 to 18 showed a positive but not significant effect (AOR=1.07; 95% CI, 0.99-1.14). Children whose parents were selected and obtained coverage had more than double the odds of having coverage compared with children whose parents were not selected and did not gain coverage (AOR=2.37; 95% CI, 2.14-2.64). Children's odds of having Medicaid or CHIP coverage increased when their parents were randomly selected to apply for Medicaid. Children whose parents were selected and subsequently obtained coverage benefited most. This study demonstrates a causal link between parents' access to Medicaid coverage and their children's coverage.

  11. Primordial and primary prevention programs for cardiovascular diseases: from risk assessment through risk communication to risk reduction. A review of the literature

    PubMed Central

    Lancarotte, Inês; Nobre, Moacyr Roberto

    2016-01-01

    The aim of this study was to identify and reflect on the methods employed by studies focusing on intervention programs for the primordial and primary prevention of cardiovascular diseases. The PubMed, EMBASE, SciVerse Hub-Scopus, and Cochrane Library electronic databases were searched using the terms ‘effectiveness AND primary prevention AND risk factors AND cardiovascular diseases’ for systematic reviews, meta-analyses, randomized clinical trials, and controlled clinical trials in the English language. A descriptive analysis of the employed strategies, theories, frameworks, applied activities, and measurement of the variables was conducted. Nineteen primary studies were analyzed. Heterogeneity was observed in the outcome evaluations, not only in the selected domains but also in the indicators used to measure the variables. There was also a predominance of repeated cross-sectional survey design, differences in community settings, and variability related to the randomization unit when randomization was implemented as part of the sample selection criteria; furthermore, particularities related to measures, limitations, and confounding factors were observed. The employed strategies, including their advantages and limitations, and the employed theories and frameworks are discussed, and risk communication, as the key element of the interventions, is emphasized. A methodological process of selecting and presenting the information to be communicated is recommended, and a systematic theoretical perspective to guide the communication of information is advised. The risk assessment concept, its essential elements, and the relevant role of risk perception are highlighted. It is fundamental for communication that statements targeting other people’s understanding be prepared using systematic data. PMID:27982169

  12. SymptomCare@Home: Developing an Integrated Symptom Monitoring and Management System for Outpatients Receiving Chemotherapy.

    PubMed

    Beck, Susan L; Eaton, Linda H; Echeverria, Christina; Mooney, Kathi H

    2017-10-01

    SymptomCare@Home, an integrated symptom monitoring and management system, was designed as part of randomized clinical trials to help patients with cancer who receive chemotherapy in ambulatory clinics and often experience significant symptoms at home. An iterative design process was informed by chronic disease management theory and features of assessment and clinical decision support systems used in other diseases. Key stakeholders participated in the design process: nurse scientists, clinical experts, bioinformatics experts, and computer programmers. Especially important was input from end users, patients, and nurse practitioners participating in a series of studies testing the system. The system includes both a patient and clinician interface and fully integrates two electronic subsystems: a telephone computer-linked interactive voice response system and a Web-based Decision Support-Symptom Management System. Key features include (1) daily symptom monitoring, (2) self-management coaching, (3) alerting, and (4) nurse practitioner follow-up. The nurse practitioner is distinctively positioned to provide assessment, education, support, and pharmacologic and nonpharmacologic interventions to intensify management of poorly controlled symptoms at home. SymptomCare@Home is a model for providing telehealth. The system facilitates using evidence-based guidelines as part of a comprehensive symptom management approach. The design process and system features can be applied to other diseases and conditions.

  13. Double image encryption in Fresnel domain using wavelet transform, gyrator transform and spiral phase masks

    NASA Astrophysics Data System (ADS)

    Kumar, Ravi; Bhaduri, Basanta

    2017-06-01

    In this paper, we propose a new technique for double image encryption in the Fresnel domain using wavelet transform (WT), gyrator transform (GT) and spiral phase masks (SPMs). The two input mages are first phase encoded and each of them are then multiplied with SPMs and Fresnel propagated with distances d1 and d2, respectively. The single-level discrete WT is applied to Fresnel propagated complex images to decompose each into sub-band matrices i.e. LL, HL, LH and HH. Further, the sub-band matrices of two complex images are interchanged after modulation with random phase masks (RPMs) and subjected to inverse discrete WT. The resulting images are then both added and subtracted to get intermediate images which are further Fresnel propagated with distances d3 and d4, respectively. These outputs are finally gyrator transformed with the same angle α to get the encrypted images. The proposed technique provides enhanced security in terms of a large set of security keys. The sensitivity of security keys such as SPM parameters, GT angle α, Fresnel propagation distances are investigated. The robustness of the proposed techniques against noise and occlusion attacks are also analysed. The numerical simulation results are shown in support of the validity and effectiveness of the proposed technique.

  14. A Data Management System Integrating Web-Based Training and Randomized Trials

    ERIC Educational Resources Information Center

    Muroff, Jordana; Amodeo, Maryann; Larson, Mary Jo; Carey, Margaret; Loftin, Ralph D.

    2011-01-01

    This article describes a data management system (DMS) developed to support a large-scale randomized study of an innovative web-course that was designed to improve substance abuse counselors' knowledge and skills in applying a substance abuse treatment method (i.e., cognitive behavioral therapy; CBT). The randomized trial compared the performance…

  15. Deducing trapdoor primitives in public key encryption schemes

    NASA Astrophysics Data System (ADS)

    Pandey, Chandra

    2005-03-01

    Semantic security of public key encryption schemes is often interchangeable with the art of building trapdoors. In the frame of reference of Random Oracle methodology, the "Key Privacy" and "Anonymity" has often been discussed. However to a certain degree the security of most public key encryption schemes is required to be analyzed with formal proofs using one-way functions. This paper evaluates the design of El Gamal and RSA based schemes and attempts to parallelize the trapdoor primitives used in the computation of the cipher text, thereby magnifying the decryption error δp in the above schemes.

  16. Efficacy of Continuing Education in Improving Pharmacists' Competencies for Providing Weight Management Service: Three-Arm Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Sarayani, Amir; Rashidian, Arash; Gholami, Kheirollah; Torkamandi, Hassan; Javadi, Mohammadreza

    2012-01-01

    Introduction: Weight management is a new public health role for community pharmacists in many countries. Lack of expertise is one of the key barriers to counseling obese patients. We evaluated the comparative efficacy of three alternative continuing education (CE) meetings on weight management. Methods: We designed a randomized controlled trial…

  17. Optimal Design in Three-Level Block Randomized Designs with Two Levels of Nesting: An ANOVA Framework with Random Effects

    ERIC Educational Resources Information Center

    Konstantopoulos, Spyros

    2013-01-01

    Large-scale experiments that involve nested structures may assign treatment conditions either to subgroups such as classrooms or to individuals such as students within subgroups. Key aspects of the design of such experiments include knowledge of the variance structure in higher levels and the sample sizes necessary to reach sufficient power to…

  18. Effects of the Caregiver Interaction Profile Training on Caregiver-Child Interactions in Dutch Child Care Centers: A Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Helmerhorst, Katrien O.; Riksen-Walraven, J. Marianne; Fukkink, Ruben G.; Tavecchio, Louis W. C.; Gevers Deynoot-Schaub, Mirjam J. J. M.

    2017-01-01

    Background: Previous studies underscore the need to improve caregiver-child interactions in early child care centers. Objective: In this study we used a randomized controlled trial to examine whether a 5-week video feedback training can improve six key interactive skills of caregivers in early child care centers: Sensitive responsiveness, respect…

  19. The Effects of Student Coaching: An Evaluation of a Randomized Experiment in Student Advising

    ERIC Educational Resources Information Center

    Bettinger, Eric P.; Baker, Rachel B.

    2014-01-01

    College graduation rates often lag behind college attendance rates. One theory as to why students do not complete college is that they lack key information about how to be successful or fail to act on the information that they have. We present evidence from a randomized experiment which tests the effectiveness of individualized student coaching.…

  20. Symmetric Stream Cipher using Triple Transposition Key Method and Base64 Algorithm for Security Improvement

    NASA Astrophysics Data System (ADS)

    Nurdiyanto, Heri; Rahim, Robbi; Wulan, Nur

    2017-12-01

    Symmetric type cryptography algorithm is known many weaknesses in encryption process compared with asymmetric type algorithm, symmetric stream cipher are algorithm that works on XOR process between plaintext and key, to improve the security of symmetric stream cipher algorithm done improvisation by using Triple Transposition Key which developed from Transposition Cipher and also use Base64 algorithm for encryption ending process, and from experiment the ciphertext that produced good enough and very random.

  1. Quantum cryptography with entangled photons

    PubMed

    Jennewein; Simon; Weihs; Weinfurter; Zeilinger

    2000-05-15

    By realizing a quantum cryptography system based on polarization entangled photon pairs we establish highly secure keys, because a single photon source is approximated and the inherent randomness of quantum measurements is exploited. We implement a novel key distribution scheme using Wigner's inequality to test the security of the quantum channel, and, alternatively, realize a variant of the BB84 protocol. Our system has two completely independent users separated by 360 m, and generates raw keys at rates of 400-800 bits/s with bit error rates around 3%.

  2. A Secure Authenticated Key Exchange Protocol for Credential Services

    NASA Astrophysics Data System (ADS)

    Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki

    In this paper, we propose a leakage-resilient and proactive authenticated key exchange (called LRP-AKE) protocol for credential services which provides not only a higher level of security against leakage of stored secrets but also secrecy of private key with respect to the involving server. And we show that the LRP-AKE protocol is provably secure in the random oracle model with the reduction to the computational Difie-Hellman problem. In addition, we discuss about some possible applications of the LRP-AKE protocol.

  3. A spatial error model with continuous random effects and an application to growth convergence

    NASA Astrophysics Data System (ADS)

    Laurini, Márcio Poletti

    2017-10-01

    We propose a spatial error model with continuous random effects based on Matérn covariance functions and apply this model for the analysis of income convergence processes (β -convergence). The use of a model with continuous random effects permits a clearer visualization and interpretation of the spatial dependency patterns, avoids the problems of defining neighborhoods in spatial econometrics models, and allows projecting the spatial effects for every possible location in the continuous space, circumventing the existing aggregations in discrete lattice representations. We apply this model approach to analyze the economic growth of Brazilian municipalities between 1991 and 2010 using unconditional and conditional formulations and a spatiotemporal model of convergence. The results indicate that the estimated spatial random effects are consistent with the existence of income convergence clubs for Brazilian municipalities in this period.

  4. Parameter identification using a creeping-random-search algorithm

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.

    1971-01-01

    A creeping-random-search algorithm is applied to different types of problems in the field of parameter identification. The studies are intended to demonstrate that a random-search algorithm can be applied successfully to these various problems, which often cannot be handled by conventional deterministic methods, and, also, to introduce methods that speed convergence to an extremal of the problem under investigation. Six two-parameter identification problems with analytic solutions are solved, and two application problems are discussed in some detail. Results of the study show that a modified version of the basic creeping-random-search algorithm chosen does speed convergence in comparison with the unmodified version. The results also show that the algorithm can successfully solve problems that contain limits on state or control variables, inequality constraints (both independent and dependent, and linear and nonlinear), or stochastic models.

  5. Quantum cryptography and applications in the optical fiber network

    NASA Astrophysics Data System (ADS)

    Luo, Yuhui

    2005-09-01

    Quantum cryptography, as part of quantum information and communications, can provide absolute security for information transmission because it is established on the fundamental laws of quantum theory, such as the principle of uncertainty, No-cloning theorem and quantum entanglement. In this thesis research, a novel scheme to implement quantum key distribution based on multiphoton entanglement with a new protocol is proposed. Its advantages are: a larger information capacity can be obtained with a longer transmission distance and the detection of multiple photons is easier than that of a single photon. The security and attacks pertaining to such a system are also studied. Next, a quantum key distribution over wavelength division multiplexed (WDM) optical fiber networks is realized. Quantum key distribution in networks is a long-standing problem for practical applications. Here we combine quantum cryptography and WDM to solve this problem because WDM technology is universally deployed in the current and next generation fiber networks. The ultimate target is to deploy quantum key distribution over commercial networks. The problems arising from the networks are also studied in this part. Then quantum key distribution in multi-access networks using wavelength routing technology is investigated in this research. For the first time, quantum cryptography for multiple individually targeted users has been successfully implemented in sharp contrast to that using the indiscriminating broadcasting structure. It overcomes the shortcoming that every user in the network can acquire the quantum key signals intended to be exchanged between only two users. Furthermore, a more efficient scheme of quantum key distribution is adopted, hence resulting in a higher key rate. Lastly, a quantum random number generator based on quantum optics has been experimentally demonstrated. This device is a key component for quantum key distribution as it can create truly random numbers, which is an essential requirement to perform quantum key distribution. This new generator is composed of a single optical fiber coupler with fiber pigtails, which can be easily used in optical fiber communications.

  6. Analysis of random drop for gateway congestion control. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Hashem, Emam Salaheddin

    1989-01-01

    Lately, the growing demand on the Internet has prompted the need for more effective congestion control policies. Currently No Gateway Policy is used to relieve and signal congestion, which leads to unfair service to the individual users and a degradation of overall network performance. Network simulation was used to illustrate the character of Internet congestion and its causes. A newly proposed gateway congestion control policy, called Random Drop, was considered as a promising solution to the pressing problem. Random Drop relieves resource congestion upon buffer overflow by choosing a random packet from the service queue to be dropped. The random choice should result in a drop distribution proportional to the bandwidth distribution among all contending TCP connections, thus applying the necessary fairness. Nonetheless, the simulation experiments demonstrate several shortcomings with this policy. Because Random Drop is a congestion control policy, which is not applied until congestion has already occurred, it usually results in a high drop rate that hurts too many connections including well-behaved ones. Even though the number of packets dropped is different from one connection to another depending on the buffer utilization upon overflow, the TCP recovery overhead is high enough to neutralize these differences, causing unfair congestion penalties. Besides, the drop distribution itself is an inaccurate representation of the average bandwidth distribution, missing much important information about the bandwidth utilization between buffer overflow events. A modification of Random Drop to do congestion avoidance by applying the policy early was also proposed. Early Random Drop has the advantage of avoiding the high drop rate of buffer overflow. The early application of the policy removes the pressure of congestion relief and allows more accurate signaling of congestion. To be used effectively, algorithms for the dynamic adjustment of the parameters of Early Random Drop to suite the current network load must still be developed.

  7. Conic Sampling: An Efficient Method for Solving Linear and Quadratic Programming by Randomly Linking Constraints within the Interior

    PubMed Central

    Serang, Oliver

    2012-01-01

    Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics. PMID:22952741

  8. A hybrid-type quantum random number generator

    NASA Astrophysics Data System (ADS)

    Hai-Qiang, Ma; Wu, Zhu; Ke-Jin, Wei; Rui-Xue, Li; Hong-Wei, Liu

    2016-05-01

    This paper proposes a well-performing hybrid-type truly quantum random number generator based on the time interval between two independent single-photon detection signals, which is practical and intuitive, and generates the initial random number sources from a combination of multiple existing random number sources. A time-to-amplitude converter and multichannel analyzer are used for qualitative analysis to demonstrate that each and every step is random. Furthermore, a carefully designed data acquisition system is used to obtain a high-quality random sequence. Our scheme is simple and proves that the random number bit rate can be dramatically increased to satisfy practical requirements. Project supported by the National Natural Science Foundation of China (Grant Nos. 61178010 and 11374042), the Fund of State Key Laboratory of Information Photonics and Optical Communications (Beijing University of Posts and Telecommunications), China, and the Fundamental Research Funds for the Central Universities of China (Grant No. bupt2014TS01).

  9. Assessing Key Competences across the Curriculum--And Europe

    ERIC Educational Resources Information Center

    Pepper, David

    2011-01-01

    The development of key competences for lifelong learning has been an important policy imperative for EU Member States. The European Reference Framework of key competences (2006) built on previous developments by the OECD, UNESCO and Member States themselves. It defined key competences as knowledge, skills and attitudes applied appropriately to…

  10. Continuous-variable quantum authentication of physical unclonable keys

    NASA Astrophysics Data System (ADS)

    Nikolopoulos, Georgios M.; Diamanti, Eleni

    2017-04-01

    We propose a scheme for authentication of physical keys that are materialized by optical multiple-scattering media. The authentication relies on the optical response of the key when probed by randomly selected coherent states of light, and the use of standard wavefront-shaping techniques that direct the scattered photons coherently to a specific target mode at the output. The quadratures of the electromagnetic field of the scattered light at the target mode are analysed using a homodyne detection scheme, and the acceptance or rejection of the key is decided upon the outcomes of the measurements. The proposed scheme can be implemented with current technology and offers collision resistance and robustness against key cloning.

  11. Field trial of differential-phase-shift quantum key distribution using polarization independent frequency up-conversion detectors.

    PubMed

    Honjo, T; Yamamoto, S; Yamamoto, T; Kamada, H; Nishida, Y; Tadanaga, O; Asobe, M; Inoue, K

    2007-11-26

    We report a field trial of differential phase shift quantum key distribution (QKD) using polarization independent frequency up-conversion detectors. A frequency up-conversion detector is a promising device for achieving a high key generation rate when combined with a high clock rate QKD system. However, its polarization dependence prevents it from being applied to practical QKD systems. In this paper, we employ a modified polarization diversity configuration to eliminate the polarization dependence. Applying this method, we performed a long-term stability test using a 17.6-km installed fiber. We successfully demonstrated stable operation for 6 hours and achieved a sifted key generation rate of 120 kbps and an average quantum bit error rate of 3.14 %. The sifted key generation rate was not the estimated value but the effective value, which means that the sifted key was continuously generated at a rate of 120 kbps for 6 hours.

  12. Continuous-time random-walk model for financial distributions

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume; Montero, Miquel; Weiss, George H.

    2003-02-01

    We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollar deutsche mark future exchange, finding good agreement between theory and the observed data.

  13. How random is the random forest? Random forest algorithm on the service of structural imaging biomarkers for Alzheimer's disease: from Alzheimer's disease neuroimaging initiative (ADNI) database.

    PubMed

    Dimitriadis, Stavros I; Liparas, Dimitris

    2018-06-01

    Neuroinformatics is a fascinating research field that applies computational models and analytical tools to high dimensional experimental neuroscience data for a better understanding of how the brain functions or dysfunctions in brain diseases. Neuroinformaticians work in the intersection of neuroscience and informatics supporting the integration of various sub-disciplines (behavioural neuroscience, genetics, cognitive psychology, etc.) working on brain research. Neuroinformaticians are the pathway of information exchange between informaticians and clinicians for a better understanding of the outcome of computational models and the clinical interpretation of the analysis. Machine learning is one of the most significant computational developments in the last decade giving tools to neuroinformaticians and finally to radiologists and clinicians for an automatic and early diagnosis-prognosis of a brain disease. Random forest (RF) algorithm has been successfully applied to high-dimensional neuroimaging data for feature reduction and also has been applied to classify the clinical label of a subject using single or multi-modal neuroimaging datasets. Our aim was to review the studies where RF was applied to correctly predict the Alzheimer's disease (AD), the conversion from mild cognitive impairment (MCI) and its robustness to overfitting, outliers and handling of non-linear data. Finally, we described our RF-based model that gave us the 1 st position in an international challenge for automated prediction of MCI from MRI data.

  14. Digital Sound Encryption with Logistic Map and Number Theoretic Transform

    NASA Astrophysics Data System (ADS)

    Satria, Yudi; Gabe Rizky, P. H.; Suryadi, MT

    2018-03-01

    Digital sound security has limits on encrypting in Frequency Domain. Number Theoretic Transform based on field (GF 2521 – 1) improve and solve that problem. The algorithm for this sound encryption is based on combination of Chaos function and Number Theoretic Transform. The Chaos function that used in this paper is Logistic Map. The trials and the simulations are conducted by using 5 different digital sound files data tester in Wave File Extension Format and simulated at least 100 times each. The key stream resulted is random with verified by 15 NIST’s randomness test. The key space formed is very big which more than 10469. The processing speed of algorithm for encryption is slightly affected by Number Theoretic Transform.

  15. Volume hologram with random encoded reference beam for secure data encryption

    NASA Astrophysics Data System (ADS)

    Markov, Vladimir B.; Weber, David C.; Trolinger, James D.

    2000-04-01

    A method is presented to store biometric and/or other important information on an ID card in the form of a Card Hologram that cannot be read or duplicated without the use of a special Key Hologram that is secured inside of an automated reader. The Key Hologram produces the unique wavefront required to release the information contained in a complex, 3D diffraction pattern recorded in a volume hologram attached to the card. Experimental results are presented in which the image of an Air Force resolution target are recorded and reconstructed in a volume material using a random speckle wavefront and that cannot be viewed using a simple wavefront such as a collimated or diverging laser beam.

  16. Random encoded reference beam for secure data storage in a holographic memory

    NASA Astrophysics Data System (ADS)

    Markov, Vladimir B.; Weber, David C.

    2000-11-01

    A method is presented to store biometric and/or other important information on an ID card in the form of a Card Hologram that cannot be read or duplicated without the use of a special Key Hologram that is secured inside of an automated reader. The Key Hologram produces the unique wavefront required to release the information contained in a complex, 3- D diffraction pattern recorded in a volume hologram attached to the card. Experimental results are presented in which the image of an Air Force resolution target are recorded and reconstructed in a volume material using a random speckle wavefront and that cannot be viewed using a simple wavefront such as a collimated or diverging laser beam.

  17. Minimum Winfree loop determines self-sustained oscillations in excitable Erdös-Rényi random networks.

    PubMed

    Qian, Yu; Cui, Xiaohua; Zheng, Zhigang

    2017-07-18

    The investigation of self-sustained oscillations in excitable complex networks is very important in understanding various activities in brain systems, among which the exploration of the key determinants of oscillations is a challenging task. In this paper, by investigating the influence of system parameters on self-sustained oscillations in excitable Erdös-Rényi random networks (EERRNs), the minimum Winfree loop (MWL) is revealed to be the key factor in determining the emergence of collective oscillations. Specifically, the one-to-one correspondence between the optimal connection probability (OCP) and the MWL length is exposed. Moreover, many important quantities such as the lower critical connection probability (LCCP), the OCP, and the upper critical connection probability (UCCP) are determined by the MWL. Most importantly, they can be approximately predicted by the network structure analysis, which have been verified in numerical simulations. Our results will be of great importance to help us in understanding the key factors in determining persistent activities in biological systems.

  18. Random Numbers and Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Scherer, Philipp O. J.

    Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.

  19. Random walker in temporally deforming higher-order potential forces observed in a financial crisis.

    PubMed

    Watanabe, Kota; Takayasu, Hideki; Takayasu, Misako

    2009-11-01

    Basic peculiarities of market price fluctuations are known to be well described by a recently developed random-walk model in a temporally deforming quadratic potential force whose center is given by a moving average of past price traces [M. Takayasu, T. Mizuno, and H. Takayasu, Physica A 370, 91 (2006)]. By analyzing high-frequency financial time series of exceptional events, such as bubbles and crashes, we confirm the appearance of higher-order potential force in the markets. We show statistical significance of its existence by applying the information criterion. This time series analysis is expected to be applied widely for detecting a nonstationary symptom in random phenomena.

  20. Overcoming the rate-distance limit of quantum key distribution without quantum repeaters.

    PubMed

    Lucamarini, M; Yuan, Z L; Dynes, J F; Shields, A J

    2018-05-01

    Quantum key distribution (QKD) 1,2 allows two distant parties to share encryption keys with security based on physical laws. Experimentally, QKD has been implemented via optical means, achieving key rates of 1.26 megabits per second over 50 kilometres of standard optical fibre 3 and of 1.16 bits per hour over 404 kilometres of ultralow-loss fibre in a measurement-device-independent configuration 4 . Increasing the bit rate and range of QKD is a formidable, but important, challenge. A related target, which is currently considered to be unfeasible without quantum repeaters 5-7 , is overcoming the fundamental rate-distance limit of QKD 8 . This limit defines the maximum possible secret key rate that two parties can distil at a given distance using QKD and is quantified by the secret-key capacity of the quantum channel 9 that connects the parties. Here we introduce an alternative scheme for QKD whereby pairs of phase-randomized optical fields are first generated at two distant locations and then combined at a central measuring station. Fields imparted with the same random phase are 'twins' and can be used to distil a quantum key. The key rate of this twin-field QKD exhibits the same dependence on distance as does a quantum repeater, scaling with the square-root of the channel transmittance, irrespective of who (malicious or otherwise) is in control of the measuring station. However, unlike schemes that involve quantum repeaters, ours is feasible with current technology and presents manageable levels of noise even on 550 kilometres of standard optical fibre. This scheme is a promising step towards overcoming the rate-distance limit of QKD and greatly extending the range of secure quantum communications.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hymel, Ross

    The Public Key (PK) FPGA software performs asymmetric authentication using the 163-bit Elliptic Curve Digital Signature Algorithm (ECDSA) on an embedded FPGA platform. A digital signature is created on user-supplied data, and communication with a host system is performed via a Serial Peripheral Interface (SPI) bus. Software includes all components necessary for signing, including custom random number generator for key creation and SHA-256 for data hashing.

  2. A Study of the Relationship between Key Factors of Academic Innovation and Faculties' Teaching Goals--The Mediatory Role of Knowledge

    ERIC Educational Resources Information Center

    Mohammadi, Mehdi; Marzooghi, Rahmatullah; Dehghani, Fatemeh

    2017-01-01

    The following research tries to study the Relationship between key factors of academic innovations and faculties' teaching goals with the mediatory role of their pedagogical, technological and content knowledge. The statistical population in this research included faculty members of Shiraz University. By simple random sampling, 127 faculty members…

  3. Color image encryption by using Yang-Gu mixture amplitude-phase retrieval algorithm in gyrator transform domain and two-dimensional Sine logistic modulation map

    NASA Astrophysics Data System (ADS)

    Sui, Liansheng; Liu, Benqing; Wang, Qiang; Li, Ye; Liang, Junli

    2015-12-01

    A color image encryption scheme is proposed based on Yang-Gu mixture amplitude-phase retrieval algorithm and two-coupled logistic map in gyrator transform domain. First, the color plaintext image is decomposed into red, green and blue components, which are scrambled individually by three random sequences generated by using the two-dimensional Sine logistic modulation map. Second, each scrambled component is encrypted into a real-valued function with stationary white noise distribution in the iterative amplitude-phase retrieval process in the gyrator transform domain, and then three obtained functions are considered as red, green and blue channels to form the color ciphertext image. Obviously, the ciphertext image is real-valued function and more convenient for storing and transmitting. In the encryption and decryption processes, the chaotic random phase mask generated based on logistic map is employed as the phase key, which means that only the initial values are used as private key and the cryptosystem has high convenience on key management. Meanwhile, the security of the cryptosystem is enhanced greatly because of high sensitivity of the private keys. Simulation results are presented to prove the security and robustness of the proposed scheme.

  4. Physical-layer security analysis of PSK quantum-noise randomized cipher in optically amplified links

    NASA Astrophysics Data System (ADS)

    Jiao, Haisong; Pu, Tao; Xiang, Peng; Zheng, Jilin; Fang, Tao; Zhu, Huatao

    2017-08-01

    The quantitative security of quantum-noise randomized cipher (QNRC) in optically amplified links is analyzed from the perspective of physical-layer advantage. Establishing the wire-tap channel models for both key and data, we derive the general expressions of secrecy capacities for the key against ciphertext-only attack and known-plaintext attack, and that for the data, which serve as the basic performance metrics. Further, the maximal achievable secrecy rate of the system is proposed, under which secrecy of both the key and data is guaranteed. Based on the same framework, the secrecy capacities of various cases can be assessed and compared. The results indicate perfect secrecy is potentially achievable for data transmission, and an elementary principle of setting proper number of photons and bases is given to ensure the maximal data secrecy capacity. But the key security is asymptotically perfect, which tends to be the main constraint of systemic maximal secrecy rate. Moreover, by adopting cascaded optical amplification, QNRC can realize long-haul transmission with secure rate up to Gb/s, which is orders of magnitude higher than the perfect secrecy rates of other encryption systems.

  5. An Evaluation of a Behaviorally Based Social Skills Group for Individuals Diagnosed with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Jeremy A.; Milne, Christine; Taubman, Mitchell; Oppenheim-Leaf, Misty; Torres, Norma; Townley-Cochran, Donna; Leaf, Ronald; McEachin, John; Yoder, Paul

    2017-01-01

    In this study we evaluated a social skills group which employed a progressive applied behavior analysis model for individuals diagnosed with autism spectrum disorder. A randomized control trial was utilized; eight participants were randomly assigned to a treatment group and seven participants were randomly assigned to a waitlist control group. The…

  6. Under What Circumstances Does External Knowledge about the Correlation Structure Improve Power in Cluster Randomized Designs?

    ERIC Educational Resources Information Center

    Rhoads, Christopher

    2014-01-01

    Recent publications have drawn attention to the idea of utilizing prior information about the correlation structure to improve statistical power in cluster randomized experiments. Because power in cluster randomized designs is a function of many different parameters, it has been difficult for applied researchers to discern a simple rule explaining…

  7. Variations on a theme of Lander and Waterman

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speed, T.

    1997-12-01

    The original Lander and Waterman mathematical analysis was for fingerprinting random clones. Since that time, a number of variants of their theory have appeared, including ones which apply to mapping by anchoring random clones, and to non-random or directed clone mapping. The same theory is now widely used to devise random sequencing strategies. In this talk I will review these developments, and go on the discuss the theory required for directed sequencing strategies.

  8. 77 FR 38273 - Availability of Seats for the Florida Keys National Marine Sanctuary Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-27

    ...), Tourism--Upper Keys (member), and Tourism--Upper Keys (alternate). Applicants are chosen based upon their particular expertise and experience in relation to the seat for which they are applying; community and...

  9. 77 FR 5492 - Availability of Seat for the Florida Keys National Marine Sanctuary Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-03

    ...: Tourism--Lower Keys (member), and Tourism--Lower Keys (alternate). Applicants are chosen based upon their particular expertise and experience in relation to the seat for which they are applying; community and...

  10. Robust quantum data locking from phase modulation

    NASA Astrophysics Data System (ADS)

    Lupo, Cosmo; Wilde, Mark M.; Lloyd, Seth

    2014-08-01

    Quantum data locking is a uniquely quantum phenomenon that allows a relatively short key of constant size to (un)lock an arbitrarily long message encoded in a quantum state, in such a way that an eavesdropper who measures the state but does not know the key has essentially no information about the message. The application of quantum data locking in cryptography would allow one to overcome the limitations of the one-time pad encryption, which requires the key to have the same length as the message. However, it is known that the strength of quantum data locking is also its Achilles heel, as the leakage of a few bits of the key or the message may in principle allow the eavesdropper to unlock a disproportionate amount of information. In this paper we show that there exist quantum data locking schemes that can be made robust against information leakage by increasing the length of the key by a proportionate amount. This implies that a constant size key can still lock an arbitrarily long message as long as a fraction of it remains secret to the eavesdropper. Moreover, we greatly simplify the structure of the protocol by proving that phase modulation suffices to generate strong locking schemes, paving the way to optical experimental realizations. Also, we show that successful data locking protocols can be constructed using random code words, which very well could be helpful in discovering random codes for data locking over noisy quantum channels.

  11. Is Identification with School the Key Component in the "Black Box" of Education Outcomes? Evidence from a Randomized Experiment

    ERIC Educational Resources Information Center

    Fletcher, Jason M.

    2009-01-01

    In this paper, we follow up the important class size reduction randomized experiment in Tennessee in the mid 1980s (Project STAR) to attempt to further understand the long-lasting influences of early education interventions. While STAR led to large test score benefits during the intervention, these benefits quickly faded at its conclusion.…

  12. An Image Encryption Algorithm Utilizing Julia Sets and Hilbert Curves

    PubMed Central

    Sun, Yuanyuan; Chen, Lina; Xu, Rudan; Kong, Ruiqing

    2014-01-01

    Image encryption is an important and effective technique to protect image security. In this paper, a novel image encryption algorithm combining Julia sets and Hilbert curves is proposed. The algorithm utilizes Julia sets’ parameters to generate a random sequence as the initial keys and gets the final encryption keys by scrambling the initial keys through the Hilbert curve. The final cipher image is obtained by modulo arithmetic and diffuse operation. In this method, it needs only a few parameters for the key generation, which greatly reduces the storage space. Moreover, because of the Julia sets’ properties, such as infiniteness and chaotic characteristics, the keys have high sensitivity even to a tiny perturbation. The experimental results indicate that the algorithm has large key space, good statistical property, high sensitivity for the keys, and effective resistance to the chosen-plaintext attack. PMID:24404181

  13. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Crevillén-García, D.; Power, H.

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  14. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media.

    PubMed

    Crevillén-García, D; Power, H

    2017-08-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen-Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error.

  15. Multilevel and quasi-Monte Carlo methods for uncertainty quantification in particle travel times through random heterogeneous porous media

    PubMed Central

    Power, H.

    2017-01-01

    In this study, we apply four Monte Carlo simulation methods, namely, Monte Carlo, quasi-Monte Carlo, multilevel Monte Carlo and multilevel quasi-Monte Carlo to the problem of uncertainty quantification in the estimation of the average travel time during the transport of particles through random heterogeneous porous media. We apply the four methodologies to a model problem where the only input parameter, the hydraulic conductivity, is modelled as a log-Gaussian random field by using direct Karhunen–Loéve decompositions. The random terms in such expansions represent the coefficients in the equations. Numerical calculations demonstrating the effectiveness of each of the methods are presented. A comparison of the computational cost incurred by each of the methods for three different tolerances is provided. The accuracy of the approaches is quantified via the mean square error. PMID:28878974

  16. Document Set Differentiability Analyzer v. 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Thor D.

    Software is a JMP Scripting Language (JSL) script designed to evaluate the differentiability of a set of documents that exhibit some conceptual commonalities but are expected to describe substantially different – thus differentiable – categories. The script imports the document set, a subset of which may be partitioned into an additions pool. The bulk of the documents form a basis pool. Text analysis is applied to the basis pool to extract a mathematical representation of its conceptual content, referred to as the document concept space. A bootstrapping approach is applied to that mathematical representation in order to generate a representationmore » of a large population of randomly designed documents that could be written within the concept space, notably without actually writing the text of those documents.The Kolmogorov-Smirnov test is applied to determine whether the basis pool document set exhibits superior differentiation relative to the randomly designed virtual documents produced by bootstrapping. If an additions pool exists, the documents are incrementally added to the basis pool, choosing the best differentiated remaining document at each step. In this manner the impact of additional categories to overall document set differentiability may be assessed.The software was developed to assess the differentiability of job description document sets. Differentiability is key to meaningful categorization. Poor job differentiation may have economic, ethical, and/or legal implications for an organization. Job categories are used in the assignment of market-based salaries; consequently, poor differentiation of job duties may set the stage for legal challenges if very similar jobs pay differently depending on title, a circumstance that also invites economic waste.The software can be applied to ensure job description set differentiability, reducing legal, economic, and ethical risks to an organization and its people. The extraction of the conceptual space to a mathematical representation enables identification of exceedingly similar documents. In the event of redundancy, two jobs may be collapsed into one. If in the judgment of the subject matter experts the jobs are truly different, the conceptual similarities are highlighted, inviting inclusion of appropriate descriptive content to explicitly characterize those differences. When additional job categories may be needed as the organization changes, the software enables evaluation of proposed additions to ensure that the resulting document set remains adequately differentiated.« less

  17. Applied Technology Proficiency of High School Students in Applied and Traditional Courses

    ERIC Educational Resources Information Center

    Field, Dennis W.

    2003-01-01

    This investigation compares applied technology skill levels of high school students enrolled in various applied and comparable traditional courses, particularly Principles of Technology and physics courses respectively. Outcomes from ACT's Applied Technology Work Keys[R] assessment test were used as a measure of applied technology skill levels.…

  18. Compressive hyperspectral sensor for LWIR gas detection

    NASA Astrophysics Data System (ADS)

    Russell, Thomas A.; McMackin, Lenore; Bridge, Bob; Baraniuk, Richard

    2012-06-01

    Focal plane arrays with associated electronics and cooling are a substantial portion of the cost, complexity, size, weight, and power requirements of Long-Wave IR (LWIR) imagers. Hyperspectral LWIR imagers add significant data volume burden as they collect a high-resolution spectrum at each pixel. We report here on a LWIR Hyperspectral Sensor that applies Compressive Sensing (CS) in order to achieve benefits in these areas. The sensor applies single-pixel detection technology demonstrated by Rice University. The single-pixel approach uses a Digital Micro-mirror Device (DMD) to reflect and multiplex the light from a random assortment of pixels onto the detector. This is repeated for a number of measurements much less than the total number of scene pixels. We have extended this architecture to hyperspectral LWIR sensing by inserting a Fabry-Perot spectrometer in the optical path. This compressive hyperspectral imager collects all three dimensions on a single detection element, greatly reducing the size, weight and power requirements of the system relative to traditional approaches, while also reducing data volume. The CS architecture also supports innovative adaptive approaches to sensing, as the DMD device allows control over the selection of spatial scene pixels to be multiplexed on the detector. We are applying this advantage to the detection of plume gases, by adaptively locating and concentrating target energy. A key challenge in this system is the diffraction loss produce by the DMD in the LWIR. We report the results of testing DMD operation in the LWIR, as well as system spatial and spectral performance.

  19. Physical layer one-time-pad data encryption through synchronized semiconductor laser networks

    NASA Astrophysics Data System (ADS)

    Argyris, Apostolos; Pikasis, Evangelos; Syvridis, Dimitris

    2016-02-01

    Semiconductor lasers (SL) have been proven to be a key device in the generation of ultrafast true random bit streams. Their potential to emit chaotic signals under conditions with desirable statistics, establish them as a low cost solution to cover various needs, from large volume key generation to real-time encrypted communications. Usually, only undemanding post-processing is needed to convert the acquired analog timeseries to digital sequences that pass all established tests of randomness. A novel architecture that can generate and exploit these true random sequences is through a fiber network in which the nodes are semiconductor lasers that are coupled and synchronized to central hub laser. In this work we show experimentally that laser nodes in such a star network topology can synchronize with each other through complex broadband signals that are the seed to true random bit sequences (TRBS) generated at several Gb/s. The potential for each node to access real-time generated and synchronized with the rest of the nodes random bit streams, through the fiber optic network, allows to implement an one-time-pad encryption protocol that mixes the synchronized true random bit sequence with real data at Gb/s rates. Forward-error correction methods are used to reduce the errors in the TRBS and the final error rate at the data decoding level. An appropriate selection in the sampling methodology and properties, as well as in the physical properties of the chaotic seed signal through which network locks in synchronization, allows an error free performance.

  20. Lévy walks

    NASA Astrophysics Data System (ADS)

    Zaburdaev, V.; Denisov, S.; Klafter, J.

    2015-04-01

    Random walk is a fundamental concept with applications ranging from quantum physics to econometrics. Remarkably, one specific model of random walks appears to be ubiquitous across many fields as a tool to analyze transport phenomena in which the dispersal process is faster than dictated by Brownian diffusion. The Lévy-walk model combines two key features, the ability to generate anomalously fast diffusion and a finite velocity of a random walker. Recent results in optics, Hamiltonian chaos, cold atom dynamics, biophysics, and behavioral science demonstrate that this particular type of random walk provides significant insight into complex transport phenomena. This review gives a self-consistent introduction to Lévy walks, surveys their existing applications, including latest advances, and outlines further perspectives.

  1. Effect of Expanding Medicaid for Parents on Children’s Health Insurance Coverage

    PubMed Central

    DeVoe, Jennifer E.; Marino, Miguel; Angier, Heather; O’Malley, Jean P.; Crawford, Courtney; Nelson, Christine; Tillotson, Carrie J.; Bailey, Steffani R.; Gallia, Charles; Gold, Rachel

    2016-01-01

    IMPORTANCE In the United States, health insurance is not universal. Observational studies show an association between uninsured parents and children. This association persisted even after expansions in child-only public health insurance. Oregon’s randomized Medicaid expansion for adults, known as the Oregon Experiment, created a rare opportunity to assess causality between parent and child coverage. OBJECTIVE To estimate the effect on a child’s health insurance coverage status when (1) a parent randomly gains access to health insurance and (2) a parent obtains coverage. DESIGN, SETTING, AND PARTICIPANTS Oregon Experiment randomized natural experiment assessing the results of Oregon’s 2008 Medicaid expansion. We used generalized estimating equation models to examine the longitudinal effect of a parent randomly selected to apply for Medicaid on their child’s Medicaid or Children’s Health Insurance Program (CHIP) coverage (intent-to-treat analyses). We used per-protocol analyses to understand the impact on children’s coverage when a parent was randomly selected to apply for and obtained Medicaid. Participants included 14 409 children aged 2 to 18 years whose parents participated in the Oregon Experiment. EXPOSURES For intent-to-treat analyses, the date a parent was selected to apply for Medicaid was considered the date the child was exposed to the intervention. In per-protocol analyses, exposure was defined as whether a selected parent obtained Medicaid. MAIN OUTCOMES AND MEASURES Children’s Medicaid or CHIP coverage, assessed monthly and in 6-month intervals relative to their parent’s selection date. RESULTS In the immediate period after selection, children whose parents were selected to apply significantly increased from 3830 (61.4%) to 4152 (66.6%) compared with a nonsignificant change from 5049 (61.8%) to 5044 (61.7%) for children whose parents were not selected to apply. Children whose parents were randomly selected to apply for Medicaid had 18% higher odds of being covered in the first 6 months after parent’s selection compared with children whose parents were not selected (adjusted odds ratio [AOR] = 1.18; 95% CI, 1.10–1.27). The effect remained significant during months 7 to 12 (AOR = 1.11; 95% CI, 1.03–1.19); months 13 to 18 showed a positive but not significant effect (AOR = 1.07; 95% CI, 0.99–1.14). Children whose parents were selected and obtained coverage had more than double the odds of having coverage compared with children whose parents were not selected and did not gain coverage (AOR = 2.37; 95% CI, 2.14–2.64). CONCLUSIONS AND RELEVANCE Children’s odds of having Medicaid or CHIP coverage increased when their parents were randomly selected to apply for Medicaid. Children whose parents were selected and subsequently obtained coverage benefited most. This study demonstrates a causal link between parents’ access to Medicaid coverage and their children’s coverage. PMID:25561041

  2. The effect of neighborhood-based community organizing: results from the Seattle Minority Youth Health Project.

    PubMed

    Cheadle, A; Wagner, E; Walls, M; Diehr, P; Bell, M; Anderman, C; McBride, C; Catalano, R F; Pettigrew, E; Simmons, R; Neckerman, H

    2001-08-01

    To evaluate the effect of a community mobilization and youth development strategy to prevent drug abuse, violence, and risky sexual activity. Primary surveys of youth, parents, and key neighborhood leaders were carried out at baseline (1994) and at the end of the intervention period (1997). The study took place in four intervention and six control neighborhoods in Seattle. The study was designed as a randomized controlled trial with neighborhood as the unit of randomization. The intervention consisted of a paid community organizer in each neighborhood who recruited a group of residents to serve as a community action board. Key variables included perceptions of neighborhood mobilization by youth, parents, and key neighborhood leaders. Youth surveys were self-administered during school hours. Parent and neighborhood leader surveys were conducted over the phone by trained interviewers. Survey results showed that mobilization increased to the same degree in both intervention and control neighborhoods with no evidence of an overall intervention effect. There did appear to be a relative increase in mobilization in the neighborhood with the highest level of intervention activity. This randomized study failed to demonstrate a measurable effect for a community mobilization intervention. It is uncertain whether the negative finding was because of a lack of strength of the interventions or problems detecting intervention effects using individual-level closed-end surveys.

  3. Data-Division-Specific Robustness and Power of Randomization Tests for ABAB Designs

    ERIC Educational Resources Information Center

    Manolov, Rumen; Solanas, Antonio; Bulte, Isis; Onghena, Patrick

    2010-01-01

    This study deals with the statistical properties of a randomization test applied to an ABAB design in cases where the desirable random assignment of the points of change in phase is not possible. To obtain information about each possible data division, the authors carried out a conditional Monte Carlo simulation with 100,000 samples for each…

  4. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    PubMed

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.

    PubMed

    Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L

    2016-02-09

    Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.

  6. Efficacy of the unified protocol for the treatment of emotional disorders in the Spanish public mental health system using a group format: study protocol for a multicenter, randomized, non-inferiority controlled trial.

    PubMed

    Osma, Jorge; Suso-Ribera, Carlos; García-Palacios, Azucena; Crespo-Delgado, Elena; Robert-Flor, Cristina; Sánchez-Guerrero, Ana; Ferreres-Galan, Vanesa; Pérez-Ayerra, Luisa; Malea-Fernández, Amparo; Torres-Alfosea, Mª Ángeles

    2018-03-12

    Emotional disorders, which include both anxiety and depressive disorders, are the most prevalent psychological disorders according to recent epidemiological studies. Consequently, public costs associated with their treatment have become a matter of concern for public health systems, which face long waiting lists. Because of their high prevalence in the population, finding an effective treatment for emotional disorders has become a key goal of today's clinical psychology. The Unified Protocol for the Transdiagnostic Treatment of Emotional Disorders might serve the aforementioned purpose, as it can be applied to a variety of disorders simultaneously and it can be easily performed in a group format. The study is a multicenter, randomized, non-inferiority controlled clinical trial. Participants will be 220 individuals with emotional disorders, who are randomized to either a treatment as usual (individual cognitive behavioral therapy) or to a Unified Protocol condition in group format. Depression, anxiety, and diagnostic criteria are the primary outcome measures. Secondary measures include the assessment of positive and negative affect, anxiety control, personality traits, overall adjustment, and quality of life. An analysis of treatment satisfaction is also conducted. Assessment points include baseline, post-treatment, and three follow-ups at 3, 6, and 12 months. To control for missing data and possible biases, intention-to-treat and per-protocol analyses will be performed. This is the first randomized, controlled clinical trial to test the effectiveness of a transdiagnostic intervention in a group format for the treatment of emotional disorders in public settings in Spain. Results obtained from this study may have important clinical, social, and economic implications for public mental health settings in Spain. Retrospectively registered at https://clinicaltrials.gov/ . Trial NCT03064477 (March 10, 2017). The trial is active and recruitment is ongoing. Recruitment is expected to finish by January 2020.

  7. Unpredictability of escape trajectory explains predator evasion ability and microhabitat preference of desert rodents.

    PubMed

    Moore, Talia Y; Cooper, Kimberly L; Biewener, Andrew A; Vasudevan, Ramanarayan

    2017-09-05

    Mechanistically linking movement behaviors and ecology is key to understanding the adaptive evolution of locomotion. Predator evasion, a behavior that enhances fitness, may depend upon short bursts or complex patterns of locomotion. However, such movements are poorly characterized by existing biomechanical metrics. We present methods based on the entropy measure of randomness from Information Theory to quantitatively characterize the unpredictability of non-steady-state locomotion. We then apply the method by examining sympatric rodent species whose escape trajectories differ in dimensionality. Unlike the speed-regulated gait use of cursorial animals to enhance locomotor economy, bipedal jerboa (family Dipodidae) gait transitions likely enhance maneuverability. In field-based observations, jerboa trajectories are significantly less predictable than those of quadrupedal rodents, likely increasing predator evasion ability. Consistent with this hypothesis, jerboas exhibit lower anxiety in open fields than quadrupedal rodents, a behavior that varies inversely with predator evasion ability. Our unpredictability metric expands the scope of quantitative biomechanical studies to include non-steady-state locomotion in a variety of evolutionary and ecologically significant contexts.Biomechanical understanding of animal gait and maneuverability has primarily been limited to species with more predictable, steady-state movement patterns. Here, the authors develop a method to quantify movement predictability, and apply the method to study escape-related movement in several species of desert rodents.

  8. Compressed Sensing for Metrics Development

    NASA Astrophysics Data System (ADS)

    McGraw, R. L.; Giangrande, S. E.; Liu, Y.

    2012-12-01

    Models by their very nature tend to be sparse in the sense that they are designed, with a few optimally selected key parameters, to provide simple yet faithful representations of a complex observational dataset or computer simulation output. This paper seeks to apply methods from compressed sensing (CS), a new area of applied mathematics currently undergoing a very rapid development (see for example Candes et al., 2006), to FASTER needs for new approaches to model evaluation and metrics development. The CS approach will be illustrated for a time series generated using a few-parameter (i.e. sparse) model. A seemingly incomplete set of measurements, taken at a just few random sampling times, is then used to recover the hidden model parameters. Remarkably there is a sharp transition in the number of required measurements, beyond which both the model parameters and time series are recovered exactly. Applications to data compression, data sampling/collection strategies, and to the development of metrics for model evaluation by comparison with observation (e.g. evaluation of model predictions of cloud fraction using cloud radar observations) are presented and discussed in context of the CS approach. Cited reference: Candes, E. J., Romberg, J., and Tao, T. (2006), Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information, IEEE Transactions on Information Theory, 52, 489-509.

  9. A novel research design can aid disinvestment from existing health technologies with uncertain effectiveness, cost-effectiveness, and/or safety.

    PubMed

    Haines, Terry; O'Brien, Lisa; McDermott, Fiona; Markham, Donna; Mitchell, Deb; Watterson, Dina; Skinner, Elizabeth

    2014-02-01

    Disinvestment is critical for ensuring the long-term sustainability of health-care services. Key barriers to disinvestment are heterogeneity between research and clinical settings, absence of evidence of effectiveness of some health technologies, and exposure of patients and organizations to risks and poor outcomes. We aimed to develop a feasible research design that can evaluate disinvestment in health technologies of uncertain effectiveness or cost-effectiveness. This article (1) establishes the need for disinvestment methodologies, (2) identifies the ethical concerns and feasibility constraints of conventional research designs for this issue, (3) describes the planning, implementation, and analytical framework for a novel disinvestment-specific study design, and (4) describes potential limitations in application of this design. The stepped-wedge, roll-in cluster randomized controlled trial can facilitate the disinvestment process, whereas generating evidence to determine whether the decision to disinvest was sound in the clinical environment. A noninferiority research paradigm may be applied to this methodology to demonstrate that the removal of a health technology does not adversely affect outcomes. This research design can be applied across multiple fields and will assist determination of whether specific health technologies are clinically effective, cost-effective, and safe. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. SYMPOSIUM REPORT: An Evidence-Based Approach to IBS and CIC: Applying New Advances to Daily Practice: A Review of an Adjunct Clinical Symposium of the American College of Gastroenterology Meeting October 16, 2016 • Las Vegas, Nevada.

    PubMed

    Chey, William D

    2017-02-01

    Many nonpharmacologic and pharmacologic therapies are available to manage irritable bowel syndrome (IBS) and chronic idiopathic constipation (CIC). The American College of Gastroenterology (ACG) regularly publishes reviews on IBS and CIC therapies. The most recent of these reviews was published by the ACG Task Force on the Management of Functional Bowel Disorders in 2014. The key objective of this review was to evaluate the efficacy of therapies for IBS or CIC compared with placebo or no treatment in randomized controlled trials. Evidence-based approaches to managing diarrhea-predominant IBS include dietary measures, such as a diet low in gluten and fermentable oligo-, di-, and monosaccharides and polyols (FODMAPs); loperamide; antispasmodics; peppermint oil; probiotics; tricyclic antidepressants; alosetron; eluxadoline, and rifaximin. Evidence-based approaches to managing constipation-predominant IBS and CIC include fiber, stimulant laxatives, polyethylene glycol, selective serotonin reuptake inhibitors, lubiprostone, and guanylate cyclase agonists. With the growing evidence base for IBS and CIC therapies, it has become increasingly important for clinicians to assess the quality of evidence and understand how to apply it to the care of individual patients.

  11. IMBLMS phase B4, additional tasks 5.0. Microbial identification system

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A laboratory study was undertaken to provide simplified procedures leading to the presumptive identification (I/D) of defined microorganisms on-board an orbiting spacecraft. Identifications were to be initiated by nonprofessional bacteriologists, (crew members) on a contingency basis only. Key objectives/constraints for this investigation were as follows:(1) I/D procedures based on limited, defined diagnostic tests, (2) testing oriented about ten selected microorganisms, (3) provide for definitive I/D key and procedures per selected organism, (4) define possible occurrences of false positives for the resulting I/D key by search of the appropriate literature, and (5) evaluation of the I/D key and procedure through a limited field trial on randomly selected subjects using the I/D key.

  12. Distinctions between fraud, bias, errors, misunderstanding, and incompetence.

    PubMed

    DeMets, D L

    1997-12-01

    Randomized clinical trials are challenging not only in their design and analysis, but in their conduct as well. Despite the best intentions and efforts, problems often arise in the conduct of trials, including errors, misunderstandings, and bias. In some instances, key players in a trial may discover that they are not able or competent to meet requirements of the study. In a few cases, fraudulent activity occurs. While none of these problems is desirable, randomized clinical trials are usually found sufficiently robust by many key individuals to produce valid results. Other problems are not tolerable. Confusion may arise among scientists, scientific and lay press, and the public about the distinctions between these areas and their implications. We shall try to define these problems and illustrate their impact through a series of examples.

  13. Issues Relating to Selective Reporting When Including Non-Randomized Studies in Systematic Reviews on the Effects of Healthcare Interventions

    ERIC Educational Resources Information Center

    Norris, Susan L.; Moher, David; Reeves, Barnaby C.; Shea, Beverley; Loke, Yoon; Garner, Sarah; Anderson, Laurie; Tugwell, Peter; Wells, George

    2013-01-01

    Background: Selective outcome and analysis reporting (SOR and SAR) occur when only a subset of outcomes measured and analyzed in a study is fully reported, and are an important source of potential bias. Key methodological issues: We describe what is known about the prevalence and effects of SOR and SAR in both randomized controlled trials (RCTs)…

  14. The Effects of Student Coaching in College: An Evaluation of a Randomized Experiment in Student Mentoring. NBER Working Paper No. 16881

    ERIC Educational Resources Information Center

    Bettinger, Eric; Baker, Rachel

    2011-01-01

    College completion and college success often lag behind college attendance. One theory as to why students do not succeed in college is that they lack key information about how to be successful or fail to act on the information that they have. We present evidence from a randomized experiment which tests the effectiveness of individualized student…

  15. The Impact of Massage Therapy on Function in Pain Populations—A Systematic Review and Meta-Analysis of Randomized Controlled Trials: Part III, Surgical Pain Populations

    PubMed Central

    Boyd, Courtney; Crawford, Cindy; Paat, Charmagne F; Price, Ashley; Xenakis, Lea; Zhang, Weimin; Buckenmaier, Chester; Buckenmaier, Pamela; Cambron, Jerrilyn; Deery, Christopher; Schwartz, Jan; Werner, Ruth; Whitridge, Pete

    2016-01-01

    Abstract Objective Pain is multi-dimensional and may be better addressed through a holistic, biopsychosocial approach. Massage therapy is commonly practiced among patients seeking pain management; however, its efficacy is unclear. This systematic review and meta-analysis is the first to rigorously assess the quality of the evidence for massage therapy’s efficacy in treating pain, function-related, and health-related quality of life outcomes in surgical pain populations. Methods Key databases were searched from inception through February 2014. Eligible randomized controlled trials were assessed for methodological quality using SIGN 50 Checklist. Meta-analysis was applied at the outcome level. A professionally diverse steering committee interpreted the results to develop recommendations. Results Twelve high quality and four low quality studies were included in the review. Results indicate massage therapy is effective for treating pain [standardized mean difference (SMD) = −0.79] and anxiety (SMD = −0.57) compared to active comparators. Conclusion Based on the available evidence, weak recommendations are suggested for massage therapy, compared to active comparators for reducing pain intensity/severity and anxiety in patients undergoing surgical procedures. This review also discusses massage therapy safety, challenges within this research field, how to address identified research gaps, and next steps for future research. PMID:27165970

  16. Building young women's knowledge and skills in female condom use: lessons learned from a South African intervention.

    PubMed

    Schuyler, A C; Masvawure, T B; Smit, J A; Beksinska, M; Mabude, Z; Ngoloyi, C; Mantell, J E

    2016-04-01

    Partner negotiation and insertion difficulties are key barriers to female condom (FC) use in sub-Saharan Africa. Few FC interventions have provided comprehensive training in both negotiation and insertion skills, or focused on university students. In this study we explored whether training in FC insertion and partner negotiation influenced young women's FC use. 296 female students at a South African university were randomized to a one-session didactic information-only minimal intervention (n= 149) or a two-session cognitive-behavioral enhanced intervention (n= 147), which received additional information specific to partner negotiation and FC insertion. Both groups received FCs. We report the 'experiences of' 39 randomly selected female students who participated in post-intervention qualitative interviews. Two-thirds of women reported FC use. Most women (n= 30/39) applied information learned during the interventions to negotiate with partners. Women reported that FC insertion practice increased their confidence. Twelve women failed to convince male partners to use the FC, often due to its physical attributes or partners' lack of knowledge about insertion. FC educational and skills training can help facilitate use, improve attitudes toward the device and help women to successfully negotiate safer sex with partners. Innovative strategies and tailored interventions are needed to increase widespread FC adoption. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  17. Devil's vortex Fresnel lens phase masks on an asymmetric cryptosystem based on phase-truncation in gyrator wavelet transform domain

    NASA Astrophysics Data System (ADS)

    Singh, Hukum

    2016-06-01

    An asymmetric scheme has been proposed for optical double images encryption in the gyrator wavelet transform (GWT) domain. Grayscale and binary images are encrypted separately using double random phase encoding (DRPE) in the GWT domain. Phase masks based on devil's vortex Fresnel Lens (DVFLs) and random phase masks (RPMs) are jointly used in spatial as well as in the Fourier plane. The images to be encrypted are first gyrator transformed and then single-level discrete wavelet transformed (DWT) to decompose LL , HL , LH and HH matrices of approximation, horizontal, vertical and diagonal coefficients. The resulting coefficients from the DWT are multiplied by other RPMs and the results are applied to inverse discrete wavelet transform (IDWT) for obtaining the encrypted images. The images are recovered from their corresponding encrypted images by using the correct parameters of the GWT, DVFL and its digital implementation has been performed using MATLAB 7.6.0 (R2008a). The mother wavelet family, DVFL and gyrator transform orders associated with the GWT are extra keys that cause difficulty to an attacker. Thus, the scheme is more secure as compared to conventional techniques. The efficacy of the proposed scheme is verified by computing mean-squared-error (MSE) between recovered and the original images. The sensitivity of the proposed scheme is verified with encryption parameters and noise attacks.

  18. Efficacy of platelet-rich plasma applied to post-extraction retained lower third molar alveoli. A systematic review.

    PubMed

    Barona-Dorado, C; González-Regueiro, I; Martín-Ares, M; Arias-Irimia, O; Martínez-González, J-M

    2014-03-01

    Dental retentions have a high prevalence among the general population and their removal can involve multiple complications. The use of platelet rich plasma has been proposed in an attempt to avoid these complications, as it contains high growth factors and stimulates diverse biological functions that facilitate the healing of soft and hard tissues. To evaluate the available scientific evidence related to the application of platelet-rich plasma in the post-extraction alveoli of a retained lower third molars. A systematic review of published literature registered in the Medline, EMBASE, Cochrane and NIH databases. The following categories were included: human randomized clinical studies. Key search words were: platelet rich plasma; platelet rich plasma and oral surgery; platelet rich in growth factors and third molar. Of 101 potentially valid articles, seven were selected, of which four were rejected as they failed to meet quality criteria. Three studies fulfilled all selection and quality criteria: Ogundipe et al.; Rutkowski et al.; Haraji et al. The studies all measured osteoblast activity by means of sintigraphy, and also registered pain, bleeding, inflammation, temperature, numbness as perceived by the patients, radiological bone density and the incidence of alveolar osteitis. Scientific evidence for the use of PRP in retained third molar surgery is poor. For this reason randomized clinical trials are needed before recommendations for the clinical application of PRP can be made.

  19. Delay and cost performance analysis of the diffie-hellman key exchange protocol in opportunistic mobile networks

    NASA Astrophysics Data System (ADS)

    Soelistijanto, B.; Muliadi, V.

    2018-03-01

    Diffie-Hellman (DH) provides an efficient key exchange system by reducing the number of cryptographic keys distributed in the network. In this method, a node broadcasts a single public key to all nodes in the network, and in turn each peer uses this key to establish a shared secret key which then can be utilized to encrypt and decrypt traffic between the peer and the given node. In this paper, we evaluate the key transfer delay and cost performance of DH in opportunistic mobile networks, a specific scenario of MANETs where complete end-to-end paths rarely exist between sources and destinations; consequently, the end-to-end delays in these networks are much greater than typical MANETs. Simulation results, driven by a random node movement model and real human mobility traces, showed that DH outperforms a typical key distribution scheme based on the RSA algorithm in terms of key transfer delay, measured by average key convergence time; however, DH performs as well as the benchmark in terms of key transfer cost, evaluated by total key (copies) forwards.

  20. On fatigue crack growth under random loading

    NASA Astrophysics Data System (ADS)

    Zhu, W. Q.; Lin, Y. K.; Lei, Y.

    1992-09-01

    A probabilistic analysis of the fatigue crack growth, fatigue life and reliability of a structural or mechanical component is presented on the basis of fracture mechanics and theory of random processes. The material resistance to fatigue crack growth and the time-history of the stress are assumed to be random. Analytical expressions are obtained for the special case in which the random stress is a stationary narrow-band Gaussian random process, and a randomized Paris-Erdogan law is applicable. As an example, the analytical method is applied to a plate with a central crack, and the results are compared with those obtained from digital Monte Carlo simulations.

  1. Queuing Theory and Reference Transactions.

    ERIC Educational Resources Information Center

    Terbille, Charles

    1995-01-01

    Examines the implications of applying the queuing theory to three different reference situations: (1) random patron arrivals; (2) random durations of transactions; and (3) use of two librarians. Tables and figures represent results from spreadsheet calculations of queues for each reference situation. (JMV)

  2. The Efficacy of Mindfulness-Based Interventions in Primary Care: A Meta-Analytic Review.

    PubMed

    Demarzo, Marcelo M P; Montero-Marin, Jesús; Cuijpers, Pim; Zabaleta-del-Olmo, Edurne; Mahtani, Kamal R; Vellinga, Akke; Vicens, Caterina; López-del-Hoyo, Yolanda; García-Campayo, Javier

    2015-11-01

    Positive effects have been reported after mindfulness-based interventions (MBIs) in diverse clinical and nonclinical populations. Primary care is a key health care setting for addressing common chronic conditions, and an effective MBI designed for this setting could benefit countless people worldwide. Meta-analyses of MBIs have become popular, but little is known about their efficacy in primary care. Our aim was to investigate the application and efficacy of MBIs that address primary care patients. We performed a meta-analytic review of randomized controlled trials addressing the effect of MBIs in adult patients recruited from primary care settings. The PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) and Cochrane guidelines were followed. Effect sizes were calculated with the Hedges g in random effects models. The meta-analyses were based on 6 trials having a total of 553 patients. The overall effect size of MBI compared with a control condition for improving general health was moderate (g = 0.48; P = .002), with moderate heterogeneity (I(2) = 59; P <.05). We found no indication of publication bias in the overall estimates. MBIs were efficacious for improving mental health (g = 0.56; P = .007), with a high heterogeneity (I(2) = 78; P <.01), and for improving quality of life (g = 0.29; P = .002), with a low heterogeneity (I(2) = 0; P >.05). Although the number of randomized controlled trials applying MBIs in primary care is still limited, our results suggest that these interventions are promising for the mental health and quality of life of primary care patients. We discuss innovative approaches for implementing MBIs, such as complex intervention and stepped care. © 2015 Annals of Family Medicine, Inc.

  3. Effect of self-administered foot reflexology for symptom management in healthy persons: a systematic review and meta-analysis.

    PubMed

    Song, Hyun Jin; Son, Heejeong; Seo, Hyun-Ju; Lee, Heeyoung; Choi, Sun Mi; Lee, Sanghun

    2015-02-01

    Self-administered foot reflexology is unrestricted by time and space, economical, and practical because it is easy to learn and apply. This study estimated the effectiveness of self-foot reflexology for symptom management in healthy persons through a systematic review and meta-analysis. The participants were healthy persons not diagnosed with a specific disease. The intervention was foot reflexology administered by participants, not by practitioners or healthcare providers. The comparative studies either between groups or within group comparison were included. Our search utilized core databases (MEDLINE, EMBASE, Cochrane, and CINAHL). We also searched Chinese (CNKI), Japanese (J-STAGE), and Korean databases (KoreaMed, KMbase, KISS, NDSL, KISTI, and OASIS). The search was used MeSH terminology and key words (foot reflexology, foot massage, and self). Analysis of three non-randomized trials and three before-and-after studies showed that self-administered foot reflexology resulted in significant improvement in subjective outcomes such as perceived stress, fatigue, and depression. However, there was no significant improvement in objective outcomes such as cortisol levels, blood pressure, and pulse rate. We did not find any randomized controlled trial. This study presents the effectiveness of self-administered foot reflexology for healthy persons' psychological and physiological symptoms. While objective outcomes showed limited results, significant improvements were found in subjective outcomes. However, owing to the small number of studies and methodological flaws, there was insufficient evidence supporting the use of self-performed foot reflexology. Well-designed randomized controlled trials are needed to assess the effect of self-administered foot reflexology in healthy people. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. A quality assessment of randomized controlled trial reports in endodontics.

    PubMed

    Lucena, C; Souza, E M; Voinea, G C; Pulgar, R; Valderrama, M J; De-Deus, G

    2017-03-01

    To assess the quality of the randomized clinical trial (RCT) reports published in Endodontics between 1997 and 2012. Retrieval of RCTs in Endodontics was based on a search of the Thomson Reuters Web of Science (WoS) database (March 2013). Quality evaluation was performed using a checklist based on the Jadad criteria, CONSORT (Consolidated Standards of Reporting Trials) statement and SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials). Descriptive statistics were used for frequency distribution of data. Student's t-test and Welch test were used to identify the influence of certain trial characteristics upon report quality (α = 0.05). A total of 89 RCTs were evaluated, and several methodological flaws were found: only 45% had random sequence generation at low risk of bias, 75% did not provide information on allocation concealment, and 19% were nonblinded designs. Regarding statistics, only 55% of the RCTs performed adequate sample size estimations, only 16% presented confidence intervals, and 25% did not provide the exact P-value. Also, 2% of the articles used no statistical tests, and in 87% of the RCTs, the information provided was insufficient to determine whether the statistical methodology applied was appropriate or not. Significantly higher scores were observed for multicentre trials (P = 0.023), RCTs signed by more than 5 authors (P = 0.03), articles belonging to journals ranked above the JCR median (P = 0.03), and articles complying with the CONSORT guidelines (P = 0.000). The quality of RCT reports in key areas for internal validity of the study was poor. Several measures, such as compliance with the CONSORT guidelines, are important in order to raise the quality of RCTs in Endodontics. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  5. Assessing the effectiveness of a pharmacist-delivered smoking cessation program in the State of Qatar: study protocol for a randomized controlled trial.

    PubMed

    El Hajj, Maguy Saffouh; Kheir, Nadir; Al Mulla, Ahmad Mohd; Al-Badriyeh, Daoud; Al Kaddour, Ahmad; Mahfoud, Ziyad R; Salehi, Mohammad; Fanous, Nadia

    2015-02-26

    It had been reported that up to 37% of the adult male population smokes cigarettes in Qatar. The Global Youth Tobacco Survey also stated that 13.4% of male school students aged 13 to 15 years in Qatar smoke cigarettes. Smoking cessation is key to reducing smoking-related diseases and deaths. Healthcare providers are in an ideal position to encourage smoking cessation. Pharmacists are the most accessible healthcare providers and are uniquely situated to initiate behavior change among patients. Many studies have shown that pharmacists can be successful in helping patients quit smoking. Studies demonstrating the effectiveness of pharmacist-delivered smoking cessation programs are lacking in Qatar. This proposal aims to test the effect of a structured smoking cessation program delivered by trained ambulatory pharmacists in Qatar. A prospective, randomized, controlled trial is conducted at eight ambulatory pharmacies in Qatar. Participants are randomly assigned to receive an at least four-session face-to-face structured patient-specific smoking cessation program conducted by the pharmacist or 5 to 10 min of unstructured brief smoking cessation advice (emulating current practice) given by the pharmacist. Both groups are offered nicotine replacement therapy if feasible. The primary outcome of smoking cessation will be confirmed by an exhaled carbon monoxide test at 12 months. Secondary outcomes constitute quality-of-life adjustment as well as cost analysis of program resources consumed, including per case and patient outcome. If proven to be effective, this smoking cessation program will be considered as a model that Qatar and the region can apply to decrease the smoking burden. Clinical Trials NCT02123329 .

  6. Therapeutic clowns in pediatrics: a systematic review and meta-analysis of randomized controlled trials.

    PubMed

    Sridharan, Kannan; Sivaramakrishnan, Gowri

    2016-10-01

    Children and/or their parents are in fear and anxiety when admitted to hospitals or undergo invasive surgeries or investigations. Clown therapy has been shown as an effective measure in reducing this hospital fear and anxiety. Hence, we carried out a systematic compilation of the existing evidence on the clinical utility of hospital clowns in pediatric population. Electronic databases were searched with an appropriate search strategy, and only randomized controlled trials comparing the effect of clown therapy with standard care in children were included. The key outcome measures were as follows: extent of anxiety and pain felt by children and extent of state and trait parental anxiety. Random effect model was applied when moderate to severe heterogeneity was observed. Forest plot, I(2) statistics and risk of bias were evaluated using RevMan 5.3 software. A total of 19 studies were found eligible to be included in the systematic review and 16 for meta-analysis. The pooled SMD [95 % CI] for child anxiety score was -0.83 [-1.16, -0.51] favoring clown therapy. Similarly, a statistically significant reduction {SMD [95 % CI] -0.46 [-0.7, -0.21]} in the state anxiety was observed amongst parents. We found that hospital clowns play a significant role in reducing stress and anxiety levels in children admitted to hospitals as well as their parents. • Trials with clown doctors in pediatric population have shown conflicting results in allaying anxiety amongst children undergoing either hospitalization or invasive procedures What is new: • This is the first systematic review and meta-analysis on hospital clowns • We found out that hospital clowns reduce anxiety amongst children before undergoing either hospitalization or invasive procedures.

  7. The Efficacy of Mindfulness-Based Interventions in Primary Care: A Meta-Analytic Review

    PubMed Central

    Demarzo, Marcelo M.P.; Montero-Marin, Jesús; Cuijpers, Pim; Zabaleta-del-Olmo, Edurne; Mahtani, Kamal R.; Vellinga, Akke; Vicens, Caterina; López-del-Hoyo, Yolanda; García-Campayo, Javier

    2015-01-01

    PURPOSE Positive effects have been reported after mindfulness-based interventions (MBIs) in diverse clinical and nonclinical populations. Primary care is a key health care setting for addressing common chronic conditions, and an effective MBI designed for this setting could benefit countless people worldwide. Meta-analyses of MBIs have become popular, but little is known about their efficacy in primary care. Our aim was to investigate the application and efficacy of MBIs that address primary care patients. METHODS We performed a meta-analytic review of randomized controlled trials addressing the effect of MBIs in adult patients recruited from primary care settings. The PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) and Cochrane guidelines were followed. Effect sizes were calculated with the Hedges g in random effects models. RESULTS The meta-analyses were based on 6 trials having a total of 553 patients. The overall effect size of MBI compared with a control condition for improving general health was moderate (g = 0.48; P = .002), with moderate heterogeneity (I2 = 59; P <.05). We found no indication of publication bias in the overall estimates. MBIs were efficacious for improving mental health (g = 0.56; P = .007), with a high heterogeneity (I2 = 78; P <.01), and for improving quality of life (g = 0.29; P = .002), with a low heterogeneity (I2 = 0; P >.05). CONCLUSIONS Although the number of randomized controlled trials applying MBIs in primary care is still limited, our results suggest that these interventions are promising for the mental health and quality of life of primary care patients. We discuss innovative approaches for implementing MBIs, such as complex intervention and stepped care. PMID:26553897

  8. Packet Randomized Experiments for Eliminating Classes of Confounders

    PubMed Central

    Pavela, Greg; Wiener, Howard; Fontaine, Kevin R.; Fields, David A.; Voss, Jameson D.; Allison, David B.

    2014-01-01

    Background Although randomization is considered essential for causal inference, it is often not possible to randomize in nutrition and obesity research. To address this, we develop a framework for an experimental design—packet randomized experiments (PREs), which improves causal inferences when randomization on a single treatment variable is not possible. This situation arises when subjects are randomly assigned to a condition (such as a new roommate) which varies in one characteristic of interest (such as weight), but also varies across many others. There has been no general discussion of this experimental design, including its strengths, limitations, and statistical properties. As such, researchers are left to develop and apply PREs on an ad hoc basis, limiting its potential to improve causal inferences among nutrition and obesity researchers. Methods We introduce PREs as an intermediary design between randomized controlled trials and observational studies. We review previous research that used the PRE design and describe its application in obesity-related research, including random roommate assignments, heterochronic parabiosis, and the quasi-random assignment of subjects to geographic areas. We then provide a statistical framework to control for potential packet-level confounders not accounted for by randomization. Results PREs have successfully been used to improve causal estimates of the effect of roommates, altitude, and breastfeeding on weight outcomes. When certain assumptions are met, PREs can asymptotically control for packet-level characteristics. This has the potential to statistically estimate the effect of a single treatment even when randomization to a single treatment did not occur. Conclusions Applying PREs to obesity-related research will improve decisions about clinical, public health, and policy actions insofar as it offers researchers new insight into cause and effect relationships among variables. PMID:25444088

  9. Four-dimensional key design in amplitude, phase, polarization and distance for optical encryption based on polarization digital holography and QR code.

    PubMed

    Lin, Chao; Shen, Xueju; Li, Baochen

    2014-08-25

    We demonstrate that all parameters of optical lightwave can be simultaneously designed as keys in security system. This multi-dimensional property of key can significantly enlarge the key space and further enhance the security level of the system. The single-shot off-axis digital holography with orthogonal polarized reference waves is employed to perform polarization state recording on object wave. Two pieces of polarization holograms are calculated and fabricated to be arranged in reference arms to generate random amplitude and phase distribution respectively. When reconstruction, original information which is represented with QR code can be retrieved using Fresnel diffraction with decryption keys and read out noise-free. Numerical simulation results for this cryptosystem are presented. An analysis on the key sensitivity and fault tolerance properties are also provided.

  10. Finite-key security analyses on passive decoy-state QKD protocols with different unstable sources.

    PubMed

    Song, Ting-Ting; Qin, Su-Juan; Wen, Qiao-Yan; Wang, Yu-Kun; Jia, Heng-Yue

    2015-10-16

    In quantum communication, passive decoy-state QKD protocols can eliminate many side channels, but the protocols without any finite-key analyses are not suitable for in practice. The finite-key securities of passive decoy-state (PDS) QKD protocols with two different unstable sources, type-II parametric down-convention (PDC) and phase randomized weak coherent pulses (WCPs), are analyzed in our paper. According to the PDS QKD protocols, we establish an optimizing programming respectively and obtain the lower bounds of finite-key rates. Under some reasonable values of quantum setup parameters, the lower bounds of finite-key rates are simulated. The simulation results show that at different transmission distances, the affections of different fluctuations on key rates are different. Moreover, the PDS QKD protocol with an unstable PDC source can resist more intensity fluctuations and more statistical fluctuation.

  11. Disentangling giant component and finite cluster contributions in sparse random matrix spectra.

    PubMed

    Kühn, Reimer

    2016-04-01

    We describe a method for disentangling giant component and finite cluster contributions to sparse random matrix spectra, using sparse symmetric random matrices defined on Erdős-Rényi graphs as an example and test bed. Our methods apply to sparse matrices defined in terms of arbitrary graphs in the configuration model class, as long as they have finite mean degree.

  12. Hurricane Katrina-linked environmental injustice: race, class, and place differentials in attitudes.

    PubMed

    Adeola, Francis O; Picou, J Steven

    2017-04-01

    Claims of environmental injustice, human neglect, and racism dominated the popular and academic literature after Hurricane Katrina struck the United States in August 2005. A systematic analysis of environmental injustice from the perspective of the survivors remains scanty or nonexistent. This paper presents, therefore, a systematic empirical analysis of the key determinants of Katrina-induced environmental injustice attitudes among survivors in severely affected parishes (counties) in Louisiana and Mississippi three years into the recovery process. Statistical models based on a random sample of survivors were estimated, with the results revealing significant predictors such as age, children in household under 18, education, homeownership, and race. The results further indicate that African-Americans were more likely to perceive environmental injustice following Katrina than their white counterparts. Indeed, the investigation reveals that there are substantial racial gaps in measures of environmental injustice. The theoretical, methodological, and applied policy implications of these findings are discussed. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.

  13. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    NASA Astrophysics Data System (ADS)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  14. Multiobjective robust design of the double wishbone suspension system based on particle swarm optimization.

    PubMed

    Cheng, Xianfu; Lin, Yuqun

    2014-01-01

    The performance of the suspension system is one of the most important factors in the vehicle design. For the double wishbone suspension system, the conventional deterministic optimization does not consider any deviations of design parameters, so design sensitivity analysis and robust optimization design are proposed. In this study, the design parameters of the robust optimization are the positions of the key points, and the random factors are the uncertainties in manufacturing. A simplified model of the double wishbone suspension is established by software ADAMS. The sensitivity analysis is utilized to determine main design variables. Then, the simulation experiment is arranged and the Latin hypercube design is adopted to find the initial points. The Kriging model is employed for fitting the mean and variance of the quality characteristics according to the simulation results. Further, a particle swarm optimization method based on simple PSO is applied and the tradeoff between the mean and deviation of performance is made to solve the robust optimization problem of the double wishbone suspension system.

  15. Critical factors for the success of orthodontic mini-implants: a systematic review.

    PubMed

    Chen, Yan; Kyung, Hee Moon; Zhao, Wen Ting; Yu, Won Jae

    2009-03-01

    This systematic review was undertaken to discuss factors that affect mini-implants as direct and indirect orthodontic anchorage. The data were collected from electronic databases (Medline [Entrez PubMed], Embase, Web of Science, Cochrane Library, and All Evidence Based Medicine Reviews). Randomized clinical trials, prospective and retrospective clinical studies, and clinical trials concerning the properties, affective factors, and requirements of mini-implants were considered. The titles and abstracts that appeared to fulfill the initial selection criteria were collected by consensus, and the original articles were retrieved and evaluated with a methodologic checklist. A hand search of key orthodontic journals was performed to identify recent unindexed literature. The search strategy resulted in 596 articles. By screening titles and abstracts, 126 articles were identified. After the exclusion criteria were applied, 16 articles remained. The analyzed results of the literature were divided into 2 topics: placement-related and loading-related factors. Mini-implants are effective as anchorage, and their success depends on proper initial mechanical stability and loading quality and quantity.

  16. Sensor-Based Optimization Model for Air Quality Improvement in Home IoT

    PubMed Central

    Kim, Jonghyuk

    2018-01-01

    We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market. PMID:29570684

  17. Sensor-Based Optimization Model for Air Quality Improvement in Home IoT.

    PubMed

    Kim, Jonghyuk; Hwangbo, Hyunwoo

    2018-03-23

    We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market.

  18. Combining Different Privacy-Preserving Record Linkage Methods for Hospital Admission Data.

    PubMed

    Stausberg, Jürgen; Waldenburger, Andreas; Borgs, Christian; Schnell, Rainer

    2017-01-01

    Record linkage (RL) is the process of identifying pairs of records that correspond to the same entity, for example the same patient. The basic approach assigns to each pair of records a similarity weight, and then determines a certain threshold, above which the two records are considered to be a match. Three different RL methods were applied under privacy-preserving conditions on hospital admission data: deterministic RL (DRL), probabilistic RL (PRL), and Bloom filters. The patient characteristics like names were one-way encrypted (DRL, PRL) or transformed to a cryptographic longterm key (Bloom filters). Based on one year of hospital admissions, the data set was split randomly in 30 thousand new and 1,5 million known patients. With the combination of the three RL-methods, a positive predictive value of 83 % (95 %-confidence interval 65 %-94 %) was attained. Thus, the application of the presented combination of RL-methods seem to be suited for other applications of population-based research.

  19. Building a profile of subjective well-being for social media users.

    PubMed

    Chen, Lushi; Gong, Tao; Kosinski, Michal; Stillwell, David; Davidson, Robert L

    2017-01-01

    Subjective well-being includes 'affect' and 'satisfaction with life' (SWL). This study proposes a unified approach to construct a profile of subjective well-being based on social media language in Facebook status updates. We apply sentiment analysis to generate users' affect scores, and train a random forest model to predict SWL using affect scores and other language features of the status updates. Results show that: the computer-selected features resemble the key predictors of SWL as identified in early studies; the machine-predicted SWL is moderately correlated with the self-reported SWL (r = 0.36, p < 0.01), indicating that language-based assessment can constitute valid SWL measures; the machine-assessed affect scores resemble those reported in a previous experimental study; and the machine-predicted subjective well-being profile can also reflect other psychological traits like depression (r = 0.24, p < 0.01). This study provides important insights for psychological prediction using multiple, machine-assessed components and longitudinal or dense psychological assessment using social media language.

  20. Perpendicular magnetic tunnel junction with a strained Mn-based nanolayer

    PubMed Central

    Suzuki, K. Z.; Ranjbar, R.; Okabayashi, J.; Miura, Y.; Sugihara, A.; Tsuchiura, H.; Mizukami, S.

    2016-01-01

    A magnetic tunnel junction with a perpendicular magnetic easy-axis (p-MTJ) is a key device for spintronic non-volatile magnetoresistive random access memory (MRAM). Co-Fe-B alloy-based p-MTJs are being developed, although they have a large magnetisation and medium perpendicular magnetic anisotropy (PMA), which make it difficult to apply them to a future dense MRAM. Here, we demonstrate a p-MTJ with an epitaxially strained MnGa nanolayer grown on a unique CoGa buffer material, which exhibits a large PMA of more than 5 Merg/cm3 and magnetisation below 500 emu/cm3; these properties are sufficient for application to advanced MRAM. Although the experimental tunnel magnetoresistance (TMR) ratio is still low, first principles calculations confirm that the strain-induced crystal lattice distortion modifies the band dispersion along the tetragonal c-axis into the fully spin-polarised state; thus, a huge TMR effect can be generated in this p-MTJ. PMID:27457186

  1. Building a profile of subjective well-being for social media users

    PubMed Central

    Kosinski, Michal; Stillwell, David; Davidson, Robert L.

    2017-01-01

    Subjective well-being includes ‘affect’ and ‘satisfaction with life’ (SWL). This study proposes a unified approach to construct a profile of subjective well-being based on social media language in Facebook status updates. We apply sentiment analysis to generate users’ affect scores, and train a random forest model to predict SWL using affect scores and other language features of the status updates. Results show that: the computer-selected features resemble the key predictors of SWL as identified in early studies; the machine-predicted SWL is moderately correlated with the self-reported SWL (r = 0.36, p < 0.01), indicating that language-based assessment can constitute valid SWL measures; the machine-assessed affect scores resemble those reported in a previous experimental study; and the machine-predicted subjective well-being profile can also reflect other psychological traits like depression (r = 0.24, p < 0.01). This study provides important insights for psychological prediction using multiple, machine-assessed components and longitudinal or dense psychological assessment using social media language. PMID:29135991

  2. A Secure and Robust Object-Based Video Authentication System

    NASA Astrophysics Data System (ADS)

    He, Dajun; Sun, Qibin; Tian, Qi

    2004-12-01

    An object-based video authentication system, which combines watermarking, error correction coding (ECC), and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART) coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT) coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI).

  3. One-Shot Coherence Dilution.

    PubMed

    Zhao, Qi; Liu, Yunchao; Yuan, Xiao; Chitambar, Eric; Ma, Xiongfeng

    2018-02-16

    Manipulation and quantification of quantum resources are fundamental problems in quantum physics. In the asymptotic limit, coherence distillation and dilution have been proposed by manipulating infinite identical copies of states. In the nonasymptotic setting, finite data-size effects emerge, and the practically relevant problem of coherence manipulation using finite resources has been left open. This Letter establishes the one-shot theory of coherence dilution, which involves converting maximally coherent states into an arbitrary quantum state using maximally incoherent operations, dephasing-covariant incoherent operations, incoherent operations, or strictly incoherent operations. We introduce several coherence monotones with concrete operational interpretations that estimate the one-shot coherence cost-the minimum amount of maximally coherent states needed for faithful coherence dilution. Furthermore, we derive the asymptotic coherence dilution results with maximally incoherent operations, incoherent operations, and strictly incoherent operations as special cases. Our result can be applied in the analyses of quantum information processing tasks that exploit coherence as resources, such as quantum key distribution and random number generation.

  4. One-Shot Coherence Dilution

    NASA Astrophysics Data System (ADS)

    Zhao, Qi; Liu, Yunchao; Yuan, Xiao; Chitambar, Eric; Ma, Xiongfeng

    2018-02-01

    Manipulation and quantification of quantum resources are fundamental problems in quantum physics. In the asymptotic limit, coherence distillation and dilution have been proposed by manipulating infinite identical copies of states. In the nonasymptotic setting, finite data-size effects emerge, and the practically relevant problem of coherence manipulation using finite resources has been left open. This Letter establishes the one-shot theory of coherence dilution, which involves converting maximally coherent states into an arbitrary quantum state using maximally incoherent operations, dephasing-covariant incoherent operations, incoherent operations, or strictly incoherent operations. We introduce several coherence monotones with concrete operational interpretations that estimate the one-shot coherence cost—the minimum amount of maximally coherent states needed for faithful coherence dilution. Furthermore, we derive the asymptotic coherence dilution results with maximally incoherent operations, incoherent operations, and strictly incoherent operations as special cases. Our result can be applied in the analyses of quantum information processing tasks that exploit coherence as resources, such as quantum key distribution and random number generation.

  5. Crossing the Threshold: Bringing Biological Variation to the Foreground

    PubMed Central

    Batzli, Janet M.; Knight, Jennifer K.; Hartley, Laurel M.; Maskiewicz, April Cordero; Desy, Elizabeth A.

    2016-01-01

    Threshold concepts have been referred to as “jewels in the curriculum”: concepts that are key to competency in a discipline but not taught explicitly. In biology, researchers have proposed the idea of threshold concepts that include such topics as variation, randomness, uncertainty, and scale. In this essay, we explore how the notion of threshold concepts can be used alongside other frameworks meant to guide instructional and curricular decisions, and we examine the proposed threshold concept of variation and how it might influence students’ understanding of core concepts in biology focused on genetics and evolution. Using dimensions of scientific inquiry, we outline a schema that may allow students to experience and apply the idea of variation in such a way that it transforms their future understanding and learning of genetics and evolution. We encourage others to consider the idea of threshold concepts alongside the Vision and Change core concepts to provide a lens for targeted instruction and as an integrative bridge between concepts and competencies. PMID:27856553

  6. Gain-of-function mutagenesis approaches in rice for functional genomics and improvement of crop productivity.

    PubMed

    Moin, Mazahar; Bakshi, Achala; Saha, Anusree; Dutta, Mouboni; Kirti, P B

    2017-07-01

    The epitome of any genome research is to identify all the existing genes in a genome and investigate their roles. Various techniques have been applied to unveil the functions either by silencing or over-expressing the genes by targeted expression or random mutagenesis. Rice is the most appropriate model crop for generating a mutant resource for functional genomic studies because of the availability of high-quality genome sequence and relatively smaller genome size. Rice has syntenic relationships with members of other cereals. Hence, characterization of functionally unknown genes in rice will possibly provide key genetic insights and can lead to comparative genomics involving other cereals. The current review attempts to discuss the available gain-of-function mutagenesis techniques for functional genomics, emphasizing the contemporary approach, activation tagging and alterations to this method for the enhancement of yield and productivity of rice. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  7. Modeling stimulus variation in three common implicit attitude tasks.

    PubMed

    Wolsiefer, Katie; Westfall, Jacob; Judd, Charles M

    2017-08-01

    We explored the consequences of ignoring the sampling variation due to stimuli in the domain of implicit attitudes. A large literature in psycholinguistics has examined the statistical treatment of random stimulus materials, but the recommendations from this literature have not been applied to the social psychological literature on implicit attitudes. This is partly because of inherent complications in applying crossed random-effect models to some of the most common implicit attitude tasks, and partly because no work to date has demonstrated that random stimulus variation is in fact consequential in implicit attitude measurement. We addressed this problem by laying out statistically appropriate and practically feasible crossed random-effect models for three of the most commonly used implicit attitude measures-the Implicit Association Test, affect misattribution procedure, and evaluative priming task-and then applying these models to large datasets (average N = 3,206) that assess participants' implicit attitudes toward race, politics, and self-esteem. We showed that the test statistics from the traditional analyses are substantially (about 60 %) inflated relative to the more-appropriate analyses that incorporate stimulus variation. Because all three tasks used the same stimulus words and faces, we could also meaningfully compare the relative contributions of stimulus variation across the tasks. In an appendix, we give syntax in R, SAS, and SPSS for fitting the recommended crossed random-effects models to data from all three tasks, as well as instructions on how to structure the data file.

  8. Implementing traceability using particle randomness-based textile printed tags

    NASA Astrophysics Data System (ADS)

    Agrawal, T. K.; Koehl, L.; Campagne, C.

    2017-10-01

    This article introduces a random particle-based traceability tag for textiles. The proposed tag not only act as a unique signature for the corresponding textile product but also possess the features such as easy to manufacture and hard to copy. It seeks applications in brand authentication and traceability in textile and clothing (T&C) supply chain. A prototype has been developed by screen printing process, in which micron-scale particles were mixed with the printing paste and printed on cotton fabrics to attain required randomness. To encode the randomness, the image of the developed tag was taken and analyzed using image processing. The randomness of the particles acts as a product key or unique signature which is required to decode the tag. Finally, washing and abrasion resistance tests were conducted to check the durability of the printed tag.

  9. Shaping the spectrum of random-phase radar waveforms

    DOEpatents

    Doerry, Armin W.; Marquette, Brandeis

    2017-05-09

    The various technologies presented herein relate to generation of a desired waveform profile in the form of a spectrum of apparently random noise (e.g., white noise or colored noise), but with precise spectral characteristics. Hence, a waveform profile that could be readily determined (e.g., by a spoofing system) is effectively obscured. Obscuration is achieved by dividing the waveform into a series of chips, each with an assigned frequency, wherein the sequence of chips are subsequently randomized. Randomization can be a function of the application of a key to the chip sequence. During processing of the echo pulse, a copy of the randomized transmitted pulse is recovered or regenerated against which the received echo is correlated. Hence, with the echo energy range-compressed in this manner, it is possible to generate a radar image with precise impulse response.

  10. On the security of Y-00 under fast correlation and other attacks on the key

    NASA Astrophysics Data System (ADS)

    Yuen, Horace P.; Nair, Ranjith

    2007-04-01

    The security of the Y-00 direct encryption protocol under correlation attack is addressed. A Y-00 configuration that is more secure than AES under known-plaintext attack is presented. It is shown that under any ciphertext-only attack, full information-theoretic security on the Y-00 seed key is obtained for any encryption box ENC with proper deliberate signal randomization.

  11. Ensemble of Chaotic and Naive Approaches for Performance Enhancement in Video Encryption.

    PubMed

    Chandrasekaran, Jeyamala; Thiruvengadam, S J

    2015-01-01

    Owing to the growth of high performance network technologies, multimedia applications over the Internet are increasing exponentially. Applications like video conferencing, video-on-demand, and pay-per-view depend upon encryption algorithms for providing confidentiality. Video communication is characterized by distinct features such as large volume, high redundancy between adjacent frames, video codec compliance, syntax compliance, and application specific requirements. Naive approaches for video encryption encrypt the entire video stream with conventional text based cryptographic algorithms. Although naive approaches are the most secure for video encryption, the computational cost associated with them is very high. This research work aims at enhancing the speed of naive approaches through chaos based S-box design. Chaotic equations are popularly known for randomness, extreme sensitivity to initial conditions, and ergodicity. The proposed methodology employs two-dimensional discrete Henon map for (i) generation of dynamic and key-dependent S-box that could be integrated with symmetric algorithms like Blowfish and Data Encryption Standard (DES) and (ii) generation of one-time keys for simple substitution ciphers. The proposed design is tested for randomness, nonlinearity, avalanche effect, bit independence criterion, and key sensitivity. Experimental results confirm that chaos based S-box design and key generation significantly reduce the computational cost of video encryption with no compromise in security.

  12. Ensemble of Chaotic and Naive Approaches for Performance Enhancement in Video Encryption

    PubMed Central

    Chandrasekaran, Jeyamala; Thiruvengadam, S. J.

    2015-01-01

    Owing to the growth of high performance network technologies, multimedia applications over the Internet are increasing exponentially. Applications like video conferencing, video-on-demand, and pay-per-view depend upon encryption algorithms for providing confidentiality. Video communication is characterized by distinct features such as large volume, high redundancy between adjacent frames, video codec compliance, syntax compliance, and application specific requirements. Naive approaches for video encryption encrypt the entire video stream with conventional text based cryptographic algorithms. Although naive approaches are the most secure for video encryption, the computational cost associated with them is very high. This research work aims at enhancing the speed of naive approaches through chaos based S-box design. Chaotic equations are popularly known for randomness, extreme sensitivity to initial conditions, and ergodicity. The proposed methodology employs two-dimensional discrete Henon map for (i) generation of dynamic and key-dependent S-box that could be integrated with symmetric algorithms like Blowfish and Data Encryption Standard (DES) and (ii) generation of one-time keys for simple substitution ciphers. The proposed design is tested for randomness, nonlinearity, avalanche effect, bit independence criterion, and key sensitivity. Experimental results confirm that chaos based S-box design and key generation significantly reduce the computational cost of video encryption with no compromise in security. PMID:26550603

  13. Properties of behavior under different random ratio and random interval schedules: A parametric study.

    PubMed

    Dembo, M; De Penfold, J B; Ruiz, R; Casalta, H

    1985-03-01

    Four pigeons were trained to peck a key under different values of a temporally defined independent variable (T) and different probabilities of reinforcement (p). Parameter T is a fixed repeating time cycle and p the probability of reinforcement for the first response of each cycle T. Two dependent variables were used: mean response rate and mean postreinforcement pause. For all values of p a critical value for the independent variable T was found (T=1 sec) in which marked changes took place in response rate and postreinforcement pauses. Behavior typical of random ratio schedules was obtained at T 1 sec and behavior typical of random interval schedules at T 1 sec. Copyright © 1985. Published by Elsevier B.V.

  14. Sensitivity of Polar Stratospheric Ozone Loss to Uncertainties in Chemical Reaction Kinetics

    NASA Technical Reports Server (NTRS)

    Kawa, S. Randolph; Stolarksi, Richard S.; Douglass, Anne R.; Newman, Paul A.

    2008-01-01

    Several recent observational and laboratory studies of processes involved in polar stratospheric ozone loss have prompted a reexamination of aspects of our understanding for this key indicator of global change. To a large extent, our confidence in understanding and projecting changes in polar and global ozone is based on our ability to simulate these processes in numerical models of chemistry and transport. The fidelity of the models is assessed in comparison with a wide range of observations. These models depend on laboratory-measured kinetic reaction rates and photolysis cross sections to simulate molecular interactions. A typical stratospheric chemistry mechanism has on the order of 50- 100 species undergoing over a hundred intermolecular reactions and several tens of photolysis reactions. The rates of all of these reactions are subject to uncertainty, some substantial. Given the complexity of the models, however, it is difficult to quantify uncertainties in many aspects of system. In this study we use a simple box-model scenario for Antarctic ozone to estimate the uncertainty in loss attributable to known reaction kinetic uncertainties. Following the method of earlier work, rates and uncertainties from the latest laboratory evaluations are applied in random combinations. We determine the key reactions and rates contributing the largest potential errors and compare the results to observations to evaluate which combinations are consistent with atmospheric data. Implications for our theoretical and practical understanding of polar ozone loss will be assessed.

  15. Exponential gain of randomness certified by quantum contextuality

    NASA Astrophysics Data System (ADS)

    Um, Mark; Zhang, Junhua; Wang, Ye; Wang, Pengfei; Kim, Kihwan

    2017-04-01

    We demonstrate the protocol of exponential gain of randomness certified by quantum contextuality in a trapped ion system. The genuine randomness can be produced by quantum principle and certified by quantum inequalities. Recently, randomness expansion protocols based on inequality of Bell-text and Kochen-Specker (KS) theorem, have been demonstrated. These schemes have been theoretically innovated to exponentially expand the randomness and amplify the randomness from weak initial random seed. Here, we report the experimental evidence of such exponential expansion of randomness. In the experiment, we use three states of a 138Ba + ion between a ground state and two quadrupole states. In the 138Ba + ion system, we do not have detection loophole and we apply a methods to rule out certain hidden variable models that obey a kind of extended noncontextuality.

  16. Electromagnetic Scattering by Fully Ordered and Quasi-Random Rigid Particulate Samples

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Dlugach, Janna M.; Mackowski, Daniel W.

    2016-01-01

    In this paper we have analyzed circumstances under which a rigid particulate sample can behave optically as a true discrete random medium consisting of particles randomly moving relative to each other during measurement. To this end, we applied the numerically exact superposition T-matrix method to model far-field scattering characteristics of fully ordered and quasi-randomly arranged rigid multiparticle groups in fixed and random orientations. We have shown that, in and of itself, averaging optical observables over movements of a rigid sample as a whole is insufficient unless it is combined with a quasi-random arrangement of the constituent particles in the sample. Otherwise, certain scattering effects typical of discrete random media (including some manifestations of coherent backscattering) may not be accurately replicated.

  17. Semiquantum key distribution with secure delegated quantum computation

    PubMed Central

    Li, Qin; Chan, Wai Hong; Zhang, Shengyu

    2016-01-01

    Semiquantum key distribution allows a quantum party to share a random key with a “classical” party who only can prepare and measure qubits in the computational basis or reorder some qubits when he has access to a quantum channel. In this work, we present a protocol where a secret key can be established between a quantum user and an almost classical user who only needs the quantum ability to access quantum channels, by securely delegating quantum computation to a quantum server. We show the proposed protocol is robust even when the delegated quantum server is a powerful adversary, and is experimentally feasible with current technology. As one party of our protocol is the most quantum-resource efficient, it can be more practical and significantly widen the applicability scope of quantum key distribution. PMID:26813384

  18. A Hybrid Key Management Scheme for WSNs Based on PPBR and a Tree-Based Path Key Establishment Method

    PubMed Central

    Zhang, Ying; Liang, Jixing; Zheng, Bingxin; Chen, Wei

    2016-01-01

    With the development of wireless sensor networks (WSNs), in most application scenarios traditional WSNs with static sink nodes will be gradually replaced by Mobile Sinks (MSs), and the corresponding application requires a secure communication environment. Current key management researches pay less attention to the security of sensor networks with MS. This paper proposes a hybrid key management schemes based on a Polynomial Pool-based key pre-distribution and Basic Random key pre-distribution (PPBR) to be used in WSNs with MS. The scheme takes full advantages of these two kinds of methods to improve the cracking difficulty of the key system. The storage effectiveness and the network resilience can be significantly enhanced as well. The tree-based path key establishment method is introduced to effectively solve the problem of communication link connectivity. Simulation clearly shows that the proposed scheme performs better in terms of network resilience, connectivity and storage effectiveness compared to other widely used schemes. PMID:27070624

  19. Investigating Information Dynamics in Living Systems through the Structure and Function of Enzymes.

    PubMed

    Gatenby, Robert; Frieden, B Roy

    2016-01-01

    Enzymes are proteins that accelerate intracellular chemical reactions often by factors of 105-1012s-1. We propose the structure and function of enzymes represent the thermodynamic expression of heritable information encoded in DNA with post-translational modifications that reflect intra- and extra-cellular environmental inputs. The 3 dimensional shape of the protein, determined by the genetically-specified amino acid sequence and post translational modifications, permits geometric interactions with substrate molecules traditionally described by the key-lock best fit model. Here we apply Kullback-Leibler (K-L) divergence as metric of this geometric "fit" and the information content of the interactions. When the K-L 'distance' between interspersed substrate pn and enzyme rn positions is minimized, the information state, reaction probability, and reaction rate are maximized. The latter obeys the Arrhenius equation, which we show can be derived from the geometrical principle of minimum K-L distance. The derivation is first limited to optimum substrate positions for fixed sets of enzyme positions. However, maximally improving the key/lock fit, called 'induced fit,' requires both sets of positions to be varied optimally. We demonstrate this permits and is maximally efficient if the key and lock particles pn, rn are quantum entangled because the level of entanglement obeys the same minimized value of the Kullback-Leibler distance that occurs when all pn ≈ rn. This implies interchanges pn ⇄ brn randomly taking place during a reaction successively improves key/lock fits, reducing the activation energy Ea and increasing the reaction rate k. Our results demonstrate the summation of heritable and environmental information that determines the enzyme spatial configuration, by decreasing the K-L divergence, is converted to thermodynamic work by reducing Ea and increasing k of intracellular reactions. Macroscopically, enzyme information increases the order in living systems, similar to the Maxwell demon gedanken, by selectively accelerating specific reaction thus generating both spatial and temporal concentration gradients.

  20. [Encryption technique for linkable anonymizing].

    PubMed

    Okamoto, Etsuji

    2004-06-01

    Linkage of different records such as health insurance claims or medical records for the purpose of cohort studies or cancer registration usually requires matching with personal names and other personally identifiable data. The present study was conducted to examine the possibility of performing such privacy-sensitive procedures in a "linkable anonymizing" manner using encryption. While bidirectional communication entails encryption and deciphering, necessitating both senders and receivers sharing a common secret "key", record linkage entails only encryption and not deciphering because researchers do not need to know the identity of the linked person. This unidirectional nature relieves researchers from the historical problem of "key sharing" and enables data holders such as municipal governments and insurers to encrypt personal names in a relatively easy manner. The author demonstrates an encryption technique using readily available spread-sheet software, Microsoft Excel in a step-by-step fashion. Encoding Chinese characters into the numeric JIS codes and replacing the codes with a randomly assigned case-sensitive alphabet, all names of Japanese nationals will be encrypted into gibberish strings of alphabet, which can not be deciphered without the secret key. Data holders are able to release personal data without sacrificing privacy, even when accidental leakage occurs and researchers are still able to link records of the same name because encrypted texts, although gibberish, are unique to each name. Such a technical assurance of privacy protection is expected to satisfy the Privacy Protection Act or the Ethical Guidelines for Epidemiological Research and enhance public health research. Traditional encryption techniques, however, cannot be applied to cancer or stroke registration, because the registrar receives reports from numerous unspecified senders. The new public key encryption technique will enable disease registry in a linkable anonymizing manner. However various technical problems such as complexity, difficulties in registrar inquiries and risk of code-breaking make the encryption technique unsuitable for disease registry in the foreseeable future.

  1. Kevlar: Transitioning Helix from Research to Practice

    DTIC Science & Technology

    2015-04-01

    protective transformations are applied to application binaries before they are deployed. Salient features of Kevlar include applying high- entropy ...variety of classes. Kevlar uses novel, fine-grained, high- entropy diversification transformations to prevent an attacker from successfully exploiting...Kevlar include applying high- entropy randomization techniques, automated program repairs, leveraging highly-optimized virtual machine technology, and in

  2. Experimentally Generated Random Numbers Certified by the Impossibility of Superluminal Signaling

    NASA Astrophysics Data System (ADS)

    Bierhorst, Peter; Shalm, Lynden K.; Mink, Alan; Jordan, Stephen; Liu, Yi-Kai; Rommal, Andrea; Glancy, Scott; Christensen, Bradley; Nam, Sae Woo; Knill, Emanuel

    Random numbers are an important resource for applications such as numerical simulation and secure communication. However, it is difficult to certify whether a physical random number generator is truly unpredictable. Here, we exploit the phenomenon of quantum nonlocality in a loophole-free photonic Bell test experiment to obtain data containing randomness that cannot be predicted by any theory that does not also allow the sending of signals faster than the speed of light. To certify and quantify the randomness, we develop a new protocol that performs well in an experimental regime characterized by low violation of Bell inequalities. Applying an extractor function to our data, we obtain 256 new random bits, uniform to within 10- 3 .

  3. Random pulse generator

    NASA Technical Reports Server (NTRS)

    Lindsey, R. S., Jr. (Inventor)

    1975-01-01

    An exemplary embodiment of the present invention provides a source of random width and random spaced rectangular voltage pulses whose mean or average frequency of operation is controllable within prescribed limits of about 10 hertz to 1 megahertz. A pair of thin-film metal resistors are used to provide a differential white noise voltage pulse source. Pulse shaping and amplification circuitry provide relatively short duration pulses of constant amplitude which are applied to anti-bounce logic circuitry to prevent ringing effects. The pulse outputs from the anti-bounce circuits are then used to control two one-shot multivibrators whose output comprises the random length and random spaced rectangular pulses. Means are provided for monitoring, calibrating and evaluating the relative randomness of the generator.

  4. Denoising forced-choice detection data.

    PubMed

    García-Pérez, Miguel A

    2010-02-01

    Observers in a two-alternative forced-choice (2AFC) detection task face the need to produce a response at random (a guess) on trials in which neither presentation appeared to display a stimulus. Observers could alternatively be instructed to use a 'guess' key on those trials, a key that would produce a random guess and would also record the resultant correct or wrong response as emanating from a computer-generated guess. A simulation study shows that 'denoising' 2AFC data with information regarding which responses are a result of guesses yields estimates of detection threshold and spread of the psychometric function that are far more precise than those obtained in the absence of this information, and parallel the precision of estimates obtained with yes-no tasks running for the same number of trials. Simulations also show that partial compliance with the instructions to use the 'guess' key reduces the quality of the estimates, which nevertheless continue to be more precise than those obtained from conventional 2AFC data if the observers are still moderately compliant. An empirical study testing the validity of simulation results showed that denoised 2AFC estimates of spread were clearly superior to conventional 2AFC estimates and similar to yes-no estimates, but variations in threshold across observers and across sessions hid the benefits of denoising for threshold estimation. The empirical study also proved the feasibility of using a 'guess' key in addition to the conventional response keys defined in 2AFC tasks.

  5. Covert Network Analysis for Key Player Detection and Event Prediction Using a Hybrid Classifier

    PubMed Central

    Akram, M. Usman; Khan, Shoab A.; Javed, Muhammad Younus

    2014-01-01

    National security has gained vital importance due to increasing number of suspicious and terrorist events across the globe. Use of different subfields of information technology has also gained much attraction of researchers and practitioners to design systems which can detect main members which are actually responsible for such kind of events. In this paper, we present a novel method to predict key players from a covert network by applying a hybrid framework. The proposed system calculates certain centrality measures for each node in the network and then applies novel hybrid classifier for detection of key players. Our system also applies anomaly detection to predict any terrorist activity in order to help law enforcement agencies to destabilize the involved network. As a proof of concept, the proposed framework has been implemented and tested using different case studies including two publicly available datasets and one local network. PMID:25136674

  6. Evaluation of the Efficiency of the Nursing Care Plan Applied Using NANDA, NOC, and NIC Linkages to Elderly Women with Incontinence Living in a Nursing Home: A Randomized Controlled Study.

    PubMed

    Gencbas, Dercan; Bebis, Hatice; Cicek, Hatice

    2017-05-30

    Evaluate the efficiency of the nursing care plan, applied with the use of NANDA-I, NOC, and NIC (NNN) linkages, for elderly women with incontinence who live in nursing homes. A randomized controlled experimental design was applied. NNN linkages were prepared and applied for 12 weeks in an experimental group. NOC scales were evaluated again for two groups. A 0.5 NOC point change targeted in all elderly in the experimental group were provided between pretest-posttest scores. The experimental group had higher life quality and lower incontinence severity/symptoms than the control group. It is important that NNN linkages effective for solving the problems are used in different groups and with larger samples to create further evidence linking NNN. © 2017 NANDA International, Inc.

  7. Encryption key distribution via chaos synchronization

    NASA Astrophysics Data System (ADS)

    Keuninckx, Lars; Soriano, Miguel C.; Fischer, Ingo; Mirasso, Claudio R.; Nguimdo, Romain M.; van der Sande, Guy

    2017-02-01

    We present a novel encryption scheme, wherein an encryption key is generated by two distant complex nonlinear units, forced into synchronization by a chaotic driver. The concept is sufficiently generic to be implemented on either photonic, optoelectronic or electronic platforms. The method for generating the key bitstream from the chaotic signals is reconfigurable. Although derived from a deterministic process, the obtained bit series fulfill the randomness conditions as defined by the National Institute of Standards test suite. We demonstrate the feasibility of our concept on an electronic delay oscillator circuit and test the robustness against attacks using a state-of-the-art system identification method.

  8. Study on Multi-stage Logistics System Design Problem with Inventory Considering Demand Change by Hybrid Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Inoue, Hisaki; Gen, Mitsuo

    The logistics model used in this study is 3-stage model employed by an automobile company, which aims to solve traffic problems at a total minimum cost. Recently, research on the metaheuristics method has advanced as an approximate means for solving optimization problems like this model. These problems can be solved using various methods such as the genetic algorithm (GA), simulated annealing, and tabu search. GA is superior in robustness and adjustability toward a change in the structure of these problems. However, GA has a disadvantage in that it has a slightly inefficient search performance because it carries out a multi-point search. A hybrid GA that combines another method is attracting considerable attention since it can compensate for a fault to a partial solution that early convergence gives a bad influence on a result. In this study, we propose a novel hybrid random key-based GA(h-rkGA) that combines local search and parameter tuning of crossover rate and mutation rate; h-rkGA is an improved version of the random key-based GA (rk-GA). We attempted comparative experiments with spanning tree-based GA, priority based GA and random key-based GA. Further, we attempted comparative experiments with “h-GA by only local search” and “h-GA by only parameter tuning”. We reported the effectiveness of the proposed method on the basis of the results of these experiments.

  9. Quantum cryptographic system with reduced data loss

    DOEpatents

    Lo, H.K.; Chau, H.F.

    1998-03-24

    A secure method for distributing a random cryptographic key with reduced data loss is disclosed. Traditional quantum key distribution systems employ similar probabilities for the different communication modes and thus reject at least half of the transmitted data. The invention substantially reduces the amount of discarded data (those that are encoded and decoded in different communication modes e.g. using different operators) in quantum key distribution without compromising security by using significantly different probabilities for the different communication modes. Data is separated into various sets according to the actual operators used in the encoding and decoding process and the error rate for each set is determined individually. The invention increases the key distribution rate of the BB84 key distribution scheme proposed by Bennett and Brassard in 1984. Using the invention, the key distribution rate increases with the number of quantum signals transmitted and can be doubled asymptotically. 23 figs.

  10. Finite-key security analyses on passive decoy-state QKD protocols with different unstable sources

    PubMed Central

    Song, Ting-Ting; Qin, Su-Juan; Wen, Qiao-Yan; Wang, Yu-Kun; Jia, Heng-Yue

    2015-01-01

    In quantum communication, passive decoy-state QKD protocols can eliminate many side channels, but the protocols without any finite-key analyses are not suitable for in practice. The finite-key securities of passive decoy-state (PDS) QKD protocols with two different unstable sources, type-II parametric down-convention (PDC) and phase randomized weak coherent pulses (WCPs), are analyzed in our paper. According to the PDS QKD protocols, we establish an optimizing programming respectively and obtain the lower bounds of finite-key rates. Under some reasonable values of quantum setup parameters, the lower bounds of finite-key rates are simulated. The simulation results show that at different transmission distances, the affections of different fluctuations on key rates are different. Moreover, the PDS QKD protocol with an unstable PDC source can resist more intensity fluctuations and more statistical fluctuation. PMID:26471947

  11. A fast key generation method based on dynamic biometrics to secure wireless body sensor networks for p-health.

    PubMed

    Zhang, G H; Poon, Carmen C Y; Zhang, Y T

    2010-01-01

    Body sensor networks (BSNs) have emerged as a new technology for healthcare applications, but the security of communication in BSNs remains a formidable challenge yet to be resolved. The paper discusses the typical attacks faced by BSNs and proposes a fast biometric based approach to generate keys for ensuing confidentiality and authentication in BSN communications. The approach was tested on 900 segments of electrocardiogram. Each segment was 4 seconds long and used to generate a 128-bit key. The results of the study found that entropy of 96% of the keys were above 0.95 and 99% of the hamming distances calculated from any two keys were above 50 bits. Based on the randomness and distinctiveness of these keys, it is concluded that the fast biometric based approach has great potential to be used to secure communication in BSNs for health applications.

  12. Quantum cryptographic system with reduced data loss

    DOEpatents

    Lo, Hoi-Kwong; Chau, Hoi Fung

    1998-01-01

    A secure method for distributing a random cryptographic key with reduced data loss. Traditional quantum key distribution systems employ similar probabilities for the different communication modes and thus reject at least half of the transmitted data. The invention substantially reduces the amount of discarded data (those that are encoded and decoded in different communication modes e.g. using different operators) in quantum key distribution without compromising security by using significantly different probabilities for the different communication modes. Data is separated into various sets according to the actual operators used in the encoding and decoding process and the error rate for each set is determined individually. The invention increases the key distribution rate of the BB84 key distribution scheme proposed by Bennett and Brassard in 1984. Using the invention, the key distribution rate increases with the number of quantum signals transmitted and can be doubled asymptotically.

  13. Autoshaping and automaintenance of a key-press response in squirrel monkeys

    PubMed Central

    Gamzu, Elkan; Schwam, Elias

    1974-01-01

    Following exposure for a minimum of 500 to 600 trials, three of four naive squirrel monkeys eventually pressed a response key, illumination of which always preceded delivery of a food pellet. Three other naive monkeys did not press the key when the pellets were delivered randomly with respect to key illumination. Despite some similarities to autoshaping using pigeons, the data indicate many points of difference when squirrel monkeys are used as subjects. Although key-food pairings were shown to be important in the acquisition of the key-press response, they were ineffective in maintaining the response when either a negative response-reinforcer dependency was introduced, or when there was no scheduled response-reinforcer dependency (fixed trial). Not all demonstrations of autoshaping can be considered to be under the control of those processes that are primarily responsible for the phenomena obtained in pigeons. PMID:16811749

  14. Autoshaping and automaintenance of a key-press response in squirrel monkeys.

    PubMed

    Gamzu, E; Schwam, E

    1974-03-01

    Following exposure for a minimum of 500 to 600 trials, three of four naive squirrel monkeys eventually pressed a response key, illumination of which always preceded delivery of a food pellet. Three other naive monkeys did not press the key when the pellets were delivered randomly with respect to key illumination. Despite some similarities to autoshaping using pigeons, the data indicate many points of difference when squirrel monkeys are used as subjects. Although key-food pairings were shown to be important in the acquisition of the key-press response, they were ineffective in maintaining the response when either a negative response-reinforcer dependency was introduced, or when there was no scheduled response-reinforcer dependency (fixed trial). Not all demonstrations of autoshaping can be considered to be under the control of those processes that are primarily responsible for the phenomena obtained in pigeons.

  15. Robust portfolio selection based on asymmetric measures of variability of stock returns

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Tan, Shaohua

    2009-10-01

    This paper addresses a new uncertainty set--interval random uncertainty set for robust optimization. The form of interval random uncertainty set makes it suitable for capturing the downside and upside deviations of real-world data. These deviation measures capture distributional asymmetry and lead to better optimization results. We also apply our interval random chance-constrained programming to robust mean-variance portfolio selection under interval random uncertainty sets in the elements of mean vector and covariance matrix. Numerical experiments with real market data indicate that our approach results in better portfolio performance.

  16. Quantifying Uncertainties in N2O Emission Due to N Fertilizer Application in Cultivated Areas

    PubMed Central

    Philibert, Aurore; Loyce, Chantal; Makowski, David

    2012-01-01

    Nitrous oxide (N2O) is a greenhouse gas with a global warming potential approximately 298 times greater than that of CO2. In 2006, the Intergovernmental Panel on Climate Change (IPCC) estimated N2O emission due to synthetic and organic nitrogen (N) fertilization at 1% of applied N. We investigated the uncertainty on this estimated value, by fitting 13 different models to a published dataset including 985 N2O measurements. These models were characterized by (i) the presence or absence of the explanatory variable “applied N”, (ii) the function relating N2O emission to applied N (exponential or linear function), (iii) fixed or random background (i.e. in the absence of N application) N2O emission and (iv) fixed or random applied N effect. We calculated ranges of uncertainty on N2O emissions from a subset of these models, and compared them with the uncertainty ranges currently used in the IPCC-Tier 1 method. The exponential models outperformed the linear models, and models including one or two random effects outperformed those including fixed effects only. The use of an exponential function rather than a linear function has an important practical consequence: the emission factor is not constant and increases as a function of applied N. Emission factors estimated using the exponential function were lower than 1% when the amount of N applied was below 160 kg N ha−1. Our uncertainty analysis shows that the uncertainty range currently used by the IPCC-Tier 1 method could be reduced. PMID:23226430

  17. Hybrid computer optimization of systems with random parameters

    NASA Technical Reports Server (NTRS)

    White, R. C., Jr.

    1972-01-01

    A hybrid computer Monte Carlo technique for the simulation and optimization of systems with random parameters is presented. The method is applied to the simultaneous optimization of the means and variances of two parameters in the radar-homing missile problem treated by McGhee and Levine.

  18. Network meta-analysis, electrical networks and graph theory.

    PubMed

    Rücker, Gerta

    2012-12-01

    Network meta-analysis is an active field of research in clinical biostatistics. It aims to combine information from all randomized comparisons among a set of treatments for a given medical condition. We show how graph-theoretical methods can be applied to network meta-analysis. A meta-analytic graph consists of vertices (treatments) and edges (randomized comparisons). We illustrate the correspondence between meta-analytic networks and electrical networks, where variance corresponds to resistance, treatment effects to voltage, and weighted treatment effects to current flows. Based thereon, we then show that graph-theoretical methods that have been routinely applied to electrical networks also work well in network meta-analysis. In more detail, the resulting consistent treatment effects induced in the edges can be estimated via the Moore-Penrose pseudoinverse of the Laplacian matrix. Moreover, the variances of the treatment effects are estimated in analogy to electrical effective resistances. It is shown that this method, being computationally simple, leads to the usual fixed effect model estimate when applied to pairwise meta-analysis and is consistent with published results when applied to network meta-analysis examples from the literature. Moreover, problems of heterogeneity and inconsistency, random effects modeling and including multi-armed trials are addressed. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Two-key concurrent responding: response-reinforcement dependencies and blackouts1

    PubMed Central

    Herbert, Emily W.

    1970-01-01

    Two-key concurrent responding was maintained for three pigeons by a single variable-interval 1-minute schedule of reinforcement in conjunction with a random number generator that assigned feeder operations between keys with equal probability. The duration of blackouts was varied between keys when each response initiated a blackout, and grain arranged by the variable-interval schedule was automatically presented after a blackout (Exp. I). In Exp. II every key peck, except for those that produced grain, initiated a blackout, and grain was dependent upon a response following a blackout. For each pigeon in Exp. I and for one pigeon in Exp. II, the relative frequency of responding on a key approximated, i.e., matched, the relative reciprocal of the duration of the blackout interval on that key. In a third experiment, blackouts scheduled on a variable-interval were of equal duration on the two keys. For one key, grain automatically followed each blackout; for the other key, grain was dependent upon a response and never followed a blackout. The relative frequency of responding on the former key, i.e., the delay key, better approximated the negative exponential function obtained by Chung (1965) than the matching function predicted by Chung and Herrnstein (1967). PMID:16811458

  20. The response of rotating machinery to external random vibration

    NASA Technical Reports Server (NTRS)

    Tessarzik, J. M.; Chiang, T.; Badgley, R. H.

    1974-01-01

    A high-speed turbogenerator employing gas-lubricated hydrodynamic journal and thrust bearings was subjected to external random vibrations for the purpose of assessing bearing performance in a dynamic environment. The pivoted-pad type journal bearings and the step-sector thrust bearing supported a turbine-driven rotor weighing approximately twenty-one pounds at a nominal operating speed of 36,000 rpm. The response amplitudes of both the rigid-supported and flexible-supported bearing pads, the gimballed thrust bearing, and the rotor relative to the machine casing were measured with capacitance type displacement probes. Random vibrations were applied by means of a large electrodynamic shaker at input levels ranging between 0.5 g (rms) and 1.5 g (rms). Vibrations were applied both along and perpendicular to the rotor axis. Response measurements were analyzed for amplitude distribution and power spectral density. Experimental results compare well with calculations of amplitude power spectral density made for the case where the vibrations were applied along the rotor axis. In this case, the rotor-bearing system was treated as a linear, three-mass model.

  1. Resampling method for applying density-dependent habitat selection theory to wildlife surveys.

    PubMed

    Tardy, Olivia; Massé, Ariane; Pelletier, Fanie; Fortin, Daniel

    2015-01-01

    Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection over large geographic extents.

  2. Meta-analysis identifies gene-by-environment interactions as demonstrated in a study of 4,965 mice.

    PubMed

    Kang, Eun Yong; Han, Buhm; Furlotte, Nicholas; Joo, Jong Wha J; Shih, Diana; Davis, Richard C; Lusis, Aldons J; Eskin, Eleazar

    2014-01-01

    Identifying environmentally-specific genetic effects is a key challenge in understanding the structure of complex traits. Model organisms play a crucial role in the identification of such gene-by-environment interactions, as a result of the unique ability to observe genetically similar individuals across multiple distinct environments. Many model organism studies examine the same traits but under varying environmental conditions. For example, knock-out or diet-controlled studies are often used to examine cholesterol in mice. These studies, when examined in aggregate, provide an opportunity to identify genomic loci exhibiting environmentally-dependent effects. However, the straightforward application of traditional methodologies to aggregate separate studies suffers from several problems. First, environmental conditions are often variable and do not fit the standard univariate model for interactions. Additionally, applying a multivariate model results in increased degrees of freedom and low statistical power. In this paper, we jointly analyze multiple studies with varying environmental conditions using a meta-analytic approach based on a random effects model to identify loci involved in gene-by-environment interactions. Our approach is motivated by the observation that methods for discovering gene-by-environment interactions are closely related to random effects models for meta-analysis. We show that interactions can be interpreted as heterogeneity and can be detected without utilizing the traditional uni- or multi-variate approaches for discovery of gene-by-environment interactions. We apply our new method to combine 17 mouse studies containing in aggregate 4,965 distinct animals. We identify 26 significant loci involved in High-density lipoprotein (HDL) cholesterol, many of which are consistent with previous findings. Several of these loci show significant evidence of involvement in gene-by-environment interactions. An additional advantage of our meta-analysis approach is that our combined study has significantly higher power and improved resolution compared to any single study thus explaining the large number of loci discovered in the combined study.

  3. Meta-Analysis Identifies Gene-by-Environment Interactions as Demonstrated in a Study of 4,965 Mice

    PubMed Central

    Joo, Jong Wha J.; Shih, Diana; Davis, Richard C.; Lusis, Aldons J.; Eskin, Eleazar

    2014-01-01

    Identifying environmentally-specific genetic effects is a key challenge in understanding the structure of complex traits. Model organisms play a crucial role in the identification of such gene-by-environment interactions, as a result of the unique ability to observe genetically similar individuals across multiple distinct environments. Many model organism studies examine the same traits but under varying environmental conditions. For example, knock-out or diet-controlled studies are often used to examine cholesterol in mice. These studies, when examined in aggregate, provide an opportunity to identify genomic loci exhibiting environmentally-dependent effects. However, the straightforward application of traditional methodologies to aggregate separate studies suffers from several problems. First, environmental conditions are often variable and do not fit the standard univariate model for interactions. Additionally, applying a multivariate model results in increased degrees of freedom and low statistical power. In this paper, we jointly analyze multiple studies with varying environmental conditions using a meta-analytic approach based on a random effects model to identify loci involved in gene-by-environment interactions. Our approach is motivated by the observation that methods for discovering gene-by-environment interactions are closely related to random effects models for meta-analysis. We show that interactions can be interpreted as heterogeneity and can be detected without utilizing the traditional uni- or multi-variate approaches for discovery of gene-by-environment interactions. We apply our new method to combine 17 mouse studies containing in aggregate 4,965 distinct animals. We identify 26 significant loci involved in High-density lipoprotein (HDL) cholesterol, many of which are consistent with previous findings. Several of these loci show significant evidence of involvement in gene-by-environment interactions. An additional advantage of our meta-analysis approach is that our combined study has significantly higher power and improved resolution compared to any single study thus explaining the large number of loci discovered in the combined study. PMID:24415945

  4. Secure self-calibrating quantum random-bit generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiorentino, M.; Santori, C.; Spillane, S. M.

    2007-03-15

    Random-bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require 'strong' RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random-bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographicmore » method to measure a lower bound on the 'min-entropy' of the system, and we employ this value to distill a truly random-bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.« less

  5. Device-independent randomness generation from several Bell estimators

    NASA Astrophysics Data System (ADS)

    Nieto-Silleras, Olmo; Bamps, Cédric; Silman, Jonathan; Pironio, Stefano

    2018-02-01

    Device-independent randomness generation and quantum key distribution protocols rely on a fundamental relation between the non-locality of quantum theory and its random character. This relation is usually expressed in terms of a trade-off between the probability of guessing correctly the outcomes of measurements performed on quantum systems and the amount of violation of a given Bell inequality. However, a more accurate assessment of the randomness produced in Bell experiments can be obtained if the value of several Bell expressions is simultaneously taken into account, or if the full set of probabilities characterizing the behavior of the device is considered. We introduce protocols for device-independent randomness generation secure against classical side information, that rely on the estimation of an arbitrary number of Bell expressions or even directly on the experimental frequencies of measurement outcomes. Asymptotically, this results in an optimal generation of randomness from experimental data (as measured by the min-entropy), without having to assume beforehand that the devices violate a specific Bell inequality.

  6. Tetris and Word games lead to fewer intrusive memories when applied several days after analogue trauma.

    PubMed

    Hagenaars, Muriel A; Holmes, Emily A; Klaassen, Fayette; Elzinga, Bernet

    2017-01-01

    Background : Intrusive trauma memories are a key symptom of posttraumatic stress disorder (PTSD), so disrupting their recurrence is highly important. Intrusion development was hindered by visuospatial interventions administered up to 24 hours after analogue trauma. It is unknown whether interventions can be applied later, and whether modality or working-memory load are crucial factors. Objectives : This study tested: (1) whether a visuospatial task would lead to fewer intrusions compared to a reactivation-only group when applied after memory reactivation four days after analogue trauma exposure (extended replication), (2) whether both tasks (i.e. one aimed to be visuospatial, one more verbal) would lead to fewer intrusions than the reactivation-only group (intervention effect), and (3) whether supposed task modality (visuospatial or verbal) is a critical component (modality effect). Method : Fifty-four participants were randomly assigned to reactivation+Tetris (visuospatial), reactivation+Word games (verbal), or reactivation-only (no task). They watched an aversive film (day 0) and recorded intrusive memories of the film in diary A. On day 4, memory was reactivated, after which participants played Tetris, Word games, or had no task for 10 minutes. They then kept a second diary (B). Informative hypotheses were evaluated using Bayes factors. Results : Reactivation+Tetris and reactivation+Word games resulted in relatively fewer intrusions from the last day of diary A to the first day of diary B than reactivation-only (objective 1 and 2). Thus, both tasks were effective even when applied days after analogue trauma. Reactivation-only was not effective. Reactivation+Word games appeared to result in fewer intrusions than reactivation+Tetris (objective 3; modality effect), but this evidence was weak. Explorative analyses showed that Word games were more difficult than Tetris. Conclusions : Applying a task four days after the trauma film (during memory reconsolidation) was effective. The modality versus working-memory load issue is inconclusive.

  7. What Is Applied Linguistics?

    ERIC Educational Resources Information Center

    James, Carl

    1993-01-01

    Ostensive and expository definitions of applied linguistics are assessed. It is suggested that the key to a meaningful definition lies in the dual articulation of applied linguistics: it is an interface between linguistics and practicality. Its role as an "expert system" is suggested. (45 references) (Author/LB)

  8. Hand rim wheelchair propulsion training using biomechanical real-time visual feedback based on motor learning theory principles.

    PubMed

    Rice, Ian; Gagnon, Dany; Gallagher, Jere; Boninger, Michael

    2010-01-01

    As considerable progress has been made in laboratory-based assessment of manual wheelchair propulsion biomechanics, the necessity to translate this knowledge into new clinical tools and treatment programs becomes imperative. The objective of this study was to describe the development of a manual wheelchair propulsion training program aimed to promote the development of an efficient propulsion technique among long-term manual wheelchair users. Motor learning theory principles were applied to the design of biomechanical feedback-based learning software, which allows for random discontinuous real-time visual presentation of key spatiotemporal and kinetic parameters. This software was used to train a long-term wheelchair user on a dynamometer during 3 low-intensity wheelchair propulsion training sessions over a 3-week period. Biomechanical measures were recorded with a SmartWheel during over ground propulsion on a 50-m level tile surface at baseline and 3 months after baseline. Training software was refined and administered to a participant who was able to improve his propulsion technique by increasing contact angle while simultaneously reducing stroke cadence, mean resultant force, peak and mean moment out of plane, and peak rate of rise of force applied to the pushrim after training. The proposed propulsion training protocol may lead to favorable changes in manual wheelchair propulsion technique. These changes could limit or prevent upper limb injuries among manual wheelchair users. In addition, many of the motor learning theory-based techniques examined in this study could be applied to training individuals in various stages of rehabilitation to optimize propulsion early on.

  9. Hand Rim Wheelchair Propulsion Training Using Biomechanical Real-Time Visual Feedback Based on Motor Learning Theory Principles

    PubMed Central

    Rice, Ian; Gagnon, Dany; Gallagher, Jere; Boninger, Michael

    2010-01-01

    Background/Objective: As considerable progress has been made in laboratory-based assessment of manual wheelchair propulsion biomechanics, the necessity to translate this knowledge into new clinical tools and treatment programs becomes imperative. The objective of this study was to describe the development of a manual wheelchair propulsion training program aimed to promote the development of an efficient propulsion technique among long-term manual wheelchair users. Methods: Motor learning theory principles were applied to the design of biomechanical feedback-based learning software, which allows for random discontinuous real-time visual presentation of key spatio-temporal and kinetic parameters. This software was used to train a long-term wheelchair user on a dynamometer during 3 low-intensity wheelchair propulsion training sessions over a 3-week period. Biomechanical measures were recorded with a SmartWheel during over ground propulsion on a 50-m level tile surface at baseline and 3 months after baseline. Results: Training software was refined and administered to a participant who was able to improve his propulsion technique by increasing contact angle while simultaneously reducing stroke cadence, mean resultant force, peak and mean moment out of plane, and peak rate of rise of force applied to the pushrim after training. Conclusions: The proposed propulsion training protocol may lead to favorable changes in manual wheelchair propulsion technique. These changes could limit or prevent upper limb injuries among manual wheelchair users. In addition, many of the motor learning theory–based techniques examined in this study could be applied to training individuals in various stages of rehabilitation to optimize propulsion early on. PMID:20397442

  10. Social patterns revealed through random matrix theory

    NASA Astrophysics Data System (ADS)

    Sarkar, Camellia; Jalan, Sarika

    2014-11-01

    Despite the tremendous advancements in the field of network theory, very few studies have taken weights in the interactions into consideration that emerge naturally in all real-world systems. Using random matrix analysis of a weighted social network, we demonstrate the profound impact of weights in interactions on emerging structural properties. The analysis reveals that randomness existing in particular time frame affects the decisions of individuals rendering them more freedom of choice in situations of financial security. While the structural organization of networks remains the same throughout all datasets, random matrix theory provides insight into the interaction pattern of individuals of the society in situations of crisis. It has also been contemplated that individual accountability in terms of weighted interactions remains as a key to success unless segregation of tasks comes into play.

  11. Incompleteness and limit of security theory of quantum key distribution

    NASA Astrophysics Data System (ADS)

    Hirota, Osamu; Murakami, Dan; Kato, Kentaro; Futami, Fumio

    2012-10-01

    It is claimed in the many papers that a trace distance: d guarantees the universal composition security in quantum key distribution (QKD) like BB84 protocol. In this introduction paper, at first, it is explicitly explained what is the main misconception in the claim of the unconditional security for QKD theory. In general terms, the cause of the misunderstanding on the security claim is the Lemma in the paper of Renner. It suggests that the generation of the perfect random key is assured by the probability (1-d), and its failure probability is d. Thus, it concludes that the generated key provides the perfect random key sequence when the protocol is success. So the QKD provides perfect secrecy to the one time pad. This is the reason for the composition claim. However, the quantity of the trace distance (or variational distance) is not the probability for such an event. If d is not small enough, always the generated key sequence is not uniform. Now one needs the reconstruction of the evaluation of the trace distance if one wants to use it. One should first go back to the indistinguishability theory in the computational complexity based, and to clarify the meaning of the value of the variational distance. In addition, the same analysis for the information theoretic case is necessary. The recent serial papers by H.P.Yuen have given the answer on such questions. In this paper, we show more concise description of Yuen's theory, and clarify that the upper bound theories for the trace distance by Tomamichel et al and Hayashi et al are constructed by the wrong reasoning of Renner and it is unsuitable as the security analysis. Finally, we introduce a new macroscopic quantum communication to replace Q-bit QKD.

  12. Effectiveness of a multi-level implementation strategy for ASD interventions: study protocol for two linked cluster randomized trials.

    PubMed

    Brookman-Frazee, Lauren; Stahmer, Aubyn C

    2018-05-09

    The Centers for Disease Control (2018) estimates that 1 in 59 children has autism spectrum disorder, and the annual cost of ASD in the U.S. is estimated to be $236 billion. Evidence-based interventions have been developed and demonstrate effectiveness in improving child outcomes. However, research on generalizable methods to scale up these practices in the multiple service systems caring for these children has been limited and is critical to meet this growing public health need. This project includes two, coordinated studies testing the effectiveness of the Translating Evidence-based Interventions (EBI) for ASD: Multi-Level Implementation Strategy (TEAMS) model. TEAMS focuses on improving implementation leadership, organizational climate, and provider attitudes and motivation in order to improve two key implementation outcomes-provider training completion and intervention fidelity and subsequent child outcomes. The TEAMS Leadership Institute applies implementation leadership strategies and TEAMS Individualized Provider Strategies for training applies motivational interviewing strategies to facilitate provider and organizational behavior change. A cluster randomized implementation/effectiveness Hybrid, type 3, trial with a dismantling design will be used to understand the effectiveness of TEAMS and the mechanisms of change across settings and participants. Study #1 will test the TEAMS model with AIM HI (An Individualized Mental Health Intervention for ASD) in publicly funded mental health services. Study #2 will test TEAMS with CPRT (Classroom Pivotal Response Teaching) in education settings. Thirty-seven mental health programs and 37 school districts will be randomized, stratified by county and study, to one of four groups (Standard Provider Training Only, Standard Provider Training + Leader Training, Enhanced Provider Training, Enhanced Provider Training + Leader Training) to test the effectiveness of combining standard, EBI-specific training with the two TEAMS modules individually and together on multiple implementation outcomes. Implementation outcomes including provider training completion, fidelity (coded by observers blind to group assignment) and child behavior change will be examined for 295 mental health providers, 295 teachers, and 590 children. This implementation intervention has the potential to increase quality of care for ASD in publicly funded settings by improving effectiveness of intervention implementation. The process and modules will be generalizable to multiple service systems, providers, and interventions, providing broad impact in community services. This study is registered with Clinicaltrials.gov ( NCT03380078 ). Registered 20 December 2017, retrospectively registered.

  13. On randomized algorithms for numerical solution of applied Fredholm integral equations of the second kind

    NASA Astrophysics Data System (ADS)

    Voytishek, Anton V.; Shipilov, Nikolay M.

    2017-11-01

    In this paper, the systematization of numerical (implemented on a computer) randomized functional algorithms for approximation of a solution of Fredholm integral equation of the second kind is carried out. Wherein, three types of such algorithms are distinguished: the projection, the mesh and the projection-mesh methods. The possibilities for usage of these algorithms for solution of practically important problems is investigated in detail. The disadvantages of the mesh algorithms, related to the necessity of calculation values of the kernels of integral equations in fixed points, are identified. On practice, these kernels have integrated singularities, and calculation of their values is impossible. Thus, for applied problems, related to solving Fredholm integral equation of the second kind, it is expedient to use not mesh, but the projection and the projection-mesh randomized algorithms.

  14. Conservation of Shannon's redundancy for proteins. [information theory applied to amino acid sequences

    NASA Technical Reports Server (NTRS)

    Gatlin, L. L.

    1974-01-01

    Concepts of information theory are applied to examine various proteins in terms of their redundancy in natural originators such as animals and plants. The Monte Carlo method is used to derive information parameters for random protein sequences. Real protein sequence parameters are compared with the standard parameters of protein sequences having a specific length. The tendency of a chain to contain some amino acids more frequently than others and the tendency of a chain to contain certain amino acid pairs more frequently than other pairs are used as randomness measures of individual protein sequences. Non-periodic proteins are generally found to have random Shannon redundancies except in cases of constraints due to short chain length and genetic codes. Redundant characteristics of highly periodic proteins are discussed. A degree of periodicity parameter is derived.

  15. RANDOMIZATION PROCEDURES FOR THE ANALYSIS OF EDUCATIONAL EXPERIMENTS.

    ERIC Educational Resources Information Center

    COLLIER, RAYMOND O.

    CERTAIN SPECIFIC ASPECTS OF HYPOTHESIS TESTS USED FOR ANALYSIS OF RESULTS IN RANDOMIZED EXPERIMENTS WERE STUDIED--(1) THE DEVELOPMENT OF THE THEORETICAL FACTOR, THAT OF PROVIDING INFORMATION ON STATISTICAL TESTS FOR CERTAIN EXPERIMENTAL DESIGNS AND (2) THE DEVELOPMENT OF THE APPLIED ELEMENT, THAT OF SUPPLYING THE EXPERIMENTER WITH MACHINERY FOR…

  16. Aircraft adaptive learning control

    NASA Technical Reports Server (NTRS)

    Lee, P. S. T.; Vanlandingham, H. F.

    1979-01-01

    The optimal control theory of stochastic linear systems is discussed in terms of the advantages of distributed-control systems, and the control of randomly-sampled systems. An optimal solution to longitudinal control is derived and applied to the F-8 DFBW aircraft. A randomly-sampled linear process model with additive process and noise is developed.

  17. Key concepts relevant to quality of complex and shared decision-making in health care: a literature review.

    PubMed

    Dy, Sydney M; Purnell, Tanjala S

    2012-02-01

    High-quality provider-patient decision-making is key to quality care for complex conditions. We performed an analysis of key elements relevant to quality and complex, shared medical decision-making. Based on a search of electronic databases, including Medline and the Cochrane Library, as well as relevant articles' reference lists, reviews of tools, and annotated bibliographies, we developed a list of key concepts and applied them to a decision-making example. Key concepts identified included provider competence, trustworthiness, and cultural competence; communication with patients and families; information quality; patient/surrogate competence; and roles and involvement. We applied this concept list to a case example, shared decision-making for live donor kidney transplantation, and identified the likely most important concepts as provider and cultural competence, information quality, and communication with patients and families. This concept list may be useful for conceptualizing the quality of complex shared decision-making and in guiding research in this area. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Use of Random and Site-Directed Mutagenesis to Probe Protein Structure-Function Relationships: Applied Techniques in the Study of Helicobacter pylori.

    PubMed

    Whitmire, Jeannette M; Merrell, D Scott

    2017-01-01

    Mutagenesis is a valuable tool to examine the structure-function relationships of bacterial proteins. As such, a wide variety of mutagenesis techniques and strategies have been developed. This chapter details a selection of random mutagenesis methods and site-directed mutagenesis procedures that can be applied to an array of bacterial species. Additionally, the direct application of the techniques to study the Helicobacter pylori Ferric Uptake Regulator (Fur) protein is described. The varied approaches illustrated herein allow the robust investigation of the structural-functional relationships within a protein of interest.

  19. Applying a weighted random forests method to extract karst sinkholes from LiDAR data

    NASA Astrophysics Data System (ADS)

    Zhu, Junfeng; Pierskalla, William P.

    2016-02-01

    Detailed mapping of sinkholes provides critical information for mitigating sinkhole hazards and understanding groundwater and surface water interactions in karst terrains. LiDAR (Light Detection and Ranging) measures the earth's surface in high-resolution and high-density and has shown great potentials to drastically improve locating and delineating sinkholes. However, processing LiDAR data to extract sinkholes requires separating sinkholes from other depressions, which can be laborious because of the sheer number of the depressions commonly generated from LiDAR data. In this study, we applied the random forests, a machine learning method, to automatically separate sinkholes from other depressions in a karst region in central Kentucky. The sinkhole-extraction random forest was grown on a training dataset built from an area where LiDAR-derived depressions were manually classified through a visual inspection and field verification process. Based on the geometry of depressions, as well as natural and human factors related to sinkholes, 11 parameters were selected as predictive variables to form the dataset. Because the training dataset was imbalanced with the majority of depressions being non-sinkholes, a weighted random forests method was used to improve the accuracy of predicting sinkholes. The weighted random forest achieved an average accuracy of 89.95% for the training dataset, demonstrating that the random forest can be an effective sinkhole classifier. Testing of the random forest in another area, however, resulted in moderate success with an average accuracy rate of 73.96%. This study suggests that an automatic sinkhole extraction procedure like the random forest classifier can significantly reduce time and labor costs and makes its more tractable to map sinkholes using LiDAR data for large areas. However, the random forests method cannot totally replace manual procedures, such as visual inspection and field verification.

  20. A Secure and Efficient Communications Architecture for Global Information Grid Users Via Cooperating Space Assets

    DTIC Science & Technology

    2008-06-19

    ground troop component of a deployed contingency, and not a stationary infrastructure. With respect to fast- moving vehicles and aircraft, troops...the rapidly- moving user. In fact, the Control Group users could have been randomly assigned the Stationary , Sea, or 134 Ground Mobility Category...additional re-keying on the non- stationary users, just as they induce no re-keying on the Stationary users (assuming those fast- moving aircraft have the

  1. First direct landscape-scale measurement of tropical rain forest Leaf Area Index, a key driver of global primary productivity

    Treesearch

    David B. Clark; Paulo C. Olivas; Steven F. Oberbauer; Deborah A. Clark; Michael G. Ryan

    2008-01-01

    Leaf Area Index (leaf area per unit ground area, LAI) is a key driver of forest productivity but has never previously been measured directly at the landscape scale in tropical rain forest (TRF). We used a modular tower and stratified random sampling to harvest all foliage from forest floor to canopy top in 55 vertical transects (4.6 m2) across 500 ha of old growth in...

  2. Nonlinear optical cryptosystem based on joint Fresnel transform correlator under vector wave illumination

    NASA Astrophysics Data System (ADS)

    Xueju, Shen; Chao, Lin; Xiao, Zou; Jianjun, Cai

    2015-05-01

    We present a nonlinear optical cryptosystem with multi-dimensional keys including phase, polarization and diffraction distance. To make full use of the degrees of freedom that optical processing offers, an elaborately designed vector wave with both a space-variant phase and locally linear polarization is generated with a common-path interferometer for illumination. The joint transform correlator in the Fresnel domain, implemented with a double optical wedge, is utilized as the encryption framework which provides an additional key known as the Fresnel diffraction distance. Two nonlinear operations imposed on the recorded joint Fresnel power distribution (JFPD) by a charge coupled device (CCD) are adopted. The first one is the division of power distribution of the reference window random function which is previously proposed by researchers and can improve the quality of the decrypted image. The second one is the recording of a hybrid JFPD using a micro-polarizers array with orthogonal and random transmissive axes attached to the CCD. Then the hybrid JFPD is further scrambled by substituting random noise for partial power distribution. The two nonlinear operations break the linearity of this cryptosystem and provide ultra security. We verify our proposal using a quick response code for noise-free recovery.

  3. An Undergraduate Research Experience on Studying Variable Stars

    NASA Astrophysics Data System (ADS)

    Amaral, A.; Percy, J. R.

    2016-06-01

    We describe and evaluate a summer undergraduate research project and experience by one of us (AA), under the supervision of the other (JP). The aim of the project was to sample current approaches to analyzing variable star data, and topics related to the study of Mira variable stars and their astrophysical importance. This project was done through the Summer Undergraduate Research Program (SURP) in astronomy at the University of Toronto. SURP allowed undergraduate students to explore and learn about many topics within astronomy and astrophysics, from instrumentation to cosmology. SURP introduced students to key skills which are essential for students hoping to pursue graduate studies in any scientific field. Variable stars proved to be an excellent topic for a research project. For beginners to independent research, it introduces key concepts in research such as critical thinking and problem solving, while illuminating previously learned topics in stellar physics. The focus of this summer project was to compare observations with structural and evolutionary models, including modelling the random walk behavior exhibited in the (O-C) diagrams of most Mira stars. We found that the random walk could be modelled by using random fluctuations of the period. This explanation agreed well with observations.

  4. Simultaneous multiplexing and encoding of multiple images based on a double random phase encryption system

    NASA Astrophysics Data System (ADS)

    Alfalou, Ayman; Mansour, Ali

    2009-09-01

    Nowadays, protecting information is a major issue in any transmission system, as showed by an increasing number of research papers related to this topic. Optical encoding methods, such as a Double Random Phase encryption system i.e. DRP, are widely used and cited in the literature. DRP systems have very simple principle and they are easily applicable to most images (B&W, gray levels or color). Moreover, some applications require an enhanced encoding level based on multiencryption scheme and including biometric keys (as digital fingerprints). The enhancement should be done without increasing transmitted or stored information. In order to achieve that goal, a new approach for simultaneous multiplexing & encoding of several target images is developed in this manuscript. By introducing two additional security levels, our approach enhances the security level of a classic "DRP" system. Our first security level consists in using several independent image-keys (randomly and structurally) along with a new multiplexing algorithm. At this level, several target images (multiencryption) are used. This part can reduce needed information (encoding information). At the second level a standard DRP system is included. Finally, our approach can detect if any vandalism attempt has been done on transmitted encrypted images.

  5. Physically unclonable cryptographic primitives using self-assembled carbon nanotubes.

    PubMed

    Hu, Zhaoying; Comeras, Jose Miguel M Lobez; Park, Hongsik; Tang, Jianshi; Afzali, Ali; Tulevski, George S; Hannon, James B; Liehr, Michael; Han, Shu-Jen

    2016-06-01

    Information security underpins many aspects of modern society. However, silicon chips are vulnerable to hazards such as counterfeiting, tampering and information leakage through side-channel attacks (for example, by measuring power consumption, timing or electromagnetic radiation). Single-walled carbon nanotubes are a potential replacement for silicon as the channel material of transistors due to their superb electrical properties and intrinsic ultrathin body, but problems such as limited semiconducting purity and non-ideal assembly still need to be addressed before they can deliver high-performance electronics. Here, we show that by using these inherent imperfections, an unclonable electronic random structure can be constructed at low cost from carbon nanotubes. The nanotubes are self-assembled into patterned HfO2 trenches using ion-exchange chemistry, and the width of the trench is optimized to maximize the randomness of the nanotube placement. With this approach, two-dimensional (2D) random bit arrays are created that can offer ternary-bit architecture by determining the connection yield and switching type of the nanotube devices. As a result, our cryptographic keys provide a significantly higher level of security than conventional binary-bit architecture with the same key size.

  6. Physically unclonable cryptographic primitives using self-assembled carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Hu, Zhaoying; Comeras, Jose Miguel M. Lobez; Park, Hongsik; Tang, Jianshi; Afzali, Ali; Tulevski, George S.; Hannon, James B.; Liehr, Michael; Han, Shu-Jen

    2016-06-01

    Information security underpins many aspects of modern society. However, silicon chips are vulnerable to hazards such as counterfeiting, tampering and information leakage through side-channel attacks (for example, by measuring power consumption, timing or electromagnetic radiation). Single-walled carbon nanotubes are a potential replacement for silicon as the channel material of transistors due to their superb electrical properties and intrinsic ultrathin body, but problems such as limited semiconducting purity and non-ideal assembly still need to be addressed before they can deliver high-performance electronics. Here, we show that by using these inherent imperfections, an unclonable electronic random structure can be constructed at low cost from carbon nanotubes. The nanotubes are self-assembled into patterned HfO2 trenches using ion-exchange chemistry, and the width of the trench is optimized to maximize the randomness of the nanotube placement. With this approach, two-dimensional (2D) random bit arrays are created that can offer ternary-bit architecture by determining the connection yield and switching type of the nanotube devices. As a result, our cryptographic keys provide a significantly higher level of security than conventional binary-bit architecture with the same key size.

  7. Asymmetric optical image encryption using Kolmogorov phase screens and equal modulus decomposition

    NASA Astrophysics Data System (ADS)

    Kumar, Ravi; Bhaduri, Basanta; Quan, Chenggen

    2017-11-01

    An asymmetric technique for optical image encryption is proposed using Kolmogorov phase screens (KPSs) and equal modulus decomposition (EMD). The KPSs are generated using the power spectral density of Kolmogorov turbulence. The input image is first randomized and then Fresnel propagated with distance d. Further, the output in the Fresnel domain is modulated with a random phase mask, and the gyrator transform (GT) of the modulated image is obtained with an angle α. The EMD is operated on the GT spectrum to get the complex images, Z1 and Z2. Among these, Z2 is reserved as a private key for decryption and Z1 is propagated through a medium consisting of four KPSs, located at specified distances, to get the final encrypted image. The proposed technique provides a large set of security keys and is robust against various potential attacks. Numerical simulation results validate the effectiveness and security of the proposed technique.

  8. Random sphere packing model of heterogeneous propellants

    NASA Astrophysics Data System (ADS)

    Kochevets, Sergei Victorovich

    It is well recognized that combustion of heterogeneous propellants is strongly dependent on the propellant morphology. Recent developments in computing systems make it possible to start three-dimensional modeling of heterogeneous propellant combustion. A key component of such large scale computations is a realistic model of industrial propellants which retains the true morphology---a goal never achieved before. The research presented develops the Random Sphere Packing Model of heterogeneous propellants and generates numerical samples of actual industrial propellants. This is done by developing a sphere packing algorithm which randomly packs a large number of spheres with a polydisperse size distribution within a rectangular domain. First, the packing code is developed, optimized for performance, and parallelized using the OpenMP shared memory architecture. Second, the morphology and packing fraction of two simple cases of unimodal and bimodal packs are investigated computationally and analytically. It is shown that both the Loose Random Packing and Dense Random Packing limits are not well defined and the growth rate of the spheres is identified as the key parameter controlling the efficiency of the packing. For a properly chosen growth rate, computational results are found to be in excellent agreement with experimental data. Third, two strategies are developed to define numerical samples of polydisperse heterogeneous propellants: the Deterministic Strategy and the Random Selection Strategy. Using these strategies, numerical samples of industrial propellants are generated. The packing fraction is investigated and it is shown that the experimental values of the packing fraction can be achieved computationally. It is strongly believed that this Random Sphere Packing Model of propellants is a major step forward in the realistic computational modeling of heterogeneous propellant of combustion. In addition, a method of analysis of the morphology of heterogeneous propellants is developed which uses the concept of multi-point correlation functions. A set of intrinsic length scales of local density fluctuations in random heterogeneous propellants is identified by performing a Monte-Carlo study of the correlation functions. This method of analysis shows great promise for understanding the origins of the combustion instability of heterogeneous propellants, and is believed to become a valuable tool for the development of safe and reliable rocket engines.

  9. 41 CFR Appendix A to Subpart B of... - 3-Key Points and Principles

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Principles A Appendix A to Subpart B of Part 102 Public Contracts and Property Management Federal Property.... B, App. A Appendix A to Subpart B of Part 102-3—Key Points and Principles This appendix provides... principles that may be applied to situations not covered elsewhere in this subpart. The guidance follows: Key...

  10. Getting to Grips with Education and Training for Industry. A Development of the Concept of Key Technologies.

    ERIC Educational Resources Information Center

    Clyde, Albert

    "Key technologies" is an umbrella term for appropriate technologies applied to give maximum economic benefit in particular circumstances that may cross traditional disciplinary boundaries. Development of the concept is necessitated by the rate of change of technological development. Key technologies may be classified in three groups related to…

  11. 41 CFR Appendix A to Subpart D of... - 3-Key Points and Principles

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Principles A Appendix A to Subpart D of Part 102 Public Contracts and Property Management Federal Property... Subpart D of Part 102-3—Key Points and Principles This appendix provides additional guidance in the form of answers to frequently asked questions and identifies key points and principles that may be applied...

  12. 41 CFR Appendix A to Subpart C of... - 3-Key Points and Principles

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Principles A Appendix A to Subpart C of Part 102 Public Contracts and Property Management Federal Property... 102-3—Key Points and Principles This appendix provides additional guidance in the form of answers to frequently asked questions and identifies key points and principles that may be applied to situations not...

  13. Hazardous Chemical Fluorometer Development.

    DTIC Science & Technology

    1981-02-01

    RD-0129 997 HAZARDOUS CHEMICAL FLUOROMETER DEYELOPNENT(U) JOHNS HOPKINS UNIV LAUREL RD APPLIED PHYSICS LAB 6 S KEYS FEB Bi JHU/RPL/EED-Bi-6B USCO-D...TEST CHART REr-CRT NO: Cr-n-79-81 Hazardous Chemical Fluorometer Development -- Gary S. Keys q Ft THE JOHNS HOPKINS UNIVERSITYqFt. ill) APPLIED PHYSICS...Connecticut 06340 - 0 I CG-D-79-81/ Ah 7_> Hazardous Chemical Fluorometer Development February 1981 88898 7. ,~rrro z 9. NO-0.C as, 0-a ., AII=q1. Wo

  14. Unknown biological effects of L-glucose, ALA, and PUFA.

    PubMed

    Yamada, Katsuya; Sato, Daisuke; Nakamura, Takao; Amano, Hizuru; Morimoto, Yuji

    2017-09-01

    Key substrates including glucose, amino acids, and fatty acids play core roles in nutrient metabolism. In this review, we describe phenomena observed when key substrates are applied to cells. We focused on three promising substrates: L-glucose derivatives, 5-aminolevulinic acid, and polyunsaturated fatty acid. Since they are assumed to give a specific reaction when they are transported into cells or metabolized in cells, they are expected to be applied in a clinical setting. We provide the latest knowledge regarding their behaviors and effects on cells.

  15. Subsonic Transonic Applied Refinements By Using Key Strategies - STARBUKS In the NASA Langley Research Center National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Paryz, Roman W.

    2014-01-01

    Several upgrade projects have been completed at the NASA Langley Research Center National Transonic Facility over the last 1.5 years in an effort defined as STARBUKS - Subsonic Transonic Applied Refinements By Using Key Strategies. This multi-year effort was undertaken to improve NTF's overall capabilities by addressing Accuracy and Validation, Productivity, and Reliability areas at the NTF. This presentation will give a brief synopsis of each of these efforts.

  16. Equipoise among recanalization strategies.

    PubMed

    Tomsick, T A; Khatri, P; Jovin, T; Demaerschalk, B; Malisch, T; Demchuk, A; Hill, M D; Jauch, E; Spilker, J; Broderick, J P

    2010-03-30

    Modern acute ischemic stroke therapy is based on the premise that recanalization and subsequent reperfusion are essential for the preservation of brain tissue and favorable clinical outcomes. We outline key issues that we think underlie equipoise regarding the comparative clinical efficacy of IV recombinant tissue-type plasminogen activator (rt-PA) and intra-arterial (IA) reperfusion therapies for acute ischemic stroke. On the one hand, IV rt-PA therapy has the benefit of speed with presumed lower rates of recanalization of large artery occlusions as compared to IA methods. More recent reports of major arterial occlusions treated with IV rt-PA, as measured by transcranial Doppler and magnetic resonance angiography, demonstrate higher rates of recanalization. Conversely, IA therapies report higher recanalization rates, but are hampered by procedural delays and risks, even failing to be applied at all in occasional patients where time to reperfusion remains a critical factor. Higher rates of recanalization in IA trials using clot-removal devices have not translated into improved patient functional outcome as compared to trials of IV therapy. Combined IV-IA therapy promises to offer advantages of both, but perhaps only when applied in the timeliest of fashions, compared to IV therapy alone. Where equipoise exists, randomizing subjects to either IV rt-PA therapy or IV therapy followed by IA intervention, while incorporating new interventions into the study design, is a rational and appropriate research approach.

  17. Phase walk analysis of leptokurtic time series.

    PubMed

    Schreiber, Korbinian; Modest, Heike I; Räth, Christoph

    2018-06-01

    The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.

  18. N-mixture models for estimating population size from spatially replicated counts

    USGS Publications Warehouse

    Royle, J. Andrew

    2004-01-01

    Spatial replication is a common theme in count surveys of animals. Such surveys often generate sparse count data from which it is difficult to estimate population size while formally accounting for detection probability. In this article, i describe a class of models (n-mixture models) which allow for estimation of population size from such data. The key idea is to view site-specific population sizes, n, as independent random variables distributed according to some mixing distribution (e.g., Poisson). Prior parameters are estimated from the marginal likelihood of the data, having integrated over the prior distribution for n. Carroll and lombard (1985, journal of american statistical association 80, 423-426) proposed a class of estimators based on mixing over a prior distribution for detection probability. Their estimator can be applied in limited settings, but is sensitive to prior parameter values that are fixed a priori. Spatial replication provides additional information regarding the parameters of the prior distribution on n that is exploited by the n-mixture models and which leads to reasonable estimates of abundance from sparse data. A simulation study demonstrates superior operating characteristics (bias, confidence interval coverage) of the n-mixture estimator compared to the caroll and lombard estimator. Both estimators are applied to point count data on six species of birds illustrating the sensitivity to choice of prior on p and substantially different estimates of abundance as a consequence.

  19. Reduction of display artifacts by random sampling

    NASA Technical Reports Server (NTRS)

    Ahumada, A. J., Jr.; Nagel, D. C.; Watson, A. B.; Yellott, J. I., Jr.

    1983-01-01

    The application of random-sampling techniques to remove visible artifacts (such as flicker, moire patterns, and paradoxical motion) introduced in TV-type displays by discrete sequential scanning is discussed and demonstrated. Sequential-scanning artifacts are described; the window of visibility defined in spatiotemporal frequency space by Watson and Ahumada (1982 and 1983) and Watson et al. (1983) is explained; the basic principles of random sampling are reviewed and illustrated by the case of the human retina; and it is proposed that the sampling artifacts can be replaced by random noise, which can then be shifted to frequency-space regions outside the window of visibility. Vertical sequential, single-random-sequence, and continuously renewed random-sequence plotting displays generating 128 points at update rates up to 130 Hz are applied to images of stationary and moving lines, and best results are obtained with the single random sequence for the stationary lines and with the renewed random sequence for the moving lines.

  20. Generating random numbers by means of nonlinear dynamic systems

    NASA Astrophysics Data System (ADS)

    Zang, Jiaqi; Hu, Haojie; Zhong, Juhua; Luo, Duanbin; Fang, Yi

    2018-07-01

    To introduce the randomness of a physical process to students, a chaotic pendulum experiment was opened in East China University of Science and Technology (ECUST) on the undergraduate level in the physics department. It was shown chaotic motion could be initiated through adjusting the operation of a chaotic pendulum. By using the data of the angular displacements of chaotic motion, random binary numerical arrays can be generated. To check the randomness of generated numerical arrays, the NIST Special Publication 800-20 method was adopted. As a result, it was found that all the random arrays which were generated by the chaotic motion could pass the validity criteria and some of them were even better than the quality of pseudo-random numbers generated by a computer. Through the experiments, it is demonstrated that chaotic pendulum can be used as an efficient mechanical facility in generating random numbers, and can be applied in teaching random motion to the students.

  1. FPGA and USB based control board for quantum random number generator

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Wan, Xu; Zhang, Hong-Fei; Gao, Yuan; Chen, Teng-Yun; Liang, Hao

    2009-09-01

    The design and implementation of FPGA-and-USB-based control board for quantum experiments are discussed. The usage of quantum true random number generator, control- logic in FPGA and communication with computer through USB protocol are proposed in this paper. Programmable controlled signal input and output ports are implemented. The error-detections of data frame header and frame length are designed. This board has been used in our decoy-state based quantum key distribution (QKD) system successfully.

  2. System and method for key generation in security tokens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Philip G.; Humble, Travis S.; Paul, Nathanael R.

    Functional randomness in security tokens (FRIST) may achieve improved security in two-factor authentication hardware tokens by improving on the algorithms used to securely generate random data. A system and method in one embodiment according to the present invention may allow for security of a token based on storage cost and computational security. This approach may enable communication where security is no longer based solely on onetime pads (OTPs) generated from a single cryptographic function (e.g., SHA-256).

  3. Condensation of helium in aerogel and athermal dynamics of the random-field Ising model.

    PubMed

    Aubry, Geoffroy J; Bonnet, Fabien; Melich, Mathieu; Guyon, Laurent; Spathis, Panayotis; Despetis, Florence; Wolf, Pierre-Etienne

    2014-08-22

    High resolution measurements reveal that condensation isotherms of (4)He in high porosity silica aerogel become discontinuous below a critical temperature. We show that this behavior does not correspond to an equilibrium phase transition modified by the disorder induced by the aerogel structure, but to the disorder-driven critical point predicted for the athermal out-of-equilibrium dynamics of the random-field Ising model. Our results evidence the key role of nonequilibrium effects in the phase transitions of disordered systems.

  4. MICROBIAL POPULATION CHANGES DURING BIOREMEDIATION OF AN EXPERIMENTAL OIL SPILL

    EPA Science Inventory

    Three crude oil bioremediation techniques were applied in a randomized block field experiment simulating a coastal oil-spill. Four treatments (no oil control, oil alone, oil + nutrients, and oil + nutrients + an indigenous inoculum) were applied. In-situ microbial community str...

  5. Quantum-key-distribution protocol with pseudorandom bases

    NASA Astrophysics Data System (ADS)

    Trushechkin, A. S.; Tregubov, P. A.; Kiktenko, E. O.; Kurochkin, Y. V.; Fedorov, A. K.

    2018-01-01

    Quantum key distribution (QKD) offers a way for establishing information-theoretical secure communications. An important part of QKD technology is a high-quality random number generator for the quantum-state preparation and for post-processing procedures. In this work, we consider a class of prepare-and-measure QKD protocols, utilizing additional pseudorandomness in the preparation of quantum states. We study one of such protocols and analyze its security against the intercept-resend attack. We demonstrate that, for single-photon sources, the considered protocol gives better secret key rates than the BB84 and the asymmetric BB84 protocols. However, the protocol strongly requires single-photon sources.

  6. High performance frame synchronization for continuous variable quantum key distribution systems.

    PubMed

    Lin, Dakai; Huang, Peng; Huang, Duan; Wang, Chao; Peng, Jinye; Zeng, Guihua

    2015-08-24

    Considering a practical continuous variable quantum key distribution(CVQKD) system, synchronization is of significant importance as it is hardly possible to extract secret keys from unsynchronized strings. In this paper, we proposed a high performance frame synchronization method for CVQKD systems which is capable to operate under low signal-to-noise(SNR) ratios and is compatible with random phase shift induced by quantum channel. A practical implementation of this method with low complexity is presented and its performance is analysed. By adjusting the length of synchronization frame, this method can work well with large range of SNR values which paves the way for longer distance CVQKD.

  7. Encryption key distribution via chaos synchronization

    PubMed Central

    Keuninckx, Lars; Soriano, Miguel C.; Fischer, Ingo; Mirasso, Claudio R.; Nguimdo, Romain M.; Van der Sande, Guy

    2017-01-01

    We present a novel encryption scheme, wherein an encryption key is generated by two distant complex nonlinear units, forced into synchronization by a chaotic driver. The concept is sufficiently generic to be implemented on either photonic, optoelectronic or electronic platforms. The method for generating the key bitstream from the chaotic signals is reconfigurable. Although derived from a deterministic process, the obtained bit series fulfill the randomness conditions as defined by the National Institute of Standards test suite. We demonstrate the feasibility of our concept on an electronic delay oscillator circuit and test the robustness against attacks using a state-of-the-art system identification method. PMID:28233876

  8. Key rate for calibration robust entanglement based BB84 quantum key distribution protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gittsovich, O.; Moroder, T.

    2014-12-04

    We apply the approach of verifying entanglement, which is based on the sole knowledge of the dimension of the underlying physical system to the entanglement based version of the BB84 quantum key distribution protocol. We show that the familiar one-way key rate formula holds already if one assumes the assumption that one of the parties is measuring a qubit and no further assumptions about the measurement are needed.

  9. A systematic review of the evidence base for telehospice.

    PubMed

    Oliver, Debra Parker; Demiris, George; Wittenberg-Lyles, Elaine; Washington, Karla; Day, Tami; Novak, Hannah

    2012-01-01

    Abstract The use of telehealth technologies to overcome the geographic distances in the delivery of hospice care has been termed telehospice. Although telehospice research has been conducted over the last 10 years, little is known about the comprehensive findings within the field. The purpose of this systematic article was to focus on available research and answer the question, What is the state of the evidence related to telehospice services? The article was limited to studies that had been published in the English language and indexed between January 1, 2000 and March 23, 2010. Indexed databases included PubMed and PsycINFO and contained specified key words. Only research published in peer review journals and reporting empirical data, rather than opinion or editorials, were included. A two-part scoring framework was modified and applied to assess the methodological rigor and pertinence of each study. Scoring criteria allowed the evaluation of both quantitative and qualitative methodologies. Twenty-six studies were identified with the search strategy. Although limited in number and in strength, studies have evaluated the use of a variety of technologies, attitudes toward use by providers and consumers, clinical outcomes, barriers, readiness, and cost. A small evidence base for telehospice has emerged over the last 10 years. Although the evidence is of medium strength, its pertinence is strong. The evidence base could be strengthened with randomized trials and additional clinical-outcome-focused research in larger randomized samples and in qualitative studies with better-described samples.

  10. Sticker charts: a method for improving adherence to treatment of chronic diseases in children.

    PubMed

    Luersen, Kara; Davis, Scott A; Kaplan, Sebastian G; Abel, Troy D; Winchester, Woodrow W; Feldman, Steven R

    2012-01-01

    Poor adherence is a common problem and may be an underlying cause of poor clinical outcomes. In pediatric populations, positive reinforcement techniques such as sticker charts may increase motivation to adhere to treatment regimens. To review the use of sticker charts to improve adherence in children with chronic disease, Medline and PsycINFO searches were conducted using the key words "positive reinforcement OR behavior therapy" and "adherence OR patient compliance" and "child." Randomized controlled retrospective cohort or single-subject-design studies were selected. Studies reporting adherence to the medical treatment of chronic disease in children using positive reinforcement techniques were included in the analysis. The systematic search was supplemented by identifying additional studies identified through the reference lists and authors of the initial articles found. Positive reinforcement techniques such as sticker charts increase adherence to medical treatment regimens. In several studies, this effect was maintained for months after the initial intervention. Better adherence correlated with better clinical outcomes in some, but not all, studies. Few studies examining the use of sticker charts were identified. Although single-subject-design studies are useful in establishing the effect of a behavioral intervention, larger randomized controlled trials would help determine the precise efficacy of sticker chart interventions. Adherence to medical treatments in children can be increased using sticker charts or other positive reinforcement techniques. This may be an effective means to encourage children with atopic dermatitis to apply their medications and improve clinical outcomes. © 2012 Wiley Periodicals, Inc.

  11. A model of tuberculosis transmission and intervention strategies in an urban residential area.

    PubMed

    Pienaar, Elsje; Fluitt, Aaron M; Whitney, Scott E; Freifeld, Alison G; Viljoen, Hendrik J

    2010-04-01

    The model herein aims to explore the dynamics of the spread of tuberculosis (TB) in an informal settlement or township. The population is divided into households of various sizes and also based on commuting status. The model dynamics distinguishes between three distinct social patterns: the exposure of commuters during travel, random diurnal interaction and familial exposure at night. Following the general SLIR models, the population is further segmented into susceptible (S), exposed/latently infected (L), active/infectious (I), and recovered (R) individuals. During the daytime, commuters travel on public transport, while non-commuters randomly interact in the community to mimic chance encounters with infectious persons. At night, each family interacts and sleeps together in the home. The risk of exposure to TB is based on the proximity, duration, and frequency of encounters with infectious persons. The model is applied to a hypothetical population to explore the effects of different intervention strategies including vaccination, wearing of masks during the commute, prophylactic treatment of latent infections and more effective case-finding and treatment. The most important findings of the model are: (1) members of larger families are responsible for more disease transmissions than those from smaller families, (2) daily commutes on public transport provide ideal conditions for transmission of the disease, (3) improved diagnosis and treatment has the greatest impact on the spread of the disease, and (4) detecting TB at the first clinic visit, when patients are still smear negative, is key. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. Learning to experience side effects after antidepressant intake - Results from a randomized, controlled, double-blind study.

    PubMed

    Rheker, Julia; Winkler, Alexander; Doering, Bettina K; Rief, Winfried

    2017-02-01

    Side effects play a key role in patients' failure to take antidepressants. There is evidence that verbal suggestions and informed consent elicit expectations that can in turn trigger the occurrence of side effects. Prior experience or learning mechanisms are also assumed to contribute to the development of side effects, although their role has not been thoroughly investigated. In this study, we examined whether an antidepressant's side effects can be learned via Pavlovian conditioning. Participants (n = 39) were randomly allocated to one of two groups and were exposed to a classical conditioning procedure. During acquisition, 19 participants received amitriptyline and 20 participants received a placebo pill. Pills were taken for four nights together with a novel-tasting drink. After a washout phase, both groups received a placebo pill together with the novel-tasting drink (evocation). Side effects were assessed via the Generic Assessment of Side Effects Scale prior to acquisition (baseline), after acquisition, and after evocation. A score of antidepressant-specific side effects was calculated. Participants taking amitriptyline reported significantly more antidepressant-specific side effects after acquisition compared to both baseline and the placebo group. After evocation, participants who underwent the conditioning procedure with amitriptyline reported significantly more antidepressant-specific side effects than those who never received amitriptyline, even though both groups received a placebo. Our results indicate that antidepressant side effects can be learned using a conditioning paradigm and evoked via a placebo pill when applied with the same contextual factors as the verum.

  13. [Identification and sampling of people with migration background for epidemiological studies in Germany].

    PubMed

    Reiss, K; Makarova, N; Spallek, J; Zeeb, H; Razum, O

    2013-06-01

    In 2009, 19.6% of the population of Germany either had migrated themselves or were the offspring of people with migration experience. Migrants differ from the autochthonous German population in terms of health status, health awareness and health behaviour. To further investigate the health situation of migrants in Germany, epidemiological studies are needed. Such studies can employ existing databases which provide detailed information on migration status. Otherwise, onomastic or toponomastic procedures can be applied to identify people with migration background. If migrants have to be recruited into an epidemiological study, this can be done register-based (e. g., data from registration offices or telephone lists), based on residential location (random-route or random-walk procedure), via snowball sampling (e. g., through key persons) or via settings (e. g., school entry examination). An oversampling of people with migration background is not sufficient to avoid systematic bias in the sample due to non-participation. Additional measures have to be taken to increase access and raise participation rates. Personal contacting, multilingual instruments, multilingual interviewers and extensive public relations increase access and willingness to participate. Empirical evidence on 'successful' recruitment strategies for studies with migrants is still lacking in epidemiology and health sciences in Germany. The choice of the recruitment strategy as well as the measures to raise accessibility and willingness to participate depend on the available resources, the research question and the specific migrant target group. © Georg Thieme Verlag KG Stuttgart · New York.

  14. 75 FR 3444 - Availability of Seats for the Florida Keys National Marine Sanctuary Advisory Council

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-21

    ... Cultural Resources (alternate), and Tourism--Upper Keys (member). Applicants are chosen based upon their particular expertise and experience in relation to the seat for which they are applying; community and...

  15. Problems of Understanding English Ironic Expressions by M.A. Students of Applied Linguistics at Mu'tah University in Jordan

    ERIC Educational Resources Information Center

    Al Khawaldeh, Suhaib

    2015-01-01

    The present study attempts to investigate the problems of understanding English ironic expressions M.A. of Applied Linguistics students at Mu'tah University in Jordan. This quantitative and qualitative study includes 15 of M.A. students of Applied Linguistics at Mu'tah University. The participants were selected randomly. Two research instruments…

  16. Beliefs of Applied Studio Faculty on Desirable Traits of Prospective Music Education Majors: A Pilot Study

    ERIC Educational Resources Information Center

    Royston, Natalie Steele; Springer, D. Gregory

    2015-01-01

    The purpose of this pilot study was to examine the beliefs of applied music faculty on desirable traits of prospective music education majors. Researcher-designed surveys were sent electronically to applied music faculty at 12 National Association of Schools of Music-accredited institutions randomly selected from each of the four major divisions…

  17. Physical-layer security analysis of a quantum-noise randomized cipher based on the wire-tap channel model.

    PubMed

    Jiao, Haisong; Pu, Tao; Zheng, Jilin; Xiang, Peng; Fang, Tao

    2017-05-15

    The physical-layer security of a quantum-noise randomized cipher (QNRC) system is, for the first time, quantitatively evaluated with secrecy capacity employed as the performance metric. Considering quantum noise as a channel advantage for legitimate parties over eavesdroppers, the specific wire-tap models for both channels of the key and data are built with channel outputs yielded by quantum heterodyne measurement; the general expressions of secrecy capacities for both channels are derived, where the matching codes are proved to be uniformly distributed. The maximal achievable secrecy rate of the system is proposed, under which secrecy of both the key and data is guaranteed. The influences of various system parameters on secrecy capacities are assessed in detail. The results indicate that QNRC combined with proper channel codes is a promising framework of secure communication for long distance with high speed, which can be orders of magnitude higher than the perfect secrecy rates of other encryption systems. Even if the eavesdropper intercepts more signal power than the legitimate receiver, secure communication (up to Gb/s) can still be achievable. Moreover, the secrecy of running key is found to be the main constraint to the systemic maximal secrecy rate.

  18. Multiple-image encryption based on double random phase encoding and compressive sensing by using a measurement array preprocessed with orthogonal-basis matrices

    NASA Astrophysics Data System (ADS)

    Zhang, Luozhi; Zhou, Yuanyuan; Huo, Dongming; Li, Jinxi; Zhou, Xin

    2018-09-01

    A method is presented for multiple-image encryption by using the combination of orthogonal encoding and compressive sensing based on double random phase encoding. As an original thought in optical encryption, it is demonstrated theoretically and carried out by using the orthogonal-basis matrices to build a modified measurement array, being projected onto the images. In this method, all the images can be compressed in parallel into a stochastic signal and be diffused to be a stationary white noise. Meanwhile, each single-image can be separately reestablished by adopting a proper decryption key combination through the block-reconstruction rather than the entire-rebuilt, for its costs of data and decryption time are greatly decreased, which may be promising both in multi-user multiplexing and huge-image encryption/decryption. Besides, the security of this method is characterized by using the bit-length of key, and the parallelism is investigated as well. The simulations and discussions are also made on the effects of decryption as well as the correlation coefficient by using a series of sampling rates, occlusion attacks, keys with various error rates, etc.

  19. Predicting 30-day Hospital Readmission with Publicly Available Administrative Database. A Conditional Logistic Regression Modeling Approach.

    PubMed

    Zhu, K; Lou, Z; Zhou, J; Ballester, N; Kong, N; Parikh, P

    2015-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". Hospital readmissions raise healthcare costs and cause significant distress to providers and patients. It is, therefore, of great interest to healthcare organizations to predict what patients are at risk to be readmitted to their hospitals. However, current logistic regression based risk prediction models have limited prediction power when applied to hospital administrative data. Meanwhile, although decision trees and random forests have been applied, they tend to be too complex to understand among the hospital practitioners. Explore the use of conditional logistic regression to increase the prediction accuracy. We analyzed an HCUP statewide inpatient discharge record dataset, which includes patient demographics, clinical and care utilization data from California. We extracted records of heart failure Medicare beneficiaries who had inpatient experience during an 11-month period. We corrected the data imbalance issue with under-sampling. In our study, we first applied standard logistic regression and decision tree to obtain influential variables and derive practically meaning decision rules. We then stratified the original data set accordingly and applied logistic regression on each data stratum. We further explored the effect of interacting variables in the logistic regression modeling. We conducted cross validation to assess the overall prediction performance of conditional logistic regression (CLR) and compared it with standard classification models. The developed CLR models outperformed several standard classification models (e.g., straightforward logistic regression, stepwise logistic regression, random forest, support vector machine). For example, the best CLR model improved the classification accuracy by nearly 20% over the straightforward logistic regression model. Furthermore, the developed CLR models tend to achieve better sensitivity of more than 10% over the standard classification models, which can be translated to correct labeling of additional 400 - 500 readmissions for heart failure patients in the state of California over a year. Lastly, several key predictor identified from the HCUP data include the disposition location from discharge, the number of chronic conditions, and the number of acute procedures. It would be beneficial to apply simple decision rules obtained from the decision tree in an ad-hoc manner to guide the cohort stratification. It could be potentially beneficial to explore the effect of pairwise interactions between influential predictors when building the logistic regression models for different data strata. Judicious use of the ad-hoc CLR models developed offers insights into future development of prediction models for hospital readmissions, which can lead to better intuition in identifying high-risk patients and developing effective post-discharge care strategies. Lastly, this paper is expected to raise the awareness of collecting data on additional markers and developing necessary database infrastructure for larger-scale exploratory studies on readmission risk prediction.

  20. Fast image interpolation via random forests.

    PubMed

    Huang, Jun-Jie; Siu, Wan-Chi; Liu, Tian-Rui

    2015-10-01

    This paper proposes a two-stage framework for fast image interpolation via random forests (FIRF). The proposed FIRF method gives high accuracy, as well as requires low computation. The underlying idea of this proposed work is to apply random forests to classify the natural image patch space into numerous subspaces and learn a linear regression model for each subspace to map the low-resolution image patch to high-resolution image patch. The FIRF framework consists of two stages. Stage 1 of the framework removes most of the ringing and aliasing artifacts in the initial bicubic interpolated image, while Stage 2 further refines the Stage 1 interpolated image. By varying the number of decision trees in the random forests and the number of stages applied, the proposed FIRF method can realize computationally scalable image interpolation. Extensive experimental results show that the proposed FIRF(3, 2) method achieves more than 0.3 dB improvement in peak signal-to-noise ratio over the state-of-the-art nonlocal autoregressive modeling (NARM) method. Moreover, the proposed FIRF(1, 1) obtains similar or better results as NARM while only takes its 0.3% computational time.

  1. Random electric field instabilities of relaxor ferroelectrics

    NASA Astrophysics Data System (ADS)

    Arce-Gamboa, José R.; Guzmán-Verri, Gian G.

    2017-06-01

    Relaxor ferroelectrics are complex oxide materials which are rather unique to study the effects of compositional disorder on phase transitions. Here, we study the effects of quenched cubic random electric fields on the lattice instabilities that lead to a ferroelectric transition and show that, within a microscopic model and a statistical mechanical solution, even weak compositional disorder can prohibit the development of long-range order and that a random field state with anisotropic and power-law correlations of polarization emerges from the combined effect of their characteristic dipole forces and their inherent charge disorder. We compare and reproduce several key experimental observations in the well-studied relaxor PbMg1/3Nb2/3O3-PbTiO3.

  2. Measuring CAMD technique performance. 2. How "druglike" are drugs? Implications of Random test set selection exemplified using druglikeness classification models.

    PubMed

    Good, Andrew C; Hermsmeier, Mark A

    2007-01-01

    Research into the advancement of computer-aided molecular design (CAMD) has a tendency to focus on the discipline of algorithm development. Such efforts are often wrought to the detriment of the data set selection and analysis used in said algorithm validation. Here we highlight the potential problems this can cause in the context of druglikeness classification. More rigorous efforts are applied to the selection of decoy (nondruglike) molecules from the ACD. Comparisons are made between model performance using the standard technique of random test set creation with test sets derived from explicit ontological separation by drug class. The dangers of viewing druglike space as sufficiently coherent to permit simple classification are highlighted. In addition the issues inherent in applying unfiltered data and random test set selection to (Q)SAR models utilizing large and supposedly heterogeneous databases are discussed.

  3. Discrete gravity on random tensor network and holographic Rényi entropy

    NASA Astrophysics Data System (ADS)

    Han, Muxin; Huang, Shilin

    2017-11-01

    In this paper we apply the discrete gravity and Regge calculus to tensor networks and Anti-de Sitter/conformal field theory (AdS/CFT) correspondence. We construct the boundary many-body quantum state |Ψ〉 using random tensor networks as the holographic mapping, applied to the Wheeler-deWitt wave function of bulk Euclidean discrete gravity in 3 dimensions. The entanglement Rényi entropy of |Ψ〉 is shown to holographically relate to the on-shell action of Einstein gravity on a branch cover bulk manifold. The resulting Rényi entropy S n of |Ψ〉 approximates with high precision the Rényi entropy of ground state in 2-dimensional conformal field theory (CFT). In particular it reproduces the correct n dependence. Our results develop the framework of realizing the AdS3/CFT2 correspondence on random tensor networks, and provide a new proposal to approximate the CFT ground state.

  4. In the eye of the beholder: Inhomogeneous distribution of high-resolution shapes within the random-walk ensemble

    NASA Astrophysics Data System (ADS)

    Müller, Christian L.; Sbalzarini, Ivo F.; van Gunsteren, Wilfred F.; Žagrović, Bojan; Hünenberger, Philippe H.

    2009-06-01

    The concept of high-resolution shapes (also referred to as folds or states, depending on the context) of a polymer chain plays a central role in polymer science, structural biology, bioinformatics, and biopolymer dynamics. However, although the idea of shape is intuitively very useful, there is no unambiguous mathematical definition for this concept. In the present work, the distributions of high-resolution shapes within the ideal random-walk ensembles with N =3,…,6 beads (or up to N =10 for some properties) are investigated using a systematic (grid-based) approach based on a simple working definition of shapes relying on the root-mean-square atomic positional deviation as a metric (i.e., to define the distance between pairs of structures) and a single cutoff criterion for the shape assignment. Although the random-walk ensemble appears to represent the paramount of homogeneity and randomness, this analysis reveals that the distribution of shapes within this ensemble, i.e., in the total absence of interatomic interactions characteristic of a specific polymer (beyond the generic connectivity constraint), is significantly inhomogeneous. In particular, a specific (densest) shape occurs with a local probability that is 1.28, 1.79, 2.94, and 10.05 times (N =3,…,6) higher than the corresponding average over all possible shapes (these results can tentatively be extrapolated to a factor as large as about 1028 for N =100). The qualitative results of this analysis lead to a few rather counterintuitive suggestions, namely, that, e.g., (i) a fold classification analysis applied to the random-walk ensemble would lead to the identification of random-walk "folds;" (ii) a clustering analysis applied to the random-walk ensemble would also lead to the identification random-walk "states" and associated relative free energies; and (iii) a random-walk ensemble of polymer chains could lead to well-defined diffraction patterns in hypothetical fiber or crystal diffraction experiments. The inhomogeneous nature of the shape probability distribution identified here for random walks may represent a significant underlying baseline effect in the analysis of real polymer chain ensembles (i.e., in the presence of specific interatomic interactions). As a consequence, a part of what is called a polymer shape may actually reside just "in the eye of the beholder" rather than in the nature of the interactions between the constituting atoms, and the corresponding observation-related bias should be taken into account when drawing conclusions from shape analyses as applied to real structural ensembles.

  5. In the eye of the beholder: Inhomogeneous distribution of high-resolution shapes within the random-walk ensemble.

    PubMed

    Müller, Christian L; Sbalzarini, Ivo F; van Gunsteren, Wilfred F; Zagrović, Bojan; Hünenberger, Philippe H

    2009-06-07

    The concept of high-resolution shapes (also referred to as folds or states, depending on the context) of a polymer chain plays a central role in polymer science, structural biology, bioinformatics, and biopolymer dynamics. However, although the idea of shape is intuitively very useful, there is no unambiguous mathematical definition for this concept. In the present work, the distributions of high-resolution shapes within the ideal random-walk ensembles with N=3,...,6 beads (or up to N=10 for some properties) are investigated using a systematic (grid-based) approach based on a simple working definition of shapes relying on the root-mean-square atomic positional deviation as a metric (i.e., to define the distance between pairs of structures) and a single cutoff criterion for the shape assignment. Although the random-walk ensemble appears to represent the paramount of homogeneity and randomness, this analysis reveals that the distribution of shapes within this ensemble, i.e., in the total absence of interatomic interactions characteristic of a specific polymer (beyond the generic connectivity constraint), is significantly inhomogeneous. In particular, a specific (densest) shape occurs with a local probability that is 1.28, 1.79, 2.94, and 10.05 times (N=3,...,6) higher than the corresponding average over all possible shapes (these results can tentatively be extrapolated to a factor as large as about 10(28) for N=100). The qualitative results of this analysis lead to a few rather counterintuitive suggestions, namely, that, e.g., (i) a fold classification analysis applied to the random-walk ensemble would lead to the identification of random-walk "folds;" (ii) a clustering analysis applied to the random-walk ensemble would also lead to the identification random-walk "states" and associated relative free energies; and (iii) a random-walk ensemble of polymer chains could lead to well-defined diffraction patterns in hypothetical fiber or crystal diffraction experiments. The inhomogeneous nature of the shape probability distribution identified here for random walks may represent a significant underlying baseline effect in the analysis of real polymer chain ensembles (i.e., in the presence of specific interatomic interactions). As a consequence, a part of what is called a polymer shape may actually reside just "in the eye of the beholder" rather than in the nature of the interactions between the constituting atoms, and the corresponding observation-related bias should be taken into account when drawing conclusions from shape analyses as applied to real structural ensembles.

  6. Correcting oral contraceptive pharmacokinetic alterations due to obesity. A randomized controlled trial

    PubMed Central

    Edelman, Alison B; Cherala, Ganesh; Munar, Myrna Y.; McInnis, Martha; Stanczyk, Frank Z.; Jensen, Jeffrey T

    2014-01-01

    Objective To determine if increasing the hormone dose or eliminating the hormone-free interval improves key pharmacokinetic (PK) alterations caused by obesity during oral contraceptive (OC) use. Study design Obese (BMI ≥ 30 kg/m2), ovulatory, otherwise healthy, women received an OC containing 20 mcg ethinyl estradiol (EE)/100 mcg levonorgestrel (LNG) dosed cyclically (21 days active pills with 7-day placebo week) for two cycles and then were randomized for two additional cycles to: Continuous Cycling [CC, a dose neutral arm using the same OC with no hormone-free interval] or Increased Dose [ID, a dose escalation arm using an OC containing 30 mcg EE/150 mcg LNG cyclically]. During Cycle 2, 3, and 4, outpatient visits were performed to assess maximum serum concentration (Cmax), area under the curve (AUC0-∞), and time to steady state as well as pharmacodynamics. These key PK parameters were calculated and compared within groups between baseline and treatment cycles. Results A total of 31 women enrolled and completed the study (CC group n = 16; ID group n = 15). Demographics were similar between groups [mean BMI: CC 38kg/m2 (SD 5.1), ID 41kg/m2 (SD 7.6)]. At baseline, the key LNG PK parameters were no different between groups; average time to reach steady-state was 12 days in both groups; Cmax were CC: 3.82 ± 1.28 ng/mL and ID: 3.13 ± 0.87 ng/mL; and AUC0-∞ were CC: 267 ± 115 hr*ng/mL and ID: 199±75 hr*ng/mL. Following randomization, the CC group maintained steady-state serum levels whereas the ID group had a significantly higher Cmax (p< 0.001) but again required 12 days to achieve steady-state. However, AUC was not significantly different between CC (412 ± 255 hr*ng/mL) and ID (283 ± 130 hr*ng/mL). Forty-five percent (14/31) of the study population had evidence of an active follicle-like structure prior to randomization and afterwards this decreased to 9% (3/31). Conclusion Both increasing the OC dose and continuous dosing appear to counteract the impact of obesity on key OC PK parameters. PMID:25070547

  7. 17 CFR Appendix A to Part 160 - Model Privacy Form

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... rates and payments; retirement assets; checking account information; employment information; wire... additional or different information, such as a random opt-out number or a truncated account number, to... retirement earnings; apply for financing; apply for a lease; provide account information; give us your...

  8. Can an educational podcast improve the ability of parents of primary school children to assess the reliability of claims made about the benefits and harms of treatments: study protocol for a randomised controlled trial.

    PubMed

    Semakula, Daniel; Nsangi, Allen; Oxman, Matt; Austvoll-Dahlgren, Astrid; Rosenbaum, Sarah; Kaseje, Margaret; Nyirazinyoye, Laetitia; Fretheim, Atle; Chalmers, Iain; Oxman, Andrew D; Sewankambo, Nelson K

    2017-01-21

    Claims made about the effects of treatments are very common in the media and in the population more generally. The ability of individuals to understand and assess such claims can affect their decisions and health outcomes. Many people in both low- and high-income countries have inadequate aptitude to assess information about the effects of treatments. As part of the Informed Healthcare Choices project, we have prepared a series of podcast episodes to help improve people's ability to assess claims made about treatment effects. We will evaluate the effect of the Informed Healthcare Choices podcast on people's ability to assess claims made about the benefits and harms of treatments. Our study population will be parents of primary school children in schools with limited educational and financial resources in Uganda. This will be a two-arm, parallel-group, individual-randomised trial. We will randomly allocate consenting participants who meet the inclusion criteria for the trial to either listen to nine episodes of the Informed Healthcare Choices podcast (intervention) or to listen to nine typical public service announcements about health issues (control). Each podcast includes a story about a treatment claim, a message about one key concept that we believe is important for people to be able to understand to assess treatment claims, an explanation of how that concept applies to the claim, and a second example illustrating the concept. We designed the Claim Evaluation Tools to measure people's ability to apply key concepts related to assessing claims made about the effects of treatments and making informed health care choices. The Claim Evaluation Tools that we will use include multiple-choice questions addressing each of the nine concepts covered by the podcast. Using the Claim Evaluation Tools, we will measure two primary outcomes: (1) the proportion that 'pass', based on an absolute standard and (2) the average score. As far as we are aware this is the first randomised trial to assess the use of mass media to promote understanding of the key concepts needed to judge claims made about the effects of treatments. Pan African Clinical Trials Registry, PACTR201606001676150. Registered on 12 June 2016. http://www.pactr.org/ATMWeb/appmanager/atm/atmregistry?dar=true&tNo=PACTR201606001676150 .

  9. Secure detection in quantum key distribution by real-time calibration of receiver

    NASA Astrophysics Data System (ADS)

    Marøy, Øystein; Makarov, Vadim; Skaar, Johannes

    2017-12-01

    The single-photon detectionefficiency of the detector unit is crucial for the security of common quantum key distribution protocols like Bennett-Brassard 1984 (BB84). A low value for the efficiency indicates a possible eavesdropping attack that exploits the photon receiver’s imperfections. We present a method for estimating the detection efficiency, and calculate the corresponding secure key generation rate. The estimation is done by testing gated detectors using a randomly activated photon source inside the receiver unit. This estimate gives a secure rate for any detector with non-unity single-photon detection efficiency, both inherit or due to blinding. By adding extra optical components to the receiver, we make sure that the key is extracted from photon states for which our estimate is valid. The result is a quantum key distribution scheme that is secure against any attack that exploits detector imperfections.

  10. Continuous Variable Quantum Key Distribution Using Polarized Coherent States

    NASA Astrophysics Data System (ADS)

    Vidiella-Barranco, A.; Borelli, L. F. M.

    We discuss a continuous variables method of quantum key distribution employing strongly polarized coherent states of light. The key encoding is performed using the variables known as Stokes parameters, rather than the field quadratures. Their quantum counterpart, the Stokes operators Ŝi (i=1,2,3), constitute a set of non-commuting operators, being the precision of simultaneous measurements of a pair of them limited by an uncertainty-like relation. Alice transmits a conveniently modulated two-mode coherent state, and Bob randomly measures one of the Stokes parameters of the incoming beam. After performing reconciliation and privacy amplification procedures, it is possible to distill a secret common key. We also consider a non-ideal situation, in which coherent states with thermal noise, instead of pure coherent states, are used for encoding.

  11. Morphologies of Primary Silicon in Hypereutectic Al-Si Alloys: Phase-Field Simulation Supported by Key Experiments

    NASA Astrophysics Data System (ADS)

    Wang, Kai; Wei, Ming; Zhang, Lijun; Du, Yong

    2016-04-01

    We realized a three-dimensional visualization of the morphology evolution and the growth behavior of the octahedral primary silicon in hypereutectic Al-20wtpctSi alloy during solidification in a real length scale by utilizing the phase-field simulation coupled with CALPHAD databases, and supported by key experiments. Moreover, through two-dimensional cut of the octahedral primary silicon at random angles, different morphologies observed in experiments, including triangle, square, trapezoid, rhombic, pentagon, and hexagon, were well reproduced.

  12. Noisy probability judgment, the conjunction fallacy, and rationality: Comment on Costello and Watts (2014).

    PubMed

    Crupi, Vincenzo; Tentori, Katya

    2016-01-01

    According to Costello and Watts (2014), probability theory can account for key findings in human judgment research provided that random noise is embedded in the model. We concur with a number of Costello and Watts's remarks, but challenge the empirical adequacy of their model in one of their key illustrations (the conjunction fallacy) on the basis of recent experimental findings. We also discuss how our argument bears on heuristic and rational thinking. (c) 2015 APA, all rights reserved).

  13. How the World's Best Schools Stay on Top: Study's Key Findings Pinpoint Practices That Align with Learning Forward

    ERIC Educational Resources Information Center

    Killion, Joellen

    2016-01-01

    Key findings from a new study highlight how Learning Forward's long-standing position on professional learning correlates with practices in high-performing systems in Singapore, Shanghai, Hong Kong, and British Columbia. The purpose of this article is to share key findings from the study so that educators might apply them to strengthening…

  14. Mean dyadic Green's function for a two layer random medium

    NASA Technical Reports Server (NTRS)

    Zuniga, M. A.

    1981-01-01

    The mean dyadic Green's function for a two-layer random medium with arbitrary three-dimensional correlation functions has been obtained with the zeroth-order solution to the Dyson equation by applying the nonlinear approximation. The propagation of the coherent wave in the random medium is similar to that in an anisotropic medium with different propagation constants for the characteristic transverse electric and transverse magnetic polarizations. In the limit of a laminar structure, two propagation constants for each polarization are found to exist.

  15. A Gompertzian model with random effects to cervical cancer growth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mazlan, Mazma Syahidatul Ayuni; Rosli, Norhayati

    2015-05-15

    In this paper, a Gompertzian model with random effects is introduced to describe the cervical cancer growth. The parameters values of the mathematical model are estimated via maximum likehood estimation. We apply 4-stage Runge-Kutta (SRK4) for solving the stochastic model numerically. The efficiency of mathematical model is measured by comparing the simulated result and the clinical data of the cervical cancer growth. Low values of root mean-square error (RMSE) of Gompertzian model with random effect indicate good fits.

  16. Time-correlated gust loads using matched filter theory and random process theory - A new way of looking at things

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Zeiler, Thomas A.; Perry, Boyd, III

    1989-01-01

    This paper describes and illustrates two ways of performing time-correlated gust-load calculations. The first is based on Matched Filter Theory; the second on Random Process Theory. Both approaches yield theoretically identical results and represent novel applications of the theories, are computationally fast, and may be applied to other dynamic-response problems. A theoretical development and example calculations using both Matched Filter Theory and Random Process Theory approaches are presented.

  17. Transient Oscilliations in Mechanical Systems of Automatic Control with Random Parameters

    NASA Astrophysics Data System (ADS)

    Royev, B.; Vinokur, A.; Kulikov, G.

    2018-04-01

    Transient oscillations in mechanical systems of automatic control with random parameters is a relevant but insufficiently studied issue. In this paper, a modified spectral method was applied to investigate the problem. The nature of dynamic processes and the phase portraits are analyzed depending on the amplitude and frequency of external influence. It is evident from the obtained results, that the dynamic phenomena occurring in the systems with random parameters under external influence are complex, and their study requires further investigation.

  18. Time-correlated gust loads using Matched-Filter Theory and Random-Process Theory: A new way of looking at things

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Zeiler, Thomas A.; Perry, Boyd, III

    1989-01-01

    Two ways of performing time-correlated gust-load calculations are described and illustrated. The first is based on Matched Filter Theory; the second on Random Process Theory. Both approaches yield theoretically identical results and represent novel applications of the theories, are computationally fast, and may be applied to other dynamic-response problems. A theoretical development and example calculations using both Matched Filter Theory and Random Process Theory approaches are presented.

  19. A Round-Efficient Authenticated Key Agreement Scheme Based on Extended Chaotic Maps for Group Cloud Meeting.

    PubMed

    Lin, Tsung-Hung; Tsung, Chen-Kun; Lee, Tian-Fu; Wang, Zeng-Bo

    2017-12-03

    The security is a critical issue for business purposes. For example, the cloud meeting must consider strong security to maintain the communication privacy. Considering the scenario with cloud meeting, we apply extended chaotic map to present passwordless group authentication key agreement, termed as Passwordless Group Authentication Key Agreement (PL-GAKA). PL-GAKA improves the computation efficiency for the simple group password-based authenticated key agreement (SGPAKE) proposed by Lee et al. in terms of computing the session key. Since the extended chaotic map has equivalent security level to the Diffie-Hellman key exchange scheme applied by SGPAKE, the security of PL-GAKA is not sacrificed when improving the computation efficiency. Moreover, PL-GAKA is a passwordless scheme, so the password maintenance is not necessary. Short-term authentication is considered, hence the communication security is stronger than other protocols by dynamically generating session key in each cloud meeting. In our analysis, we first prove that each meeting member can get the correct information during the meeting. We analyze common security issues for the proposed PL-GAKA in terms of session key security, mutual authentication, perfect forward security, and data integrity. Moreover, we also demonstrate that communicating in PL-GAKA is secure when suffering replay attacks, impersonation attacks, privileged insider attacks, and stolen-verifier attacks. Eventually, an overall comparison is given to show the performance between PL-GAKA, SGPAKE and related solutions.

  20. A complexity theory model in science education problem solving: random walks for working memory and mental capacity.

    PubMed

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2003-07-01

    The present study examines the role of limited human channel capacity from a science education perspective. A model of science problem solving has been previously validated by applying concepts and tools of complexity theory (the working memory, random walk method). The method correlated the subjects' rank-order achievement scores in organic-synthesis chemistry problems with the subjects' working memory capacity. In this work, we apply the same nonlinear approach to a different data set, taken from chemical-equilibrium problem solving. In contrast to the organic-synthesis problems, these problems are algorithmic, require numerical calculations, and have a complex logical structure. As a result, these problems cause deviations from the model, and affect the pattern observed with the nonlinear method. In addition to Baddeley's working memory capacity, the Pascual-Leone's mental (M-) capacity is examined by the same random-walk method. As the complexity of the problem increases, the fractal dimension of the working memory random walk demonstrates a sudden drop, while the fractal dimension of the M-capacity random walk decreases in a linear fashion. A review of the basic features of the two capacities and their relation is included. The method and findings have consequences for problem solving not only in chemistry and science education, but also in other disciplines.

Top