Sample records for method key results

  1. Screening of the key volatile organic compounds of Tuber melanosporum fermentation by aroma sensory evaluation combination with principle component analysis

    PubMed Central

    Liu, Rui-Sang; Jin, Guang-Huai; Xiao, Deng-Rong; Li, Hong-Mei; Bai, Feng-Wu; Tang, Ya-Jie

    2015-01-01

    Aroma results from the interplay of volatile organic compounds (VOCs) and the attributes of microbial-producing aromas are significantly affected by fermentation conditions. Among the VOCs, only a few of them contribute to aroma. Thus, screening and identification of the key VOCs is critical for microbial-producing aroma. The traditional method is based on gas chromatography-olfactometry (GC-O), which is time-consuming and laborious. Considering the Tuber melanosporum fermentation system as an example, a new method to screen and identify the key VOCs by combining the aroma evaluation method with principle component analysis (PCA) was developed in this work. First, an aroma sensory evaluation method was developed to screen 34 potential favorite aroma samples from 504 fermentation samples. Second, PCA was employed to screen nine common key VOCs from these 34 samples. Third, seven key VOCs were identified by the traditional method. Finally, all of the seven key VOCs identified by the traditional method were also identified, along with four others, by the new strategy. These results indicate the reliability of the new method and demonstrate it to be a viable alternative to the traditional method. PMID:26655663

  2. A Routing Path Construction Method for Key Dissemination Messages in Sensor Networks

    PubMed Central

    Moon, Soo Young; Cho, Tae Ho

    2014-01-01

    Authentication is an important security mechanism for detecting forged messages in a sensor network. Each cluster head (CH) in dynamic key distribution schemes forwards a key dissemination message that contains encrypted authentication keys within its cluster to next-hop nodes for the purpose of authentication. The forwarding path of the key dissemination message strongly affects the number of nodes to which the authentication keys in the message are actually distributed. We propose a routing method for the key dissemination messages to increase the number of nodes that obtain the authentication keys. In the proposed method, each node selects next-hop nodes to which the key dissemination message will be forwarded based on secret key indexes, the distance to the sink node, and the energy consumption of its neighbor nodes. The experimental results show that the proposed method can increase by 50–70% the number of nodes to which authentication keys in each cluster are distributed compared to geographic and energy-aware routing (GEAR). In addition, the proposed method can detect false reports earlier by using the distributed authentication keys, and it consumes less energy than GEAR when the false traffic ratio (FTR) is ≥10%. PMID:25136649

  3. Simultaneous transmission for an encrypted image and a double random-phase encryption key

    NASA Astrophysics Data System (ADS)

    Yuan, Sheng; Zhou, Xin; Li, Da-Hai; Zhou, Ding-Fu

    2007-06-01

    We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.

  4. Simultaneous transmission for an encrypted image and a double random-phase encryption key.

    PubMed

    Yuan, Sheng; Zhou, Xin; Li, Da-hai; Zhou, Ding-fu

    2007-06-20

    We propose a method to simultaneously transmit double random-phase encryption key and an encrypted image by making use of the fact that an acceptable decryption result can be obtained when only partial data of the encrypted image have been taken in the decryption process. First, the original image data are encoded as an encrypted image by a double random-phase encryption technique. Second, a double random-phase encryption key is encoded as an encoded key by the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. Then the amplitude of the encrypted image is modulated by the encoded key to form what we call an encoded image. Finally, the encoded image that carries both the encrypted image and the encoded key is delivered to the receiver. Based on such a method, the receiver can have an acceptable result and secure transmission can be guaranteed by the RSA cipher system.

  5. Improved key-rate bounds for practical decoy-state quantum-key-distribution systems

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng

    2017-01-01

    The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.

  6. An Empirical Analysis of the Cascade Secret Key Reconciliation Protocol for Quantum Key Distribution

    DTIC Science & Technology

    2011-09-01

    performance with the parity checks within each pass increasing and as a result, the processing time is expected to increase as well. A conclusion is drawn... timely manner has driven efforts to develop new key distribution methods. The most promising method is Quantum Key Distribution (QKD) and is...thank the QKD Project Team for all of the insight and support they provided in such a short time period. Thanks are especially in order for my

  7. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    NASA Astrophysics Data System (ADS)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  8. Security Analysis and Improvements to the PsychoPass Method

    PubMed Central

    2013-01-01

    Background In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. Objective To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. Methods We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. Results The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. Conclusions The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength. PMID:23942458

  9. Anomaly Monitoring Method for Key Components of Satellite

    PubMed Central

    Fan, Linjun; Xiao, Weidong; Tang, Jun

    2014-01-01

    This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703

  10. Small Private Key PKS on an Embedded Microprocessor

    PubMed Central

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-01-01

    Multivariate quadratic ( ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012. PMID:24651722

  11. Small private key MQPKS on an embedded microprocessor.

    PubMed

    Seo, Hwajeong; Kim, Jihyun; Choi, Jongseok; Park, Taehwan; Liu, Zhe; Kim, Howon

    2014-03-19

    Multivariate quadratic (MQ) cryptography requires the use of long public and private keys to ensure a sufficient security level, but this is not favorable to embedded systems, which have limited system resources. Recently, various approaches to MQ cryptography using reduced public keys have been studied. As a result of this, at CHES2011 (Cryptographic Hardware and Embedded Systems, 2011), a small public key MQ scheme, was proposed, and its feasible implementation on an embedded microprocessor was reported at CHES2012. However, the implementation of a small private key MQ scheme was not reported. For efficient implementation, random number generators can contribute to reduce the key size, but the cost of using a random number generator is much more complex than computing MQ on modern microprocessors. Therefore, no feasible results have been reported on embedded microprocessors. In this paper, we propose a feasible implementation on embedded microprocessors for a small private key MQ scheme using a pseudo-random number generator and hash function based on a block-cipher exploiting a hardware Advanced Encryption Standard (AES) accelerator. To speed up the performance, we apply various implementation methods, including parallel computation, on-the-fly computation, optimized logarithm representation, vinegar monomials and assembly programming. The proposed method reduces the private key size by about 99.9% and boosts signature generation and verification by 5.78% and 12.19% than previous results in CHES2012.

  12. A knowledge-driven approach to biomedical document conceptualization.

    PubMed

    Zheng, Hai-Tao; Borchert, Charles; Jiang, Yong

    2010-06-01

    Biomedical document conceptualization is the process of clustering biomedical documents based on ontology-represented domain knowledge. The result of this process is the representation of the biomedical documents by a set of key concepts and their relationships. Most of clustering methods cluster documents based on invariant domain knowledge. The objective of this work is to develop an effective method to cluster biomedical documents based on various user-specified ontologies, so that users can exploit the concept structures of documents more effectively. We develop a flexible framework to allow users to specify the knowledge bases, in the form of ontologies. Based on the user-specified ontologies, we develop a key concept induction algorithm, which uses latent semantic analysis to identify key concepts and cluster documents. A corpus-related ontology generation algorithm is developed to generate the concept structures of documents. Based on two biomedical datasets, we evaluate the proposed method and five other clustering algorithms. The clustering results of the proposed method outperform the five other algorithms, in terms of key concept identification. With respect to the first biomedical dataset, our method has the F-measure values 0.7294 and 0.5294 based on the MeSH ontology and gene ontology (GO), respectively. With respect to the second biomedical dataset, our method has the F-measure values 0.6751 and 0.6746 based on the MeSH ontology and GO, respectively. Both results outperforms the five other algorithms in terms of F-measure. Based on the MeSH ontology and GO, the generated corpus-related ontologies show informative conceptual structures. The proposed method enables users to specify the domain knowledge to exploit the conceptual structures of biomedical document collections. In addition, the proposed method is able to extract the key concepts and cluster the documents with a relatively high precision. Copyright 2010 Elsevier B.V. All rights reserved.

  13. Practical and Secure Recovery of Disk Encryption Key Using Smart Cards

    NASA Astrophysics Data System (ADS)

    Omote, Kazumasa; Kato, Kazuhiko

    In key-recovery methods using smart cards, a user can recover the disk encryption key in cooperation with the system administrator, even if the user has lost the smart card including the disk encryption key. However, the disk encryption key is known to the system administrator in advance in most key-recovery methods. Hence user's disk data may be read by the system administrator. Furthermore, if the disk encryption key is not known to the system administrator in advance, it is difficult to achieve a key authentication. In this paper, we propose a scheme which enables to recover the disk encryption key when the user's smart card is lost. In our scheme, the disk encryption key is not preserved anywhere and then the system administrator cannot know the key before key-recovery phase. Only someone who has a user's smart card and knows the user's password can decrypt that user's disk data. Furthermore, we measured the processing time required for user authentication in an experimental environment using a virtual machine monitor. As a result, we found that this processing time is short enough to be practical.

  14. A text zero-watermarking method based on keyword dense interval

    NASA Astrophysics Data System (ADS)

    Yang, Fan; Zhu, Yuesheng; Jiang, Yifeng; Qing, Yin

    2017-07-01

    Digital watermarking has been recognized as a useful technology for the copyright protection and authentication of digital information. However, rarely did the former methods focus on the key content of digital carrier. The idea based on the protection of key content is more targeted and can be considered in different digital information, including text, image and video. In this paper, we use text as research object and a text zero-watermarking method which uses keyword dense interval (KDI) as the key content is proposed. First, we construct zero-watermarking model by introducing the concept of KDI and giving the method of KDI extraction. Second, we design detection model which includes secondary generation of zero-watermark and the similarity computing method of keyword distribution. Besides, experiments are carried out, and the results show that the proposed method gives better performance than other available methods especially in the attacks of sentence transformation and synonyms substitution.

  15. A Bayesian Framework for Human Body Pose Tracking from Depth Image Sequences

    PubMed Central

    Zhu, Youding; Fujimura, Kikuo

    2010-01-01

    This paper addresses the problem of accurate and robust tracking of 3D human body pose from depth image sequences. Recovering the large number of degrees of freedom in human body movements from a depth image sequence is challenging due to the need to resolve the depth ambiguity caused by self-occlusions and the difficulty to recover from tracking failure. Human body poses could be estimated through model fitting using dense correspondences between depth data and an articulated human model (local optimization method). Although it usually achieves a high accuracy due to dense correspondences, it may fail to recover from tracking failure. Alternately, human pose may be reconstructed by detecting and tracking human body anatomical landmarks (key-points) based on low-level depth image analysis. While this method (key-point based method) is robust and recovers from tracking failure, its pose estimation accuracy depends solely on image-based localization accuracy of key-points. To address these limitations, we present a flexible Bayesian framework for integrating pose estimation results obtained by methods based on key-points and local optimization. Experimental results are shown and performance comparison is presented to demonstrate the effectiveness of the proposed approach. PMID:22399933

  16. KEY COMPARISON: Final report on CCQM-K69 key comparison: Testosterone glucuronide in human urine

    NASA Astrophysics Data System (ADS)

    Liu, Fong-Ha; Mackay, Lindsey; Murby, John

    2010-01-01

    The CCQM-K69 key comparison of testosterone glucuronide in human urine was organized under the auspices of the CCQM Organic Analysis Working Group (OAWG). The National Measurement Institute Australia (NMIA) acted as the coordinating laboratory for the comparison. The samples distributed for the key comparison were prepared at NMIA with funding from the World Anti-Doping Agency (WADA). WADA granted the approval for this material to be used for the intercomparison provided the distribution and handling of the material were strictly controlled. Three national metrology institutes (NMIs)/designated institutes (DIs) developed reference methods and submitted data for the key comparison along with two other laboratories who participated in the parallel pilot study. A good selection of analytical methods and sample workup procedures was displayed in the results submitted considering the complexities of the matrix involved. The comparability of measurement results was successfully demonstrated by the participating NMIs. Only the key comparison data were used to estimate the key comparison reference value (KCRV), using the arithmetic mean approach. The reported expanded uncertainties for results ranged from 3.7% to 6.7% at the 95% level of confidence and all results agreed within the expanded uncertainty of the KCRV. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  17. Secure Multicast Tree Structure Generation Method for Directed Diffusion Using A* Algorithms

    NASA Astrophysics Data System (ADS)

    Kim, Jin Myoung; Lee, Hae Young; Cho, Tae Ho

    The application of wireless sensor networks to areas such as combat field surveillance, terrorist tracking, and highway traffic monitoring requires secure communication among the sensor nodes within the networks. Logical key hierarchy (LKH) is a tree based key management model which provides secure group communication. When a sensor node is added or evicted from the communication group, LKH updates the group key in order to ensure the security of the communications. In order to efficiently update the group key in directed diffusion, we propose a method for secure multicast tree structure generation, an extension to LKH that reduces the number of re-keying messages by considering the addition and eviction ratios of the history data. For the generation of the proposed key tree structure the A* algorithm is applied, in which the branching factor at each level can take on different value. The experiment results demonstrate the efficiency of the proposed key tree structure against the existing key tree structures of fixed branching factors.

  18. Application Mail Tracking Using RSA Algorithm As Security Data and HOT-Fit a Model for Evaluation System

    NASA Astrophysics Data System (ADS)

    Permadi, Ginanjar Setyo; Adi, Kusworo; Gernowo, Rahmad

    2018-02-01

    RSA algorithm give security in the process of the sending of messages or data by using 2 key, namely private key and public key .In this research to ensure and assess directly systems are made have meet goals or desire using a comprehensive evaluation methods HOT-Fit system .The purpose of this research is to build a information system sending mail by applying methods of security RSA algorithm and to evaluate in uses the method HOT-Fit to produce a system corresponding in the faculty physics. Security RSA algorithm located at the difficulty of factoring number of large coiled factors prima, the results of the prime factors has to be done to obtain private key. HOT-Fit has three aspects assessment, in the aspect of technology judging from the system status, the quality of system and quality of service. In the aspect of human judging from the use of systems and satisfaction users while in the aspect of organization judging from the structure and environment. The results of give a tracking system sending message based on the evaluation acquired.

  19. A fast image matching algorithm based on key points

    NASA Astrophysics Data System (ADS)

    Wang, Huilin; Wang, Ying; An, Ru; Yan, Peng

    2014-05-01

    Image matching is a very important technique in image processing. It has been widely used for object recognition and tracking, image retrieval, three-dimensional vision, change detection, aircraft position estimation, and multi-image registration. Based on the requirements of matching algorithm for craft navigation, such as speed, accuracy and adaptability, a fast key point image matching method is investigated and developed. The main research tasks includes: (1) Developing an improved celerity key point detection approach using self-adapting threshold of Features from Accelerated Segment Test (FAST). A method of calculating self-adapting threshold was introduced for images with different contrast. Hessian matrix was adopted to eliminate insecure edge points in order to obtain key points with higher stability. This approach in detecting key points has characteristics of small amount of computation, high positioning accuracy and strong anti-noise ability; (2) PCA-SIFT is utilized to describe key point. 128 dimensional vector are formed based on the SIFT method for the key points extracted. A low dimensional feature space was established by eigenvectors of all the key points, and each eigenvector was projected onto the feature space to form a low dimensional eigenvector. These key points were re-described by dimension-reduced eigenvectors. After reducing the dimension by the PCA, the descriptor was reduced to 20 dimensions from the original 128. This method can reduce dimensions of searching approximately near neighbors thereby increasing overall speed; (3) Distance ratio between the nearest neighbour and second nearest neighbour searching is regarded as the measurement criterion for initial matching points from which the original point pairs matched are obtained. Based on the analysis of the common methods (e.g. RANSAC (random sample consensus) and Hough transform cluster) used for elimination false matching point pairs, a heuristic local geometric restriction strategy is adopted to discard false matched point pairs further; and (4) Affine transformation model is introduced to correct coordinate difference between real-time image and reference image. This resulted in the matching of the two images. SPOT5 Remote sensing images captured at different date and airborne images captured with different flight attitude were used to test the performance of the method from matching accuracy, operation time and ability to overcome rotation. Results show the effectiveness of the approach.

  20. Key frame extraction based on spatiotemporal motion trajectory

    NASA Astrophysics Data System (ADS)

    Zhang, Yunzuo; Tao, Ran; Zhang, Feng

    2015-05-01

    Spatiotemporal motion trajectory can accurately reflect the changes of motion state. Motivated by this observation, this letter proposes a method for key frame extraction based on motion trajectory on the spatiotemporal slice. Different from the well-known motion related methods, the proposed method utilizes the inflexions of the motion trajectory on the spatiotemporal slice of all the moving objects. Experimental results show that although a similar performance is achieved in the single-objective screen, by comparing the proposed method to that achieved with the state-of-the-art methods based on motion energy or acceleration, the proposed method shows a better performance in a multiobjective video.

  1. International Perspective on Teaching Human Sexuality

    ERIC Educational Resources Information Center

    Wylie, Kevan; Weerakoon, Patricia

    2010-01-01

    Objective: The authors outline international training programs in human sexuality. Methods: The authors reviewed the international literature and Internet resources to identify key training opportunities and curricula, with particular emphasis on training opportunities for psychiatrists. Results: The authors outline key resources and training…

  2. Comparing hair-morphology and molecular methods to identify fecal samples from Neotropical felids

    PubMed Central

    Alberts, Carlos C.; Saranholi, Bruno H.; Frei, Fernando; Galetti, Pedro M.

    2017-01-01

    To avoid certain problems encountered with more-traditional and invasive methods in behavioral-ecology studies of mammalian predators, such as felids, molecular approaches have been employed to identify feces found in the field. However, this method requires a complete molecular biology laboratory, and usually also requires very fresh fecal samples to avoid DNA degradation. Both conditions are normally absent in the field. To address these difficulties, identification based on morphological characters (length, color, banding, scales and medullar patterns) of hairs found in feces could be employed as an alternative. In this study we constructed a morphological identification key for guard hairs of eight Neotropical felids (jaguar, oncilla, Geoffroy’s cat, margay, ocelot, Pampas cat, puma and jaguarundi) and compared its efficiency to that of a molecular identification method, using the ATP6 region as a marker. For this molecular approach, we simulated some field conditions by postponing sample-conservation procedures. A blind test of the identification key obtained a nearly 70% overall success rate, which we considered equivalent to or better than the results of some molecular methods (probably due to DNA degradation) found in other studies. The jaguar, puma and jaguarundi could be unequivocally discriminated from any other Neotropical felid. On a scale ranging from inadequate to excellent, the key proved poor only for the margay, with only 30% of its hairs successfully identified using this key; and have intermediate success rates for the remaining species, the oncilla, Geoffroy’s cat, ocelot and Pampas cat, were intermediate. Complementary information about the known distributions of felid populations may be necessary to substantially improve the results obtained with the key. Our own molecular results were even better, since all blind-tested samples were correctly identified. Part of these identifications were made from samples kept in suboptimal conditions, with some samples remaining outdoors for up to seven days, simulating conditions in the field. It appears that both methods can be used, depending on the available laboratory facilities and on the expected results. PMID:28880947

  3. Measurement-Device-Independent Quantum Key Distribution Over a 404 km Optical Fiber.

    PubMed

    Yin, Hua-Lei; Chen, Teng-Yun; Yu, Zong-Wen; Liu, Hui; You, Li-Xing; Zhou, Yi-Heng; Chen, Si-Jing; Mao, Yingqiu; Huang, Ming-Qi; Zhang, Wei-Jun; Chen, Hao; Li, Ming Jun; Nolan, Daniel; Zhou, Fei; Jiang, Xiao; Wang, Zhen; Zhang, Qiang; Wang, Xiang-Bin; Pan, Jian-Wei

    2016-11-04

    Measurement-device-independent quantum key distribution (MDIQKD) with the decoy-state method negates security threats of both the imperfect single-photon source and detection losses. Lengthening the distance and improving the key rate of quantum key distribution (QKD) are vital issues in practical applications of QKD. Herein, we report the results of MDIQKD over 404 km of ultralow-loss optical fiber and 311 km of a standard optical fiber while employing an optimized four-intensity decoy-state method. This record-breaking implementation of the MDIQKD method not only provides a new distance record for both MDIQKD and all types of QKD systems but also, more significantly, achieves a distance that the traditional Bennett-Brassard 1984 QKD would not be able to achieve with the same detection devices even with ideal single-photon sources. This work represents a significant step toward proving and developing feasible long-distance QKD.

  4. Round-robin differential-phase-shift quantum key distribution with heralded pair-coherent sources

    NASA Astrophysics Data System (ADS)

    Wang, Le; Zhao, Shengmei

    2017-04-01

    Round-robin differential-phase-shift (RRDPS) quantum key distribution (QKD) scheme provides an effective way to overcome the signal disturbance from the transmission process. However, most RRDPS-QKD schemes use weak coherent pulses (WCPs) as the replacement of the perfect single-photon source. Considering the heralded pair-coherent source (HPCS) can efficiently remove the shortcomings of WCPs, we propose a RRDPS-QKD scheme with HPCS in this paper. Both infinite-intensity decoy-state method and practical three-intensity decoy-state method are adopted to discuss the tight bound of the key rate of the proposed scheme. The results show that HPCS is a better candidate for the replacement of the perfect single-photon source, and both the key rate and the transmission distance are greatly increased in comparison with those results with WCPs when the length of the pulse trains is small. Simultaneously, the performance of the proposed scheme using three-intensity decoy states is close to that result using infinite-intensity decoy states when the length of pulse trains is small.

  5. Exploiting domain information for Word Sense Disambiguation of medical documents

    PubMed Central

    Agirre, Eneko; Soroa, Aitor

    2011-01-01

    Objective Current techniques for knowledge-based Word Sense Disambiguation (WSD) of ambiguous biomedical terms rely on relations in the Unified Medical Language System Metathesaurus but do not take into account the domain of the target documents. The authors' goal is to improve these methods by using information about the topic of the document in which the ambiguous term appears. Design The authors proposed and implemented several methods to extract lists of key terms associated with Medical Subject Heading terms. These key terms are used to represent the document topic in a knowledge-based WSD system. They are applied both alone and in combination with local context. Measurements A standard measure of accuracy was calculated over the set of target words in the widely used National Library of Medicine WSD dataset. Results and discussion The authors report a significant improvement when combining those key terms with local context, showing that domain information improves the results of a WSD system based on the Unified Medical Language System Metathesaurus alone. The best results were obtained using key terms obtained by relevance feedback and weighted by inverse document frequency. PMID:21900701

  6. Sarma-based key-group method for rock slope reliability analyses

    NASA Astrophysics Data System (ADS)

    Yarahmadi Bafghi, A. R.; Verdel, T.

    2005-08-01

    The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright

  7. Monitoring the Future: National Results on Adolescent Drug Use. Overview of Key Findings, 2006

    ERIC Educational Resources Information Center

    Johnston, Lloyd D., O'Malley, Patrick M.; Bachman, Jerald G.; Schulenberg, John E.

    2007-01-01

    This report provides a summary of drug use trends from a survey of nearly 50,000 eighth-, tenth-, and twelfth- grade students nationwide. It also includes perceived risk, personal disapproval, and perceived availability of each drug by this group. A synopsis of the methods used in the study and an overview of the key results from the 2006 survey…

  8. Transferring of speech movements from video to 3D face space.

    PubMed

    Pei, Yuru; Zha, Hongbin

    2007-01-01

    We present a novel method for transferring speech animation recorded in low quality videos to high resolution 3D face models. The basic idea is to synthesize the animated faces by an interpolation based on a small set of 3D key face shapes which span a 3D face space. The 3D key shapes are extracted by an unsupervised learning process in 2D video space to form a set of 2D visemes which are then mapped to the 3D face space. The learning process consists of two main phases: 1) Isomap-based nonlinear dimensionality reduction to embed the video speech movements into a low-dimensional manifold and 2) K-means clustering in the low-dimensional space to extract 2D key viseme frames. Our main contribution is that we use the Isomap-based learning method to extract intrinsic geometry of the speech video space and thus to make it possible to define the 3D key viseme shapes. To do so, we need only to capture a limited number of 3D key face models by using a general 3D scanner. Moreover, we also develop a skull movement recovery method based on simple anatomical structures to enhance 3D realism in local mouth movements. Experimental results show that our method can achieve realistic 3D animation effects with a small number of 3D key face models.

  9. The Efficacy of Multidimensional Constraint Keys in Database Query Performance

    ERIC Educational Resources Information Center

    Cardwell, Leslie K.

    2012-01-01

    This work is intended to introduce a database design method to resolve the two-dimensional complexities inherent in the relational data model and its resulting performance challenges through abstract multidimensional constructs. A multidimensional constraint is derived and utilized to implement an indexed Multidimensional Key (MK) to abstract a…

  10. Dual-Level Security based Cyclic18 Steganographic Method and its Application for Secure Transmission of Keyframes during Wireless Capsule Endoscopy.

    PubMed

    Muhammad, Khan; Sajjad, Muhammad; Baik, Sung Wook

    2016-05-01

    In this paper, the problem of secure transmission of sensitive contents over the public network Internet is addressed by proposing a novel data hiding method in encrypted images with dual-level security. The secret information is divided into three blocks using a specific pattern, followed by an encryption mechanism based on the three-level encryption algorithm (TLEA). The input image is scrambled using a secret key, and the encrypted sub-message blocks are then embedded in the scrambled image by cyclic18 least significant bit (LSB) substitution method, utilizing LSBs and intermediate LSB planes. Furthermore, the cover image and its planes are rotated at different angles using a secret key prior to embedding, deceiving the attacker during data extraction. The usage of message blocks division, TLEA, image scrambling, and the cyclic18 LSB method results in an advanced security system, maintaining the visual transparency of resultant images and increasing the security of embedded data. In addition, employing various secret keys for image scrambling, data encryption, and data hiding using the cyclic18 LSB method makes the data recovery comparatively more challenging for attackers. Experimental results not only validate the effectiveness of the proposed framework in terms of visual quality and security compared to other state-of-the-art methods, but also suggest its feasibility for secure transmission of diagnostically important keyframes to healthcare centers and gastroenterologists during wireless capsule endoscopy.

  11. Music playing and memory trace: evidence from event-related potentials.

    PubMed

    Kamiyama, Keiko; Katahira, Kentaro; Abla, Dilshat; Hori, Koji; Okanoya, Kazuo

    2010-08-01

    We examined the relationship between motor practice and auditory memory for sound sequences to evaluate the hypothesis that practice involving physical performance might enhance auditory memory. Participants learned two unfamiliar sound sequences using different training methods. Under the key-press condition, they learned a melody while pressing a key during auditory input. Under the no-key-press condition, they listened to another melody without any key pressing. The two melodies were presented alternately, and all participants were trained in both methods. Participants were instructed to pay attention under both conditions. After training, they listened to the two melodies again without pressing keys, and ERPs were recorded. During the ERP recordings, 10% of the tones in these melodies deviated from the originals. The grand-average ERPs showed that the amplitude of mismatch negativity (MMN) elicited by deviant stimuli was larger under the key-press condition than under the no-key-press condition. This effect appeared only in the high absolute pitch group, which included those with a pronounced ability to identify a note without external reference. This result suggests that the effect of training with key pressing was mediated by individual musical skills. Copyright 2010 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.

  12. Security of Distributed-Phase-Reference Quantum Key Distribution

    NASA Astrophysics Data System (ADS)

    Moroder, Tobias; Curty, Marcos; Lim, Charles Ci Wen; Thinh, Le Phuc; Zbinden, Hugo; Gisin, Nicolas

    2012-12-01

    Distributed-phase-reference quantum key distribution stands out for its easy implementation with present day technology. For many years, a full security proof of these schemes in a realistic setting has been elusive. We solve this long-standing problem and present a generic method to prove the security of such protocols against general attacks. To illustrate our result, we provide lower bounds on the key generation rate of a variant of the coherent-one-way quantum key distribution protocol. In contrast to standard predictions, it appears to scale quadratically with the system transmittance.

  13. Performance of device-independent quantum key distribution

    NASA Astrophysics Data System (ADS)

    Cao, Zhu; Zhao, Qi; Ma, Xiongfeng

    2016-07-01

    Quantum key distribution provides information-theoretically-secure communication. In practice, device imperfections may jeopardise the system security. Device-independent quantum key distribution solves this problem by providing secure keys even when the quantum devices are untrusted and uncharacterized. Following a recent security proof of the device-independent quantum key distribution, we improve the key rate by tightening the parameter choice in the security proof. In practice where the system is lossy, we further improve the key rate by taking into account the loss position information. From our numerical simulation, our method can outperform existing results. Meanwhile, we outline clear experimental requirements for implementing device-independent quantum key distribution. The maximal tolerable error rate is 1.6%, the minimal required transmittance is 97.3%, and the minimal required visibility is 96.8 % .

  14. Teaching Cardiac Examination Skills

    PubMed Central

    Smith, Christopher A; Hart, Avery S; Sadowski, Laura S; Riddle, Janet; Evans, Arthur T; Clarke, Peter M; Ganschow, Pamela S; Mason, Ellen; Sequeira, Winston; Wang, Yue

    2006-01-01

    OBJECTIVE To determine if structured teaching of bedside cardiac examination skills improves medical residents' examination technique and their identification of key clinical findings. DESIGN Firm-based single-blinded controlled trial. SETTING Inpatient service at a university-affiliated public teaching hospital. PARTICIPANTS Eighty Internal Medicine residents. METHODS The study assessed 2 intervention groups that received 3-hour bedside teaching sessions during their 4-week rotation using either: (1) a traditional teaching method, “demonstration and practice” (DP) (n=26) or (2) an innovative method, “collaborative discovery” (CD) (n=24). The control group received their usual ward teaching sessions (n=25). The main outcome measures were scores on examination technique and correct identification of key clinical findings on an objective structured clinical examination (OSCE). RESULTS All 3 groups had similar scores for both their examination technique and identification of key findings in the preintervention OSCE. After teaching, both intervention groups significantly improved their technical examination skills compared with the control group. The increase was 10% (95% confidence interval [CI] 4% to 17%) for CD versus control and 12% (95% CI 6% to 19%) for DP versus control (both P<.005) equivalent to an additional 3 to 4 examination skills being correctly performed. Improvement in key findings was limited to a 5% (95% CI 2% to 9%) increase for the CD teaching method, CD versus control P=.046, equivalent to the identification of an additional 2 key clinical findings. CONCLUSIONS Both programs of bedside teaching increase the technical examination skills of residents but improvements in the identification of key clinical findings were modest and only demonstrated with a new method of teaching. PMID:16423116

  15. Key Relation Extraction from Biomedical Publications.

    PubMed

    Huang, Lan; Wang, Ye; Gong, Leiguang; Kulikowski, Casimir; Bai, Tian

    2017-01-01

    Within the large body of biomedical knowledge, recent findings and discoveries are most often presented as research articles. Their number has been increasing sharply since the turn of the century, presenting ever-growing challenges for search and discovery of knowledge and information related to specific topics of interest, even with the help of advanced online search tools. This is especially true when the goal of a search is to find or discover key relations between important concepts or topic words. We have developed an innovative method for extracting key relations between concepts from abstracts of articles. The method focuses on relations between keywords or topic words in the articles. Early experiments with the method on PubMed publications have shown promising results in searching and discovering keywords and their relationships that are strongly related to the main topic of an article.

  16. Staff Recommendations Concerning the Delivery of Hepatitis-Related Services in County Health Departments

    ERIC Educational Resources Information Center

    Rainey, Jacquie

    2007-01-01

    Background: This paper describes a portion of a larger evaluation project of a state hepatitis prevention program. Purpose: The study explored the suggestions of key informants related to the delivery of hepatitis services in the state. Methods: Researchers conducted key informant interviews lasting 30 to 45 minutes. Results: Important findings…

  17. Exploiting domain information for Word Sense Disambiguation of medical documents.

    PubMed

    Stevenson, Mark; Agirre, Eneko; Soroa, Aitor

    2012-01-01

    Current techniques for knowledge-based Word Sense Disambiguation (WSD) of ambiguous biomedical terms rely on relations in the Unified Medical Language System Metathesaurus but do not take into account the domain of the target documents. The authors' goal is to improve these methods by using information about the topic of the document in which the ambiguous term appears. The authors proposed and implemented several methods to extract lists of key terms associated with Medical Subject Heading terms. These key terms are used to represent the document topic in a knowledge-based WSD system. They are applied both alone and in combination with local context. A standard measure of accuracy was calculated over the set of target words in the widely used National Library of Medicine WSD dataset. The authors report a significant improvement when combining those key terms with local context, showing that domain information improves the results of a WSD system based on the Unified Medical Language System Metathesaurus alone. The best results were obtained using key terms obtained by relevance feedback and weighted by inverse document frequency.

  18. Multi-bit wavelength coding phase-shift-keying optical steganography based on amplified spontaneous emission noise

    NASA Astrophysics Data System (ADS)

    Wang, Cheng; Wang, Hongxiang; Ji, Yuefeng

    2018-01-01

    In this paper, a multi-bit wavelength coding phase-shift-keying (PSK) optical steganography method is proposed based on amplified spontaneous emission noise and wavelength selection switch. In this scheme, the assignment codes and the delay length differences provide a large two-dimensional key space. A 2-bit wavelength coding PSK system is simulated to show the efficiency of our proposed method. The simulated results demonstrate that the stealth signal after encoded and modulated is well-hidden in both time and spectral domains, under the public channel and noise existing in the system. Besides, even the principle of this scheme and the existence of stealth channel are known to the eavesdropper, the probability of recovering the stealth data is less than 0.02 if the key is unknown. Thus it can protect the security of stealth channel more effectively. Furthermore, the stealth channel will results in 0.48 dB power penalty to the public channel at 1 × 10-9 bit error rate, and the public channel will have no influence on the receiving of the stealth channel.

  19. Reliable Characterization for Pyrolysis Bio-Oils Leads to Enhanced

    Science.gov Websites

    Upgrading Methods | NREL Reliable Characterization for Pyrolysis Bio-Oils Leads to Enhanced Upgrading Methods Science and Technology Highlights Highlights in Research & Development Reliable Characterization for Pyrolysis Bio-Oils Leads to Enhanced Upgrading Methods Key Research Results Achievement As co

  20. Extending key sharing: how to generate a key tightly coupled to a network security policy

    NASA Astrophysics Data System (ADS)

    Kazantzidis, Matheos

    2006-04-01

    Current state of the art security policy technologies, besides the small scale limitation and largely manual nature of accompanied management methods, are lacking a) in real-timeliness of policy implementation and b) vulnerabilities and inflexibility stemming from the centralized policy decision making; even if, for example, a policy description or access control database is distributed, the actual decision is often a centralized action and forms a system single point of failure. In this paper we are presenting a new fundamental concept that allows implement a security policy by a systematic and efficient key distribution procedure. Specifically, we extend the polynomial Shamir key splitting. According to this, a global key is split into n parts, any k of which can re-construct the original key. In this paper we present a method that instead of having "any k parts" be able to re-construct the original key, the latter can only be reconstructed if keys are combined as any access control policy describes. This leads into an easily deployable key generation procedure that results a single key per entity that "knows" its role in the specific access control policy from which it was derived. The system is considered efficient as it may be used to avoid expensive PKI operations or pairwise key distributions as well as provides superior security due to its distributed nature, the fact that the key is tightly coupled to the policy, and that policy change may be implemented easier and faster.

  1. Continuous-variable measurement-device-independent quantum key distribution with virtual photon subtraction

    NASA Astrophysics Data System (ADS)

    Zhao, Yijia; Zhang, Yichen; Xu, Bingjie; Yu, Song; Guo, Hong

    2018-04-01

    The method of improving the performance of continuous-variable quantum key distribution protocols by postselection has been recently proposed and verified. In continuous-variable measurement-device-independent quantum key distribution (CV-MDI QKD) protocols, the measurement results are obtained from untrusted third party Charlie. There is still not an effective method of improving CV-MDI QKD by the postselection with untrusted measurement. We propose a method to improve the performance of coherent-state CV-MDI QKD protocol by virtual photon subtraction via non-Gaussian postselection. The non-Gaussian postselection of transmitted data is equivalent to an ideal photon subtraction on the two-mode squeezed vacuum state, which is favorable to enhance the performance of CV-MDI QKD. In CV-MDI QKD protocol with non-Gaussian postselection, two users select their own data independently. We demonstrate that the optimal performance of the renovated CV-MDI QKD protocol is obtained with the transmitted data only selected by Alice. By setting appropriate parameters of the virtual photon subtraction, the secret key rate and tolerable excess noise are both improved at long transmission distance. The method provides an effective optimization scheme for the application of CV-MDI QKD protocols.

  2. Simple Web-based interactive key development software (WEBiKEY) and an example key for Kuruna (Poaceae: Bambusoideae)1

    PubMed Central

    Attigala, Lakshmi; De Silva, Nuwan I.; Clark, Lynn G.

    2016-01-01

    Premise of the study: Programs that are user-friendly and freely available for developing Web-based interactive keys are scarce and most of the well-structured applications are relatively expensive. WEBiKEY was developed to enable researchers to easily develop their own Web-based interactive keys with fewer resources. Methods and Results: A Web-based multiaccess identification tool (WEBiKEY) was developed that uses freely available Microsoft ASP.NET technologies and an SQL Server database for Windows-based hosting environments. WEBiKEY was tested for its usability with a sample data set, the temperate woody bamboo genus Kuruna (Poaceae). Conclusions: WEBiKEY is freely available to the public and can be used to develop Web-based interactive keys for any group of species. The interactive key we developed for Kuruna using WEBiKEY enables users to visually inspect characteristics of Kuruna and identify an unknown specimen as one of seven possible species in the genus. PMID:27144109

  3. New hematological key for bovine leukemia virus-infected Japanese Black cattle.

    PubMed

    Mekata, Hirohisa; Yamamoto, Mari; Kirino, Yumi; Sekiguchi, Satoshi; Konnai, Satoru; Horii, Yoichiro; Norimine, Junzo

    2018-02-20

    The European Community's (EC) Key, which is also called Bendixen's Key, is a well-established bovine leukemia virus (BLV) diagnostic method that classifies cattle according to the absolute lymphocyte count and age. The EC Key was originally designed for dairy cattle and is not necessarily suitable for Japanese Black (JB) beef cattle. This study revealed the lymphocyte counts in the BLV-free and -infected JB cattle were significantly lower than those in the Holstein cattle. Therefore, applying the EC Key to JB cattle could result in a large number of undetected BLV-infected cattle. Our proposed hematological key, which was designed for JB cattle, improves the detection of BLV-infected cattle by approximately 20%. We believe that this study could help promote BLV control.

  4. Using mixed methods in health research

    PubMed Central

    Woodman, Jenny

    2013-01-01

    Summary Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research. However, there are few papers that summarize this methodological approach for health practitioners who wish to conduct or critically engage with mixed methods studies. The objective of this paper is to provide an accessible introduction to mixed methods for clinicians and researchers unfamiliar with this approach. We present a synthesis of key methodological literature on mixed methods research, with examples from our own work and that of others, to illustrate the practical applications of this approach within health research. We summarize definitions of mixed methods research, the value of this approach, key aspects of study design and analysis, and discuss the potential challenges of combining quantitative and qualitative methods and data. One of the key challenges within mixed methods research is the successful integration of quantitative and qualitative data during analysis and interpretation. However, the integration of different types of data can generate insights into a research question, resulting in enriched understanding of complex health research problems. PMID:23885291

  5. Mechanics analysis of the multi-point-load process for the thin film solar cell

    NASA Astrophysics Data System (ADS)

    Wang, Zhiming; Wei, Guangpu; Gong, Zhengbang

    2008-02-01

    The main element of thin film solar cell is silicon. Because of the special mechanical characteristic of silicon, the method of loading pressure on the thin film solar cell and the value of pressure is the key problem which must be solved during the manufacturing of thin film solar cell. This paper describes the special mechanical characteristic of silicon, discussed the test method overall; value of pressure on thin film solar cell; the elements and the method of load by ANSYS finite element, according to these theory analysis, we obtained the key conclusion in the actual operation, these result have a great meaning in industry.

  6. Compact 0-complete trees: A new method for searching large files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orlandic, R.; Pfaltz, J.L.

    1988-01-26

    In this report, a novel approach to ordered retrieval in very large files is developed. The method employs a B-tree like search algorithm that is independent of key type or key length because all keys in index blocks are encoded by a 1 byte surrogate. The replacement of actual key sequences by the 1 byte surrogate ensures a maximal possible fan out and greatly reduces the storage overhead of maintaining access indices. Initially, retrieval in binary trie structure is developed. With the aid of a fairly complex recurrence relation, the rather scraggly binary trie is transformed into compact multi-way searchmore » tree. Then the recurrence relation itself is replaced by an unusually simple search algorithm. Then implementation details and empirical performance results are presented. Reduction of index size by 50%--75% opens up the possibility of replicating system-wide indices for parallel access in distributed databases. 23 figs.« less

  7. Long-distance measurement-device-independent quantum key distribution with coherent-state superpositions.

    PubMed

    Yin, H-L; Cao, W-F; Fu, Y; Tang, Y-L; Liu, Y; Chen, T-Y; Chen, Z-B

    2014-09-15

    Measurement-device-independent quantum key distribution (MDI-QKD) with decoy-state method is believed to be securely applied to defeat various hacking attacks in practical quantum key distribution systems. Recently, the coherent-state superpositions (CSS) have emerged as an alternative to single-photon qubits for quantum information processing and metrology. Here, in this Letter, CSS are exploited as the source in MDI-QKD. We present an analytical method that gives two tight formulas to estimate the lower bound of yield and the upper bound of bit error rate. We exploit the standard statistical analysis and Chernoff bound to perform the parameter estimation. Chernoff bound can provide good bounds in the long-distance MDI-QKD. Our results show that with CSS, both the security transmission distance and secure key rate are significantly improved compared with those of the weak coherent states in the finite-data case.

  8. Pavement crack detection combining non-negative feature with fast LoG in complex scene

    NASA Astrophysics Data System (ADS)

    Wang, Wanli; Zhang, Xiuhua; Hong, Hanyu

    2015-12-01

    Pavement crack detection is affected by much interference in the realistic situation, such as the shadow, road sign, oil stain, salt and pepper noise etc. Due to these unfavorable factors, the exist crack detection methods are difficult to distinguish the crack from background correctly. How to extract crack information effectively is the key problem to the road crack detection system. To solve this problem, a novel method for pavement crack detection based on combining non-negative feature with fast LoG is proposed. The two key novelties and benefits of this new approach are that 1) using image pixel gray value compensation to acquisit uniform image, and 2) combining non-negative feature with fast LoG to extract crack information. The image preprocessing results demonstrate that the method is indeed able to homogenize the crack image with more accurately compared to existing methods. A large number of experimental results demonstrate the proposed approach can detect the crack regions more correctly compared with traditional methods.

  9. Decoy-state quantum key distribution with biased basis choice

    PubMed Central

    Wei, Zhengchao; Wang, Weilong; Zhang, Zhen; Gao, Ming; Ma, Zhi; Ma, Xiongfeng

    2013-01-01

    We propose a quantum key distribution scheme that combines a biased basis choice with the decoy-state method. In this scheme, Alice sends all signal states in the Z basis and decoy states in the X and Z basis with certain probabilities, and Bob measures received pulses with optimal basis choice. This scheme simplifies the system and reduces the random number consumption. From the simulation result taking into account of statistical fluctuations, we find that in a typical experimental setup, the proposed scheme can increase the key rate by at least 45% comparing to the standard decoy-state scheme. In the postprocessing, we also apply a rigorous method to upper bound the phase error rate of the single-photon components of signal states. PMID:23948999

  10. Decoy-state quantum key distribution with biased basis choice.

    PubMed

    Wei, Zhengchao; Wang, Weilong; Zhang, Zhen; Gao, Ming; Ma, Zhi; Ma, Xiongfeng

    2013-01-01

    We propose a quantum key distribution scheme that combines a biased basis choice with the decoy-state method. In this scheme, Alice sends all signal states in the Z basis and decoy states in the X and Z basis with certain probabilities, and Bob measures received pulses with optimal basis choice. This scheme simplifies the system and reduces the random number consumption. From the simulation result taking into account of statistical fluctuations, we find that in a typical experimental setup, the proposed scheme can increase the key rate by at least 45% comparing to the standard decoy-state scheme. In the postprocessing, we also apply a rigorous method to upper bound the phase error rate of the single-photon components of signal states.

  11. Cascade phenomenon against subsequent failures in complex networks

    NASA Astrophysics Data System (ADS)

    Jiang, Zhong-Yuan; Liu, Zhi-Quan; He, Xuan; Ma, Jian-Feng

    2018-06-01

    Cascade phenomenon may lead to catastrophic disasters which extremely imperil the network safety or security in various complex systems such as communication networks, power grids, social networks and so on. In some flow-based networks, the load of failed nodes can be redistributed locally to their neighboring nodes to maximally preserve the traffic oscillations or large-scale cascading failures. However, in such local flow redistribution model, a small set of key nodes attacked subsequently can result in network collapse. Then it is a critical problem to effectively find the set of key nodes in the network. To our best knowledge, this work is the first to study this problem comprehensively. We first introduce the extra capacity for every node to put up with flow fluctuations from neighbors, and two extra capacity distributions including degree based distribution and average distribution are employed. Four heuristic key nodes discovering methods including High-Degree-First (HDF), Low-Degree-First (LDF), Random and Greedy Algorithms (GA) are presented. Extensive simulations are realized in both scale-free networks and random networks. The results show that the greedy algorithm can efficiently find the set of key nodes in both scale-free and random networks. Our work studies network robustness against cascading failures from a very novel perspective, and methods and results are very useful for network robustness evaluations and protections.

  12. Care initiation area yields dramatic results.

    PubMed

    2009-03-01

    The ED at Gaston Memorial Hospital in Gastonia, NC, has achieved dramatic results in key department metrics with a Care Initiation Area (CIA) and a physician in triage. Here's how the ED arrived at this winning solution: Leadership was trained in and implemented the Kaizen method, which eliminates redundant or inefficient process steps. Simulation software helped determine additional space needed by analyzing arrival patterns and other key data. After only two days of meetings, new ideas were implemented and tested.

  13. A novel key-frame extraction approach for both video summary and video index.

    PubMed

    Lei, Shaoshuai; Xie, Gang; Yan, Gaowei

    2014-01-01

    Existing key-frame extraction methods are basically video summary oriented; yet the index task of key-frames is ignored. This paper presents a novel key-frame extraction approach which can be available for both video summary and video index. First a dynamic distance separability algorithm is advanced to divide a shot into subshots based on semantic structure, and then appropriate key-frames are extracted in each subshot by SVD decomposition. Finally, three evaluation indicators are proposed to evaluate the performance of the new approach. Experimental results show that the proposed approach achieves good semantic structure for semantics-based video index and meanwhile produces video summary consistent with human perception.

  14. Towards a taxonomy for integrated care: a mixed-methods study

    PubMed Central

    Valentijn, Pim P.; Boesveld, Inge C.; van der Klauw, Denise M.; Ruwaard, Dirk; Struijs, Jeroen N.; Molema, Johanna J.W.; Bruijnzeels, Marc A.; Vrijhoef, Hubertus JM.

    2015-01-01

    Introduction Building integrated services in a primary care setting is considered an essential important strategy for establishing a high-quality and affordable health care system. The theoretical foundations of such integrated service models are described by the Rainbow Model of Integrated Care, which distinguishes six integration dimensions (clinical, professional, organisational, system, functional and normative integration). The aim of the present study is to refine the Rainbow Model of Integrated Care by developing a taxonomy that specifies the underlying key features of the six dimensions. Methods First, a literature review was conducted to identify features for achieving integrated service delivery. Second, a thematic analysis method was used to develop a taxonomy of key features organised into the dimensions of the Rainbow Model of Integrated Care. Finally, the appropriateness of the key features was tested in a Delphi study among Dutch experts. Results The taxonomy consists of 59 key features distributed across the six integration dimensions of the Rainbow Model of Integrated Care. Key features associated with the clinical, professional, organisational and normative dimensions were considered appropriate by the experts. Key features linked to the functional and system dimensions were considered less appropriate. Discussion This study contributes to the ongoing debate of defining the concept and typology of integrated care. This taxonomy provides a development agenda for establishing an accepted scientific framework of integrated care from an end-user, professional, managerial and policy perspective. PMID:25759607

  15. Stable isotope dilution assay (SIDA) and HS-SPME-GCMS quantification of key aroma volatiles for fruit and sap of Australian mango cultivars.

    PubMed

    San, Anh T; Joyce, Daryl C; Hofman, Peter J; Macnish, Andrew J; Webb, Richard I; Matovic, Nicolas J; Williams, Craig M; De Voss, James J; Wong, Siew H; Smyth, Heather E

    2017-04-15

    Reported herein is a high throughput method to quantify in a single analysis the key volatiles that contribute to the aroma of commercially significant mango cultivars grown in Australia. The method constitutes stable isotope dilution analysis (SIDA) in conjunction with headspace (HS) solid-phase microextraction (SPME) coupled with gas-chromatography mass spectrometry (GCMS). Deuterium labelled analogues of the target analytes were either purchased commercially or synthesised for use as internal standards. Seven volatiles, hexanal, 3-carene, α-terpinene, p-cymene, limonene, α-terpinolene and ethyl octanoate, were targeted. The resulting calibration functions had determination coefficients (R 2 ) ranging from 0.93775 to 0.99741. High recovery efficiencies for spiked mango samples were also achieved. The method was applied to identify the key aroma volatile compounds produced by 'Kensington Pride' and 'B74' mango fruit and by 'Honey Gold' mango sap. This method represents a marked improvement over current methods for detecting and measuring concentrations of mango fruit and sap volatiles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A Pressure Control Method for Emulsion Pump Station Based on Elman Neural Network

    PubMed Central

    Tan, Chao; Qi, Nan; Yao, Xingang; Wang, Zhongbin; Si, Lei

    2015-01-01

    In order to realize pressure control of emulsion pump station which is key equipment of coal mine in the safety production, the control requirements were analyzed and a pressure control method based on Elman neural network was proposed. The key techniques such as system framework, pressure prediction model, pressure control model, and the flowchart of proposed approach were presented. Finally, a simulation example was carried out and comparison results indicated that the proposed approach was feasible and efficient and outperformed others. PMID:25861253

  17. Differential Fault Analysis on CLEFIA with 128, 192, and 256-Bit Keys

    NASA Astrophysics Data System (ADS)

    Takahashi, Junko; Fukunaga, Toshinori

    This paper describes a differential fault analysis (DFA) attack against CLEFIA. The proposed attack can be applied to CLEFIA with all supported keys: 128, 192, and 256-bit keys. DFA is a type of side-channel attack. This attack enables the recovery of secret keys by injecting faults into a secure device during its computation of the cryptographic algorithm and comparing the correct ciphertext with the faulty one. CLEFIA is a 128-bit blockcipher with 128, 192, and 256-bit keys developed by the Sony Corporation in 2007. CLEFIA employs a generalized Feistel structure with four data lines. We developed a new attack method that uses this characteristic structure of the CLEFIA algorithm. On the basis of the proposed attack, only 2 pairs of correct and faulty ciphertexts are needed to retrieve the 128-bit key, and 10.78 pairs on average are needed to retrieve the 192 and 256-bit keys. The proposed attack is more efficient than any previously reported. In order to verify the proposed attack and estimate the calculation time to recover the secret key, we conducted an attack simulation using a PC. The simulation results show that we can obtain each secret key within three minutes on average. This result shows that we can obtain the entire key within a feasible computational time.

  18. SU-E-J-237: Image Feature Based DRR and Portal Image Registration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, X; Chang, J

    Purpose: Two-dimensional (2D) matching of the kV X-ray and digitally reconstructed radiography (DRR) images is an important setup technique for image-guided radiotherapy (IGRT). In our clinics, mutual information based methods are used for this purpose on commercial linear accelerators, but with often needs for manual corrections. This work proved the feasibility that feature based image transform can be used to register kV and DRR images. Methods: The scale invariant feature transform (SIFT) method was implemented to detect the matching image details (or key points) between the kV and DRR images. These key points represent high image intensity gradients, and thusmore » the scale invariant features. Due to the poor image contrast from our kV image, direct application of the SIFT method yielded many detection errors. To assist the finding of key points, the center coordinates of the kV and DRR images were read from the DICOM header, and the two groups of key points with similar relative positions to their corresponding centers were paired up. Using these points, a rigid transform (with scaling, horizontal and vertical shifts) was estimated. We also artificially introduced vertical and horizontal shifts to test the accuracy of our registration method on anterior-posterior (AP) and lateral pelvic images. Results: The results provided a satisfactory overlay of the transformed kV onto the DRR image. The introduced vs. detected shifts were fit into a linear regression. In the AP image experiments, linear regression analysis showed a slope of 1.15 and 0.98 with an R2 of 0.89 and 0.99 for the horizontal and vertical shifts, respectively. The results are 1.2 and 1.3 with R2 of 0.72 and 0.82 for the lateral image shifts. Conclusion: This work provided an alternative technique for kV to DRR alignment. Further improvements in the estimation accuracy and image contrast tolerance are underway.« less

  19. An Image Encryption Algorithm Utilizing Julia Sets and Hilbert Curves

    PubMed Central

    Sun, Yuanyuan; Chen, Lina; Xu, Rudan; Kong, Ruiqing

    2014-01-01

    Image encryption is an important and effective technique to protect image security. In this paper, a novel image encryption algorithm combining Julia sets and Hilbert curves is proposed. The algorithm utilizes Julia sets’ parameters to generate a random sequence as the initial keys and gets the final encryption keys by scrambling the initial keys through the Hilbert curve. The final cipher image is obtained by modulo arithmetic and diffuse operation. In this method, it needs only a few parameters for the key generation, which greatly reduces the storage space. Moreover, because of the Julia sets’ properties, such as infiniteness and chaotic characteristics, the keys have high sensitivity even to a tiny perturbation. The experimental results indicate that the algorithm has large key space, good statistical property, high sensitivity for the keys, and effective resistance to the chosen-plaintext attack. PMID:24404181

  20. Development of a new quantitative gas permeability method for dental implant-abutment connection tightness assessment

    PubMed Central

    2011-01-01

    Background Most dental implant systems are presently made of two pieces: the implant itself and the abutment. The connection tightness between those two pieces is a key point to prevent bacterial proliferation, tissue inflammation and bone loss. The leak has been previously estimated by microbial, color tracer and endotoxin percolation. Methods A new nitrogen flow technique was developed for implant-abutment connection leakage measurement, adapted from a recent, sensitive, reproducible and quantitative method used to assess endodontic sealing. Results The results show very significant differences between various sealing and screwing conditions. The remaining flow was lower after key screwing compared to hand screwing (p = 0.03) and remained different from the negative test (p = 0.0004). The method reproducibility was very good, with a coefficient of variation of 1.29%. Conclusions Therefore, the presented new gas flow method appears to be a simple and robust method to compare different implant systems. It allows successive measures without disconnecting the abutment from the implant and should in particular be used to assess the behavior of the connection before and after mechanical stress. PMID:21492459

  1. Recognized Leader in Electrochemical Purification

    ScienceCinema

    Hoppe, Eric

    2018-01-16

    PNNL scientists developed an electrochemical method for purifying copper, a key material that makes possible radiation detection systems of unprecedented sensitivity. The method begins with the purest copper materials available, and results in the lowest-background copper in the world. Chemist Eric Hoppe explains the process.

  2. Method for encryption and transmission of digital keying data

    DOEpatents

    Mniszewski, Susan M.; Springer, Edward A.; Brenner, David P.

    1988-01-01

    A method for the encryption, transmission, and subsequent decryption of digital keying data. The method utilizes the Data Encryption Standard and is implemented by means of a pair of apparatus, each of which is selectable to operate as either a master unit or remote unit. Each unit contains a set of key encryption keys which are indexed by a common indexing system. The master unit operates upon command from the remote unit to generate a data encryption key and encrypt the data encryption key using a preselected key encryption key. The encrypted data encryption key and an index designator are then downloaded to the remote unit, where the data encryption key is decrypted for subsequent use in the encryption and transmission data. Downloading of the encrypted data encryption key enables frequent change of keys without requiring manual entry or storage of keys at the remote unit.

  3. Analytical analysis and implementation of a low-speed high-torque permanent magnet vernier in-wheel motor for electric vehicle

    NASA Astrophysics Data System (ADS)

    Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili

    2012-04-01

    In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.

  4. Key comparison CCPR-K1.a as an interlaboratory comparison of correlated color temperature

    NASA Astrophysics Data System (ADS)

    Kärhä, P.; Vaskuri, A.; Pulli, T.; Ikonen, E.

    2018-02-01

    We analyze the results of spectral irradiance key comparison CCPR-K1.a for correlated color temperature (CCT). For four participants out of 13, the uncertainties of CCT, calculated using traditional methods, not accounting for correlations, would be too small. The reason for the failure of traditional uncertainty calculation is spectral correlations, producing systematic deviations of the same sign over certain wavelength regions. The results highlight the importance of accounting for such correlations when calculating uncertainties of spectrally integrated quantities.

  5. Estimating the size of key populations at higher risk of HIV infection: a summary of experiences and lessons presented during a technical meeting on size estimation among key populations in Asian countries

    PubMed Central

    Calleja, Jesus Maria Garcia; Zhao, Jinkou; Reddy, Amala; Seguy, Nicole

    2014-01-01

    Problem Size estimates of key populations at higher risk of HIV exposure are recognized as critical for understanding the trajectory of the HIV epidemic and planning and monitoring an effective response, especially for countries with concentrated and low epidemics such as those in Asia. Context To help countries estimate population sizes of key populations, global guidelines were updated in 2011 to reflect new technical developments and recent field experiences in applying these methods. Action In September 2013, a meeting of programme managers and experts experienced with population size estimates (PSE) for key populations was held for 13 Asian countries. This article summarizes the key results presented, shares practical lessons learnt and reviews the methodological approaches from implementing PSE in 13 countries. Lessons learnt It is important to build capacity to collect, analyse and use PSE data; establish a technical review group; and implement a transparent, well documented process. Countries should adapt global PSE guidelines and maintain operational definitions that are more relevant and useable for country programmes. Development of methods for non-venue-based key populations requires more investment and collaborative efforts between countries and among partners. PMID:25320676

  6. Delay and cost performance analysis of the diffie-hellman key exchange protocol in opportunistic mobile networks

    NASA Astrophysics Data System (ADS)

    Soelistijanto, B.; Muliadi, V.

    2018-03-01

    Diffie-Hellman (DH) provides an efficient key exchange system by reducing the number of cryptographic keys distributed in the network. In this method, a node broadcasts a single public key to all nodes in the network, and in turn each peer uses this key to establish a shared secret key which then can be utilized to encrypt and decrypt traffic between the peer and the given node. In this paper, we evaluate the key transfer delay and cost performance of DH in opportunistic mobile networks, a specific scenario of MANETs where complete end-to-end paths rarely exist between sources and destinations; consequently, the end-to-end delays in these networks are much greater than typical MANETs. Simulation results, driven by a random node movement model and real human mobility traces, showed that DH outperforms a typical key distribution scheme based on the RSA algorithm in terms of key transfer delay, measured by average key convergence time; however, DH performs as well as the benchmark in terms of key transfer cost, evaluated by total key (copies) forwards.

  7. Three-pass protocol scheme for bitmap image security by using vernam cipher algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Budiman, M. A.; Aulya, L.

    2018-02-01

    Confidentiality, integrity, and efficiency are the crucial aspects of data security. Among the other digital data, image data is too prone to abuse of operation like duplication, modification, etc. There are some data security techniques, one of them is cryptography. The security of Vernam Cipher cryptography algorithm is very dependent on the key exchange process. If the key is leaked, security of this algorithm will collapse. Therefore, a method that minimizes key leakage during the exchange of messages is required. The method which is used, is known as Three-Pass Protocol. This protocol enables message delivery process without the key exchange. Therefore, the sending messages process can reach the receiver safely without fear of key leakage. The system is built by using Java programming language. The materials which are used for system testing are image in size 200×200 pixel, 300×300 pixel, 500×500 pixel, 800×800 pixel and 1000×1000 pixel. The result of experiments showed that Vernam Cipher algorithm in Three-Pass Protocol scheme could restore the original image.

  8. Key management of the double random-phase-encoding method using public-key encryption

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2010-03-01

    Public-key encryption has been used to encode the key of the encryption process. In the proposed technique, an input image has been encrypted by using the double random-phase-encoding method using extended fractional Fourier transform. The key of the encryption process have been encoded by using the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. The encoded key has then been transmitted to the receiver side along with the encrypted image. In the decryption process, first the encoded key has been decrypted using the secret key and then the encrypted image has been decrypted by using the retrieved key parameters. The proposed technique has advantage over double random-phase-encoding method because the problem associated with the transmission of the key has been eliminated by using public-key encryption. Computer simulation has been carried out to validate the proposed technique.

  9. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  10. The comparison and analysis of extracting video key frame

    NASA Astrophysics Data System (ADS)

    Ouyang, S. Z.; Zhong, L.; Luo, R. Q.

    2018-05-01

    Video key frame extraction is an important part of the large data processing. Based on the previous work in key frame extraction, we summarized four important key frame extraction algorithms, and these methods are largely developed by comparing the differences between each of two frames. If the difference exceeds a threshold value, take the corresponding frame as two different keyframes. After the research, the key frame extraction based on the amount of mutual trust is proposed, the introduction of information entropy, by selecting the appropriate threshold values into the initial class, and finally take a similar mean mutual information as a candidate key frame. On this paper, several algorithms is used to extract the key frame of tunnel traffic videos. Then, with the analysis to the experimental results and comparisons between the pros and cons of these algorithms, the basis of practical applications is well provided.

  11. Microscale optical cryptography using a subdiffraction-limit optical key

    NASA Astrophysics Data System (ADS)

    Ogura, Yusuke; Aino, Masahiko; Tanida, Jun

    2018-04-01

    We present microscale optical cryptography using a subdiffraction-limit optical pattern, which is finer than the diffraction-limit size of the decrypting optical system, as a key and a substrate with a reflectance distribution as an encrypted image. Because of the subdiffraction-limit spatial coding, this method enables us to construct a secret image with the diffraction-limit resolution. Simulation and experimental results demonstrate, both qualitatively and quantitatively, that the secret image becomes recognizable when and only when the substrate is illuminated with the designed key pattern.

  12. Deficiencies of the cryptography based on multiple-parameter fractional Fourier transform.

    PubMed

    Ran, Qiwen; Zhang, Haiying; Zhang, Jin; Tan, Liying; Ma, Jing

    2009-06-01

    Methods of image encryption based on fractional Fourier transform have an incipient flaw in security. We show that the schemes have the deficiency that one group of encryption keys has many groups of keys to decrypt the encrypted image correctly for several reasons. In some schemes, many factors result in the deficiencies, such as the encryption scheme based on multiple-parameter fractional Fourier transform [Opt. Lett.33, 581 (2008)]. A modified method is proposed to avoid all the deficiencies. Security and reliability are greatly improved without increasing the complexity of the encryption process. (c) 2009 Optical Society of America.

  13. Fast Simulation of the Impact Parameter Calculation of Electrons through Pair Production

    NASA Astrophysics Data System (ADS)

    Bang, Hyesun; Kweon, MinJung; Huh, Kyoung Bum; Pachmayer, Yvonne

    2018-05-01

    A fast simulation method is introduced that reduces tremendously the time required for the impact parameter calculation, a key observable in physics analyses of high energy physics experiments and detector optimisation studies. The impact parameter of electrons produced through pair production was calculated considering key related processes using the Bethe-Heitler formula, the Tsai formula and a simple geometric model. The calculations were performed at various conditions and the results were compared with those from full GEANT4 simulations. The computation time using this fast simulation method is 104 times shorter than that of the full GEANT4 simulation.

  14. Comparative study of key exchange and authentication methods in application, transport and network level security mechanisms

    NASA Astrophysics Data System (ADS)

    Fathirad, Iraj; Devlin, John; Jiang, Frank

    2012-09-01

    The key-exchange and authentication are two crucial elements of any network security mechanism. IPsec, SSL/TLS, PGP and S/MIME are well-known security approaches in providing security service to network, transport and application layers; these protocols use different methods (based on their requirements) to establish keying materials and authenticates key-negotiation and participated parties. This paper studies and compares the authenticated key negotiation methods in mentioned protocols.

  15. Practical issues in quantum-key-distribution postprocessing

    NASA Astrophysics Data System (ADS)

    Fung, Chi-Hang Fred; Ma, Xiongfeng; Chau, H. F.

    2010-01-01

    Quantum key distribution (QKD) is a secure key generation method between two distant parties by wisely exploiting properties of quantum mechanics. In QKD, experimental measurement outcomes on quantum states are transformed by the two parties to a secret key. This transformation is composed of many logical steps (as guided by security proofs), which together will ultimately determine the length of the final secret key and its security. We detail the procedure for performing such classical postprocessing taking into account practical concerns (including the finite-size effect and authentication and encryption for classical communications). This procedure is directly applicable to realistic QKD experiments and thus serves as a recipe that specifies what postprocessing operations are needed and what the security level is for certain lengths of the keys. Our result is applicable to the BB84 protocol with a single or entangled photon source.

  16. Impacts on non-human biota from a generic geological disposal facility for radioactive waste: some key assessment issues.

    PubMed

    Robinson, C A; Smith, K L; Norris, S

    2010-06-01

    This paper provides an overview of key issues associated with the application of currently available biota dose assessment methods to consideration of potential environmental impacts from geological disposal facilities. It explores philosophical, methodological and practical assessment issues and reviews the implications of test assessment results in the context of recent and on-going challenges and debates.

  17. a Mapping Method of Slam Based on Look up Table

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Li, J.; Wang, A.; Wang, J.

    2017-09-01

    In the last years several V-SLAM(Visual Simultaneous Localization and Mapping) approaches have appeared showing impressive reconstructions of the world. However these maps are built with far more than the required information. This limitation comes from the whole process of each key-frame. In this paper we present for the first time a mapping method based on the LOOK UP TABLE(LUT) for visual SLAM that can improve the mapping effectively. As this method relies on extracting features in each cell divided from image, it can get the pose of camera that is more representative of the whole key-frame. The tracking direction of key-frames is obtained by counting the number of parallax directions of feature points. LUT stored all mapping needs the number of cell corresponding to the tracking direction which can reduce the redundant information in the key-frame, and is more efficient to mapping. The result shows that a better map with less noise is build using less than one-third of the time. We believe that the capacity of LUT efficiently building maps makes it a good choice for the community to investigate in the scene reconstruction problems.

  18. Dissemination and implementation of comparative effectiveness evidence: key informant interviews with Clinical and Translational Science Award institutions

    PubMed Central

    Morrato, Elaine H; Concannon, Thomas W; Meissner, Paul; Shah, Nilay D; Turner, Barbara J

    2014-01-01

    Aim To identify ongoing practices and opportunities for improving national comparative effectiveness research (CER) translation through dissemination and implementation (D&I) via NIH-funded Clinical and Translational Science Award (CTSA) institutions. Materials & methods Key informant interviews were conducted with 18 CTSA grantees sampled to represent a range of D&I efforts. Results & conclusions The institutional representatives endorsed fostering CER translation nationally via the CTSA Consortium. However, five themes emerged from the interviews as barriers to CER D&I: lack of institutional awareness, insufficient capacity, lack of established D&I methods, confusion among stakeholders about what CER actually is and limited funding opportunities. Interviewees offered two key recommendations to improve CER translation: development of a centralized clearing house to facilitate the diffusion of CER D&I resources and methods across CTSA institutions; and formalization of the national CTSA network to leverage existing community engagement relationships and resources for the purpose of adapting and disseminating robust CER evidence locally with providers, patients and healthcare systems. PMID:24236560

  19. Research on common methods for evaluating the operation effect of integrated wastewater treatment facilities of iron and steel enterprises

    NASA Astrophysics Data System (ADS)

    Bingsheng, Xu

    2017-04-01

    Considering the large quantities of wastewater generated from iron and steel enterprises in China, this paper is aimed to research the common methods applied for evaluating the integrated wastewater treatment effect of iron and steel enterprises. Based on survey results on environmental protection performance, technological economy, resource & energy consumption, services and management, an indicator system for evaluating the operation effect of integrated wastewater treatment facilities is set up. By discussing the standards and industrial policies in and out of China, 27 key secondary indicators are further defined on the basis of investigation on main equipment and key processes for wastewater treatment, so as to determine the method for setting key quantitative and qualitative indicators for evaluation indicator system. It is also expected to satisfy the basic requirements of reasonable resource allocation, environmental protection and sustainable economic development, further improve the integrated wastewater treatment effect of iron and steel enterprises, and reduce the emission of hazardous substances and environmental impact.

  20. Generalized Buneman Pruning for Inferring the Most Parsimonious Multi-state Phylogeny

    NASA Astrophysics Data System (ADS)

    Misra, Navodit; Blelloch, Guy; Ravi, R.; Schwartz, Russell

    Accurate reconstruction of phylogenies remains a key challenge in evolutionary biology. Most biologically plausible formulations of the problem are formally NP-hard, with no known efficient solution. The standard in practice are fast heuristic methods that are empirically known to work very well in general, but can yield results arbitrarily far from optimal. Practical exact methods, which yield exponential worst-case running times but generally much better times in practice, provide an important alternative. We report progress in this direction by introducing a provably optimal method for the weighted multi-state maximum parsimony phylogeny problem. The method is based on generalizing the notion of the Buneman graph, a construction key to efficient exact methods for binary sequences, so as to apply to sequences with arbitrary finite numbers of states with arbitrary state transition weights. We implement an integer linear programming (ILP) method for the multi-state problem using this generalized Buneman graph and demonstrate that the resulting method is able to solve data sets that are intractable by prior exact methods in run times comparable with popular heuristics. Our work provides the first method for provably optimal maximum parsimony phylogeny inference that is practical for multi-state data sets of more than a few characters.

  1. Sequential weighted Wiener estimation for extraction of key tissue parameters in color imaging: a phantom study

    NASA Astrophysics Data System (ADS)

    Chen, Shuo; Lin, Xiaoqian; Zhu, Caigang; Liu, Quan

    2014-12-01

    Key tissue parameters, e.g., total hemoglobin concentration and tissue oxygenation, are important biomarkers in clinical diagnosis for various diseases. Although point measurement techniques based on diffuse reflectance spectroscopy can accurately recover these tissue parameters, they are not suitable for the examination of a large tissue region due to slow data acquisition. The previous imaging studies have shown that hemoglobin concentration and oxygenation can be estimated from color measurements with the assumption of known scattering properties, which is impractical in clinical applications. To overcome this limitation and speed-up image processing, we propose a method of sequential weighted Wiener estimation (WE) to quickly extract key tissue parameters, including total hemoglobin concentration (CtHb), hemoglobin oxygenation (StO2), scatterer density (α), and scattering power (β), from wide-band color measurements. This method takes advantage of the fact that each parameter is sensitive to the color measurements in a different way and attempts to maximize the contribution of those color measurements likely to generate correct results in WE. The method was evaluated on skin phantoms with varying CtHb, StO2, and scattering properties. The results demonstrate excellent agreement between the estimated tissue parameters and the corresponding reference values. Compared with traditional WE, the sequential weighted WE shows significant improvement in the estimation accuracy. This method could be used to monitor tissue parameters in an imaging setup in real time.

  2. Key comparison SIM.EM.RF-K5b.CL: scattering coefficients by broad-band methods, 2 GHz-18 GHz — type N connector

    NASA Astrophysics Data System (ADS)

    Silva, H.; Monasterios, G.

    2016-01-01

    The first key comparison in microwave frequencies within the SIM (Sistema Interamericano de Metrología) region has been carried out. The measurands were the S-parameters of 50 ohm coaxial devices with Type-N connectors and were measured at 2 GHz, 9 GHz and 18 GHz. SIM.EM.RF-K5b.CL was the identification assigned and it was based on a parent CCEM key comparison named CCEM.RF-K5b.CL. For this reason, the measurements standards and their nominal values were selected accordingly, i.e. two one-port devices (a matched and a mismatched load) to cover low and high reflection coefficients and two attenuators (3dB and 20 dB) to cover low and high transmission coefficients. This key comparison has met the need for ensuring traceability in high-frequency measurements across America by linking SIM's results to CCEM. Six NMIs have participated in this comparison which was piloted by the Instituto Nacional de Tecnología Industrial (Argentina). A linking method of multivariate values was proposed and implemented in order to allow the linking of 2-dimensional results. KEY WORDS FOR SEARCH Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCEM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  3. Secure password-based authenticated key exchange for web services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Fang; Meder, Samuel; Chevassut, Olivier

    This paper discusses an implementation of an authenticated key-exchange method rendered on message primitives defined in the WS-Trust and WS-SecureConversation specifications. This IEEE-specified cryptographic method (AuthA) is proven-secure for password-based authentication and key exchange, while the WS-Trust and WS-Secure Conversation are emerging Web Services Security specifications that extend the WS-Security specification. A prototype of the presented protocol is integrated in the WSRF-compliant Globus Toolkit V4. Further hardening of the implementation is expected to result in a version that will be shipped with future Globus Toolkit releases. This could help to address the current unavailability of decent shared-secret-based authentication options inmore » the Web Services and Grid world. Future work will be to integrate One-Time-Password (OTP) features in the authentication protocol.« less

  4. Identification of key ancestors of modern germplasm in a breeding program of maize.

    PubMed

    Technow, F; Schrag, T A; Schipprack, W; Melchinger, A E

    2014-12-01

    Probabilities of gene origin computed from the genomic kinships matrix can accurately identify key ancestors of modern germplasms Identifying the key ancestors of modern plant breeding populations can provide valuable insights into the history of a breeding program and provide reference genomes for next generation whole genome sequencing. In an animal breeding context, a method was developed that employs probabilities of gene origin, computed from the pedigree-based additive kinship matrix, for identifying key ancestors. Because reliable and complete pedigree information is often not available in plant breeding, we replaced the additive kinship matrix with the genomic kinship matrix. As a proof-of-concept, we applied this approach to simulated data sets with known ancestries. The relative contribution of the ancestral lines to later generations could be determined with high accuracy, with and without selection. Our method was subsequently used for identifying the key ancestors of the modern Dent germplasm of the public maize breeding program of the University of Hohenheim. We found that the modern germplasm can be traced back to six or seven key ancestors, with one or two of them having a disproportionately large contribution. These results largely corroborated conjectures based on early records of the breeding program. We conclude that probabilities of gene origin computed from the genomic kinships matrix can be used for identifying key ancestors in breeding programs and estimating the proportion of genes contributed by them.

  5. Selection of key financial indicators: a literature, panel and survey approach.

    PubMed

    Pink, George H; Daniel, Imtiaz; Hall, Linda McGillis; McKillop, Ian

    2007-01-01

    Since 1998, most hospitals in Ontario have voluntarily participated in one of the largest and most ambitious publicly available performance-reporting initiatives in the world. This article describes the method used to select key financial indicators for inclusion in the report including the literature review, panel and survey approaches that were used. The results for five years of recent data for Ontario hospitals are also presented.

  6. Key management schemes using routing information frames in secure wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Kamaev, V. A.; Finogeev, A. G.; Finogeev, A. A.; Parygin, D. S.

    2017-01-01

    The article considers the problems and objectives of key management for data encryption in wireless sensor networks (WSN) of SCADA systems. The structure of the key information in the ZigBee network and methods of keys obtaining are discussed. The use of a hybrid key management schemes is most suitable for WSN. The session symmetric key is used to encrypt the sensor data, asymmetric keys are used to encrypt the session key transmitted from the routing information. Three algorithms of hybrid key management using routing information frames determined by routing methods and the WSN topology are presented.

  7. Realizing Improvement through Team Empowerment (RITE): A Team-based, Project-based Multidisciplinary Improvement Program.

    PubMed

    Larson, David B; Mickelsen, L Jake; Garcia, Kandice

    2016-01-01

    Performance improvement in a complex health care environment depends on the cooperation of diverse individuals and groups, allocation of time and resources, and use of effective improvement methods. To address this challenge, we developed an 18-week multidisciplinary training program that would also provide a vehicle for effecting needed improvements, by using a team- and project-based model. The program began in the radiology department and subsequently expanded to include projects from throughout the medical center. Participants were taught a specific method for team-based problem solving, which included (a) articulating the problem, (b) observing the process, (c) analyzing possible causes of problems, (d) identifying key drivers, (e) testing and refining interventions, and (f) providing for sustainment of results. Progress was formally reviewed on a weekly basis. A total of 14 teams consisting of 78 participants completed the course in two cohorts; one project was discontinued. All completed projects resulted in at least modest improvement. Mean skill scores increased from 2.5/6 to 4.5/6 (P < .01), and the mean satisfaction score was 4.7/5. Identified keys to success include (a) engagement of frontline staff, (b) teams given authority to make process changes, (c) capable improvement coaches, (d) a physician-director with improvement expertise and organizational authority, (e) capable administrative direction, (f) supportive organizational leaders, (g) weekly progress reviews, (h) timely educational material, (i) structured problem-solving methods, and ( j ) multiple projects working simultaneously. The purpose of this article is to review the program, including the methods and results, and discuss perceived keys to program success. © RSNA, 2016.

  8. A fast key generation method based on dynamic biometrics to secure wireless body sensor networks for p-health.

    PubMed

    Zhang, G H; Poon, Carmen C Y; Zhang, Y T

    2010-01-01

    Body sensor networks (BSNs) have emerged as a new technology for healthcare applications, but the security of communication in BSNs remains a formidable challenge yet to be resolved. The paper discusses the typical attacks faced by BSNs and proposes a fast biometric based approach to generate keys for ensuing confidentiality and authentication in BSN communications. The approach was tested on 900 segments of electrocardiogram. Each segment was 4 seconds long and used to generate a 128-bit key. The results of the study found that entropy of 96% of the keys were above 0.95 and 99% of the hamming distances calculated from any two keys were above 50 bits. Based on the randomness and distinctiveness of these keys, it is concluded that the fast biometric based approach has great potential to be used to secure communication in BSNs for health applications.

  9. Optical Double Image Hiding in the Fractional Hartley Transform Using Structured Phase Filter and Arnold Transform

    NASA Astrophysics Data System (ADS)

    Yadav, Poonam Lata; Singh, Hukum

    2018-06-01

    To maintain the security of the image encryption and to protect the image from intruders, a new asymmetric cryptosystem based on fractional Hartley Transform (FrHT) and the Arnold transform (AT) is proposed. AT is a method of image cropping and edging in which pixels of the image are reorganized. In this cryptosystem we have used AT so as to extent the information content of the two original images onto the encrypted images so as to increase the safety of the encoded images. We have even used Structured Phase Mask (SPM) and Hybrid Mask (HM) as the encryption keys. The original image is first multiplied with the SPM and HM and then transformed with direct and inverse fractional Hartley transform so as to obtain the encrypted image. The fractional orders of the FrHT and the parameters of the AT correspond to the keys of encryption and decryption methods. If both the keys are correctly used only then the original image would be retrieved. Recommended method helps in strengthening the safety of DRPE by growing the key space and the number of parameters and the method is robust against various attacks. By using MATLAB 8.3.0.52 (R2014a) we calculate the strength of the recommended cryptosystem. A set of simulated results shows the power of the proposed asymmetric cryptosystem.

  10. Images Encryption Method using Steganographic LSB Method, AES and RSA algorithm

    NASA Astrophysics Data System (ADS)

    Moumen, Abdelkader; Sissaoui, Hocine

    2017-03-01

    Vulnerability of communication of digital images is an extremely important issue nowadays, particularly when the images are communicated through insecure channels. To improve communication security, many cryptosystems have been presented in the image encryption literature. This paper proposes a novel image encryption technique based on an algorithm that is faster than current methods. The proposed algorithm eliminates the step in which the secrete key is shared during the encryption process. It is formulated based on the symmetric encryption, asymmetric encryption and steganography theories. The image is encrypted using a symmetric algorithm, then, the secret key is encrypted by means of an asymmetrical algorithm and it is hidden in the ciphered image using a least significant bits steganographic scheme. The analysis results show that while enjoying the faster computation, our method performs close to optimal in terms of accuracy.

  11. Combination of Rivest-Shamir-Adleman Algorithm and End of File Method for Data Security

    NASA Astrophysics Data System (ADS)

    Rachmawati, Dian; Amalia, Amalia; Elviwani

    2018-03-01

    Data security is one of the crucial issues in the delivery of information. One of the ways which used to secure the data is by encoding it into something else that is not comprehensible by human beings by using some crypto graphical techniques. The Rivest-Shamir-Adleman (RSA) cryptographic algorithm has been proven robust to secure messages. Since this algorithm uses two different keys (i.e., public key and private key) at the time of encryption and decryption, it is classified as asymmetric cryptography algorithm. Steganography is a method that is used to secure a message by inserting the bits of the message into a larger media such as an image. One of the known steganography methods is End of File (EoF). In this research, the cipher text resulted from the RSA algorithm is compiled into an array form and appended to the end of the image. The result of the EoF is the image which has a line with black gradations under it. This line contains the secret message. This combination of cryptography and steganography in securing the message is expected to increase the security of the message, since the message encryption technique (RSA) is mixed with the data hiding technique (EoF).

  12. Multiple spatial modes based QKD over marine free-space optical channels in the presence of atmospheric turbulence.

    PubMed

    Sun, Xiaole; Djordjevic, Ivan B; Neifeld, Mark A

    2016-11-28

    We investigate a multiple spatial modes based quantum key distribution (QKD) scheme that employs multiple independent parallel beams through a marine free-space optical channel over open ocean. This approach provides the potential to increase secret key rate (SKR) linearly with the number of channels. To improve the SKR performance, we describe a back-propagation mode (BPM) method to mitigate the atmospheric turbulence effects. Our simulation results indicate that the secret key rate can be improved significantly by employing the proposed BPM-based multi-channel QKD scheme.

  13. On High-Order Upwind Methods for Advection

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    2017-01-01

    Scheme III (piecewise linear) and V (piecewise parabolic) of Van Leer are shown to yield identical solutions provided the initial conditions are chosen in an appropriate manner. This result is counter intuitive since it is generally believed that piecewise linear and piecewise parabolic methods cannot produce the same solutions due to their different degrees of approximation. The result also shows a key connection between the approaches of discontinuous and continuous representations.

  14. A cluster-randomised controlled trial of values-based training to promote autonomously held recovery values in mental health workers.

    PubMed

    Williams, Virginia; Deane, Frank P; Oades, Lindsay G; Crowe, Trevor P; Ciarrochi, Joseph; Andresen, Retta

    2016-02-02

    The implementation and use of evidence-based practices is a key priority for recovery-oriented mental health service provision. Training and development programmes for employees continue to be a key method of knowledge and skill development, despite acknowledged difficulties with uptake and maintenance of behaviour change. Self-determination theory suggests that autonomy, or a sense that behaviour is self-generated, is a key motivator to sustained behaviour change, in this case practices in mental health services. This study examined the utility of values-focused staff intervention as a specific, reproducible method of autonomy support. Mental health workers (n = 146) were assigned via cluster randomisation to either a values clarification condition or an active problem-solving control condition. Results demonstrated that a structured values clarification exercise was useful in promoting integrated motivation for the changed practice and resulted in increased implementation planning. Structured values clarification intervention demonstrates utility as a reproducible means of autonomy support within the workplace. We discuss future directions for the study of autonomous motivation in the field of implementation science. ACTRN12613000353796.

  15. Survey and Method for Determination of Trajectory Predictor Requirements

    NASA Technical Reports Server (NTRS)

    Rentas, Tamika L.; Green, Steven M.; Cate, Karen Tung

    2009-01-01

    A survey of air-traffic-management researchers, representing a broad range of automation applications, was conducted to document trajectory-predictor requirements for future decision-support systems. Results indicated that the researchers were unable to articulate a basic set of trajectory-prediction requirements for their automation concepts. Survey responses showed the need to establish a process to help developers determine the trajectory-predictor-performance requirements for their concepts. Two methods for determining trajectory-predictor requirements are introduced. A fast-time simulation method is discussed that captures the sensitivity of a concept to the performance of its trajectory-prediction capability. A characterization method is proposed to provide quicker, yet less precise results, based on analysis and simulation to characterize the trajectory-prediction errors associated with key modeling options for a specific concept. Concept developers can then identify the relative sizes of errors associated with key modeling options, and qualitatively determine which options lead to significant errors. The characterization method is demonstrated for a case study involving future airport surface traffic management automation. Of the top four sources of error, results indicated that the error associated with accelerations to and from turn speeds was unacceptable, the error associated with the turn path model was acceptable, and the error associated with taxi-speed estimation was of concern and needed a higher fidelity concept simulation to obtain a more precise result

  16. Generation of hazardous methyl azide and its application to synthesis of a key-intermediate of picarbutrazox, a new potent pesticide in flow.

    PubMed

    Ichinari, Daisuke; Nagaki, Aiichiro; Yoshida, Jun-Ichi

    2017-12-01

    Generation and reactions of methyl azide (MeN 3 ) were successfully performed by using a flow reactor system, demonstrating that the flow method serves as a safe method for handling hazardous explosive methyl azide. The reaction of NaN 3 and Me 2 SO 4 in a flow reactor gave a MeN 3 solution, which was used for Huisgen reaction with benzoyl cyanide in a flow reactor after minimal washing. The resulting 1-methyl-5-benzoyltetrazole serves as a key intermediate of picarbutrazox (IX), a new potent pesticide. Copyright © 2017. Published by Elsevier Ltd.

  17. Multi-Gaussian fitting for pulse waveform using Weighted Least Squares and multi-criteria decision making method.

    PubMed

    Wang, Lu; Xu, Lisheng; Feng, Shuting; Meng, Max Q-H; Wang, Kuanquan

    2013-11-01

    Analysis of pulse waveform is a low cost, non-invasive method for obtaining vital information related to the conditions of the cardiovascular system. In recent years, different Pulse Decomposition Analysis (PDA) methods have been applied to disclose the pathological mechanisms of the pulse waveform. All these methods decompose single-period pulse waveform into a constant number (such as 3, 4 or 5) of individual waves. Furthermore, those methods do not pay much attention to the estimation error of the key points in the pulse waveform. The estimation of human vascular conditions depends on the key points' positions of pulse wave. In this paper, we propose a Multi-Gaussian (MG) model to fit real pulse waveforms using an adaptive number (4 or 5 in our study) of Gaussian waves. The unknown parameters in the MG model are estimated by the Weighted Least Squares (WLS) method and the optimized weight values corresponding to different sampling points are selected by using the Multi-Criteria Decision Making (MCDM) method. Performance of the MG model and the WLS method has been evaluated by fitting 150 real pulse waveforms of five different types. The resulting Normalized Root Mean Square Error (NRMSE) was less than 2.0% and the estimation accuracy for the key points was satisfactory, demonstrating that our proposed method is effective in compressing, synthesizing and analyzing pulse waveforms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Finite-key security analysis of quantum key distribution with imperfect light sources

    DOE PAGES

    Mizutani, Akihiro; Curty, Marcos; Lim, Charles Ci Wen; ...

    2015-09-09

    In recent years, the gap between theory and practice in quantum key distribution (QKD) has been significantly narrowed, particularly for QKD systems with arbitrarily flawed optical receivers. The status for QKD systems with imperfect light sources is however less satisfactory, in the sense that the resulting secure key rates are often overly dependent on the quality of state preparation. This is especially the case when the channel loss is high. Very recently, to overcome this limitation, Tamaki et al proposed a QKD protocol based on the so-called 'rejected data analysis', and showed that its security in the limit of infinitelymore » long keys is almost independent of any encoding flaw in the qubit space, being this protocol compatible with the decoy state method. Here, as a step towards practical QKD, we show that a similar conclusion is reached in the finite-key regime, even when the intensity of the light source is unstable. More concretely, we derive security bounds for a wide class of realistic light sources and show that the bounds are also efficient in the presence of high channel loss. Our results strongly suggest the feasibility of long distance provably secure communication with imperfect light sources.« less

  19. DEVELOPMENT AND EVALUATION OF AN ENZYME-LINKED IMMUNOASSAY (ELISA) METHOD FOR THE MEASUREMENT OF 2,4-DICHLOROPHENOXYACETIC ACID IN HUMAN URINE

    EPA Science Inventory

    This paper describes the development of a 96-microwell high sample capacity ELISA method for measuring 2,4-D in urine; the analysis of 2,4-D in real-world urine samples by both ELISA and GC/MS methods; and compares the ELISA and GC/MS results in several key areas: accuracy, preci...

  20. Biometrics encryption combining palmprint with two-layer error correction codes

    NASA Astrophysics Data System (ADS)

    Li, Hengjian; Qiu, Jian; Dong, Jiwen; Feng, Guang

    2017-07-01

    To bridge the gap between the fuzziness of biometrics and the exactitude of cryptography, based on combining palmprint with two-layer error correction codes, a novel biometrics encryption method is proposed. Firstly, the randomly generated original keys are encoded by convolutional and cyclic two-layer coding. The first layer uses a convolution code to correct burst errors. The second layer uses cyclic code to correct random errors. Then, the palmprint features are extracted from the palmprint images. Next, they are fused together by XORing operation. The information is stored in a smart card. Finally, the original keys extraction process is the information in the smart card XOR the user's palmprint features and then decoded with convolutional and cyclic two-layer code. The experimental results and security analysis show that it can recover the original keys completely. The proposed method is more secure than a single password factor, and has higher accuracy than a single biometric factor.

  1. Parameterizations for ensemble Kalman inversion

    NASA Astrophysics Data System (ADS)

    Chada, Neil K.; Iglesias, Marco A.; Roininen, Lassi; Stuart, Andrew M.

    2018-05-01

    The use of ensemble methods to solve inverse problems is attractive because it is a derivative-free methodology which is also well-adapted to parallelization. In its basic iterative form the method produces an ensemble of solutions which lie in the linear span of the initial ensemble. Choice of the parameterization of the unknown field is thus a key component of the success of the method. We demonstrate how both geometric ideas and hierarchical ideas can be used to design effective parameterizations for a number of applied inverse problems arising in electrical impedance tomography, groundwater flow and source inversion. In particular we show how geometric ideas, including the level set method, can be used to reconstruct piecewise continuous fields, and we show how hierarchical methods can be used to learn key parameters in continuous fields, such as length-scales, resulting in improved reconstructions. Geometric and hierarchical ideas are combined in the level set method to find piecewise constant reconstructions with interfaces of unknown topology.

  2. Biometric Methods for Secure Communications in Body Sensor Networks: Resource-Efficient Key Management and Signal-Level Data Scrambling

    NASA Astrophysics Data System (ADS)

    Bui, Francis Minhthang; Hatzinakos, Dimitrios

    2007-12-01

    As electronic communications become more prevalent, mobile and universal, the threats of data compromises also accordingly loom larger. In the context of a body sensor network (BSN), which permits pervasive monitoring of potentially sensitive medical data, security and privacy concerns are particularly important. It is a challenge to implement traditional security infrastructures in these types of lightweight networks since they are by design limited in both computational and communication resources. A key enabling technology for secure communications in BSN's has emerged to be biometrics. In this work, we present two complementary approaches which exploit physiological signals to address security issues: (1) a resource-efficient key management system for generating and distributing cryptographic keys to constituent sensors in a BSN; (2) a novel data scrambling method, based on interpolation and random sampling, that is envisioned as a potential alternative to conventional symmetric encryption algorithms for certain types of data. The former targets the resource constraints in BSN's, while the latter addresses the fuzzy variability of biometric signals, which has largely precluded the direct application of conventional encryption. Using electrocardiogram (ECG) signals as biometrics, the resulting computer simulations demonstrate the feasibility and efficacy of these methods for delivering secure communications in BSN's.

  3. Practical Quantum Cryptography for Secure Free-Space Communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buttler, W.T.; Hughes, R.J.; Kwiat, P.G.

    1999-02-01

    Quantum cryptography is an emerging technology in which two parties may simultaneously generate shared, secret cryptographic key material using the transmission of quantum states of light. The security of these transmissions is based on the inviolability of the laws of quantum mechanics and information-theoretically secure post-processing methods. An adversary can neither successfully tap the quantum transmissions, nor evade detection, owing to Heisenberg's uncertainty principle. In this paper we describe the theory of quantum cryptography, and the most recent results from our experimental free-space system with which we have demonstrated for the first time the feasibility of quantum key generation overmore » a point-to-point outdoor atmospheric path in daylight. We achieved a transmission distance of 0.5 km, which was limited only by the length of the test range. Our results provide strong evidence that cryptographic key material could be generated on demand between a ground station and a satellite (or between two satellites), allowing a satellite to be securely re-keyed on orbit. We present a feasibility analysis of surface-to-satellite quantum key generation.« less

  4. FREE-SPACE QUANTUM CRYPTOGRAPHY IN DAYLIGHT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, R.J.; Buttler, W.T.

    2000-01-01

    Quantum cryptography is an emerging technology in which two parties may simultaneously generate shared, secret cryptographic key material using the transmission of quantum states of light. The security of these transmissions is based on the inviolability of the laws of quantum mechanics and information-theoretically secure post-processing methods. An adversary can neither successfully tap the quantum transmissions, nor evade detection, owing to Heisenberg's uncertainty principle. In this paper we describe the theory of quantum cryptography, and the most recent results from our experimental free-space system with which we have demonstrated for the first time the feasibility of quantum key generation overmore » a point-to-point outdoor atmospheric path in daylight. We achieved a transmission distance of 0.5 km, which was limited only by the length of the test range. Our results provide strong evidence that cryptographic key material could be generated on demand between a ground station and a satellite (or between two satellites), allowing a satellite to be securely re-keyed on orbit. We present a feasibility analysis of surface-to-satellite quantum key generation.« less

  5. Coupling habitat suitability and ecosystem health with AEHRA to estimate E-flows under intensive human activities

    NASA Astrophysics Data System (ADS)

    Zhao, C. S.; Yang, S. T.; Zhang, H. T.; Liu, C. M.; Sun, Y.; Yang, Z. Y.; Zhang, Y.; Dong, B. E.; Lim, R. P.

    2017-08-01

    Sustaining adequate environmental flows (e-flows) is a key principle for maintaining river biodiversity and ecosystem health, and for supporting sustainable water resource management in basins under intensive human activities. But few methods could correctly relate river health to e-flows assessment at the catchment scale when they are applied to rivers highly impacted by human activities. An effective method is presented in this study to closely link river health to e-flows assessment for rivers at the catchment scale. Key fish species, as indicators of ecosystem health, were selected by using the foodweb model. A multi-species-based habitat suitability model (MHSI) was improved, and coupled with dominance of the key fish species as well as the Index of Biological Integrity (IBI) to enhance its accuracy in determining the fish-preferred key hydrologic habitat variables related to ecosystem health. Taking 5964 fish samples and concurrent hydrological habitat variables as the basis, the combination of key variables of flow-velocity and water-depth were determined and used to drive the Adapted Ecological Hydraulic Radius Approach (AEHRA) to study e-flows in a Chinese urban river impacted by intensive human activities. Results showed that upstream urbanization resulted in abnormal river-course geomorphology and consequently abnormal e-flows under intensive human activities. Selection of key species based on the foodweb and trophic levels of aquatic ecosystems can reflect a comprehensive requirement on e-flows of the whole aquatic ecosystem, which greatly increases its potential to be used as a guidance tool for rehabilitation of degraded ecosystems at large spatial scales. These findings have significant ramifications for catchment e-flows assessment under intensive human activities and for river ecohealth restoration in such rivers globally.

  6. Tomographic methods in flow diagnostics

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J.

    1993-01-01

    This report presents a viewpoint of tomography that should be well adapted to currently available optical measurement technology as well as the needs of computational and experimental fluid dynamists. The goals in mind are to record data with the fastest optical array sensors; process the data with the fastest parallel processing technology available for small computers; and generate results for both experimental and theoretical data. An in-depth example treats interferometric data as it might be recorded in an aeronautics test facility, but the results are applicable whenever fluid properties are to be measured or applied from projections of those properties. The paper discusses both computed and neural net calibration tomography. The report also contains an overview of key definitions and computational methods, key references, computational problems such as ill-posedness, artifacts, missing data, and some possible and current research topics.

  7. Hierarchical Simulation to Assess Hardware and Software Dependability

    NASA Technical Reports Server (NTRS)

    Ries, Gregory Lawrence

    1997-01-01

    This thesis presents a method for conducting hierarchical simulations to assess system hardware and software dependability. The method is intended to model embedded microprocessor systems. A key contribution of the thesis is the idea of using fault dictionaries to propagate fault effects upward from the level of abstraction where a fault model is assumed to the system level where the ultimate impact of the fault is observed. A second important contribution is the analysis of the software behavior under faults as well as the hardware behavior. The simulation method is demonstrated and validated in four case studies analyzing Myrinet, a commercial, high-speed networking system. One key result from the case studies shows that the simulation method predicts the same fault impact 87.5% of the time as is obtained by similar fault injections into a real Myrinet system. Reasons for the remaining discrepancy are examined in the thesis. A second key result shows the reduction in the number of simulations needed due to the fault dictionary method. In one case study, 500 faults were injected at the chip level, but only 255 propagated to the system level. Of these 255 faults, 110 shared identical fault dictionary entries at the system level and so did not need to be resimulated. The necessary number of system-level simulations was therefore reduced from 500 to 145. Finally, the case studies show how the simulation method can be used to improve the dependability of the target system. The simulation analysis was used to add recovery to the target software for the most common fault propagation mechanisms that would cause the software to hang. After the modification, the number of hangs was reduced by 60% for fault injections into the real system.

  8. Mining key elements for severe convection prediction based on CNN

    NASA Astrophysics Data System (ADS)

    Liu, Ming; Pan, Ning; Zhang, Changan; Sha, Hongzhou; Zhang, Bolei; Liu, Liang; Zhang, Meng

    2017-04-01

    Severe convective weather is a kind of weather disasters accompanied by heavy rainfall, gust wind, hail, etc. Along with recent developments on remote sensing and numerical modeling, there are high-volume and long-term observational and modeling data accumulated to capture massive severe convective events over particular areas and time periods. With those high-volume and high-variety weather data, most of the existing studies and methods carry out the dynamical laws, cause analysis, potential rule study, and prediction enhancement by utilizing the governing equations from fluid dynamics and thermodynamics. In this study, a key-element mining method is proposed for severe convection prediction based on convolution neural network (CNN). It aims to identify the key areas and key elements from huge amounts of historical weather data including conventional measurements, weather radar, satellite, so as numerical modeling and/or reanalysis data. Under this manner, the machine-learning based method could help the human forecasters on their decision-making on operational weather forecasts on severe convective weathers by extracting key information from the real-time and historical weather big data. In this paper, it first utilizes computer vision technology to complete the data preprocessing work of the meteorological variables. Then, it utilizes the information such as radar map and expert knowledge to annotate all images automatically. And finally, by using CNN model, it cloud analyze and evaluate each weather elements (e.g., particular variables, patterns, features, etc.), and identify key areas of those critical weather elements, then help forecasters quickly screen out the key elements from huge amounts of observation data by current weather conditions. Based on the rich weather measurement and model data (up to 10 years) over Fujian province in China, where the severe convective weathers are very active during the summer months, experimental tests are conducted with the new machine-learning method via CNN models. Based on the analysis of those experimental results and case studies, the proposed new method have below benefits for the severe convection prediction: (1) helping forecasters to narrow down the scope of analysis and saves lead-time for those high-impact severe convection; (2) performing huge amount of weather big data by machine learning methods rather relying on traditional theory and knowledge, which provide new method to explore and quantify the severe convective weathers; (3) providing machine learning based end-to-end analysis and processing ability with considerable scalability on data volumes, and accomplishing the analysis work without human intervention.

  9. Study on Multi-stage Logistics System Design Problem with Inventory Considering Demand Change by Hybrid Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Inoue, Hisaki; Gen, Mitsuo

    The logistics model used in this study is 3-stage model employed by an automobile company, which aims to solve traffic problems at a total minimum cost. Recently, research on the metaheuristics method has advanced as an approximate means for solving optimization problems like this model. These problems can be solved using various methods such as the genetic algorithm (GA), simulated annealing, and tabu search. GA is superior in robustness and adjustability toward a change in the structure of these problems. However, GA has a disadvantage in that it has a slightly inefficient search performance because it carries out a multi-point search. A hybrid GA that combines another method is attracting considerable attention since it can compensate for a fault to a partial solution that early convergence gives a bad influence on a result. In this study, we propose a novel hybrid random key-based GA(h-rkGA) that combines local search and parameter tuning of crossover rate and mutation rate; h-rkGA is an improved version of the random key-based GA (rk-GA). We attempted comparative experiments with spanning tree-based GA, priority based GA and random key-based GA. Further, we attempted comparative experiments with “h-GA by only local search” and “h-GA by only parameter tuning”. We reported the effectiveness of the proposed method on the basis of the results of these experiments.

  10. Parallel Key Frame Extraction for Surveillance Video Service in a Smart City.

    PubMed

    Zheng, Ran; Yao, Chuanwei; Jin, Hai; Zhu, Lei; Zhang, Qin; Deng, Wei

    2015-01-01

    Surveillance video service (SVS) is one of the most important services provided in a smart city. It is very important for the utilization of SVS to provide design efficient surveillance video analysis techniques. Key frame extraction is a simple yet effective technique to achieve this goal. In surveillance video applications, key frames are typically used to summarize important video content. It is very important and essential to extract key frames accurately and efficiently. A novel approach is proposed to extract key frames from traffic surveillance videos based on GPU (graphics processing units) to ensure high efficiency and accuracy. For the determination of key frames, motion is a more salient feature in presenting actions or events, especially in surveillance videos. The motion feature is extracted in GPU to reduce running time. It is also smoothed to reduce noise, and the frames with local maxima of motion information are selected as the final key frames. The experimental results show that this approach can extract key frames more accurately and efficiently compared with several other methods.

  11. Gaseous Sulfate Solubility in Glass: Experimental Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bliss, Mary

    2013-11-30

    Sulfate solubility in glass is a key parameter in many commercial glasses and nuclear waste glasses. This report summarizes key publications specific to sulfate solubility experimental methods and the underlying physical chemistry calculations. The published methods and experimental data are used to verify the calculations in this report and are expanded to a range of current technical interest. The calculations and experimental methods described in this report will guide several experiments on sulfate solubility and saturation for the Hanford Waste Treatment Plant Enhanced Waste Glass Models effort. There are several tables of sulfate gas equilibrium values at high temperature tomore » guide experimental gas mixing and to achieve desired SO3 levels. This report also describes the necessary equipment and best practices to perform sulfate saturation experiments for molten glasses. Results and findings will be published when experimental work is finished and this report is validated from the data obtained.« less

  12. Extremality of Gaussian quantum states.

    PubMed

    Wolf, Michael M; Giedke, Geza; Cirac, J Ignacio

    2006-03-03

    We investigate Gaussian quantum states in view of their exceptional role within the space of all continuous variables states. A general method for deriving extremality results is provided and applied to entanglement measures, secret key distillation and the classical capacity of bosonic quantum channels. We prove that for every given covariance matrix the distillable secret key rate and the entanglement, if measured appropriately, are minimized by Gaussian states. This result leads to a clearer picture of the validity of frequently made Gaussian approximations. Moreover, it implies that Gaussian encodings are optimal for the transmission of classical information through bosonic channels, if the capacity is additive.

  13. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  14. Estimating genome-wide regulatory activity from multi-omics data sets using mathematical optimization.

    PubMed

    Trescher, Saskia; Münchmeyer, Jannes; Leser, Ulf

    2017-03-27

    Gene regulation is one of the most important cellular processes, indispensable for the adaptability of organisms and closely interlinked with several classes of pathogenesis and their progression. Elucidation of regulatory mechanisms can be approached by a multitude of experimental methods, yet integration of the resulting heterogeneous, large, and noisy data sets into comprehensive and tissue or disease-specific cellular models requires rigorous computational methods. Recently, several algorithms have been proposed which model genome-wide gene regulation as sets of (linear) equations over the activity and relationships of transcription factors, genes and other factors. Subsequent optimization finds those parameters that minimize the divergence of predicted and measured expression intensities. In various settings, these methods produced promising results in terms of estimating transcription factor activity and identifying key biomarkers for specific phenotypes. However, despite their common root in mathematical optimization, they vastly differ in the types of experimental data being integrated, the background knowledge necessary for their application, the granularity of their regulatory model, the concrete paradigm used for solving the optimization problem and the data sets used for evaluation. Here, we review five recent methods of this class in detail and compare them with respect to several key properties. Furthermore, we quantitatively compare the results of four of the presented methods based on publicly available data sets. The results show that all methods seem to find biologically relevant information. However, we also observe that the mutual result overlaps are very low, which contradicts biological intuition. Our aim is to raise further awareness of the power of these methods, yet also to identify common shortcomings and necessary extensions enabling focused research on the critical points.

  15. GOClonto: an ontological clustering approach for conceptualizing PubMed abstracts.

    PubMed

    Zheng, Hai-Tao; Borchert, Charles; Kim, Hong-Gee

    2010-02-01

    Concurrent with progress in biomedical sciences, an overwhelming of textual knowledge is accumulating in the biomedical literature. PubMed is the most comprehensive database collecting and managing biomedical literature. To help researchers easily understand collections of PubMed abstracts, numerous clustering methods have been proposed to group similar abstracts based on their shared features. However, most of these methods do not explore the semantic relationships among groupings of documents, which could help better illuminate the groupings of PubMed abstracts. To address this issue, we proposed an ontological clustering method called GOClonto for conceptualizing PubMed abstracts. GOClonto uses latent semantic analysis (LSA) and gene ontology (GO) to identify key gene-related concepts and their relationships as well as allocate PubMed abstracts based on these key gene-related concepts. Based on two PubMed abstract collections, the experimental results show that GOClonto is able to identify key gene-related concepts and outperforms the STC (suffix tree clustering) algorithm, the Lingo algorithm, the Fuzzy Ants algorithm, and the clustering based TRS (tolerance rough set) algorithm. Moreover, the two ontologies generated by GOClonto show significant informative conceptual structures.

  16. RM-DEMATEL: a new methodology to identify the key factors in PM2.5.

    PubMed

    Chen, Yafeng; Liu, Jie; Li, Yunpeng; Sadiq, Rehan; Deng, Yong

    2015-04-01

    Weather system is a relative complex dynamic system, the factors of the system are mutually influenced PM2.5 concentration. In this paper, a new method is proposed to quantify the influence on PM2.5 by other factors in the weather system and identify the most important factors for PM2.5 with limited resources. The relation map (RM) is used to figure out the direct relation matrix of 14 factors in PM2.5. The decision making trial and evaluation laboratory(DEMATEL) is applied to calculate the causal relationship and extent to a mutual influence of 14 factors in PM2.5. According to the ranking results of our proposed method, the most important key factors is sulfur dioxide (SO2) and nitrogen oxides (NO(X)). In addition, the other factors, the ambient maximum temperature (T(max)), concentration of PM10, and wind direction (W(dir)), are important factors for PM2.5. The proposed method can also be applied to other environment management systems to identify key factors.

  17. Operating Reserves and Wind Power Integration: An International Comparison; Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milligan, M.; Donohoo, P.; Lew, D.

    2010-10-01

    This paper provides a high-level international comparison of methods and key results from both operating practice and integration analysis, based on an informal International Energy Agency Task 25: Large-scale Wind Integration.

  18. Design of virtual simulation experiment based on key events

    NASA Astrophysics Data System (ADS)

    Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu

    2018-06-01

    Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.

  19. Documenting Preservice Teacher Growth through Critical Assessment of Online Lesson Plans

    ERIC Educational Resources Information Center

    Cude, Michelle D.; Haraway, Dana L.

    2017-01-01

    This research explores the question of how students in a social studies methods course improve skills in analyzing and critiquing pre-existing lesson plans. It utilizes a pre-post authentic assessment tool to measure student growth in key skills of lesson plan critique over the course of one semester's methods instruction. The results support the…

  20. Decision modeling for fire incident analysis

    Treesearch

    Donald G. MacGregor; Armando González-Cabán

    2009-01-01

    This paper reports on methods for representing and modeling fire incidents based on concepts and models from the decision and risk sciences. A set of modeling techniques are used to characterize key fire management decision processes and provide a basis for incident analysis. The results of these methods can be used to provide insights into the structure of fire...

  1. Image security based on iterative random phase encoding in expanded fractional Fourier transform domains

    NASA Astrophysics Data System (ADS)

    Liu, Zhengjun; Chen, Hang; Blondel, Walter; Shen, Zhenmin; Liu, Shutian

    2018-06-01

    A novel image encryption method is proposed by using the expanded fractional Fourier transform, which is implemented with a pair of lenses. Here the centers of two lenses are separated at the cross section of axis in optical system. The encryption system is addressed with Fresnel diffraction and phase modulation for the calculation of information transmission. The iterative process with the transform unit is utilized for hiding secret image. The structure parameters of a battery of lenses can be used for additional keys. The performance of encryption method is analyzed theoretically and digitally. The results show that the security of this algorithm is enhanced markedly by the added keys.

  2. Secure detection in quantum key distribution by real-time calibration of receiver

    NASA Astrophysics Data System (ADS)

    Marøy, Øystein; Makarov, Vadim; Skaar, Johannes

    2017-12-01

    The single-photon detectionefficiency of the detector unit is crucial for the security of common quantum key distribution protocols like Bennett-Brassard 1984 (BB84). A low value for the efficiency indicates a possible eavesdropping attack that exploits the photon receiver’s imperfections. We present a method for estimating the detection efficiency, and calculate the corresponding secure key generation rate. The estimation is done by testing gated detectors using a randomly activated photon source inside the receiver unit. This estimate gives a secure rate for any detector with non-unity single-photon detection efficiency, both inherit or due to blinding. By adding extra optical components to the receiver, we make sure that the key is extracted from photon states for which our estimate is valid. The result is a quantum key distribution scheme that is secure against any attack that exploits detector imperfections.

  3. Key on demand (KoD) for software-defined optical networks secured by quantum key distribution (QKD).

    PubMed

    Cao, Yuan; Zhao, Yongli; Colman-Meixner, Carlos; Yu, Xiaosong; Zhang, Jie

    2017-10-30

    Software-defined optical networking (SDON) will become the next generation optical network architecture. However, the optical layer and control layer of SDON are vulnerable to cyberattacks. While, data encryption is an effective method to minimize the negative effects of cyberattacks, secure key interchange is its major challenge which can be addressed by the quantum key distribution (QKD) technique. Hence, in this paper we discuss the integration of QKD with WDM optical networks to secure the SDON architecture by introducing a novel key on demand (KoD) scheme which is enabled by a novel routing, wavelength and key assignment (RWKA) algorithm. The QKD over SDON with KoD model follows two steps to provide security: i) quantum key pools (QKPs) construction for securing the control channels (CChs) and data channels (DChs); ii) the KoD scheme uses RWKA algorithm to allocate and update secret keys for different security requirements. To test our model, we define a security probability index which measures the security gain in CChs and DChs. Simulation results indicate that the security performance of CChs and DChs can be enhanced by provisioning sufficient secret keys in QKPs and performing key-updating considering potential cyberattacks. Also, KoD is beneficial to achieve a positive balance between security requirements and key resource usage.

  4. Studies on unsaturated flow in dual-scale fiber fabrics

    NASA Astrophysics Data System (ADS)

    Yan, Fei; Yan, Shilin; Li, Yongjing

    2018-03-01

    Fiber fabrics in liquid composite molding (LCM) can be recognized as a dual-scale structure. As sink theory developed, this unsaturated flow behavior has already been simulated successfully; however, most of simulated results based on a unit cell under ideal status, thus making results were not agreement with experiment. In this study, an experimental method to establish sink function was proposed. After compared the simulation results by this sink function, it shows high accuracy with the experimental data. Subsequently, the key influencing factors for unsaturated flow have been further investigated; results show that the filling time for unsaturated flow was much longer than saturated flow. In addition, the injection pressure and permeability were the key factors lead to unsaturated flow.

  5. An unsupervised method for summarizing egocentric sport videos

    NASA Astrophysics Data System (ADS)

    Habibi Aghdam, Hamed; Jahani Heravi, Elnaz; Puig, Domenec

    2015-12-01

    People are getting more interested to record their sport activities using head-worn or hand-held cameras. This type of videos which is called egocentric sport videos has different motion and appearance patterns compared with life-logging videos. While a life-logging video can be defined in terms of well-defined human-object interactions, notwithstanding, it is not trivial to describe egocentric sport videos using well-defined activities. For this reason, summarizing egocentric sport videos based on human-object interaction might fail to produce meaningful results. In this paper, we propose an unsupervised method for summarizing egocentric videos by identifying the key-frames of the video. Our method utilizes both appearance and motion information and it automatically finds the number of the key-frames. Our blind user study on the new dataset collected from YouTube shows that in 93:5% cases, the users choose the proposed method as their first video summary choice. In addition, our method is within the top 2 choices of the users in 99% of studies.

  6. Identification of key residues for protein conformational transition using elastic network model.

    PubMed

    Su, Ji Guo; Xu, Xian Jin; Li, Chun Hua; Chen, Wei Zu; Wang, Cun Xin

    2011-11-07

    Proteins usually undergo conformational transitions between structurally disparate states to fulfill their functions. The large-scale allosteric conformational transitions are believed to involve some key residues that mediate the conformational movements between different regions of the protein. In the present work, a thermodynamic method based on the elastic network model is proposed to predict the key residues involved in protein conformational transitions. In our method, the key functional sites are identified as the residues whose perturbations largely influence the free energy difference between the protein states before and after transition. Two proteins, nucleotide binding domain of the heat shock protein 70 and human/rat DNA polymerase β, are used as case studies to identify the critical residues responsible for their open-closed conformational transitions. The results show that the functionally important residues mainly locate at the following regions for these two proteins: (1) the bridging point at the interface between the subdomains that control the opening and closure of the binding cleft; (2) the hinge region between different subdomains, which mediates the cooperative motions between the corresponding subdomains; and (3) the substrate binding sites. The similarity in the positions of the key residues for these two proteins may indicate a common mechanism in their conformational transitions.

  7. Comparison of Several Methods for Determining the Internal Resistance of Lithium Ion Cells

    PubMed Central

    Schweiger, Hans-Georg; Obeidi, Ossama; Komesker, Oliver; Raschke, André; Schiemann, Michael; Zehner, Christian; Gehnen, Markus; Keller, Michael; Birke, Peter

    2010-01-01

    The internal resistance is the key parameter for determining power, energy efficiency and lost heat of a lithium ion cell. Precise knowledge of this value is vital for designing battery systems for automotive applications. Internal resistance of a cell was determined by current step methods, AC (alternating current) methods, electrochemical impedance spectroscopy and thermal loss methods. The outcomes of these measurements have been compared with each other. If charge or discharge of the cell is limited, current step methods provide the same results as energy loss methods. PMID:22219678

  8. Spatial characterization of the meltwater field from icebergs in the Weddell Sea.

    PubMed

    Helly, John J; Kaufmann, Ronald S; Vernet, Maria; Stephenson, Gordon R

    2011-04-05

    We describe the results from a spatial cyberinfrastructure developed to characterize the meltwater field around individual icebergs and integrate the results with regional- and global-scale data. During the course of the cyberinfrastructure development, it became clear that we were also building an integrated sampling planning capability across multidisciplinary teams that provided greater agility in allocating expedition resources resulting in new scientific insights. The cyberinfrastructure-enabled method is a complement to the conventional methods of hydrographic sampling in which the ship provides a static platform on a station-by-station basis. We adapted a sea-floor mapping method to more rapidly characterize the sea surface geophysically and biologically. By jointly analyzing the multisource, continuously sampled biological, chemical, and physical parameters, using Global Positioning System time as the data fusion key, this surface-mapping method enables us to examine the relationship between the meltwater field of the iceberg to the larger-scale marine ecosystem of the Southern Ocean. Through geospatial data fusion, we are able to combine very fine-scale maps of dynamic processes with more synoptic but lower-resolution data from satellite systems. Our results illustrate the importance of spatial cyberinfrastructure in the overall scientific enterprise and identify key interfaces and sources of error that require improved controls for the development of future Earth observing systems as we move into an era of peta- and exascale, data-intensive computing.

  9. Spatial characterization of the meltwater field from icebergs in the Weddell Sea

    PubMed Central

    Helly, John J.; Kaufmann, Ronald S.; Vernet, Maria; Stephenson, Gordon R.

    2011-01-01

    We describe the results from a spatial cyberinfrastructure developed to characterize the meltwater field around individual icebergs and integrate the results with regional- and global-scale data. During the course of the cyberinfrastructure development, it became clear that we were also building an integrated sampling planning capability across multidisciplinary teams that provided greater agility in allocating expedition resources resulting in new scientific insights. The cyberinfrastructure-enabled method is a complement to the conventional methods of hydrographic sampling in which the ship provides a static platform on a station-by-station basis. We adapted a sea-floor mapping method to more rapidly characterize the sea surface geophysically and biologically. By jointly analyzing the multisource, continuously sampled biological, chemical, and physical parameters, using Global Positioning System time as the data fusion key, this surface-mapping method enables us to examine the relationship between the meltwater field of the iceberg to the larger-scale marine ecosystem of the Southern Ocean. Through geospatial data fusion, we are able to combine very fine-scale maps of dynamic processes with more synoptic but lower-resolution data from satellite systems. Our results illustrate the importance of spatial cyberinfrastructure in the overall scientific enterprise and identify key interfaces and sources of error that require improved controls for the development of future Earth observing systems as we move into an era of peta- and exascale, data-intensive computing. PMID:21444769

  10. A Hybrid Key Management Scheme for WSNs Based on PPBR and a Tree-Based Path Key Establishment Method

    PubMed Central

    Zhang, Ying; Liang, Jixing; Zheng, Bingxin; Chen, Wei

    2016-01-01

    With the development of wireless sensor networks (WSNs), in most application scenarios traditional WSNs with static sink nodes will be gradually replaced by Mobile Sinks (MSs), and the corresponding application requires a secure communication environment. Current key management researches pay less attention to the security of sensor networks with MS. This paper proposes a hybrid key management schemes based on a Polynomial Pool-based key pre-distribution and Basic Random key pre-distribution (PPBR) to be used in WSNs with MS. The scheme takes full advantages of these two kinds of methods to improve the cracking difficulty of the key system. The storage effectiveness and the network resilience can be significantly enhanced as well. The tree-based path key establishment method is introduced to effectively solve the problem of communication link connectivity. Simulation clearly shows that the proposed scheme performs better in terms of network resilience, connectivity and storage effectiveness compared to other widely used schemes. PMID:27070624

  11. Security analysis and improvements to the PsychoPass method.

    PubMed

    Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko

    2013-08-13

    In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.

  12. Limited privacy protection and poor sensitivity: Is it time to move on from the statistical linkage key-581?

    PubMed

    Randall, Sean M; Ferrante, Anna M; Boyd, James H; Brown, Adrian P; Semmens, James B

    2016-08-01

    The statistical linkage key (SLK-581) is a common tool for record linkage in Australia, due to its ability to provide some privacy protection. However, newer privacy-preserving approaches may provide greater privacy protection, while allowing high-quality linkage. To evaluate the standard SLK-581, encrypted SLK-581 and a newer privacy-preserving approach using Bloom filters, in terms of both privacy and linkage quality. Linkage quality was compared by conducting linkages on Australian health datasets using these three techniques and examining results. Privacy was compared qualitatively in relation to a series of scenarios where privacy breaches may occur. The Bloom filter technique offered greater privacy protection and linkage quality compared to the SLK-based method commonly used in Australia. The adoption of new privacy-preserving methods would allow both greater confidence in research results, while significantly improving privacy protection. © The Author(s) 2016.

  13. Deep learning based beat event detection in action movie franchises

    NASA Astrophysics Data System (ADS)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  14. Using movies in family medicine teaching: A reference to EURACT Educational Agenda

    PubMed Central

    Švab, Igor

    2017-01-01

    Abstract Introduction Cinemeducation is a teaching method where popular movies or movie clips are used. We aimed to determine whether family physicians’ competencies as listed in the Educational Agenda produced by the European Academy of Teachers in General Practice/Family Medicine (EURACT) can be found in movies, and to propose a template for teaching by these movies. Methods A group of family medicine teachers provided a list of movies that they would use in cinemeducation. The movies were categorised according to the key family medicine competencies, thus creating a framework of competences, covered by different movies. These key competencies are Primary care management, Personcentred care, Specific problem-solving skills, Comprehensive approach, Community orientation, and Holistic approach. Results The list consisted of 17 movies. Nine covered primary care management. Person-centred care was covered in 13 movies. Eight movies covered specific problem-solving skills. Comprehensive approach was covered in five movies. Five movies covered community orientation. Holistic approach was covered in five movies. Conclusions All key family medicine competencies listed in the Educational Agenda can be taught using movies. Our results can serve as a template for teachers on how to use any appropriate movies in family medicine education. PMID:28289469

  15. High pressure die casting of Fe-based metallic glass.

    PubMed

    Ramasamy, Parthiban; Szabo, Attila; Borzel, Stefan; Eckert, Jürgen; Stoica, Mihai; Bárdos, András

    2016-10-11

    Soft ferromagnetic Fe-based bulk metallic glass key-shaped specimens with a maximum and minimum width of 25.4 and 5 mm, respectively, were successfully produced using a high pressure die casting (HPDC) method, The influence of die material, alloy temperature and flow rate on the microstructure, thermal stability and soft ferromagnetic properties has been studied. The results suggest that a steel die in which the molten metal flows at low rate and high temperature can be used to produce completely glassy samples. This can be attributed to the laminar filling of the mold and to a lower heat transfer coefficient, which avoids the skin effect in the steel mold. In addition, magnetic measurements reveal that the amorphous structure of the material is maintained throughout the key-shaped samples. Although it is difficult to control the flow and cooling rate of the molten metal in the corners of the key due to different cross sections, this can be overcome by proper tool geometry. The present results confirm that HPDC is a suitable method for the casting of Fe-based bulk glassy alloys even with complex geometries for a broad range of applications.

  16. High pressure die casting of Fe-based metallic glass

    NASA Astrophysics Data System (ADS)

    Ramasamy, Parthiban; Szabo, Attila; Borzel, Stefan; Eckert, Jürgen; Stoica, Mihai; Bárdos, András

    2016-10-01

    Soft ferromagnetic Fe-based bulk metallic glass key-shaped specimens with a maximum and minimum width of 25.4 and 5 mm, respectively, were successfully produced using a high pressure die casting (HPDC) method, The influence of die material, alloy temperature and flow rate on the microstructure, thermal stability and soft ferromagnetic properties has been studied. The results suggest that a steel die in which the molten metal flows at low rate and high temperature can be used to produce completely glassy samples. This can be attributed to the laminar filling of the mold and to a lower heat transfer coefficient, which avoids the skin effect in the steel mold. In addition, magnetic measurements reveal that the amorphous structure of the material is maintained throughout the key-shaped samples. Although it is difficult to control the flow and cooling rate of the molten metal in the corners of the key due to different cross sections, this can be overcome by proper tool geometry. The present results confirm that HPDC is a suitable method for the casting of Fe-based bulk glassy alloys even with complex geometries for a broad range of applications.

  17. High pressure die casting of Fe-based metallic glass

    PubMed Central

    Ramasamy, Parthiban; Szabo, Attila; Borzel, Stefan; Eckert, Jürgen; Stoica, Mihai; Bárdos, András

    2016-01-01

    Soft ferromagnetic Fe-based bulk metallic glass key-shaped specimens with a maximum and minimum width of 25.4 and 5 mm, respectively, were successfully produced using a high pressure die casting (HPDC) method, The influence of die material, alloy temperature and flow rate on the microstructure, thermal stability and soft ferromagnetic properties has been studied. The results suggest that a steel die in which the molten metal flows at low rate and high temperature can be used to produce completely glassy samples. This can be attributed to the laminar filling of the mold and to a lower heat transfer coefficient, which avoids the skin effect in the steel mold. In addition, magnetic measurements reveal that the amorphous structure of the material is maintained throughout the key-shaped samples. Although it is difficult to control the flow and cooling rate of the molten metal in the corners of the key due to different cross sections, this can be overcome by proper tool geometry. The present results confirm that HPDC is a suitable method for the casting of Fe-based bulk glassy alloys even with complex geometries for a broad range of applications. PMID:27725780

  18. Round-Robin approach to data flow optimization

    NASA Technical Reports Server (NTRS)

    Witt, J.

    1978-01-01

    A large data base, circular in structure, was required (for the Voyager Mission to Jupiter/Saturn) with the capability to completely update the data every four hours during high activity periods. The data is stored in key ordered format for retrieval but is not input in key order. Existing access methods for large data bases with rapid data replacement by keys become inefficient as the volume of data being replaced grows. The Round-Robin method was developed to alleviate this problem. The Round-Robin access method allows rapid updating of the data with continuous self cleaning where the oldest data (by key) is deleted and the newest data (by key) is kept regardless of the order of input.

  19. Securing non-volatile memory regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faraboschi, Paolo; Ranganathan, Parthasarathy; Muralimanohar, Naveen

    Methods, apparatus and articles of manufacture to secure non-volatile memory regions are disclosed. An example method disclosed herein comprises associating a first key pair and a second key pair different than the first key pair with a process, using the first key pair to secure a first region of a non-volatile memory for the process, and using the second key pair to secure a second region of the non-volatile memory for the same process, the second region being different than the first region.

  20. Known plaintext attack on double random phase encoding using fingerprint as key and a method for avoiding the attack.

    PubMed

    Tashima, Hideaki; Takeda, Masafumi; Suzuki, Hiroyuki; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2010-06-21

    We have shown that the application of double random phase encoding (DRPE) to biometrics enables the use of biometrics as cipher keys for binary data encryption. However, DRPE is reported to be vulnerable to known-plaintext attacks (KPAs) using a phase recovery algorithm. In this study, we investigated the vulnerability of DRPE using fingerprints as cipher keys to the KPAs. By means of computational experiments, we estimated the encryption key and restored the fingerprint image using the estimated key. Further, we propose a method for avoiding the KPA on the DRPE that employs the phase retrieval algorithm. The proposed method makes the amplitude component of the encrypted image constant in order to prevent the amplitude component of the encrypted image from being used as a clue for phase retrieval. Computational experiments showed that the proposed method not only avoids revealing the cipher key and the fingerprint but also serves as a sufficiently accurate verification system.

  1. An efficient computational method for the approximate solution of nonlinear Lane-Emden type equations arising in astrophysics

    NASA Astrophysics Data System (ADS)

    Singh, Harendra

    2018-04-01

    The key purpose of this article is to introduce an efficient computational method for the approximate solution of the homogeneous as well as non-homogeneous nonlinear Lane-Emden type equations. Using proposed computational method given nonlinear equation is converted into a set of nonlinear algebraic equations whose solution gives the approximate solution to the Lane-Emden type equation. Various nonlinear cases of Lane-Emden type equations like standard Lane-Emden equation, the isothermal gas spheres equation and white-dwarf equation are discussed. Results are compared with some well-known numerical methods and it is observed that our results are more accurate.

  2. A Feminine Care Clinical Research Program Transforms Women's Lives.

    PubMed

    Tzeghai, Ghebre E; Ajayi, Funmilayo O; Miller, Kenneth W; Imbescheid, Frank; Sobel, Jack D; Farage, Miranda A

    2012-12-17

    Feminine hygiene products and menstruation education have transformed the lives of women throughout the world. The P&G Feminine Care Clinical Innovation Research Program has played a key role by expanding scientific knowledge as well as developing technical insights and tools for the development of feminine hygiene products. The aim has been to meet the needs of women throughout their life stages, advancing their urogenital health beyond just menstruation, as well as helping to understand the role of sex hormones in various important health issues that women face. This review article highlights key contributions and research findings in female hygiene products, urogenital health research, and method development. The clinical research team focused on utilizing the results of clinical safety studies to advance the acceptance of feminine hygiene products world-wide. Key findings include that perception of skin sensitivity is not limited to the facial area, but is also relevant to the body and the genital area. Also, they shed light on the role of estrogen in autoimmune diseases as well as premenstrual syndrome. Efforts in the method development area focused on innovative tools that are reliable, predictive of clinical trial results and capable of measuring wear comfort, genital skin health, and the impact of product use on the consumer's quality of life. A novel method, behind-the-knee (BTK) test, developed to model irritation under normal wear conditions, was the first to account for both chemical and mechanical sources of irritation. The method has been accepted by the FDA as a substitute in clinical trials in some cases, and by American Society for Testing and Materials as a global standard test method. Additional proprietary methods were developed to enhance visual grading of irritation using cross-polarized light, to measure the amount of lotion transferred from sanitary pads, and to evaluate the skin mildness. Finally, the Farage Quality of Life tool was created to measure consumer's well-being. Based on the results of this extensive clinical research and the newly developed testing methods, the changing needs of women throughout their life stages are better met.

  3. A Feminine Care Clinical Research Program Transforms Women’s Lives

    PubMed Central

    Tzeghai, Ghebre E.; Ajayi, Funmilayo O.; Miller, Kenneth W.; Imbescheid, Frank; Sobel, Jack D.; Farage, Miranda A.

    2015-01-01

    Feminine hygiene products and menstruation education have transformed the lives of women throughout the world. The P&G Feminine Care Clinical Innovation Research Program has played a key role by expanding scientific knowledge as well as developing technical insights and tools for the development of feminine hygiene products. The aim has been to meet the needs of women throughout their life stages, advancing their urogenital health beyond just menstruation, as well as helping to understand the role of sex hormones in various important health issues that women face. This review article highlights key contributions and research findings in female hygiene products, urogenital health research, and method development. The clinical research team focused on utilizing the results of clinical safety studies to advance the acceptance of feminine hygiene products world-wide. Key findings include that perception of skin sensitivity is not limited to the facial area, but is also relevant to the body and the genital area. Also, they shed light on the role of estrogen in autoimmune diseases as well as premenstrual syndrome. Efforts in the method development area focused on innovative tools that are reliable, predictive of clinical trial results and capable of measuring wear comfort, genital skin health, and the impact of product use on the consumer’s quality of life. A novel method, behind-the-knee (BTK) test, developed to model irritation under normal wear conditions, was the first to account for both chemical and mechanical sources of irritation. The method has been accepted by the FDA as a substitute in clinical trials in some cases, and by American Society for Testing and Materials as a global standard test method. Additional proprietary methods were developed to enhance visual grading of irritation using cross-polarized light, to measure the amount of lotion transferred from sanitary pads, and to evaluate the skin mildness. Finally, the Farage Quality of Life tool was created to measure consumer’s well-being. Based on the results of this extensive clinical research and the newly developed testing methods, the changing needs of women throughout their life stages are better met. PMID:25946910

  4. Differentiation and Exploration of Model MACP for HE VER 1.0 on Prototype Performance Measurement Application for Higher Education

    NASA Astrophysics Data System (ADS)

    El Akbar, R. Reza; Anshary, Muhammad Adi Khairul; Hariadi, Dennis

    2018-02-01

    Model MACP for HE ver.1. Is a model that describes how to perform measurement and monitoring performance for Higher Education. Based on a review of the research related to the model, there are several parts of the model component to develop in further research, so this research has four main objectives. The first objective is to differentiate the CSF (critical success factor) components in the previous model, the two key KPI (key performance indicators) exploration in the previous model, the three based on the previous objective, the new and more detailed model design. The final goal is the fourth designed prototype application for performance measurement in higher education, based on a new model created. The method used is explorative research method and application design using prototype method. The results of this study are first, forming a more detailed new model for measurement and monitoring of performance in higher education, differentiation and exploration of the Model MACP for HE Ver.1. The second result compiles a dictionary of college performance measurement by re-evaluating the existing indicators. The third result is the design of prototype application of performance measurement in higher education.

  5. A novel method for inverse fiber Bragg grating structure design

    NASA Astrophysics Data System (ADS)

    Yin, Yu-zhe; Chen, Xiang-fei; Dai, Yi-tang; Xie, Shi-zhong

    2003-12-01

    A novel grating inverse design method is proposed in this paper, which is direct in physical meaning and easy to accomplish. The key point of the method is design and implement desired spectra response in grating strength modulation domain, while not in grating period chirp domain. Simulated results are in good coincidence with design target. By transforming grating period chirp to grating strength modulation, a novel grating with opposite dispersion characters is proposed.

  6. Digital games in medical education: Key terms, concepts, and definitions

    PubMed Central

    Bigdeli, Shoaleh; Kaufman, David

    2017-01-01

    Introduction: Game-based education is fast becoming a key instrument in medical education. Method: In this study, papers related to games were filtered and limited to full-text peer-reviewed published in English. Results: To the best of researchers’ knowledge, the concepts used in the literature are varied and distinct, and the literature is not conclusive on the definition of educational games for medical education. Conclusion: This paper attempts to classify terms, concepts and definitions common to gamification in medical education. PMID:29445681

  7. Digital games in medical education: Key terms, concepts, and definitions.

    PubMed

    Bigdeli, Shoaleh; Kaufman, David

    2017-01-01

    Introduction: Game-based education is fast becoming a key instrument in medical education. Method: In this study, papers related to games were filtered and limited to full-text peer-reviewed published in English. Results: To the best of researchers' knowledge, the concepts used in the literature are varied and distinct, and the literature is not conclusive on the definition of educational games for medical education. Conclusion: This paper attempts to classify terms, concepts and definitions common to gamification in medical education.

  8. Protecting Cryptographic Keys and Functions from Malware Attacks

    DTIC Science & Technology

    2010-12-01

    registers. modifies RSA private key signing in OpenSSL to use the technique. The resulting system has the following features: 1. No special hardware is...the above method based on OpenSSL , by exploiting the Streaming SIMD Extension (SSE) XMM registers of modern Intel and AMD x86-compatible CPU’s [22...one can store a 2048-bit exponent.1 Our prototype is based on OpenSSL 0.9.8e, the Ubuntu 6.06 Linux distribution with a 2.6.15 kernel, and SSE2 which

  9. DNA based random key generation and management for OTP encryption.

    PubMed

    Zhang, Yunpeng; Liu, Xin; Sun, Manhui

    2017-09-01

    One-time pad (OTP) is a principle of key generation applied to the stream ciphering method which offers total privacy. The OTP encryption scheme has proved to be unbreakable in theory, but difficult to realize in practical applications. Because OTP encryption specially requires the absolute randomness of the key, its development has suffered from dense constraints. DNA cryptography is a new and promising technology in the field of information security. DNA chromosomes storing capabilities can be used as one-time pad structures with pseudo-random number generation and indexing in order to encrypt the plaintext messages. In this paper, we present a feasible solution to the OTP symmetric key generation and transmission problem with DNA at the molecular level. Through recombinant DNA technology, by using only sender-receiver known restriction enzymes to combine the secure key represented by DNA sequence and the T vector, we generate the DNA bio-hiding secure key and then place the recombinant plasmid in implanted bacteria for secure key transmission. The designed bio experiments and simulation results show that the security of the transmission of the key is further improved and the environmental requirements of key transmission are reduced. Analysis has demonstrated that the proposed DNA-based random key generation and management solutions are marked by high security and usability. Published by Elsevier B.V.

  10. [Studies Using Text Mining on the Differences in Learning Effects between the KJ and World Café Method as Learning Strategies].

    PubMed

    Yasuhara, Tomohisa; Sone, Tomomichi; Konishi, Motomi; Kushihata, Taro; Nishikawa, Tomoe; Yamamoto, Yumi; Kurio, Wasako; Kohno, Takeyuki

    2015-01-01

    The KJ method (named for developer Jiro Kawakita; also known as affinity diagramming) is widely used in participatory learning as a means to collect and organize information. In addition, the World Café (WC) has recently become popular. However, differences in the information obtained using each method have not been studied comprehensively. To determine the appropriate information selection criteria, we analyzed differences in the information generated by the WC and KJ methods. Two groups engaged in sessions to collect and organize information using either the WC or KJ method and small group discussions were held to create "proposals to improve first-year education". Both groups answered two pre- and post- session questionnaires that asked for free descriptions. Key words were extracted from the results of the two questionnaires and categorized using text mining. In the responses to questionnaire 1, which was directly related to the session theme, a significant increase in the number of key words was observed in the WC group (p=0.0050, Fisher's exact test). However, there was no significant increase in the number of key words in the responses to questionnaire 2, which was not directly related to the session theme (p=0.8347, Fisher's exact test). In the KJ method, participants extracted the most notable issues and progressed to a detailed discussion, whereas in the WC method, various information and problems were spread among the participants. The choice between the WC and KJ method should be made to reflect the educational objective and desired direction of discussion.

  11. Standard plane localization in ultrasound by radial component model and selective search.

    PubMed

    Ni, Dong; Yang, Xin; Chen, Xin; Chin, Chien-Ting; Chen, Siping; Heng, Pheng Ann; Li, Shengli; Qin, Jing; Wang, Tianfu

    2014-11-01

    Acquisition of the standard plane is crucial for medical ultrasound diagnosis. However, this process requires substantial experience and a thorough knowledge of human anatomy. Therefore it is very challenging for novices and even time consuming for experienced examiners. We proposed a hierarchical, supervised learning framework for automatically detecting the standard plane from consecutive 2-D ultrasound images. We tested this technique by developing a system that localizes the fetal abdominal standard plane from ultrasound video by detecting three key anatomical structures: the stomach bubble, umbilical vein and spine. We first proposed a novel radial component-based model to describe the geometric constraints of these key anatomical structures. We then introduced a novel selective search method which exploits the vessel probability algorithm to produce probable locations for the spine and umbilical vein. Next, using component classifiers trained by random forests, we detected the key anatomical structures at their probable locations within the regions constrained by the radial component-based model. Finally, a second-level classifier combined the results from the component detection to identify an ultrasound image as either a "fetal abdominal standard plane" or a "non- fetal abdominal standard plane." Experimental results on 223 fetal abdomen videos showed that the detection accuracy of our method was as high as 85.6% and significantly outperformed both the full abdomen and the separate anatomy detection methods without geometric constraints. The experimental results demonstrated that our system shows great promise for application to clinical practice. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  12. Eight-step method to build the clinical content of an evidence-based care pathway: the case for COPD exacerbation

    PubMed Central

    2012-01-01

    Background Optimization of the clinical care process by integration of evidence-based knowledge is one of the active components in care pathways. When studying the impact of a care pathway by using a cluster-randomized design, standardization of the care pathway intervention is crucial. This methodology paper describes the development of the clinical content of an evidence-based care pathway for in-hospital management of chronic obstructive pulmonary disease (COPD) exacerbation in the context of a cluster-randomized controlled trial (cRCT) on care pathway effectiveness. Methods The clinical content of a care pathway for COPD exacerbation was developed based on recognized process design and guideline development methods. Subsequently, based on the COPD case study, a generalized eight-step method was designed to support the development of the clinical content of an evidence-based care pathway. Results A set of 38 evidence-based key interventions and a set of 24 process and 15 outcome indicators were developed in eight different steps. Nine Belgian multidisciplinary teams piloted both the set of key interventions and indicators. The key intervention set was judged by the teams as being valid and clinically applicable. In addition, the pilot study showed that the indicators were feasible for the involved clinicians and patients. Conclusions The set of 38 key interventions and the set of process and outcome indicators were found to be appropriate for the development and standardization of the clinical content of the COPD care pathway in the context of a cRCT on pathway effectiveness. The developed eight-step method may facilitate multidisciplinary teams caring for other patient populations in designing the clinical content of their future care pathways. PMID:23190552

  13. Calculation method for laser radar cross sections of rotationally symmetric targets.

    PubMed

    Cao, Yunhua; Du, Yongzhi; Bai, Lu; Wu, Zhensen; Li, Haiying; Li, Yanhui

    2017-07-01

    The laser radar cross section (LRCS) is a key parameter in the study of target scattering characteristics. In this paper, a practical method for calculating LRCSs of rotationally symmetric targets is presented. Monostatic LRCSs for four kinds of rotationally symmetric targets (cone, rotating ellipsoid, super ellipsoid, and blunt cone) are calculated, and the results verify the feasibility of the method. Compared with the results for the triangular patch method, the correctness of the method is verified, and several advantages of the method are highlighted. For instance, the method does not require geometric modeling and patch discretization. The method uses a generatrix model and double integral, and its calculation is concise and accurate. This work provides a theory analysis for the rapid calculation of LRCS for common basic targets.

  14. HIV Epidemic Appraisals for Assisting in the Design of Effective Prevention Programmes: Shifting the Paradigm Back to Basics

    PubMed Central

    Mishra, Sharmistha; Sgaier, Sema K.; Thompson, Laura H.; Moses, Stephen; Ramesh, B. M.; Alary, Michel; Wilson, David; Blanchard, James F.

    2012-01-01

    Background To design HIV prevention programmes, it is critical to understand the temporal and geographic aspects of the local epidemic and to address the key behaviours that drive HIV transmission. Two methods have been developed to appraise HIV epidemics and guide prevention strategies. The numerical proxy method classifies epidemics based on current HIV prevalence thresholds. The Modes of Transmission (MOT) model estimates the distribution of incidence over one year among risk-groups. Both methods focus on the current state of an epidemic and provide short-term metrics which may not capture the epidemiologic drivers. Through a detailed analysis of country and sub-national data, we explore the limitations of the two traditional methods and propose an alternative approach. Methods and Findings We compared outputs of the traditional methods in five countries for which results were published, and applied the numeric and MOT model to India and six districts within India. We discovered three limitations of the current methods for epidemic appraisal: (1) their results failed to identify the key behaviours that drive the epidemic; (2) they were difficult to apply to local epidemics with heterogeneity across district-level administrative units; and (3) the MOT model was highly sensitive to input parameters, many of which required extraction from non-regional sources. We developed an alternative decision-tree framework for HIV epidemic appraisals, based on a qualitative understanding of epidemiologic drivers, and demonstrated its applicability in India. The alternative framework offered a logical algorithm to characterize epidemics; it required minimal but key data. Conclusions Traditional appraisals that utilize the distribution of prevalent and incident HIV infections in the short-term could misguide prevention priorities and potentially impede efforts to halt the trajectory of the HIV epidemic. An approach that characterizes local transmission dynamics provides a potentially more effective tool with which policy makers can design intervention programmes. PMID:22396756

  15. Importance of Heat and Pressure for Solubilization of Recombinant Spider Silk Proteins in Aqueous Solution

    PubMed Central

    Jones, Justin A.; Harris, Thomas I.; Oliveira, Paula F.; Bell, Brianne E.; Alhabib, Abdulrahman; Lewis, Randolph V.

    2016-01-01

    The production of recombinant spider silk proteins continues to be a key area of interest for a number of research groups. Several key obstacles exist in their production as well as in their formulation into useable products. The original reported method to solubilize recombinant spider silk proteins (rSSp) in an aqueous solution involved using microwaves to quickly generate heat and pressure inside of a sealed vial containing rSSp and water. Fibers produced from this system are remarkable in their mechanical ability and demonstrate the ability to be stretched and recover 100 times. The microwave method dissolves the rSSPs with dissolution time increasing with higher molecular weight constructs, increasing concentration of rSSPs, protein type, and salt concentration. It has proven successful in solvating a number of different rSSPs including native-like sequences (MaSp1, MaSp2, piriform, and aggregate) as well as chimeric sequences (FlAS) in varied concentrations that have been spun into fibers and formed into films, foams, sponges, gels, coatings, macro and micro spheres and adhesives. The system is effective but inherently unpredictable and difficult to control. Provided that the materials that can be generated from this method of dissolution are impressive, an alternative means of applying heat and pressure that is controllable and predictable has been developed. Results indicate that there are combinations of heat and pressure (135 °C and 140 psi) that result in maximal dissolution without degrading the recombinant MaSp2 protein tested, and that heat and pressure are the key elements to the method of dissolution. PMID:27886066

  16. Importance of Heat and Pressure for Solubilization of Recombinant Spider Silk Proteins in Aqueous Solution.

    PubMed

    Jones, Justin A; Harris, Thomas I; Oliveira, Paula F; Bell, Brianne E; Alhabib, Abdulrahman; Lewis, Randolph V

    2016-11-23

    The production of recombinant spider silk proteins continues to be a key area of interest for a number of research groups. Several key obstacles exist in their production as well as in their formulation into useable products. The original reported method to solubilize recombinant spider silk proteins (rSSp) in an aqueous solution involved using microwaves to quickly generate heat and pressure inside of a sealed vial containing rSSp and water. Fibers produced from this system are remarkable in their mechanical ability and demonstrate the ability to be stretched and recover 100 times. The microwave method dissolves the rSSPs with dissolution time increasing with higher molecular weight constructs, increasing concentration of rSSPs, protein type, and salt concentration. It has proven successful in solvating a number of different rSSPs including native-like sequences (MaSp1, MaSp2, piriform, and aggregate) as well as chimeric sequences (FlAS) in varied concentrations that have been spun into fibers and formed into films, foams, sponges, gels, coatings, macro and micro spheres and adhesives. The system is effective but inherently unpredictable and difficult to control. Provided that the materials that can be generated from this method of dissolution are impressive, an alternative means of applying heat and pressure that is controllable and predictable has been developed. Results indicate that there are combinations of heat and pressure (135 °C and 140 psi) that result in maximal dissolution without degrading the recombinant MaSp2 protein tested, and that heat and pressure are the key elements to the method of dissolution.

  17. When Outcome Definition Determines the Result in Impact Evaluations: An Illustration Using the Swedish Work-Practice Programme

    ERIC Educational Resources Information Center

    Månsson, Jonas; Lundin, Christofer

    2017-01-01

    In this paper we investigate the effect of difference in outcome definitions on the result of impact evaluations. The Swedish workplace practice programme is evaluated, using matching methods. The key findings are that changing how the outcome is defined has a considerable influence on the results of the impact assessment. From the results of this…

  18. Lazy collaborative filtering for data sets with missing values.

    PubMed

    Ren, Yongli; Li, Gang; Zhang, Jun; Zhou, Wanlei

    2013-12-01

    As one of the biggest challenges in research on recommender systems, the data sparsity issue is mainly caused by the fact that users tend to rate a small proportion of items from the huge number of available items. This issue becomes even more problematic for the neighborhood-based collaborative filtering (CF) methods, as there are even lower numbers of ratings available in the neighborhood of the query item. In this paper, we aim to address the data sparsity issue in the context of neighborhood-based CF. For a given query (user, item), a set of key ratings is first identified by taking the historical information of both the user and the item into account. Then, an auto-adaptive imputation (AutAI) method is proposed to impute the missing values in the set of key ratings. We present a theoretical analysis to show that the proposed imputation method effectively improves the performance of the conventional neighborhood-based CF methods. The experimental results show that our new method of CF with AutAI outperforms six existing recommendation methods in terms of accuracy.

  19. Using the Method of Water Poverty Index (WPI) to Evaluate the Region Water Security

    NASA Astrophysics Data System (ADS)

    Fu, Q.; Kachanoski, G.

    2008-12-01

    Water security is a widely concerned issue in the world nowadays. A new method, water poverty index (WPI), has been used to evaluate the regional water security. Twelve state farms in Heilongjiang Province, Northeastern China were selected to evaluate water security status based on the data of 2006 by using WPI and mean deviation grading method. The method of WPI includes five key indexes, such as resources(R), access (A), capacity(C), utilization (U) and environment (E). Each key index includes several sub-indexes. According to the results of WPI, the grade of each farm has been calculated by using the method of mean deviation grading. Thus, the radar images can be protracted of each farm. From the radar images, the conclusions can be drawn that the WPI values of Farms 853 and Hongqiling were in very safe status, while that of Farm Raohe was in safe status, those of Farms Youyi, 597, 852, 291 and Jiangchuan were in moderate safe status, that of Farm Beixing was in low safe status and those of Farms Shuangyashan, Shuguang and Baoshan were in unsafe status. The results from this study can provide basic information for decision making on rational use of water resources and regulations for regional water safety guarantee system.

  20. Engaging Key Stakeholders to Assess and Improve the Professional Preparation of MPH Health Educators

    PubMed Central

    Steckler, Allan; Maman, Suzanne; Ellenson, Meg; French, Elizabeth; Blanchard, Lynn; Bowling, Mike; Yamanis, Nina; Succop, Stacey; Davenport, Amy; Moracco, Beth

    2010-01-01

    Objectives. We described the process of engaging key stakeholders in a systematic review of requirements for a master of public health (MPH) degree within the Department of Health Behavior and Health Education, University of North Carolina Gillings School of Global Public Health, and summarized resulting changes. Methods. A benchmarking study of 11 peer institutions was completed. Key stakeholders (i.e., current students, alumni, faculty, staff, employers, and practicum preceptors) received online or print surveys. A faculty retreat was convened to process results and reach consensus on program revisions. Results. MPH program changes included (1) improved advising and mentoring program, (2) elimination of research and practice track options, (3) increased elective and decreased required credit hours, (4) replacement of master's paper requirement with “deliverables” (written products such as reports, documents, and forms) produced as part of the required “Capstone” course, (5) extended community field experience to 2 semesters and moved it to year 2 of the program, and (6) allowed practica of either 200, 300, or 400 hours. Conclusions. Engaging key stakeholders in the program review process yielded important changes to the MPH degree program requirements. Others may consider this approach when undertaking curriculum reviews. PMID:20395575

  1. A new method for generating an invariant iris private key based on the fuzzy vault system.

    PubMed

    Lee, Youn Joo; Park, Kang Ryoung; Lee, Sung Joo; Bae, Kwanghyuk; Kim, Jaihie

    2008-10-01

    Cryptographic systems have been widely used in many information security applications. One main challenge that these systems have faced has been how to protect private keys from attackers. Recently, biometric cryptosystems have been introduced as a reliable way of concealing private keys by using biometric data. A fuzzy vault refers to a biometric cryptosystem that can be used to effectively protect private keys and to release them only when legitimate users enter their biometric data. In biometric systems, a critical problem is storing biometric templates in a database. However, fuzzy vault systems do not need to directly store these templates since they are combined with private keys by using cryptography. Previous fuzzy vault systems were designed by using fingerprint, face, and so on. However, there has been no attempt to implement a fuzzy vault system that used an iris. In biometric applications, it is widely known that an iris can discriminate between persons better than other biometric modalities. In this paper, we propose a reliable fuzzy vault system based on local iris features. We extracted multiple iris features from multiple local regions in a given iris image, and the exact values of the unordered set were then produced using the clustering method. To align the iris templates with the new input iris data, a shift-matching technique was applied. Experimental results showed that 128-bit private keys were securely and robustly generated by using any given iris data without requiring prealignment.

  2. Information hiding based on double random-phase encoding and public-key cryptography.

    PubMed

    Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li

    2009-03-02

    A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.

  3. High-precision relative position and attitude measurement for on-orbit maintenance of spacecraft

    NASA Astrophysics Data System (ADS)

    Zhu, Bing; Chen, Feng; Li, Dongdong; Wang, Ying

    2018-02-01

    In order to realize long-term on-orbit running of satellites, space stations, etc spacecrafts, in addition to the long life design of devices, The life of the spacecraft can also be extended by the on-orbit servicing and maintenance. Therefore, it is necessary to keep precise and detailed maintenance of key components. In this paper, a high-precision relative position and attitude measurement method used in the maintenance of key components is given. This method mainly considers the design of the passive cooperative marker, light-emitting device and high resolution camera in the presence of spatial stray light and noise. By using a series of algorithms, such as background elimination, feature extraction, position and attitude calculation, and so on, the high precision relative pose parameters as the input to the control system between key operation parts and maintenance equipment are obtained. The simulation results show that the algorithm is accurate and effective, satisfying the requirements of the precision operation technique.

  4. Registration algorithm of point clouds based on multiscale normal features

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Peng, Zhongtao; Su, Hang; Xia, GuiHua

    2015-01-01

    The point cloud registration technology for obtaining a three-dimensional digital model is widely applied in many areas. To improve the accuracy and speed of point cloud registration, a registration method based on multiscale normal vectors is proposed. The proposed registration method mainly includes three parts: the selection of key points, the calculation of feature descriptors, and the determining and optimization of correspondences. First, key points are selected from the point cloud based on the changes of magnitude of multiscale curvatures obtained by using principal components analysis. Then the feature descriptor of each key point is proposed, which consists of 21 elements based on multiscale normal vectors and curvatures. The correspondences in a pair of two point clouds are determined according to the descriptor's similarity of key points in the source point cloud and target point cloud. Correspondences are optimized by using a random sampling consistency algorithm and clustering technology. Finally, singular value decomposition is applied to optimized correspondences so that the rigid transformation matrix between two point clouds is obtained. Experimental results show that the proposed point cloud registration algorithm has a faster calculation speed, higher registration accuracy, and better antinoise performance.

  5. Teaching learning methods of an entrepreneurship curriculum

    PubMed Central

    ESMI, KERAMAT; MARZOUGHI, RAHMATALLAH; TORKZADEH, JAFAR

    2015-01-01

    Introduction One of the most significant elements of entrepreneurship curriculum design is teaching-learning methods, which plays a key role in studies and researches related to such a curriculum. It is the teaching method, and systematic, organized and logical ways of providing lessons that should be consistent with entrepreneurship goals and contents, and should also be developed according to the learners’ needs. Therefore, the current study aimed to introduce appropriate, modern, and effective methods of teaching entrepreneurship and their validation Methods This is a mixed method research of a sequential exploratory kind conducted through two stages: a) developing teaching methods of entrepreneurship curriculum, and b) validating developed framework. Data were collected through “triangulation” (study of documents, investigating theoretical basics and the literature, and semi-structured interviews with key experts). Since the literature on this topic is very rich, and views of the key experts are vast, directed and summative content analysis was used. In the second stage, qualitative credibility of research findings was obtained using qualitative validation criteria (credibility, confirmability, and transferability), and applying various techniques. Moreover, in order to make sure that the qualitative part is reliable, reliability test was used. Moreover, quantitative validation of the developed framework was conducted utilizing exploratory and confirmatory factor analysis methods and Cronbach’s alpha. The data were gathered through distributing a three-aspect questionnaire (direct presentation teaching methods, interactive, and practical-operational aspects) with 29 items among 90 curriculum scholars. Target population was selected by means of purposive sampling and representative sample. Results Results obtained from exploratory factor analysis showed that a three factor structure is an appropriate method for describing elements of teaching-learning methods of entrepreneurship curriculum. Moreover, the value for Kaiser Meyer Olkin measure of sampling adequacy equaled 0.72 and the value for Bartlett’s test of variances homogeneity was significant at the 0.0001 level. Except for internship element, the rest had a factor load of higher than 0.3. Also, the results of confirmatory factor analysis showed the model appropriateness, and the criteria for qualitative accreditation were acceptable. Conclusion Developed model can help instructors in selecting an appropriate method of entrepreneurship teaching, and it can also make sure that the teaching is on the right path. Moreover, the model is comprehensive and includes all the effective teaching methods in entrepreneurship education. It is also based on qualities, conditions, and requirements of Higher Education Institutions in Iranian cultural environment. PMID:26457314

  6. Identification and annotation of erotic film based on content analysis

    NASA Astrophysics Data System (ADS)

    Wang, Donghui; Zhu, Miaoliang; Yuan, Xin; Qian, Hui

    2005-02-01

    The paper brings forward a new method for identifying and annotating erotic films based on content analysis. First, the film is decomposed to video and audio stream. Then, the video stream is segmented into shots and key frames are extracted from each shot. We filter the shots that include potential erotic content by finding the nude human body in key frames. A Gaussian model in YCbCr color space for detecting skin region is presented. An external polygon that covered the skin regions is used for the approximation of the human body. Last, we give the degree of the nudity by calculating the ratio of skin area to whole body area with weighted parameters. The result of the experiment shows the effectiveness of our method.

  7. Machine Learning: A Crucial Tool for Sensor Design

    PubMed Central

    Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.

    2009-01-01

    Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110

  8. High performance reconciliation for continuous-variable quantum key distribution with LDPC code

    NASA Astrophysics Data System (ADS)

    Lin, Dakai; Huang, Duan; Huang, Peng; Peng, Jinye; Zeng, Guihua

    2015-03-01

    Reconciliation is a significant procedure in a continuous-variable quantum key distribution (CV-QKD) system. It is employed to extract secure secret key from the resulted string through quantum channel between two users. However, the efficiency and the speed of previous reconciliation algorithms are low. These problems limit the secure communication distance and the secure key rate of CV-QKD systems. In this paper, we proposed a high-speed reconciliation algorithm through employing a well-structured decoding scheme based on low density parity-check (LDPC) code. The complexity of the proposed algorithm is reduced obviously. By using a graphics processing unit (GPU) device, our method may reach a reconciliation speed of 25 Mb/s for a CV-QKD system, which is currently the highest level and paves the way to high-speed CV-QKD.

  9. Feasibility of Extracting Key Elements from ClinicalTrials.gov to Support Clinicians' Patient Care Decisions.

    PubMed

    Kim, Heejun; Bian, Jiantao; Mostafa, Javed; Jonnalagadda, Siddhartha; Del Fiol, Guilherme

    2016-01-01

    Motivation: Clinicians need up-to-date evidence from high quality clinical trials to support clinical decisions. However, applying evidence from the primary literature requires significant effort. Objective: To examine the feasibility of automatically extracting key clinical trial information from ClinicalTrials.gov. Methods: We assessed the coverage of ClinicalTrials.gov for high quality clinical studies that are indexed in PubMed. Using 140 random ClinicalTrials.gov records, we developed and tested rules for the automatic extraction of key information. Results: The rate of high quality clinical trial registration in ClinicalTrials.gov increased from 0.2% in 2005 to 17% in 2015. Trials reporting results increased from 3% in 2005 to 19% in 2015. The accuracy of the automatic extraction algorithm for 10 trial attributes was 90% on average. Future research is needed to improve the algorithm accuracy and to design information displays to optimally present trial information to clinicians.

  10. Teaching home safety and survival skills to latch-key children: a comparison of two manuals and methods.

    PubMed Central

    Peterson, L

    1984-01-01

    I evaluated the influence of two training manuals on latch-key children's acquisition of home safety and survival skills. The widely used, discussion-oriented "Prepared for Today" manual was compared with a behaviorally oriented "Safe at Home" manual. Data were scored by response criteria developed by experts and by parents' and experts' ratings of children's spontaneous answers. With both methods of scoring, three behaviorally trained children demonstrated clear and abrupt increases in skill following training in each of seven trained modules, and these increases largely persisted in real world generalization probes and at 5-month follow-up. Smaller and less stable increases in skill were found in the three discussion-trained children across the seven modules; lower skill levels were also seen in real world generalization probes and at follow-up. Neither group of children demonstrated skill increases in home safety areas that were not explicitly trained. Both training methods produced small decreases in children's self-report of general anxiety and anxiety concerning home safety. Results are discussed in terms of their implications for cost-effective training of latch-key children. PMID:6511698

  11. A national collaboration process: Finnish engineering education for the benefit of people and environment.

    PubMed

    Takala, A; Korhonen-Yrjänheikki, K

    2013-12-01

    The key stakeholders of the Finnish engineering education collaborated during 2006-09 to reform the system of education, to face the challenges of the changing business environment and to create a national strategy for the Finnish engineering education. The work process was carried out using participatory work methods. Impacts of sustainable development (SD) on engineering education were analysed in one of the subprojects. In addition to participatory workshops, the core part of the work on SD consisted of a research with more than 60 interviews and an extensive literature survey. This paper discusses the results of the research and the work process of the Collaboration Group in the subproject of SD. It is suggested that enhancing systematic dialogue among key stakeholders using participatory work methods is crucial in increasing motivation and commitment in incorporating SD in engineering education. Development of the context of learning is essential for improving skills of engineering graduates in some of the key abilities related to SD: systemic- and life-cycle thinking, ethical understanding, collaborative learning and critical reflection skills. This requires changing of the educational paradigm from teacher-centred to learner-centred applying problem- and project-oriented active learning methods.

  12. In vivo quantitative evaluation of vascular parameters for angiogenesis based on sparse principal component analysis and aggregated boosted trees

    NASA Astrophysics Data System (ADS)

    Zhao, Fengjun; Liu, Junting; Qu, Xiaochao; Xu, Xianhui; Chen, Xueli; Yang, Xiang; Cao, Feng; Liang, Jimin; Tian, Jie

    2014-12-01

    To solve the multicollinearity issue and unequal contribution of vascular parameters for the quantification of angiogenesis, we developed a quantification evaluation method of vascular parameters for angiogenesis based on in vivo micro-CT imaging of hindlimb ischemic model mice. Taking vascular volume as the ground truth parameter, nine vascular parameters were first assembled into sparse principal components (PCs) to reduce the multicolinearity issue. Aggregated boosted trees (ABTs) were then employed to analyze the importance of vascular parameters for the quantification of angiogenesis via the loadings of sparse PCs. The results demonstrated that vascular volume was mainly characterized by vascular area, vascular junction, connectivity density, segment number and vascular length, which indicated they were the key vascular parameters for the quantification of angiogenesis. The proposed quantitative evaluation method was compared with both the ABTs directly using the nine vascular parameters and Pearson correlation, which were consistent. In contrast to the ABTs directly using the vascular parameters, the proposed method can select all the key vascular parameters simultaneously, because all the key vascular parameters were assembled into the sparse PCs with the highest relative importance.

  13. Deformation effect simulation and optimization for double front axle steering mechanism

    NASA Astrophysics Data System (ADS)

    Wu, Jungang; Zhang, Siqin; Yang, Qinglong

    2013-03-01

    This paper research on tire wear problem of heavy vehicles with Double Front Axle Steering Mechanism from the flexible effect of Steering Mechanism, and proposes a structural optimization method which use both traditional static structural theory and dynamic structure theory - Equivalent Static Load (ESL) method to optimize key parts. The good simulated and test results show this method has high engineering practice and reference value for tire wear problem of Double Front Axle Steering Mechanism design.

  14. Passive decoy-state quantum key distribution with practical light sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curty, Marcos; Ma, Xiongfeng; Qi, Bing

    2010-02-15

    Decoy states have been proven to be a very useful method for significantly enhancing the performance of quantum key distribution systems with practical light sources. Although active modulation of the intensity of the laser pulses is an effective way of preparing decoy states in principle, in practice passive preparation might be desirable in some scenarios. Typical passive schemes involve parametric down-conversion. More recently, it has been shown that phase-randomized weak coherent pulses (WCP) can also be used for the same purpose [M. Curty et al., Opt. Lett. 34, 3238 (2009).] This proposal requires only linear optics together with a simplemore » threshold photon detector, which shows the practical feasibility of the method. Most importantly, the resulting secret key rate is comparable to the one delivered by an active decoy-state setup with an infinite number of decoy settings. In this article we extend these results, now showing specifically the analysis for other practical scenarios with different light sources and photodetectors. In particular, we consider sources emitting thermal states, phase-randomized WCP, and strong coherent light in combination with several types of photodetectors, like, for instance, threshold photon detectors, photon number resolving detectors, and classical photodetectors. Our analysis includes as well the effect that detection inefficiencies and noise in the form of dark counts shown by current threshold detectors might have on the final secret key rate. Moreover, we provide estimations on the effects that statistical fluctuations due to a finite data size can have in practical implementations.« less

  15. Information filtering via a scaling-based function.

    PubMed

    Qiu, Tian; Zhang, Zi-Ke; Chen, Guang

    2013-01-01

    Finding a universal description of the algorithm optimization is one of the key challenges in personalized recommendation. In this article, for the first time, we introduce a scaling-based algorithm (SCL) independent of recommendation list length based on a hybrid algorithm of heat conduction and mass diffusion, by finding out the scaling function for the tunable parameter and object average degree. The optimal value of the tunable parameter can be abstracted from the scaling function, which is heterogeneous for the individual object. Experimental results obtained from three real datasets, Netflix, MovieLens and RYM, show that the SCL is highly accurate in recommendation. More importantly, compared with a number of excellent algorithms, including the mass diffusion method, the original hybrid method, and even an improved version of the hybrid method, the SCL algorithm remarkably promotes the personalized recommendation in three other aspects: solving the accuracy-diversity dilemma, presenting a high novelty, and solving the key challenge of cold start problem.

  16. Transonic Flow Computations Using Nonlinear Potential Methods

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    This presentation describes the state of transonic flow simulation using nonlinear potential methods for external aerodynamic applications. The presentation begins with a review of the various potential equation forms (with emphasis on the full potential equation) and includes a discussion of pertinent mathematical characteristics and all derivation assumptions. Impact of the derivation assumptions on simulation accuracy, especially with respect to shock wave capture, is discussed. Key characteristics of all numerical algorithm types used for solving nonlinear potential equations, including steady, unsteady, space marching, and design methods, are described. Both spatial discretization and iteration scheme characteristics are examined. Numerical results for various aerodynamic applications are included throughout the presentation to highlight key discussion points. The presentation ends with concluding remarks and recommendations for future work. Overall. nonlinear potential solvers are efficient, highly developed and routinely used in the aerodynamic design environment for cruise conditions. Published by Elsevier Science Ltd. All rights reserved.

  17. Application of L1/2 regularization logistic method in heart disease diagnosis.

    PubMed

    Zhang, Bowen; Chai, Hua; Yang, Ziyi; Liang, Yong; Chu, Gejin; Liu, Xiaoying

    2014-01-01

    Heart disease has become the number one killer of human health, and its diagnosis depends on many features, such as age, blood pressure, heart rate and other dozens of physiological indicators. Although there are so many risk factors, doctors usually diagnose the disease depending on their intuition and experience, which requires a lot of knowledge and experience for correct determination. To find the hidden medical information in the existing clinical data is a noticeable and powerful approach in the study of heart disease diagnosis. In this paper, sparse logistic regression method is introduced to detect the key risk factors using L(1/2) regularization on the real heart disease data. Experimental results show that the sparse logistic L(1/2) regularization method achieves fewer but informative key features than Lasso, SCAD, MCP and Elastic net regularization approaches. Simultaneously, the proposed method can cut down the computational complexity, save cost and time to undergo medical tests and checkups, reduce the number of attributes needed to be taken from patients.

  18. Effective Heart Disease Detection Based on Quantitative Computerized Traditional Chinese Medicine Using Representation Based Classifiers.

    PubMed

    Shu, Ting; Zhang, Bob; Tang, Yuan Yan

    2017-01-01

    At present, heart disease is the number one cause of death worldwide. Traditionally, heart disease is commonly detected using blood tests, electrocardiogram, cardiac computerized tomography scan, cardiac magnetic resonance imaging, and so on. However, these traditional diagnostic methods are time consuming and/or invasive. In this paper, we propose an effective noninvasive computerized method based on facial images to quantitatively detect heart disease. Specifically, facial key block color features are extracted from facial images and analyzed using the Probabilistic Collaborative Representation Based Classifier. The idea of facial key block color analysis is founded in Traditional Chinese Medicine. A new dataset consisting of 581 heart disease and 581 healthy samples was experimented by the proposed method. In order to optimize the Probabilistic Collaborative Representation Based Classifier, an analysis of its parameters was performed. According to the experimental results, the proposed method obtains the highest accuracy compared with other classifiers and is proven to be effective at heart disease detection.

  19. Key Competencies and Characteristics for Innovative Teaching among Secondary School Teachers: A Mixed-Methods Research

    ERIC Educational Resources Information Center

    Zhu, Chang; Wang, Di

    2014-01-01

    This research aims to understand the key competencies and characteristics for innovative teaching as perceived by Chinese secondary teachers. A mixed-methods research was used to investigate secondary teachers' views. First, a qualitative study was conducted with interviews of teachers to understand the perceived key competencies and…

  20. Optical cryptography with biometrics for multi-depth objects.

    PubMed

    Yan, Aimin; Wei, Yang; Hu, Zhijuan; Zhang, Jingtao; Tsang, Peter Wai Ming; Poon, Ting-Chung

    2017-10-11

    We propose an optical cryptosystem for encrypting images of multi-depth objects based on the combination of optical heterodyne technique and fingerprint keys. Optical heterodyning requires two optical beams to be mixed. For encryption, each optical beam is modulated by an optical mask containing either the fingerprint of the person who is sending, or receiving the image. The pair of optical masks are taken as the encryption keys. Subsequently, the two beams are used to scan over a multi-depth 3-D object to obtain an encrypted hologram. During the decryption process, each sectional image of the 3-D object is recovered by convolving its encrypted hologram (through numerical computation) with the encrypted hologram of a pinhole image that is positioned at the same depth as the sectional image. Our proposed method has three major advantages. First, the lost-key situation can be avoided with the use of fingerprints as the encryption keys. Second, the method can be applied to encrypt 3-D images for subsequent decrypted sectional images. Third, since optical heterodyning scanning is employed to encrypt a 3-D object, the optical system is incoherent, resulting in negligible amount of speckle noise upon decryption. To the best of our knowledge, this is the first time optical cryptography of 3-D object images has been demonstrated in an incoherent optical system with biometric keys.

  1. KEY COMPARISON: Final report of the CCQM-K56: Ca, Fe, Zn and Cu in whole fat soybean powder

    NASA Astrophysics Data System (ADS)

    Liandi, Ma; Qian, Wang

    2010-01-01

    The CCQM-K56 key comparison was organized by the Inorganic Analysis Working Group (IAWG) of CCQM as a follow-up to completed pilot study CCQM-P64 to test the abilities of national metrology institutes to measure the amount content of nutritious elements in whole fat soybean powder. A pilot study CCQM-P64.1 was conducted in parallel with this key comparison. The National Institute of Metrology (NIM), P. R. China, acted as the coordinating laboratory. Eleven NIMs participated in CCQM-K56. Four elements - Ca, Fe, Zn and Cu - in different concentration levels have been studied. Different measurement methods (IDMS, ICP-MS, ICP-OES, AAS and INAA) and the microwave digestion method were used. The agreement of the results of CCQM-K56 is very good, and obviously better than that of the original P64. It shows that the capability of all of the participants had been promoted from the original pilot study to this key comparison. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  2. KEY COMPARISON: Final report on CCEM key comparison CCEM.RF-K10.CL (GT-RF/99-2) 'Power in 50 Ω coaxial lines, frequency: 50 MHz to 26 GHz' measurement techniques and results

    NASA Astrophysics Data System (ADS)

    Janik, Dieter; Inoue, T.; Michaud, A.

    2006-01-01

    This report summarizes the results and the measuring methods of an international key comparison between twelve national metrology institutes (NMIs) and is concerning the calibration factor of RF power sensors in the coaxial 3.5 mm line for frequencies up to 26 GHz. Two RF power travelling standards fitted with male PC 3.5 mm connectors were measured at seven frequencies. The following NMIs participated: NMIJ (Japan), NRC (Canada), NIST (USA), METAS (Switzerland), CSIR-NML (South Africa), NMIA (Australia), NPL (UK), SiQ (Slovenia), IEN (Italy), VNIIFTRI (Russian Federation), SPRING (Singapore) and PTB (Germany), as the pilot laboratory. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCEM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  3. Secure polarization-independent subcarrier quantum key distribution in optical fiber channel using BB84 protocol with a strong reference.

    PubMed

    Gleim, A V; Egorov, V I; Nazarov, Yu V; Smirnov, S V; Chistyakov, V V; Bannik, O I; Anisimov, A A; Kynev, S M; Ivanova, A E; Collins, R J; Kozlov, S A; Buller, G S

    2016-02-08

    A quantum key distribution system based on the subcarrier wave modulation method has been demonstrated which employs the BB84 protocol with a strong reference to generate secure bits at a rate of 16.5 kbit/s with an error of 0.5% over an optical channel of 10 dB loss, and 18 bits/s with an error of 0.75% over 25 dB of channel loss. To the best of our knowledge, these results represent the highest channel loss reported for secure quantum key distribution using the subcarrier wave approach. A passive unidirectional scheme has been used to compensate for the polarization dependence of the phase modulators in the receiver module, which resulted in a high visibility of 98.8%. The system is thus fully insensitive to polarization fluctuations and robust to environmental changes, making the approach promising for use in optical telecommunication networks. Further improvements in secure key rate and transmission distance can be achieved by implementing the decoy states protocol or by optimizing the mean photon number used in line with experimental parameters.

  4. A Standards-Based Approach for Reporting Assessment Results in South Africa

    ERIC Educational Resources Information Center

    Kanjee, Anil; Moloi, Qetelo

    2016-01-01

    This article proposes the use of a standards-based approach to reporting results from large-scale assessment surveys in South Africa. The use of this approach is intended to address the key shortcomings observed in the current reporting framework prescribed in the national curriculum documents. Using the Angoff method and data from the Annual…

  5. High-efficiency Gaussian key reconciliation in continuous variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bai, ZengLiang; Wang, XuYang; Yang, ShenShen; Li, YongMin

    2016-01-01

    Efficient reconciliation is a crucial step in continuous variable quantum key distribution. The progressive-edge-growth (PEG) algorithm is an efficient method to construct relatively short block length low-density parity-check (LDPC) codes. The qua-sicyclic construction method can extend short block length codes and further eliminate the shortest cycle. In this paper, by combining the PEG algorithm and qua-si-cyclic construction method, we design long block length irregular LDPC codes with high error-correcting capacity. Based on these LDPC codes, we achieve high-efficiency Gaussian key reconciliation with slice recon-ciliation based on multilevel coding/multistage decoding with an efficiency of 93.7%.

  6. Evaluation of generalized degrees of freedom for sparse estimation by replica method

    NASA Astrophysics Data System (ADS)

    Sakata, A.

    2016-12-01

    We develop a method to evaluate the generalized degrees of freedom (GDF) for linear regression with sparse regularization. The GDF is a key factor in model selection, and thus its evaluation is useful in many modelling applications. An analytical expression for the GDF is derived using the replica method in the large-system-size limit with random Gaussian predictors. The resulting formula has a universal form that is independent of the type of regularization, providing us with a simple interpretation. Within the framework of replica symmetric (RS) analysis, GDF has a physical meaning as the effective fraction of non-zero components. The validity of our method in the RS phase is supported by the consistency of our results with previous mathematical results. The analytical results in the RS phase are calculated numerically using the belief propagation algorithm.

  7. Efficient and Security Enhanced Anonymous Authentication with Key Agreement Scheme in Wireless Sensor Networks

    PubMed Central

    Jung, Jaewook; Moon, Jongho; Lee, Donghoon; Won, Dongho

    2017-01-01

    At present, users can utilize an authenticated key agreement protocol in a Wireless Sensor Network (WSN) to securely obtain desired information, and numerous studies have investigated authentication techniques to construct efficient, robust WSNs. Chang et al. recently presented an authenticated key agreement mechanism for WSNs and claimed that their authentication mechanism can both prevent various types of attacks, as well as preserve security properties. However, we have discovered that Chang et al’s method possesses some security weaknesses. First, their mechanism cannot guarantee protection against a password guessing attack, user impersonation attack or session key compromise. Second, the mechanism results in a high load on the gateway node because the gateway node should always maintain the verifier tables. Third, there is no session key verification process in the authentication phase. To this end, we describe how the previously-stated weaknesses occur and propose a security-enhanced version for WSNs. We present a detailed analysis of the security and performance of our authenticated key agreement mechanism, which not only enhances security compared to that of related schemes, but also takes efficiency into consideration. PMID:28335572

  8. Efficient and Security Enhanced Anonymous Authentication with Key Agreement Scheme in Wireless Sensor Networks.

    PubMed

    Jung, Jaewook; Moon, Jongho; Lee, Donghoon; Won, Dongho

    2017-03-21

    At present, users can utilize an authenticated key agreement protocol in a Wireless Sensor Network (WSN) to securely obtain desired information, and numerous studies have investigated authentication techniques to construct efficient, robust WSNs. Chang et al. recently presented an authenticated key agreement mechanism for WSNs and claimed that their authentication mechanism can both prevent various types of attacks, as well as preserve security properties. However, we have discovered that Chang et al's method possesses some security weaknesses. First, their mechanism cannot guarantee protection against a password guessing attack, user impersonation attack or session key compromise. Second, the mechanism results in a high load on the gateway node because the gateway node should always maintain the verifier tables. Third, there is no session key verification process in the authentication phase. To this end, we describe how the previously-stated weaknesses occur and propose a security-enhanced version for WSNs. We present a detailed analysis of the security and performance of our authenticated key agreement mechanism, which not only enhances security compared to that of related schemes, but also takes efficiency into consideration.

  9. Cleaning by clustering: methodology for addressing data quality issues in biomedical metadata.

    PubMed

    Hu, Wei; Zaveri, Amrapali; Qiu, Honglei; Dumontier, Michel

    2017-09-18

    The ability to efficiently search and filter datasets depends on access to high quality metadata. While most biomedical repositories require data submitters to provide a minimal set of metadata, some such as the Gene Expression Omnibus (GEO) allows users to specify additional metadata in the form of textual key-value pairs (e.g. sex: female). However, since there is no structured vocabulary to guide the submitter regarding the metadata terms to use, consequently, the 44,000,000+ key-value pairs in GEO suffer from numerous quality issues including redundancy, heterogeneity, inconsistency, and incompleteness. Such issues hinder the ability of scientists to hone in on datasets that meet their requirements and point to a need for accurate, structured and complete description of the data. In this study, we propose a clustering-based approach to address data quality issues in biomedical, specifically gene expression, metadata. First, we present three different kinds of similarity measures to compare metadata keys. Second, we design a scalable agglomerative clustering algorithm to cluster similar keys together. Our agglomerative cluster algorithm identified metadata keys that were similar, based on (i) name, (ii) core concept and (iii) value similarities, to each other and grouped them together. We evaluated our method using a manually created gold standard in which 359 keys were grouped into 27 clusters based on six types of characteristics: (i) age, (ii) cell line, (iii) disease, (iv) strain, (v) tissue and (vi) treatment. As a result, the algorithm generated 18 clusters containing 355 keys (four clusters with only one key were excluded). In the 18 clusters, there were keys that were identified correctly to be related to that cluster, but there were 13 keys which were not related to that cluster. We compared our approach with four other published methods. Our approach significantly outperformed them for most metadata keys and achieved the best average F-Score (0.63). Our algorithm identified keys that were similar to each other and grouped them together. Our intuition that underpins cleaning by clustering is that, dividing keys into different clusters resolves the scalability issues for data observation and cleaning, and keys in the same cluster with duplicates and errors can easily be found. Our algorithm can also be applied to other biomedical data types.

  10. An image hiding method based on cascaded iterative Fourier transform and public-key encryption algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Sang, Jun; Alam, Mohammad S.

    2013-03-01

    An image hiding method based on cascaded iterative Fourier transform and public-key encryption algorithm was proposed. Firstly, the original secret image was encrypted into two phase-only masks M1 and M2 via cascaded iterative Fourier transform (CIFT) algorithm. Then, the public-key encryption algorithm RSA was adopted to encrypt M2 into M2' . Finally, a host image was enlarged by extending one pixel into 2×2 pixels and each element in M1 and M2' was multiplied with a superimposition coefficient and added to or subtracted from two different elements in the 2×2 pixels of the enlarged host image. To recover the secret image from the stego-image, the two masks were extracted from the stego-image without the original host image. By applying public-key encryption algorithm, the key distribution was facilitated, and also compared with the image hiding method based on optical interference, the proposed method may reach higher robustness by employing the characteristics of the CIFT algorithm. Computer simulations show that this method has good robustness against image processing.

  11. A high accuracy ultrasonic distance measurement system using binary frequency shift-keyed signal and phase detection

    NASA Astrophysics Data System (ADS)

    Huang, S. S.; Huang, C. F.; Huang, K. N.; Young, M. S.

    2002-10-01

    A highly accurate binary frequency shift-keyed (BFSK) ultrasonic distance measurement system (UDMS) for use in isothermal air is described. This article presents an efficient algorithm which combines both the time-of-flight (TOF) method and the phase-shift method. The proposed method can obtain larger range measurement than the phase-shift method and also get higher accuracy compared with the TOF method. A single-chip microcomputer-based BFSK signal generator and phase detector was designed to record and compute the TOF, two phase shifts, and the resulting distance, which were then sent to either an LCD to display or a PC to calibrate. Experiments were done in air using BFSK with the frequencies of 40 and 41 kHz. Distance resolution of 0.05% of the wavelength corresponding to the frequency of 40 kHz was obtained. The range accuracy was found to be within ±0.05 mm at a range of over 6000 mm. The main advantages of this UDMS system are high resolution, low cost, narrow bandwidth requirement, and ease of implementation.

  12. Research on key technology of yacht positioning based on binocular parallax

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Wei, Ping; Liu, Zengzhi

    2016-10-01

    Yacht has become a fashionable way for entertainment. However, to obtain the precise location of a yacht docked at a port has become one of the concerns of a yacht manager. To deal with this issue, we adopt a positioning method based on the principle of binocular parallax and background difference in this paper. Binocular parallax uses cameras to get multi-dimensional perspective of the yacht based on geometric principle of imaging. In order to simplify the yacht localization problem, we install LED light indicator as the key point on a yacht. And let it flash at a certain frequency during day time and night time. After getting the distance between the LED and the cameras, locating the yacht is easy. Compared with other traditional positioning methods, this method is simpler and easier to implement. In this paper, we study the yacht positioning method using the LED indicator. Simulation experiment is done for a yacht model in the distance of 3 meters. The experimental result shows that our method is feasible and easy to implement with a small 15% positioning error.

  13. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang

    2017-10-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.

  14. QUANTIFYING SPATIAL POSITION OF WETLANDS FOR STREAM HABITAT QUALITY PREDICTION

    EPA Science Inventory

    A watershed's capacity to store and filter water, and the resulting effects on the hydrologic regine, is a key forcing function for insteam processes and community structure. However, methods for describing wetland position have traditionally been qualitative. A Geographic Info...

  15. Synchrosqueezing an effective method for analyzing Doppler radar physiological signals.

    PubMed

    Yavari, Ehsan; Rahman, Ashikur; Jia Xu; Mandic, Danilo P; Boric-Lubecke, Olga

    2016-08-01

    Doppler radar can monitor vital sign wirelessly. Respiratory and heart rate have time-varying behavior. Capturing the rate variability provides crucial physiological information. However, the common time-frequency methods fail to detect key information. We investigate Synchrosqueezing method to extract oscillatory components of the signal with time varying spectrum. Simulation and experimental result shows the potential of the proposed method for analyzing signals with complex time-frequency behavior like physiological signals. Respiration and heart signals and their components are extracted with higher resolution and without any pre-filtering and signal conditioning.

  16. The segmentation of bones in pelvic CT images based on extraction of key frames.

    PubMed

    Yu, Hui; Wang, Haijun; Shi, Yao; Xu, Ke; Yu, Xuyao; Cao, Yuzhen

    2018-05-22

    Bone segmentation is important in computed tomography (CT) imaging of the pelvis, which assists physicians in the early diagnosis of pelvic injury, in planning operations, and in evaluating the effects of surgical treatment. This study developed a new algorithm for the accurate, fast, and efficient segmentation of the pelvis. The proposed method consists of two main parts: the extraction of key frames and the segmentation of pelvic CT images. Key frames were extracted based on pixel difference, mutual information and normalized correlation coefficient. In the pelvis segmentation phase, skeleton extraction from CT images and a marker-based watershed algorithm were combined to segment the pelvis. To meet the requirements of clinical application, physician's judgment is needed. Therefore the proposed methodology is semi-automated. In this paper, 5 sets of CT data were used to test the overlapping area, and 15 CT images were used to determine the average deviation distance. The average overlapping area of the 5 sets was greater than 94%, and the minimum average deviation distance was approximately 0.58 pixels. In addition, the key frame extraction efficiency and the running time of the proposed method were evaluated on 20 sets of CT data. For each set, approximately 13% of the images were selected as key frames, and the average processing time was approximately 2 min (the time for manual marking was not included). The proposed method is able to achieve accurate, fast, and efficient segmentation of pelvic CT image sequences. Segmentation results not only provide an important reference for early diagnosis and decisions regarding surgical procedures, they also offer more accurate data for medical image registration, recognition and 3D reconstruction.

  17. Adaptive Unscented Kalman Filter Phase Unwrapping Method and Its Application on Gaofen-3 Interferometric SAR Data.

    PubMed

    Gao, Yandong; Zhang, Shubi; Li, Tao; Chen, Qianfu; Li, Shijin; Meng, Pengfei

    2018-06-02

    Phase unwrapping (PU) is a key step in the reconstruction of digital elevation models (DEMs) and the monitoring of surface deformation from interferometric synthetic aperture radar (SAR, InSAR) data. In this paper, an improved PU method that combines an amended matrix pencil model, an adaptive unscented kalman filter (AUKF), an efficient quality-guided strategy based on heapsort, and a circular median filter is proposed. PU theory and the existing UKFPU method are covered. Then, the improved method is presented with emphasis on the AUKF and the circular median filter. AUKF has been well used in other fields, but it is for the first time applied to interferometric images PU, to the best of our knowledge. First, the amended matrix pencil model is used to estimate the phase gradient. Then, an AUKF model is used to unwrap the interferometric phase based on an efficient quality-guided strategy based on heapsort. Finally, the key results are obtained by filtering the results using a circular median. The proposed method is compared with the minimum cost network flow (MCF), statistical cost network flow (SNAPHU), regularized phase tracking technique (RPTPU), and UKFPU methods using two sets of simulated data and two sets of experimental GF-3 SAR data. The improved method is shown to yield the greatest accuracy in the interferometric phase maps compared to the methods considered in this paper. Furthermore, the improved method is shown to be the most robust to noise and is thus most suitable for PU of GF-3 SAR data in high-noise and low-coherence regions.

  18. Encryption and decryption using FPGA

    NASA Astrophysics Data System (ADS)

    Nayak, Nikhilesh; Chandak, Akshay; Shah, Nisarg; Karthikeyan, B.

    2017-11-01

    In this paper, we are performing multiple cryptography methods on a set of data and comparing their outputs. Here AES algorithm and RSA algorithm are used. Using AES Algorithm an 8 bit input (plain text) gets encrypted using a cipher key and the result is displayed on tera term (serially). For simulation a 128 bit input is used and operated with a 128 bit cipher key to generate encrypted text. The reverse operations are then performed to get decrypted text. In RSA Algorithm file handling is used to input plain text. This text is then operated on to get the encrypted and decrypted data, which are then stored in a file. Finally the results of both the algorithms are compared.

  19. Research on simplified parametric finite element model of automobile frontal crash

    NASA Astrophysics Data System (ADS)

    Wu, Linan; Zhang, Xin; Yang, Changhai

    2018-05-01

    The modeling method and key technologies of the automobile frontal crash simplified parametric finite element model is studied in this paper. By establishing the auto body topological structure, extracting and parameterizing the stiffness properties of substructures, choosing appropriate material models for substructures, the simplified parametric FE model of M6 car is built. The comparison of the results indicates that the simplified parametric FE model can accurately calculate the automobile crash responses and the deformation of the key substructures, and the simulation time is reduced from 6 hours to 2 minutes.

  20. Research on key technology of planning and design for AC/DC hybrid distribution network

    NASA Astrophysics Data System (ADS)

    Shen, Yu; Wu, Guilian; Zheng, Huan; Deng, Junpeng; Shi, Pengjia

    2018-04-01

    With the increasing demand of DC generation and DC load, the development of DC technology, AC and DC distribution network integrating will become an important form of future distribution network. In this paper, the key technology of planning and design for AC/DC hybrid distribution network is proposed, including the selection of AC and DC voltage series, the design of typical grid structure and the comprehensive evaluation method of planning scheme. The research results provide some ideas and directions for the future development of AC/DC hybrid distribution network.

  1. Development of an automated analysis system for data from flow cytometric intracellular cytokine staining assays from clinical vaccine trials

    PubMed Central

    Shulman, Nick; Bellew, Matthew; Snelling, George; Carter, Donald; Huang, Yunda; Li, Hongli; Self, Steven G.; McElrath, M. Juliana; De Rosa, Stephen C.

    2008-01-01

    Background Intracellular cytokine staining (ICS) by multiparameter flow cytometry is one of the primary methods for determining T cell immunogenicity in HIV-1 clinical vaccine trials. Data analysis requires considerable expertise and time. The amount of data is quickly increasing as more and larger trials are performed, and thus there is a critical need for high throughput methods of data analysis. Methods A web based flow cytometric analysis system, LabKey Flow, was developed for analyses of data from standardized ICS assays. A gating template was created manually in commercially-available flow cytometric analysis software. Using this template, the system automatically compensated and analyzed all data sets. Quality control queries were designed to identify potentially incorrect sample collections. Results Comparison of the semi-automated analysis performed by LabKey Flow and the manual analysis performed using FlowJo software demonstrated excellent concordance (concordance correlation coefficient >0.990). Manual inspection of the analyses performed by LabKey Flow for 8-color ICS data files from several clinical vaccine trials indicates that template gates can appropriately be used for most data sets. Conclusions The semi-automated LabKey Flow analysis system can analyze accurately large ICS data files. Routine use of the system does not require specialized expertise. This high-throughput analysis will provide great utility for rapid evaluation of complex multiparameter flow cytometric measurements collected from large clinical trials. PMID:18615598

  2. Heterogeneity image patch index and its application to consumer video summarization.

    PubMed

    Dang, Chinh T; Radha, Hayder

    2014-06-01

    Automatic video summarization is indispensable for fast browsing and efficient management of large video libraries. In this paper, we introduce an image feature that we refer to as heterogeneity image patch (HIP) index. The proposed HIP index provides a new entropy-based measure of the heterogeneity of patches within any picture. By evaluating this index for every frame in a video sequence, we generate a HIP curve for that sequence. We exploit the HIP curve in solving two categories of video summarization applications: key frame extraction and dynamic video skimming. Under the key frame extraction frame-work, a set of candidate key frames is selected from abundant video frames based on the HIP curve. Then, a proposed patch-based image dissimilarity measure is used to create affinity matrix of these candidates. Finally, a set of key frames is extracted from the affinity matrix using a min–max based algorithm. Under video skimming, we propose a method to measure the distance between a video and its skimmed representation. The video skimming problem is then mapped into an optimization framework and solved by minimizing a HIP-based distance for a set of extracted excerpts. The HIP framework is pixel-based and does not require semantic information or complex camera motion estimation. Our simulation results are based on experiments performed on consumer videos and are compared with state-of-the-art methods. It is shown that the HIP approach outperforms other leading methods, while maintaining low complexity.

  3. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  4. Teaching learning methods of an entrepreneurship curriculum.

    PubMed

    Esmi, Keramat; Marzoughi, Rahmatallah; Torkzadeh, Jafar

    2015-10-01

    One of the most significant elements of entrepreneurship curriculum design is teaching-learning methods, which plays a key role in studies and researches related to such a curriculum. It is the teaching method, and systematic, organized and logical ways of providing lessons that should be consistent with entrepreneurship goals and contents, and should also be developed according to the learners' needs. Therefore, the current study aimed to introduce appropriate, modern, and effective methods of teaching entrepreneurship and their validation. This is a mixed method research of a sequential exploratory kind conducted through two stages: a) developing teaching methods of entrepreneurship curriculum, and b) validating developed framework. Data were collected through "triangulation" (study of documents, investigating theoretical basics and the literature, and semi-structured interviews with key experts). Since the literature on this topic is very rich, and views of the key experts are vast, directed and summative content analysis was used. In the second stage, qualitative credibility of research findings was obtained using qualitative validation criteria (credibility, confirmability, and transferability), and applying various techniques. Moreover, in order to make sure that the qualitative part is reliable, reliability test was used. Moreover, quantitative validation of the developed framework was conducted utilizing exploratory and confirmatory factor analysis methods and Cronbach's alpha. The data were gathered through distributing a three-aspect questionnaire (direct presentation teaching methods, interactive, and practical-operational aspects) with 29 items among 90 curriculum scholars. Target population was selected by means of purposive sampling and representative sample. Results obtained from exploratory factor analysis showed that a three factor structure is an appropriate method for describing elements of teaching-learning methods of entrepreneurship curriculum. Moreover, the value for Kaiser Meyer Olkin measure of sampling adequacy equaled 0.72 and the value for Bartlett's test of variances homogeneity was significant at the 0.0001 level. Except for internship element, the rest had a factor load of higher than 0.3. Also, the results of confirmatory factor analysis showed the model appropriateness, and the criteria for qualitative accreditation were acceptable. Developed model can help instructors in selecting an appropriate method of entrepreneurship teaching, and it can also make sure that the teaching is on the right path. Moreover, the model is comprehensive and includes all the effective teaching methods in entrepreneurship education. It is also based on qualities, conditions, and requirements of Higher Education Institutions in Iranian cultural environment.

  5. Development of a Nationally Coordinated Evaluation Plan for the Ghana National Strategy for Key Populations

    PubMed Central

    Reynolds, Heidi W; Atuahene, Kyeremeh; Sutherland, Elizabeth; Amenyah, Richard; Kwao, Isaiah Doe; Larbi, Emmanuel Tettey

    2015-01-01

    Objective Just as HIV prevention programs need to be tailored to the local epidemic, so should evaluations be country-owned and country-led to ensure use of those results in decision making and policy. The objective of this paper is to describe the process undertaken in Ghana to develop a national evaluation plan for the Ghana national strategy for key populations. Methods This was a participatory process that involved meetings between the Ghana AIDS Commission (GAC), other partners in Ghana working to prevent HIV among key populations, and MEASURE Evaluation. The process included three two-day, highly structured yet participatory meetings over the course of 12 months during which participants shared information about on-going and planned data and identified research questions and methods. Results An evaluation plan was prepared to inform stakeholders about which data collection activities need to be prioritized for funding, who would implement the study, the timing of data collection, the research question the data will help answer, and the analysis methods. The plan discusses various methods that can be used including the recommendation for the study design using multiple data sources. It has an evaluation conceptual model, proposed analyses, proposed definition of independent variables, estimated costs for filling data gaps, roles and responsibilities of stakeholders to carry out the plan, and considerations for ethics, data sharing and authorship. Conclusion The experience demonstrates that it is possible to design an evaluation responsive to national strategies and priorities with country leadership, regardless of stakeholders' experiences with evaluations. This process may be replicable elsewhere, where stakeholders want to plan and implement an evaluation of a large-scale program at the national or subnational level that is responsive to national priorities and part of a comprehensive monitoring and evaluation system. PMID:26120495

  6. Practical performance of real-time shot-noise measurement in continuous-variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Huang, Peng; Zhou, Yingming; Liu, Weiqi; Zeng, Guihua

    2018-01-01

    In a practical continuous-variable quantum key distribution (CVQKD) system, real-time shot-noise measurement (RTSNM) is an essential procedure for preventing the eavesdropper exploiting the practical security loopholes. However, the performance of this procedure itself is not analyzed under the real-world condition. Therefore, we indicate the RTSNM practical performance and investigate its effects on the CVQKD system. In particular, due to the finite-size effect, the shot-noise measurement at the receiver's side may decrease the precision of parameter estimation and consequently result in a tight security bound. To mitigate that, we optimize the block size for RTSNM under the ensemble size limitation to maximize the secure key rate. Moreover, the effect of finite dynamics of amplitude modulator in this scheme is studied and its mitigation method is also proposed. Our work indicates the practical performance of RTSNM and provides the real secret key rate under it.

  7. Improved statistical fluctuation analysis for measurement-device-independent quantum key distribution with four-intensity decoy-state method.

    PubMed

    Mao, Chen-Chen; Zhou, Xing-Yu; Zhu, Jian-Rong; Zhang, Chun-Hui; Zhang, Chun-Mei; Wang, Qin

    2018-05-14

    Recently Zhang et al [ Phys. Rev. A95, 012333 (2017)] developed a new approach to estimate the failure probability for the decoy-state BB84 QKD system when taking finite-size key effect into account, which offers security comparable to Chernoff bound, while results in an improved key rate and transmission distance. Based on Zhang et al's work, now we extend this approach to the case of the measurement-device-independent quantum key distribution (MDI-QKD), and for the first time implement it onto the four-intensity decoy-state MDI-QKD system. Moreover, through utilizing joint constraints and collective error-estimation techniques, we can obviously increase the performance of practical MDI-QKD systems compared with either three- or four-intensity decoy-state MDI-QKD using Chernoff bound analysis, and achieve much higher level security compared with those applying Gaussian approximation analysis.

  8. Resolving Environmental Effects of Wind Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinclair, Karin C; DeGeorge, Elise M; Copping, Andrea E.

    Concerns for potential wildlife impacts resulting from land-based and offshore wind energy have created challenges for wind project development. Research is not always adequately supported, results are neither always readily accessible nor are they satisfactorily disseminated, and so decisions are often made based on the best available information, which may be missing key findings. The potential for high impacts to avian and bat species and marine mammals have been used by wind project opponents to stop, downsize, or severely delay project development. The global nature of the wind industry - combined with the understanding that many affected species cross-national boundaries,more » and in many cases migrate between continents - also points to the need to collaborate on an international level. The International Energy Agency (IEA) Wind Technology Collaborative Programs facilitates coordination on key research issues. IEA Wind Task 34 - WREN: Working Together to Resolve Environmental Effects of Wind Energy-is a collaborative forum to share lessons gained from field research and modeling, including management methods, wildlife monitoring methods, best practices, study results, and successful approaches to mitigating impacts and addressing the cumulative effects of wind energy on wildlife.« less

  9. Uncertainty in the analysis of the overall equipment effectiveness on the shop floor

    NASA Astrophysics Data System (ADS)

    Rößler, M. P.; Abele, E.

    2013-06-01

    In this article an approach will be presented which supports transparency regarding the effectiveness of manufacturing equipment by combining the fuzzy set theory with the method of the overall equipment effectiveness analysis. One of the key principles of lean production and also a fundamental task in production optimization projects is the prior analysis of the current state of a production system by the use of key performance indicators to derive possible future states. The current state of the art in overall equipment effectiveness analysis is usually performed by cumulating different machine states by means of decentralized data collection without the consideration of uncertainty. In manual data collection or semi-automated plant data collection systems the quality of derived data often diverges and leads optimization teams to distorted conclusions about the real optimization potential of manufacturing equipment. The method discussed in this paper is to help practitioners to get more reliable results in the analysis phase and so better results of optimization projects. Under consideration of a case study obtained results are discussed.

  10. Optical multiple-image hiding based on interference and grating modulation

    NASA Astrophysics Data System (ADS)

    He, Wenqi; Peng, Xiang; Meng, Xiangfeng

    2012-07-01

    We present a method for multiple-image hiding on the basis of interference-based encryption architecture and grating modulation. By using a modified phase retrieval algorithm, we can separately hide a number of secret images into one arbitrarily preselected host image associated with a set of phase-only masks (POMs), which are regarded as secret keys. Thereafter, a grating modulation operation is introduced to multiplex and store the different POMs into a single key mask, which is then assigned to the authorized users in privacy. For recovery, after an appropriate demultiplexing process, one can reconstruct the distributions of all the secret keys and then recover the corresponding hidden images with suppressed crosstalk. Computer simulation results are presented to validate the feasibility of our approach.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gehin, Jess C; Godfrey, Andrew T; Evans, Thomas M

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) is developing a collection of methods and software products known as VERA, the Virtual Environment for Reactor Applications, including a core simulation capability called VERA-CS. A key milestone for this endeavor is to validate VERA against measurements from operating nuclear power reactors. The first step in validation against plant data is to determine the ability of VERA to accurately simulate the initial startup physics tests for Watts Bar Nuclear Power Station, Unit 1 (WBN1) cycle 1. VERA-CS calculations were performed with the Insilico code developed at ORNL using cross sectionmore » processing from the SCALE system and the transport capabilities within the Denovo transport code using the SPN method. The calculations were performed with ENDF/B-VII.0 cross sections in 252 groups (collapsed to 23 groups for the 3D transport solution). The key results of the comparison of calculations with measurements include initial criticality, control rod worth critical configurations, control rod worth, differential boron worth, and isothermal temperature reactivity coefficient (ITC). The VERA results for these parameters show good agreement with measurements, with the exception of the ITC, which requires additional investigation. Results are also compared to those obtained with Monte Carlo methods and a current industry core simulator.« less

  12. A method for modelling peak signal statistics on a mobile satellite transponder

    NASA Technical Reports Server (NTRS)

    Bilodeau, Andre; Lecours, Michel; Pelletier, Marcel; Delisle, Gilles Y.

    1990-01-01

    A simulation method is proposed. The simulation was developed to model the peak duration and energy content of signal peaks in a mobile communication satellite operating in a Frequency Division Multiple Access (FDMA) mode and presents an estimate of those power peaks for a system where the channels are modeled as band limited Gaussian noise, which is taken as a reasonable representation for Amplitude Commanded Single Sideband (ACSSB), Minimum Shift Keying (MSK), or Phase Shift Keying (PSK) modulated signals. The simulation results show that, under this hypothesis, the level of the signal power peaks for 10 percent, 1 percent, and 0.1 percent of the time are well described by a Rayleigh law and that their duration is extremely short and inversely proportional to the total FDM system bandwidth.

  13. Multiple-image encryption via lifting wavelet transform and XOR operation based on compressive ghost imaging scheme

    NASA Astrophysics Data System (ADS)

    Li, Xianye; Meng, Xiangfeng; Yang, Xiulun; Wang, Yurong; Yin, Yongkai; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi

    2018-03-01

    A multiple-image encryption method via lifting wavelet transform (LWT) and XOR operation is proposed, which is based on a row scanning compressive ghost imaging scheme. In the encryption process, the scrambling operation is implemented for the sparse images transformed by LWT, then the XOR operation is performed on the scrambled images, and the resulting XOR images are compressed in the row scanning compressive ghost imaging, through which the ciphertext images can be detected by bucket detector arrays. During decryption, the participant who possesses his/her correct key-group, can successfully reconstruct the corresponding plaintext image by measurement key regeneration, compression algorithm reconstruction, XOR operation, sparse images recovery, and inverse LWT (iLWT). Theoretical analysis and numerical simulations validate the feasibility of the proposed method.

  14. Parameterization of aquatic ecosystem functioning and its natural variation: Hierarchical Bayesian modelling of plankton food web dynamics

    NASA Astrophysics Data System (ADS)

    Norros, Veera; Laine, Marko; Lignell, Risto; Thingstad, Frede

    2017-10-01

    Methods for extracting empirically and theoretically sound parameter values are urgently needed in aquatic ecosystem modelling to describe key flows and their variation in the system. Here, we compare three Bayesian formulations for mechanistic model parameterization that differ in their assumptions about the variation in parameter values between various datasets: 1) global analysis - no variation, 2) separate analysis - independent variation and 3) hierarchical analysis - variation arising from a shared distribution defined by hyperparameters. We tested these methods, using computer-generated and empirical data, coupled with simplified and reasonably realistic plankton food web models, respectively. While all methods were adequate, the simulated example demonstrated that a well-designed hierarchical analysis can result in the most accurate and precise parameter estimates and predictions, due to its ability to combine information across datasets. However, our results also highlighted sensitivity to hyperparameter prior distributions as an important caveat of hierarchical analysis. In the more complex empirical example, hierarchical analysis was able to combine precise identification of parameter values with reasonably good predictive performance, although the ranking of the methods was less straightforward. We conclude that hierarchical Bayesian analysis is a promising tool for identifying key ecosystem-functioning parameters and their variation from empirical datasets.

  15. Markerless identification of key events in gait cycle using image flow.

    PubMed

    Vishnoi, Nalini; Duric, Zoran; Gerber, Naomi Lynn

    2012-01-01

    Gait analysis has been an interesting area of research for several decades. In this paper, we propose image-flow-based methods to compute the motion and velocities of different body segments automatically, using a single inexpensive video camera. We then identify and extract different events of the gait cycle (double-support, mid-swing, toe-off and heel-strike) from video images. Experiments were conducted in which four walking subjects were captured from the sagittal plane. Automatic segmentation was performed to isolate the moving body from the background. The head excursion and the shank motion were then computed to identify the key frames corresponding to different events in the gait cycle. Our approach does not require calibrated cameras or special markers to capture movement. We have also compared our method with the Optotrak 3D motion capture system and found our results in good agreement with the Optotrak results. The development of our method has potential use in the markerless and unencumbered video capture of human locomotion. Monitoring gait in homes and communities provides a useful application for the aged and the disabled. Our method could potentially be used as an assessment tool to determine gait symmetry or to establish the normal gait pattern of an individual.

  16. Multi-Criterion Preliminary Design of a Tetrahedral Truss Platform

    NASA Technical Reports Server (NTRS)

    Wu, K. Chauncey

    1995-01-01

    An efficient method is presented for multi-criterion preliminary design and demonstrated for a tetrahedral truss platform. The present method requires minimal analysis effort and permits rapid estimation of optimized truss behavior for preliminary design. A 14-m-diameter, 3-ring truss platform represents a candidate reflector support structure for space-based science spacecraft. The truss members are divided into 9 groups by truss ring and position. Design variables are the cross-sectional area of all members in a group, and are either 1, 3 or 5 times the minimum member area. Non-structural mass represents the node and joint hardware used to assemble the truss structure. Taguchi methods are used to efficiently identify key points in the set of Pareto-optimal truss designs. Key points identified using Taguchi methods are the maximum frequency, minimum mass, and maximum frequency-to-mass ratio truss designs. Low-order polynomial curve fits through these points are used to approximate the behavior of the full set of Pareto-optimal designs. The resulting Pareto-optimal design curve is used to predict frequency and mass for optimized trusses. Performance improvements are plotted in frequency-mass (criterion) space and compared to results for uniform trusses. Application of constraints to frequency and mass and sensitivity to constraint variation are demonstrated.

  17. Analysis on pseudo excitation of random vibration for structure of time flight counter

    NASA Astrophysics Data System (ADS)

    Wu, Qiong; Li, Dapeng

    2015-03-01

    Traditional computing method is inefficient for getting key dynamical parameters of complicated structure. Pseudo Excitation Method(PEM) is an effective method for calculation of random vibration. Due to complicated and coupling random vibration in rocket or shuttle launching, the new staging white noise mathematical model is deduced according to the practical launch environment. This deduced model is applied for PEM to calculate the specific structure of Time of Flight Counter(ToFC). The responses of power spectral density and the relevant dynamic characteristic parameters of ToFC are obtained in terms of the flight acceptance test level. Considering stiffness of fixture structure, the random vibration experiments are conducted in three directions to compare with the revised PEM. The experimental results show the structure can bear the random vibration caused by launch without any damage and key dynamical parameters of ToFC are obtained. The revised PEM is similar with random vibration experiment in dynamical parameters and responses are proved by comparative results. The maximum error is within 9%. The reasons of errors are analyzed to improve reliability of calculation. This research provides an effective method for solutions of computing dynamical characteristic parameters of complicated structure in the process of rocket or shuttle launching.

  18. A method for data‐driven exploration to pinpoint key features in medical data and facilitate expert review

    PubMed Central

    Juhlin, Kristina; Norén, G. Niklas

    2017-01-01

    Abstract Purpose To develop a method for data‐driven exploration in pharmacovigilance and illustrate its use by identifying the key features of individual case safety reports related to medication errors. Methods We propose vigiPoint, a method that contrasts the relative frequency of covariate values in a data subset of interest to those within one or more comparators, utilizing odds ratios with adaptive statistical shrinkage. Nested analyses identify higher order patterns, and permutation analysis is employed to protect against chance findings. For illustration, a total of 164 000 adverse event reports related to medication errors were characterized and contrasted to the other 7 833 000 reports in VigiBase, the WHO global database of individual case safety reports, as of May 2013. The initial scope included 2000 features, such as patient age groups, reporter qualifications, and countries of origin. Results vigiPoint highlighted 109 key features of medication error reports. The most prominent were that the vast majority of medication error reports were from the United States (89% compared with 49% for other reports in VigiBase); that the majority of reports were sent by consumers (53% vs 17% for other reports); that pharmacists (12% vs 5.3%) and lawyers (2.9% vs 1.5%) were overrepresented; and that there were more medication error reports than expected for patients aged 2‐11 years (10% vs 5.7%), particularly in Germany (16%). Conclusions vigiPoint effectively identified key features of medication error reports in VigiBase. More generally, it reduces lead times for analysis and ensures reproducibility and transparency. An important next step is to evaluate its use in other data. PMID:28815800

  19. Participatory Research for Chronic Disease Prevention in Inuit Communities

    ERIC Educational Resources Information Center

    Gittelsohn, Joel; Roache, Cindy; Kratzmann, Meredith; Reid, Rhonda; Ogina, Julia; Sharma, Sangita

    2010-01-01

    Objective: To develop a community-based chronic disease prevention program for Inuit in Nunavut, Canada. Methods: Stakeholders contributed to intervention development through formative research [in-depth interviews (n = 45), dietary recalls (n = 42)], community workshops, group feedback and implementation training. Results: Key cultural themes…

  20. Numerical analysis of a red blood cell flowing through a thin micropore.

    PubMed

    Omori, Toshihiro; Hosaka, Haruki; Imai, Yohsuke; Yamaguchi, Takami; Ishikawa, Takuji

    2014-01-01

    Red blood cell (RBC) deformability plays a key role in microcirculation, especially in vessels that have diameters even smaller than the nominal cell size. In this study, we numerically investigate the dynamics of an RBC in a thin micropore. The RBC is modeled as a capsule with a thin hyperelastic membrane. In a numerical simulation, we employ a boundary element method for fluid mechanics and a finite element method for membrane mechanics. The resulting RBC deformation towards the flow direction is suppressed considerably by increased cytoplasm viscosity, whereas the gap between the cell membrane and solid wall becomes smaller with higher cytoplasm viscosity. We also measure the transit time of the RBC and find that nondimensional transit time increases nonlinearly with respect to the viscosity ratio, whereas it is invariant to the capillary number. In conclusion, cytoplasmic viscosity plays a key role in the dynamics of an RBC in a thin pore. The results of this study will be useful for designing a microfluidic device to measure cytoplasmic viscosity.

  1. Review of Railgun Modeling Techniques: The Computation of Railgun Force and Other Key Factors

    NASA Astrophysics Data System (ADS)

    Eckert, Nathan James

    Currently, railgun force modeling either uses the simple "railgun force equation" or finite element methods. It is proposed here that a middle ground exists that does not require the solution of partial differential equations, is more readily implemented than finite element methods, and is more accurate than the traditional force equation. To develop this method, it is necessary to examine the core railgun factors: power supply mechanisms, the distribution of current in the rails and in the projectile which slides between them (called the armature), the magnetic field created by the current flowing through these rails, the inductance gradient (a key factor in simplifying railgun analysis, referred to as L'), the resultant Lorentz force, and the heating which accompanies this action. Common power supply technologies are investigated, and the shape of their current pulses are modeled. The main causes of current concentration are described, and a rudimentary method for computing current distribution in solid rails and a rectangular armature is shown to have promising accuracy with respect to outside finite element results. The magnetic field is modeled with two methods using the Biot-Savart law, and generally good agreement is obtained with respect to finite element methods (5.8% error on average). To get this agreement, a factor of 2 is added to the original formulation after seeing a reliable offset with FEM results. Three inductance gradient calculations are assessed, and though all agree with FEM results, the Kerrisk method and a regression analysis method developed by Murugan et al. (referred to as the LRM here) perform the best. Six railgun force computation methods are investigated, including the traditional railgun force equation, an equation produced by Waindok and Piekielny, and four methods inspired by the work of Xu et al. Overall, good agreement between the models and outside data is found, but each model's accuracy varies significantly between comparisons. Lastly, an approximation of the temperature profile in railgun rails originally presented by McCorkle and Bahder is replicated. In total, this work describes railgun technology and moderately complex railgun modeling methods, but is inconclusive about the presence of a middle-ground modeling method.

  2. Key-Generation Algorithms for Linear Piece In Hand Matrix Method

    NASA Astrophysics Data System (ADS)

    Tadaki, Kohtaro; Tsujii, Shigeo

    The linear Piece In Hand (PH, for short) matrix method with random variables was proposed in our former work. It is a general prescription which can be applicable to any type of multivariate public-key cryptosystems for the purpose of enhancing their security. Actually, we showed, in an experimental manner, that the linear PH matrix method with random variables can certainly enhance the security of HFE against the Gröbner basis attack, where HFE is one of the major variants of multivariate public-key cryptosystems. In 1998 Patarin, Goubin, and Courtois introduced the plus method as a general prescription which aims to enhance the security of any given MPKC, just like the linear PH matrix method with random variables. In this paper we prove the equivalence between the plus method and the primitive linear PH matrix method, which is introduced by our previous work to explain the notion of the PH matrix method in general in an illustrative manner and not for a practical use to enhance the security of any given MPKC. Based on this equivalence, we show that the linear PH matrix method with random variables has the substantial advantage over the plus method with respect to the security enhancement. In the linear PH matrix method with random variables, the three matrices, including the PH matrix, play a central role in the secret-key and public-key. In this paper, we clarify how to generate these matrices and thus present two probabilistic polynomial-time algorithms to generate these matrices. In particular, the second one has a concise form, and is obtained as a byproduct of the proof of the equivalence between the plus method and the primitive linear PH matrix method.

  3. Secure Cryptographic Key Management System (CKMS) Considerations for Smart Grid Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T; Aldridge, Hal

    2011-01-01

    In this paper, we examine some unique challenges associated with key management in the Smart Grid and concomitant research initiatives: 1) effectively model security requirements and their implementations, and 2) manage keys and key distribution for very large scale deployments such as Smart Meters over a long period of performance. This will set the stage to: 3) develop innovative, low cost methods to protect keying material, and 4) provide high assurance authentication services. We will present our perspective on key management and will discuss some key issues within the life cycle of a cryptographic key designed to achieve the following:more » 1) control systems designed, installed, operated, and maintained to survive an intentional cyber assault with no loss of critical function, and 2) widespread implementation of methods for secure communication between remote access devices and control centers that are scalable and cost-effective to deploy.« less

  4. Hybrid Cryptosystem Using Tiny Encryption Algorithm and LUC Algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, Dian; Sharif, Amer; Jaysilen; Andri Budiman, Mohammad

    2018-01-01

    Security becomes a very important issue in data transmission and there are so many methods to make files more secure. One of that method is cryptography. Cryptography is a method to secure file by writing the hidden code to cover the original file. Therefore, if the people do not involve in cryptography, they cannot decrypt the hidden code to read the original file. There are many methods are used in cryptography, one of that method is hybrid cryptosystem. A hybrid cryptosystem is a method that uses a symmetric algorithm to secure the file and use an asymmetric algorithm to secure the symmetric algorithm key. In this research, TEA algorithm is used as symmetric algorithm and LUC algorithm is used as an asymmetric algorithm. The system is tested by encrypting and decrypting the file by using TEA algorithm and using LUC algorithm to encrypt and decrypt the TEA key. The result of this research is by using TEA Algorithm to encrypt the file, the cipher text form is the character from ASCII (American Standard for Information Interchange) table in the form of hexadecimal numbers and the cipher text size increase by sixteen bytes as the plaintext length is increased by eight characters.

  5. A modular modulation method for achieving increases in metabolite production.

    PubMed

    Acerenza, Luis; Monzon, Pablo; Ortega, Fernando

    2015-01-01

    Increasing the production of overproducing strains represents a great challenge. Here, we develop a modular modulation method to determine the key steps for genetic manipulation to increase metabolite production. The method consists of three steps: (i) modularization of the metabolic network into two modules connected by linking metabolites, (ii) change in the activity of the modules using auxiliary rates producing or consuming the linking metabolites in appropriate proportions and (iii) determination of the key modules and steps to increase production. The mathematical formulation of the method in matrix form shows that it may be applied to metabolic networks of any structure and size, with reactions showing any kind of rate laws. The results are valid for any type of conservation relationships in the metabolite concentrations or interactions between modules. The activity of the module may, in principle, be changed by any large factor. The method may be applied recursively or combined with other methods devised to perform fine searches in smaller regions. In practice, it is implemented by integrating to the producer strain heterologous reactions or synthetic pathways producing or consuming the linking metabolites. The new procedure may contribute to develop metabolic engineering into a more systematic practice. © 2015 American Institute of Chemical Engineers.

  6. Developing and applying the adverse outcome pathway ...

    EPA Pesticide Factsheets

    To support a paradigm shift in regulatory toxicology testing and risk assessment, the Adverse Outcome Pathway (AOP) concept has recently been proposed. This concept is similar to that for Mode of Action (MOA), describing a sequence of measurable key events triggered by a molecular initiating event in which a stressor interacts with a biological target. The resulting cascade of key events includes molecular, cellular, structural and functional changes in biological systems, resulting in a measurable adverse outcome. Thereby, an AOP ideally provides information relevant to chemical structure-activity relationships as a basis to predict effects for structurally similar compounds. AOPs could potentially also form the basis for qualitative and quantitative predictive modeling of the human adverse outcome resulting from molecular initiating or other key events for which higher-throughput testing methods are available or can be developed.A variety of cellular and molecular processes are known to be critical to normal function of the central (CNS) and peripheral nervous systems (PNS). Because of the biological and functional complexity of the CNS and PNS, it has been challenging to establish causative links and quantitative relationships between key events that comprise the pathways leading from chemical exposure to an adverse outcome in the nervous system. Following introduction of principles of the description and assessment of MOA and AOPs, examples of adverse out

  7. Prefixed-threshold real-time selection method in free-space quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Wenyuan; Xu, Feihu; Lo, Hoi-Kwong

    2018-03-01

    Free-space quantum key distribution allows two parties to share a random key with unconditional security, between ground stations, between mobile platforms, and even in satellite-ground quantum communications. Atmospheric turbulence causes fluctuations in transmittance, which further affect the quantum bit error rate and the secure key rate. Previous postselection methods to combat atmospheric turbulence require a threshold value determined after all quantum transmission. In contrast, here we propose a method where we predetermine the optimal threshold value even before quantum transmission. Therefore, the receiver can discard useless data immediately, thus greatly reducing data storage requirements and computing resources. Furthermore, our method can be applied to a variety of protocols, including, for example, not only single-photon BB84 but also asymptotic and finite-size decoy-state BB84, which can greatly increase its practicality.

  8. Human action recognition based on spatial-temporal descriptors using key poses

    NASA Astrophysics Data System (ADS)

    Hu, Shuo; Chen, Yuxin; Wang, Huaibao; Zuo, Yaqing

    2014-11-01

    Human action recognition is an important area of pattern recognition today due to its direct application and need in various occasions like surveillance and virtual reality. In this paper, a simple and effective human action recognition method is presented based on the key poses of human silhouette and the spatio-temporal feature. Firstly, the contour points of human silhouette have been gotten, and the key poses are learned by means of K-means clustering based on the Euclidean distance between each contour point and the centre point of the human silhouette, and then the type of each action is labeled for further match. Secondly, we obtain the trajectories of centre point of each frame, and create a spatio-temporal feature value represented by W to describe the motion direction and speed of each action. The value W contains the information of location and temporal order of each point on the trajectories. Finally, the matching stage is performed by comparing the key poses and W between training sequences and test sequences, the nearest neighbor sequences is found and its label supplied the final result. Experiments on the public available Weizmann datasets show the proposed method can improve accuracy by distinguishing amphibious poses and increase suitability for real-time applications by reducing the computational cost.

  9. Academic Primer Series: Eight Key Papers about Education Theory

    PubMed Central

    Gottlieb, Michael; Boysen-Osborn, Megan; Chan, Teresa M.; Krzyzaniak, Sara M.; Pineda, Nicolas; Spector, Jordan; Sherbino, Jonathan

    2017-01-01

    Introduction Many teachers adopt instructional methods based on assumptions of best practices without attention to or knowledge of supporting education theory. Familiarity with a variety of theories informs education that is efficient, strategic, and evidence-based. As part of the Academic Life in Emergency Medicine Faculty Incubator Program, a list of key education theories for junior faculty was developed. Methods A list of key papers on theories relevant to medical education was generated using an expert panel, a virtual community of practice synthetic discussion, and a social media call for resources. A three-round, Delphi-informed voting methodology including novice and expert educators produced a rank order of the top papers. Results These educators identified 34 unique papers. Eleven papers described the general use of education theory, while 23 papers focused on a specific theory. The top three papers on general education theories and top five papers on specific education theory were selected and summarized. The relevance of each paper for junior faculty and faculty developers is also presented. Conclusion This paper presents a reading list of key papers for junior faculty in medical education roles. Three papers about general education theories and five papers about specific educational theories are identified and annotated. These papers may help provide foundational knowledge in education theory to inform junior faculty teaching practice. PMID:28210367

  10. Security enhancement of optical encryption based on biometric array keys

    NASA Astrophysics Data System (ADS)

    Yan, Aimin; Wei, Yang; Zhang, Jingtao

    2018-07-01

    A novel optical image encryption method is proposed by using Dammann grating and biometric array keys. Dammann grating is utilized to create a 2D finite uniform-intensity spot array. In encryption, a fingerprint array is used as private encryption keys. An original image can be encrypted by a scanning Fresnel zone plate array. Encrypted signals are processed by an optical coherent heterodyne detection system. Biometric array keys and optical scanning cryptography are integrated with each other to enhance information security greatly. Numerical simulations are performed to demonstrate the feasibility and validity of this method. Analyses on key sensitivity and the resistance against to possible attacks are provided.

  11. Expanding Access to a New, More Affordable Levonorgestrel Intrauterine System in Kenya: Service Delivery Costs Compared With Other Contraceptive Methods and Perspectives of Key Opinion Leaders

    PubMed Central

    Rademacher, Kate H; Solomon, Marsden; Brett, Tracey; Bratt, John H; Pascual, Claire; Njunguru, Jesse; Steiner, Markus J

    2016-01-01

    ABSTRACT Background: The levonorgestrel intrauterine system (LNG IUS) is one of the most effective forms of contraception and offers important non-contraceptive health benefits. However, it is not widely available in developing countries, largely due to the high price of existing products. Medicines360 plans to introduce its new, more affordable LNG IUS in Kenya. The public‐sector transfer price will vary by volume between US$12 to US$16 per unit; for an order of 100,000 units, the public-sector transfer price will be approximately US$15 per unit. Methods: We calculated the direct service delivery cost per couple-years of protection (CYP) of various family planning methods. The model includes the costs of contraceptive commodities, consumable supplies, instruments per client visit, and direct labor for counseling, insertion, removal, and resupply, if required. The model does not include costs of demand creation or training. We conducted interviews with key opinion leaders in Kenya to identify considerations for scale-up of a new LNG IUS, including strategies to overcome barriers that have contributed to low uptake of the copper intrauterine device. Results: The direct service delivery cost of Medicines360’s LNG IUS per CYP compares favorably with other contraceptive methods commonly procured for public-sector distribution in Kenya. The cost is slightly lower than that of the 3-month contraceptive injectable, which is currently the most popular method in Kenya. Almost all key opinion leaders agreed that introducing a more affordable LNG IUS could increase demand and uptake of the method. They thought that women seeking the product’s non-contraceptive health benefits would be a key market segment, and most agreed that the reduced menstrual bleeding associated with the method would likely be viewed as an advantage. The key opinion leaders indicated that myths and misconceptions among providers and clients about IUDs must be addressed, and that demand creation and provider training should be prioritized. Conclusion: Introducing a new, more affordable LNG IUS product could help expand choice for women in Kenya and increase use of long-acting reversible contraception. Further evaluation is needed to identify the full costs required for introduction—including the cost of demand creation—as well as research among potential and actual LNG IUS users, their partners, and health care providers to help inform scale-up of the method. PMID:27540128

  12. Development and Application of the Key Technologies for the Quality Control and Inspection of National Geographical Conditions Survey Products

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Zhang, L.; Ma, W.; Zhang, P.; Zhao, T.

    2018-04-01

    The First National Geographical Condition Survey is a predecessor task to dynamically master basic situations of the nature, ecology and human activities on the earth's surface and it is the brand-new mapping geographic information engineering. In order to ensure comprehensive, real and accurate survey results and achieve the quality management target which the qualified rate is 100 % and the yield is more than 80 %, it is necessary to carry out the quality control and result inspection for national geographical conditions survey on a national scale. To ensure that achievement quality meets quality target requirements, this paper develops the key technology method of "five-in-one" quality control that is constituted by "quality control system of national geographical condition survey, quality inspection technology system, quality evaluation system, quality inspection information management system and national linked quality control institutions" by aiming at large scale, wide coverage range, more undertaking units, more management levels, technical updating, more production process and obvious regional differences in the national geographical condition survey and combining with novel achievement manifestation, complicated dependency, more special reference data, and large data size. This project fully considering the domestic and foreign related research results and production practice experience, combined with the technology development and the needs of the production, it stipulates the inspection methods and technical requirements of each stage in the quality inspection of the geographical condition survey results, and extends the traditional inspection and acceptance technology, and solves the key technologies that are badly needed in the first national geographic survey.

  13. Spectral-element Method for 3D Marine Controlled-source EM Modeling

    NASA Astrophysics Data System (ADS)

    Liu, L.; Yin, C.; Zhang, B., Sr.; Liu, Y.; Qiu, C.; Huang, X.; Zhu, J.

    2017-12-01

    As one of the predrill reservoir appraisal methods, marine controlled-source EM (MCSEM) has been widely used in mapping oil reservoirs to reduce risk of deep water exploration. With the technical development of MCSEM, the need for improved forward modeling tools has become evident. We introduce in this paper spectral element method (SEM) for 3D MCSEM modeling. It combines the flexibility of finite-element and high accuracy of spectral method. We use Galerkin weighted residual method to discretize the vector Helmholtz equation, where the curl-conforming Gauss-Lobatto-Chebyshev (GLC) polynomials are chosen as vector basis functions. As a kind of high-order complete orthogonal polynomials, the GLC have the characteristic of exponential convergence. This helps derive the matrix elements analytically and improves the modeling accuracy. Numerical 1D models using SEM with different orders show that SEM method delivers accurate results. With increasing SEM orders, the modeling accuracy improves largely. Further we compare our SEM with finite-difference (FD) method for a 3D reservoir model (Figure 1). The results show that SEM method is more effective than FD method. Only when the mesh is fine enough, can FD achieve the same accuracy of SEM. Therefore, to obtain the same precision, SEM greatly reduces the degrees of freedom and cost. Numerical experiments with different models (not shown here) demonstrate that SEM is an efficient and effective tool for MSCEM modeling that has significant advantages over traditional numerical methods.This research is supported by Key Program of National Natural Science Foundation of China (41530320), China Natural Science Foundation for Young Scientists (41404093), and Key National Research Project of China (2016YFC0303100, 2017YFC0601900).

  14. Influence of the quality of intraoperative fluoroscopic images on the spatial positioning accuracy of a CAOS system.

    PubMed

    Wang, Junqiang; Wang, Yu; Zhu, Gang; Chen, Xiangqian; Zhao, Xiangrui; Qiao, Huiting; Fan, Yubo

    2018-06-01

    Spatial positioning accuracy is a key issue in a computer-assisted orthopaedic surgery (CAOS) system. Since intraoperative fluoroscopic images are one of the most important input data to the CAOS system, the quality of these images should have a significant influence on the accuracy of the CAOS system. But the regularities and mechanism of the influence of the quality of intraoperative images on the accuracy of a CAOS system have yet to be studied. Two typical spatial positioning methods - a C-arm calibration-based method and a bi-planar positioning method - are used to study the influence of different image quality parameters, such as resolution, distortion, contrast and signal-to-noise ratio, on positioning accuracy. The error propagation rules of image error in different spatial positioning methods are analyzed by the Monte Carlo method. Correlation analysis showed that resolution and distortion had a significant influence on spatial positioning accuracy. In addition the C-arm calibration-based method was more sensitive to image distortion, while the bi-planar positioning method was more susceptible to image resolution. The image contrast and signal-to-noise ratio have no significant influence on the spatial positioning accuracy. The result of Monte Carlo analysis proved that generally the bi-planar positioning method was more sensitive to image quality than the C-arm calibration-based method. The quality of intraoperative fluoroscopic images is a key issue in the spatial positioning accuracy of a CAOS system. Although the 2 typical positioning methods have very similar mathematical principles, they showed different sensitivities to different image quality parameters. The result of this research may help to create a realistic standard for intraoperative fluoroscopic images for CAOS systems. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Improved symbol rate identification method for on-off keying and advanced modulation format signals based on asynchronous delayed sampling

    NASA Astrophysics Data System (ADS)

    Cui, Sheng; Jin, Shang; Xia, Wenjuan; Ke, Changjian; Liu, Deming

    2015-11-01

    Symbol rate identification (SRI) based on asynchronous delayed sampling is accurate, cost-effective and robust to impairments. For on-off keying (OOK) signals the symbol rate can be derived from the periodicity of the second-order autocorrelation function (ACF2) of the delay tap samples. But it is found that when applied this method to advanced modulation format signals with auxiliary amplitude modulation (AAM), incorrect results may be produced because AAM has significant impact on ACF2 periodicity, which makes the symbol period harder or even unable to be correctly identified. In this paper it is demonstrated that for these signals the first order autocorrelation function (ACF1) has stronger periodicity and can be used to replace ACF2 to produce more accurate and robust results. Utilizing the characteristics of the ACFs, an improved SRI method is proposed to accommodate both OOK and advanced modulation formant signals in a transparent manner. Furthermore it is proposed that by minimizing the peak to average ratio (PAPR) of the delay tap samples with an additional tunable dispersion compensator (TDC) the limited dispersion tolerance can be expanded to desired values.

  16. [Monitoring method for macroporous resin column chromatography process of salvianolic acids based on near infrared spectroscopy].

    PubMed

    Hou, Xiang-Mei; Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-07-01

    To study and establish a monitoring method for macroporous resin column chromatography process of salvianolic acids by using near infrared spectroscopy (NIR) as a process analytical technology (PAT).The multivariate statistical process control (MSPC) model was developed based on 7 normal operation batches, and 2 test batches (including one normal operation batch and one abnormal operation batch) were used to verify the monitoring performance of this model. The results showed that MSPC model had a good monitoring ability for the column chromatography process. Meanwhile, NIR quantitative calibration model was established for three key quality indexes (rosmarinic acid, lithospermic acid and salvianolic acid B) by using partial least squares (PLS) algorithm. The verification results demonstrated that this model had satisfactory prediction performance. The combined application of the above two models could effectively achieve real-time monitoring for macroporous resin column chromatography process of salvianolic acids, and can be used to conduct on-line analysis of key quality indexes. This established process monitoring method could provide reference for the development of process analytical technology for traditional Chinese medicines manufacturing. Copyright© by the Chinese Pharmaceutical Association.

  17. The Combined Effects of Adaptive Control and Virtual Reality on Robot-Assisted Fine Hand Motion Rehabilitation in Chronic Stroke Patients: A Case Study.

    PubMed

    Huang, Xianwei; Naghdy, Fazel; Naghdy, Golshah; Du, Haiping; Todd, Catherine

    2018-01-01

    Robot-assisted therapy is regarded as an effective and reliable method for the delivery of highly repetitive training that is needed to trigger neuroplasticity following a stroke. However, the lack of fully adaptive assist-as-needed control of the robotic devices and an inadequate immersive virtual environment that can promote active participation during training are obstacles hindering the achievement of better training results with fewer training sessions required. This study thus focuses on these research gaps by combining these 2 key components into a rehabilitation system, with special attention on the rehabilitation of fine hand motion skills. The effectiveness of the proposed system is tested by conducting clinical trials on a chronic stroke patient and verified through clinical evaluation methods by measuring the key kinematic features such as active range of motion (ROM), finger strength, and velocity. By comparing the pretraining and post-training results, the study demonstrates that the proposed method can further enhance the effectiveness of fine hand motion rehabilitation training by improving finger ROM, strength, and coordination. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  18. Sensitivity analysis of key components in large-scale hydroeconomic models

    NASA Astrophysics Data System (ADS)

    Medellin-Azuara, J.; Connell, C. R.; Lund, J. R.; Howitt, R. E.

    2008-12-01

    This paper explores the likely impact of different estimation methods in key components of hydro-economic models such as hydrology and economic costs or benefits, using the CALVIN hydro-economic optimization for water supply in California. In perform our analysis using two climate scenarios: historical and warm-dry. The components compared were perturbed hydrology using six versus eighteen basins, highly-elastic urban water demands, and different valuation of agricultural water scarcity. Results indicate that large scale hydroeconomic hydro-economic models are often rather robust to a variety of estimation methods of ancillary models and components. Increasing the level of detail in the hydrologic representation of this system might not greatly affect overall estimates of climate and its effects and adaptations for California's water supply. More price responsive urban water demands will have a limited role in allocating water optimally among competing uses. Different estimation methods for the economic value of water and scarcity in agriculture may influence economically optimal water allocation; however land conversion patterns may have a stronger influence in this allocation. Overall optimization results of large-scale hydro-economic models remain useful for a wide range of assumptions in eliciting promising water management alternatives.

  19. Estimating the number of female sex workers in Côte d'Ivoire: results and lessons learned.

    PubMed

    Vuylsteke, Bea; Sika, Lazare; Semdé, Gisèle; Anoma, Camille; Kacou, Elise; Laga, Marie

    2017-09-01

    To report on the results of three size estimations of the populations of female sex workers (FSW) in five cities in Côte d'Ivoire and on operational lessons learned, which may be relevant for key population programmes in other parts of the world. We applied three methods: mapping and census, capture-recapture and service multiplier. All were applied between 2008 and 2009 in Abidjan, San Pedro, Bouaké, Yamoussoukro and Abengourou. Abidjan was the city with the highest number of FSW by far, with estimations between 7880 (census) and 13 714 (service multiplier). The estimations in San Pedro, Bouaké and Yamoussoukro were very similar, with figures ranging from 1160 (Yamoussoukro, census) to 1916 (San Pedro, capture-recapture). Important operational lessons were learned, including strategies for mapping, the importance of involving peer sex workers for implementing the capture-recapture and the identification of the right question for the multiplier method. Successful application of three methods to estimate the population size of FSW in five cities in Côte d'Ivoire enabled us to make recommendations for size estimations of key population in low-income countries. © 2017 John Wiley & Sons Ltd.

  20. A Novel Real-Time Reference Key Frame Scan Matching Method.

    PubMed

    Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu

    2017-05-07

    Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions' environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems.

  1. Enhancing Electromagnetic Side-Channel Analysis in an Operational Environment

    NASA Astrophysics Data System (ADS)

    Montminy, David P.

    Side-channel attacks exploit the unintentional emissions from cryptographic devices to determine the secret encryption key. This research identifies methods to make attacks demonstrated in an academic environment more operationally relevant. Algebraic cryptanalysis is used to reconcile redundant information extracted from side-channel attacks on the AES key schedule. A novel thresholding technique is used to select key byte guesses for a satisfiability solver resulting in a 97.5% success rate despite failing for 100% of attacks using standard methods. Two techniques are developed to compensate for differences in emissions from training and test devices dramatically improving the effectiveness of cross device template attacks. Mean and variance normalization improves same part number attack success rates from 65.1% to 100%, and increases the number of locations an attack can be performed by 226%. When normalization is combined with a novel technique to identify and filter signals in collected traces not related to the encryption operation, the number of traces required to perform a successful attack is reduced by 85.8% on average. Finally, software-defined radios are shown to be an effective low-cost method for collecting side-channel emissions in real-time, eliminating the need to modify or profile the target encryption device to gain precise timing information.

  2. At-line monitoring of key parameters of nisin fermentation by near infrared spectroscopy, chemometric modeling and model improvement.

    PubMed

    Guo, Wei-Liang; Du, Yi-Ping; Zhou, Yong-Can; Yang, Shuang; Lu, Jia-Hui; Zhao, Hong-Yu; Wang, Yao; Teng, Li-Rong

    2012-03-01

    An analytical procedure has been developed for at-line (fast off-line) monitoring of 4 key parameters including nisin titer (NT), the concentration of reducing sugars, cell concentration and pH during a nisin fermentation process. This procedure is based on near infrared (NIR) spectroscopy and Partial Least Squares (PLS). Samples without any preprocessing were collected at intervals of 1 h during fifteen batch of fermentations. These fermentation processes were implemented in 3 different 5 l fermentors at various conditions. NIR spectra of the samples were collected in 10 min. And then, PLS was used for modeling the relationship between NIR spectra and the key parameters which were determined by reference methods. Monte Carlo Partial Least Squares (MCPLS) was applied to identify the outliers and select the most efficacious methods for preprocessing spectra, wavelengths and the suitable number of latent variables (n (LV)). Then, the optimum models for determining NT, concentration of reducing sugars, cell concentration and pH were established. The correlation coefficients of calibration set (R (c)) were 0.8255, 0.9000, 0.9883 and 0.9581, respectively. These results demonstrated that this method can be successfully applied to at-line monitor of NT, concentration of reducing sugars, cell concentration and pH during nisin fermentation processes.

  3. Method and apparatus for staking optical elements

    DOEpatents

    Woods, Robert O.

    1988-01-01

    A method and apparatus for staking two optical elements together in order to retain their alignment is disclosed. The apparatus includes a removable adaptor made up of first and second adaptor bodies each having a lateral slot in their front and side faces. The adaptor also includes a system for releasably attaching each adaptor body to a respective optical element such that when the two optical elements are positioned relative to one another the adaptor bodies are adjacent and the lateral slots therein are aligned to form key slots. The adaptor includes keys which are adapted to fit into the key slots. A curable filler material is employed to retain the keys in the key slots and thereby join the first and second adaptor bodies to form the adaptor. Also disclosed is a method for staking together two optical elements employing the adaptor of the present invention.

  4. Method and apparatus for staking optical elements

    DOEpatents

    Woods, Robert O.

    1988-10-04

    A method and apparatus for staking two optical elements together in order to retain their alignment is disclosed. The apparatus includes a removable adaptor made up of first and second adaptor bodies each having a lateral slot in their front and side faces. The adaptor also includes a system for releasably attaching each adaptor body to a respective optical element such that when the two optical elements are positioned relative to one another the adaptor bodies are adjacent and the lateral slots therein are aligned to form key slots. The adaptor includes keys which are adapted to fit into the key slots. A curable filler material is employed to retain the keys in the key slots and thereby join the first and second adaptor bodies to form the adaptor. Also disclosed is a method for staking together two optical elements employing the adaptor of the present invention.

  5. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data

    PubMed Central

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data. PMID:26535589

  6. Physical Unclonable Function Hardware Keys Utilizing Kirchhoff-Law Secure Key Exchange and Noise-Based Logic

    NASA Astrophysics Data System (ADS)

    Kish, Laszlo B.; Kwan, Chiman

    Weak unclonable function (PUF) encryption key means that the manufacturer of the hardware can clone the key but not anybody else. Strong unclonable function (PUF) encryption key means that even the manufacturer of the hardware is unable to clone the key. In this paper, first we introduce an "ultra" strong PUF with intrinsic dynamical randomness, which is not only unclonable but also gets renewed to an independent key (with fresh randomness) during each use via the unconditionally secure key exchange. The solution utilizes the Kirchhoff-law-Johnson-noise (KLJN) method for dynamical key renewal and a one-time-pad secure key for the challenge/response process. The secure key is stored in a flash memory on the chip to provide tamper-resistance and nonvolatile storage with zero power requirements in standby mode. Simplified PUF keys are shown: a strong PUF utilizing KLJN protocol during the first run and noise-based logic (NBL) hyperspace vector string verification method for the challenge/response during the rest of its life or until it is re-initialized. Finally, the simplest PUF utilizes NBL without KLJN thus it can be cloned by the manufacturer but not by anybody else.

  7. A comprehensive method for the fracability evaluation of shale combined with brittleness and stress sensitivity

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoqiong; Ge, Hongkui; Wang, Daobing; Wang, Jianbo; Chen, Hao

    2017-12-01

    An effective fracability evaluation on the fracture network is key to the whole process of shale gas exploitation. At present, neither a standard criteria nor a generally accepted evaluation method exist. Well log and laboratory results have shown that the commonly used brittleness index calculated from the mineralogy composition is not entirely consistent with that obtained from the elastic modulus of the rock, and is sometimes even contradictory. The brittle mineral reflects the brittleness of the rock matrix, and the stress sensitivity of the wave velocity reflects the development degree of the natural fracture system. They are both key factors in controlling the propagating fracture morphology. Thus, in this study, a novel fracability evaluation method of shale was developed combining brittleness and stress sensitivity. Based on this method, the fracability of three shale gas plays were evaluated. The cored cylindrical samples were loaded under uniaxial stress up to 30 MPa and the compressional wave velocities were obtained along the axis stress direction at each MPa stress. From the stress velocity evolution, the stress sensitivity coefficients could be obtained. Our results showed that the fracability of Niutitang shale is better than that of Lujiaping shale, and the fracability of Lujiaping shale is better than Longmaxi shale. This result is in good agreement with acoustic emission activity measurements. The new fracability evaluation method enables a comprehensive reflection of the characteristics of rock matrix brittleness and the natural fracture system. This work is valuable for the evaluation of hydraulic fracturing effects in unconventional oil and gas reservoirs in the future.

  8. A mesh generation and machine learning framework for Drosophila gene expression pattern image analysis

    PubMed Central

    2013-01-01

    Background Multicellular organisms consist of cells of many different types that are established during development. Each type of cell is characterized by the unique combination of expressed gene products as a result of spatiotemporal gene regulation. Currently, a fundamental challenge in regulatory biology is to elucidate the gene expression controls that generate the complex body plans during development. Recent advances in high-throughput biotechnologies have generated spatiotemporal expression patterns for thousands of genes in the model organism fruit fly Drosophila melanogaster. Existing qualitative methods enhanced by a quantitative analysis based on computational tools we present in this paper would provide promising ways for addressing key scientific questions. Results We develop a set of computational methods and open source tools for identifying co-expressed embryonic domains and the associated genes simultaneously. To map the expression patterns of many genes into the same coordinate space and account for the embryonic shape variations, we develop a mesh generation method to deform a meshed generic ellipse to each individual embryo. We then develop a co-clustering formulation to cluster the genes and the mesh elements, thereby identifying co-expressed embryonic domains and the associated genes simultaneously. Experimental results indicate that the gene and mesh co-clusters can be correlated to key developmental events during the stages of embryogenesis we study. The open source software tool has been made available at http://compbio.cs.odu.edu/fly/. Conclusions Our mesh generation and machine learning methods and tools improve upon the flexibility, ease-of-use and accuracy of existing methods. PMID:24373308

  9. A Study on Environmental Research Trends Using Text-Mining Method - Focus on Spatial information and ICT -

    NASA Astrophysics Data System (ADS)

    Lee, M. J.; Oh, K. Y.; Joung-ho, L.

    2016-12-01

    Recently there are many research about analysing the interaction between entities by text-mining analysis in various fields. In this paper, we aimed to quantitatively analyse research-trends in the area of environmental research relating either spatial information or ICT (Information and Communications Technology) by Text-mining analysis. To do this, we applied low-dimensional embedding method, clustering analysis, and association rule to find meaningful associative patterns of key words frequently appeared in the articles. As the authors suppose that KCI (Korea Citation Index) articles reflect academic demands, total 1228 KCI articles that have been published from 1996 to 2015 were reviewed and analysed by Text-mining method. First, we derived KCI articles from NDSL(National Discovery for Science Leaders) site. And then we pre-processed their key-words elected from abstract and then classified those in separable sectors. We investigated the appearance rates and association rule of key-words for articles in the two fields: spatial-information and ICT. In order to detect historic trends, analysis was conducted separately for the four periods: 1996-2000, 2001-2005, 2006-2010, 2011-2015. These analysis were conducted with the usage of R-software. As a result, we conformed that environmental research relating spatial information mainly focused upon such fields as `GIS(35%)', `Remote-Sensing(25%)', `environmental theme map(15.7%)'. Next, `ICT technology(23.6%)', `ICT service(5.4%)', `mobile(24%)', `big data(10%)', `AI(7%)' are primarily emerging from environmental research relating ICT. Thus, from the analysis results, this paper asserts that research trends and academic progresses are well-structured to review recent spatial information and ICT technology and the outcomes of the analysis can be an adequate guidelines to establish environment policies and strategies. KEY WORDS: Big data, Test-mining, Environmental research, Spatial-information, ICT Acknowledgements: The authors appreciate the support that this study has received from `Building application frame of environmental issues, to respond to the latest ICT trends'.

  10. Round-robin differential-phase-shift quantum key distribution with a passive decoy state method

    PubMed Central

    Liu, Li; Guo, Fen-Zhuo; Qin, Su-Juan; Wen, Qiao-Yan

    2017-01-01

    Recently, a new type of protocol named Round-robin differential-phase-shift quantum key distribution (RRDPS QKD) was proposed, where the security can be guaranteed without monitoring conventional signal disturbances. The active decoy state method can be used in this protocol to overcome the imperfections of the source. But, it may lead to side channel attacks and break the security of QKD systems. In this paper, we apply the passive decoy state method to the RRDPS QKD protocol. Not only can the more environment disturbance be tolerated, but in addition it can overcome side channel attacks on the sources. Importantly, we derive a new key generation rate formula for our RRDPS protocol using passive decoy states and enhance the key generation rate. We also compare the performance of our RRDPS QKD to that using the active decoy state method and the original RRDPS QKD without any decoy states. From numerical simulations, the performance improvement of the RRDPS QKD by our new method can be seen. PMID:28198808

  11. Analysis of digital communication signals and extraction of parameters

    NASA Astrophysics Data System (ADS)

    Al-Jowder, Anwar

    1994-12-01

    The signal classification performance of four types of electronics support measure (ESM) communications detection systems is compared from the standpoint of the unintended receiver (interceptor). Typical digital communication signals considered include binary phase shift keying (BPSK), quadrature phase shift keying (QPSK), frequency shift keying (FSK), and on-off keying (OOK). The analysis emphasizes the use of available signal processing software. Detection methods compared include broadband energy detection, FFT-based narrowband energy detection, and two correlation methods which employ the fast Fourier transform (FFT). The correlation methods utilize modified time-frequency distributions, where one of these is based on the Wigner-Ville distribution (WVD). Gaussian white noise is added to the signal to simulate various signal-to-noise ratios (SNR's).

  12. Key success factors of health research centers: A mixed method study

    PubMed Central

    Tofighi, Shahram; Teymourzadeh, Ehsan; Heydari, Majid

    2017-01-01

    Background In order to achieve success in future goals and activities, health research centers are required to identify their key success factors. Objective This study aimed to extract and rank the factors affecting the success of research centers at one of the medical universities in Iran. Methods This study is a mixed method (qualitative-quantitative) study, which was conducted between May to October in 2016. The study setting was 22 health research centers. In qualitative phase, we extracted the factors affecting the success in research centers through purposeful interviews with 10 experts of centers, and classified them into themes and sub-themes. In the quantitative phase, we prepared a questionnaire and scored and ranked the factors recognized by 54 of the study samples by Friedman test. Results Nine themes and 42 sub-themes were identified. Themes included: strategic orientation, management, human capital, support, projects, infrastructure, communications and collaboration, paradigm and innovation and they were rated respectively as components of success in research centers. Among the 42 identified factors, 10 factors were ranked respectively as the key factors of success, and included: science and technology road map, strategic plan, evaluation indexes, committed human resources, scientific evaluation of members and centers, innovation in research and implementation, financial support, capable researchers, equipment infrastructure and teamwork. Conclusion According to the results, the strategic orientation was the most important component in the success of research centers. Therefore, managers and authorities of research centers should pay more attention to strategic areas in future planning, including the science and technology road map and strategic plan. PMID:28979733

  13. Key ingredients of anti-stigma programs for health care providers: a data synthesis of evaluative studies.

    PubMed

    Knaak, Stephanie; Modgill, Geeta; Patten, Scott B

    2014-10-01

    As part of its ongoing effort to combat stigma against mental illness among health care providers, the Mental Health Commission of Canada partnered with organizations conducting anti-stigma interventions. Our objective was to evaluate program effectiveness and to better understand what makes some programs more effective than others. Our paper reports the elements of these programs found to be most strongly associated with favourable outcomes. Our study employed a multi-phased, mixed-methods design. First, a grounded theory qualitative study was undertaken to identify key program elements. Next, each program (n = 22) was coded according to the presence or absence of the identified key program ingredients. Then, random-effects, meta-regression modelling was used to examine the association between program outcomes and the key ingredients. The qualitative analysis led to a 6-ingredient model of key program elements. Results of the quantitative analysis showed that programs that included all 6 of these ingredients performed significantly better than those that did not. Individual analyses of each of the 6 ingredients showed that including multiple forms of social contact and emphasizing recovery were characteristics of the most effective programs. The results provide a validation of a 6-ingredient model of key program elements for anti-stigma programming for health care providers. Emphasizing recovery and including multiple types of social contact are of particular importance for maximizing the effectiveness of anti-stigma programs for health care providers.

  14. Partial Variance of Increments Method in Solar Wind Observations and Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Greco, A.; Matthaeus, W. H.; Perri, S.; Osman, K. T.; Servidio, S.; Wan, M.; Dmitruk, P.

    2018-02-01

    The method called "PVI" (Partial Variance of Increments) has been increasingly used in analysis of spacecraft and numerical simulation data since its inception in 2008. The purpose of the method is to study the kinematics and formation of coherent structures in space plasmas, a topic that has gained considerable attention, leading the development of identification methods, observations, and associated theoretical research based on numerical simulations. This review paper will summarize key features of the method and provide a synopsis of the main results obtained by various groups using the method. This will enable new users or those considering methods of this type to find details and background collected in one place.

  15. Method for adding nodes to a quantum key distribution system

    DOEpatents

    Grice, Warren P

    2015-02-24

    An improved quantum key distribution (QKD) system and method are provided. The system and method introduce new clients at intermediate points along a quantum channel, where any two clients can establish a secret key without the need for a secret meeting between the clients. The new clients perform operations on photons as they pass through nodes in the quantum channel, and participate in a non-secret protocol that is amended to include the new clients. The system and method significantly increase the number of clients that can be supported by a conventional QKD system, with only a modest increase in cost. The system and method are compatible with a variety of QKD schemes, including polarization, time-bin, continuous variable and entanglement QKD.

  16. Several key issues on using 137Cs method for soil erosion estimation

    USDA-ARS?s Scientific Manuscript database

    This work was to examine several key issues of using the cesium-137 method to estimate soil erosion rates in order to improve and standardize the method. Based on the comprehensive review and synthesis of a large body of published literature and the author’s extensive research experience, several k...

  17. Feasibility of Extracting Key Elements from ClinicalTrials.gov to Support Clinicians’ Patient Care Decisions

    PubMed Central

    Kim, Heejun; Bian, Jiantao; Mostafa, Javed; Jonnalagadda, Siddhartha; Del Fiol, Guilherme

    2016-01-01

    Motivation: Clinicians need up-to-date evidence from high quality clinical trials to support clinical decisions. However, applying evidence from the primary literature requires significant effort. Objective: To examine the feasibility of automatically extracting key clinical trial information from ClinicalTrials.gov. Methods: We assessed the coverage of ClinicalTrials.gov for high quality clinical studies that are indexed in PubMed. Using 140 random ClinicalTrials.gov records, we developed and tested rules for the automatic extraction of key information. Results: The rate of high quality clinical trial registration in ClinicalTrials.gov increased from 0.2% in 2005 to 17% in 2015. Trials reporting results increased from 3% in 2005 to 19% in 2015. The accuracy of the automatic extraction algorithm for 10 trial attributes was 90% on average. Future research is needed to improve the algorithm accuracy and to design information displays to optimally present trial information to clinicians. PMID:28269867

  18. Combination of advanced encryption standard 256 bits with md5 to secure documents on android smartphone

    NASA Astrophysics Data System (ADS)

    Pasaribu, Hendra; Sitanggang, Delima; Rizki Damanik, Rudolfo; Rudianto Sitompul, Alex Chandra

    2018-04-01

    File transfer by using a smartphone has some security issues like data theft by irresponsible parties. To improve the quality of data security systems on smartphones, in this research the integration of AES 256 bit algorithm by using MD5 hashing is proposed. The use of MD5 aims to increase the key strength of the encryption and decryption process of document files. The test results show that the proposed method can increase the key strength of the encryption and decryption process in the document file. Encryption and decryption time by using AES and MD5 combination is faster than using AES only on *.txt file type and reverse results for *.docx, *.xlsx, *.pptx and *.pdf file files.

  19. Parametric Representation of the Speaker's Lips for Multimodal Sign Language and Speech Recognition

    NASA Astrophysics Data System (ADS)

    Ryumin, D.; Karpov, A. A.

    2017-05-01

    In this article, we propose a new method for parametric representation of human's lips region. The functional diagram of the method is described and implementation details with the explanation of its key stages and features are given. The results of automatic detection of the regions of interest are illustrated. A speed of the method work using several computers with different performances is reported. This universal method allows applying parametrical representation of the speaker's lipsfor the tasks of biometrics, computer vision, machine learning, and automatic recognition of face, elements of sign languages, and audio-visual speech, including lip-reading.

  20. A Novel Method for Reconstructing Broken Contour Lines Extracted from Scanned Topographic Maps

    NASA Astrophysics Data System (ADS)

    Wang, Feng; Liu, Pingzhi; Yang, Yun; Wei, Haiping; An, Xiaoya

    2018-05-01

    It is known that after segmentation and morphological operations on scanned topographic maps, gaps occur in contour lines. It is also well known that filling these gaps and reconstruction of contour lines with high accuracy and completeness is not an easy problem. In this paper, a novel method is proposed dedicated in automatic or semiautomatic filling up caps and reconstructing broken contour lines in binary images. The key part of end points' auto-matching and reconnecting is deeply discussed after introducing the procedure of reconstruction, in which some key algorithms and mechanisms are presented and realized, including multiple incremental backing trace to get weighted average direction angle of end points, the max constraint angle control mechanism based on the multiple gradient ranks, combination of weighted Euclidean distance and deviation angle to determine the optimum matching end point, bidirectional parabola control, etc. Lastly, experimental comparisons based on typically samples are complemented between proposed method and the other representative method, the results indicate that the former holds higher accuracy and completeness, better stability and applicability.

  1. In-vitro antimicrobial activity of marine actinobacteria against multidrug resistance Staphylococcus aureus

    PubMed Central

    Sathish, Kumar SR; Kokati, Venkata Bhaskara Rao

    2012-01-01

    Objective To investigate the antibacterial activity of marine actinobacteria against multidrug resistance Staphylococcus aureus (MDRSA). Methods Fifty one actinobacterial strains were isolated from salt pans soil, costal area in Kothapattanam, Ongole, Andhra Pradesh. Primary screening was done using cross-streak method against MDRSA. The bioactive compounds are extracted from efficient actinobacteria using solvent extraction. The antimicrobial activity of crude and solvent extracts was performed using Kirby-Bauer method. MIC for ethyl acetate extract was determined by modified agar well diffusion method. The potent actinobacteria are identified using Nonomura key, Shirling and Gottlieb 1966 with Bergey's manual of determinative bacteriology. Results Among the fifty one isolates screened for antibacterial activity, SRB25 were found efficient against MDRSA. The ethyl acetate extracts showed high inhibition against test organism. MIC test was performed with the ethyl acetate extract against MDRSA and found to be 1 000 µg/mL. The isolated actinobacteria are identified as Streptomyces sp with the help of Nonomura key. Conclusions The current investigation reveals that the marine actinobacteria from salt pan environment can be able to produce new drug molecules against drug resistant microorganisms. PMID:23569848

  2. Health treaty dilution: a case study of Japan's influence on the language of the WHO Framework Convention on Tobacco Control

    PubMed Central

    Assunta, Mary; Chapman, Simon

    2006-01-01

    Background The Japanese government is an important shareholder in the Japanese tobacco industry. Negotiations to develop the WHO's historic Framework Convention on Tobacco Control (FCTC) were based on consensus, resulting in countries needing to agree to the lowest acceptable common denominator in clause development. Objective To illustrate Japan's role in negotiating key optional language in the FCTC text. Methods Summary reports, text proposals, conference papers, and speeches related to the six FCTC negotiation sessions were reviewed for repeated words, concepts and emerging themes. Key stakeholders were interviewed. Key words such as “sovereignty”, “appropriate”, “latitude”, “individual”, “flexibility”, and “may” representing optional language were examined. Results The Japanese government's proposals for “appropriate” and optional measures are reflected in the final FCTC text that accommodates flexibility on interpretation and implementation on key tobacco controls. While Japan was not alone in proposing optional language, consensus accommodated their proposals. Conclusion Japan's success in arguing for extensive optional language seriously weakened the FCTC. Accordingly, international tobacco control can be expected to be less successful in reducing the burden of disease caused by tobacco use. PMID:16905717

  3. Collaborating with Youth to Inform and Develop Tools for Psychotropic Decision Making

    PubMed Central

    Murphy, Andrea; Gardner, David; Kutcher, Stan; Davidson, Simon; Manion, Ian

    2010-01-01

    Introduction: Youth oriented and informed resources designed to support psychopharmacotherapeutic decision-making are essentially unavailable. This article outlines the approach taken to design such resources, the product that resulted from the approach taken, and the lessons learned from the process. Methods: A project team with psychopharmacology expertise was assembled. The project team reviewed best practices regarding medication educational materials and related tools to support decisions. Collaboration with key stakeholders who were thought of as primary end-users and target groups occurred. A graphic designer and a plain language consultant were also retained. Results: Through an iterative and collaborative process over approximately 6 months, Med Ed and Med Ed Passport were developed. Literature and input from key stakeholders, in particular youth, was instrumental to the development of the tools and materials within Med Ed. A training program utilizing a train-the-trainer model was developed to facilitate the implementation of Med Ed in Ontario, which is currently ongoing. Conclusion: An evidence-informed process that includes youth and key stakeholder engagement is required for developing tools to support in psychopharmacotherapeutic decision-making. The development process fostered an environment of reciprocity between the project team and key stakeholders. PMID:21037916

  4. Optical hiding with visual cryptography

    NASA Astrophysics Data System (ADS)

    Shi, Yishi; Yang, Xiubo

    2017-11-01

    We propose an optical hiding method based on visual cryptography. In the hiding process, we convert the secret information into a set of fabricated phase-keys, which are completely independent of each other, intensity-detected-proof and image-covered, leading to the high security. During the extraction process, the covered phase-keys are illuminated with laser beams and then incoherently superimposed to extract the hidden information directly by human vision, without complicated optical implementations and any additional computation, resulting in the convenience of extraction. Also, the phase-keys are manufactured as the diffractive optical elements that are robust to the attacks, such as the blocking and the phase-noise. Optical experiments verify that the high security, the easy extraction and the strong robustness are all obtainable in the visual-cryptography-based optical hiding.

  5. Parameter optimization in biased decoy-state quantum key distribution with both source errors and statistical fluctuations

    NASA Astrophysics Data System (ADS)

    Zhu, Jian-Rong; Li, Jian; Zhang, Chun-Mei; Wang, Qin

    2017-10-01

    The decoy-state method has been widely used in commercial quantum key distribution (QKD) systems. In view of the practical decoy-state QKD with both source errors and statistical fluctuations, we propose a universal model of full parameter optimization in biased decoy-state QKD with phase-randomized sources. Besides, we adopt this model to carry out simulations of two widely used sources: weak coherent source (WCS) and heralded single-photon source (HSPS). Results show that full parameter optimization can significantly improve not only the secure transmission distance but also the final key generation rate. And when taking source errors and statistical fluctuations into account, the performance of decoy-state QKD using HSPS suffered less than that of decoy-state QKD using WCS.

  6. A biometric method to secure telemedicine systems.

    PubMed

    Zhang, G H; Poon, Carmen C Y; Li, Ye; Zhang, Y T

    2009-01-01

    Security and privacy are among the most crucial issues for data transmission in telemedicine systems. This paper proposes a solution for securing wireless data transmission in telemedicine systems, i.e. within a body sensor network (BSN), between the BSN and server as well as between the server and professionals who have assess to the server. A unique feature of this solution is the generation of random keys by physiological data (i.e. a biometric approach) for securing communication at all 3 levels. In the performance analysis, inter-pulse interval of photoplethysmogram is used as an example to generate these biometric keys to protect wireless data transmission. The results of statistical analysis and computational complexity suggest that this type of key is random enough to make telemedicine systems resistant to attacks.

  7. An aggregate method to calibrate the reference point of cumulative prospect theory-based route choice model for urban transit network

    NASA Astrophysics Data System (ADS)

    Zhang, Yufeng; Long, Man; Luo, Sida; Bao, Yu; Shen, Hanxia

    2015-12-01

    Transit route choice model is the key technology of public transit systems planning and management. Traditional route choice models are mostly based on expected utility theory which has an evident shortcoming that it cannot accurately portray travelers' subjective route choice behavior for their risk preferences are not taken into consideration. Cumulative prospect theory (CPT), a brand new theory, can be used to describe travelers' decision-making process under the condition of uncertainty of transit supply and risk preferences of multi-type travelers. The method to calibrate the reference point, a key parameter to CPT-based transit route choice model, determines the precision of the model to a great extent. In this paper, a new method is put forward to obtain the value of reference point which combines theoretical calculation and field investigation results. Comparing the proposed method with traditional method, it shows that the new method can promote the quality of CPT-based model by improving the accuracy in simulating travelers' route choice behaviors based on transit trip investigation from Nanjing City, China. The proposed method is of great significance to logical transit planning and management, and to some extent makes up the defect that obtaining the reference point is solely based on qualitative analysis.

  8. Information Filtering via a Scaling-Based Function

    PubMed Central

    Qiu, Tian; Zhang, Zi-Ke; Chen, Guang

    2013-01-01

    Finding a universal description of the algorithm optimization is one of the key challenges in personalized recommendation. In this article, for the first time, we introduce a scaling-based algorithm (SCL) independent of recommendation list length based on a hybrid algorithm of heat conduction and mass diffusion, by finding out the scaling function for the tunable parameter and object average degree. The optimal value of the tunable parameter can be abstracted from the scaling function, which is heterogeneous for the individual object. Experimental results obtained from three real datasets, Netflix, MovieLens and RYM, show that the SCL is highly accurate in recommendation. More importantly, compared with a number of excellent algorithms, including the mass diffusion method, the original hybrid method, and even an improved version of the hybrid method, the SCL algorithm remarkably promotes the personalized recommendation in three other aspects: solving the accuracy-diversity dilemma, presenting a high novelty, and solving the key challenge of cold start problem. PMID:23696829

  9. Identifying Dispositions That Matter: Reaching for Consensus Using a Delphi Study

    ERIC Educational Resources Information Center

    Bair, Mary Antony

    2017-01-01

    This article describes how one institution used the Delphi technique to identify and operationalize key professional dispositions to be addressed in its teacher education program. Participants included teacher educators, methods course instructors, and school administrators. Data collection occurred in three phases, with the results of each phase…

  10. Predictors of Care-Giver Stress in Families of Preschool-Aged Children with Developmental Disabilities

    ERIC Educational Resources Information Center

    Plant, K. M.; Sanders, M. R.

    2007-01-01

    Background: This study examined the predictors, mediators and moderators of parent stress in families of preschool-aged children with developmental disability. Method: One hundred and five mothers of preschool-aged children with developmental disability completed assessment measures addressing the key variables. Results: Analyses demonstrated that…

  11. Psychosomatic Medicine: The Scientific Foundation of the Biopsychosocial Model

    ERIC Educational Resources Information Center

    Novack, Dennis H.; Cameron, Oliver; Epel, Elissa; Ader, Robert; Waldstein, Shari R.; Levenstein, Susan; Antoni, Michael H.; Wainer, Alicia Rojas

    2007-01-01

    Objective: This article presents major concepts and research findings from the field of psychosomatic medicine that the authors believe should be taught to all medical students. Method: The authors asked senior scholars involved in psychosomatic medicine to summarize key findings in their respective fields. Results: The authors provide an overview…

  12. Downsized Boosted Engine Benchmarking Method and Results (SAE Paper 2015-01-1266)

    EPA Science Inventory

    Light-duty vehicle greenhouse gas (GHG) and fuel economy (FE) standards for MYs 2012 -2025 are requiring vehicle powertrain to become much more efficient. One key technology strategy that vehicle manufacturers are using to help comply with GHG and FE standards is to replace natu...

  13. ANALYSIS OF CONCORDANCE OF PROBABILISTIC AGGREGATE EXPOSURE PREDICTIONS WITH OBSERVED BIOMONITORING RESULTS: AN EXAMPLE USING CTEPP DATA

    EPA Science Inventory

    Three key areas of scientific inquiry in the study of human exposure to environmental contaminants are 1) assessment of aggregate (i.e., multi-pathway, multi-route) exposures, 2) application of probabilistic methods to exposure prediction, and 3) the interpretation of biomarker m...

  14. Attitude Surveys Document Sampler.

    ERIC Educational Resources Information Center

    Walker, Albert, Comp.

    This packet presents results of a series of attitude surveys representing a variety of purposes, methods and defined publics. They range from a simple questionnaire prepared and mailed to a small group of key individuals by a public relations staff to scientifically derived surveys purchased from Louis Harris and Associates and other research…

  15. Self-Assembled Resonance Energy Transfer Keys for Secure Communication over Classical Channels.

    PubMed

    Nellore, Vishwa; Xi, Sam; Dwyer, Chris

    2015-12-22

    Modern authentication and communication protocols increasingly use physical keys in lieu of conventional software-based keys for security. This shift is primarily driven by the ability to derive a unique, unforgeable signature from a physical key. The sole demonstration of an unforgeable key, thus far, has been through quantum key distribution, which suffers from limited communication distances and expensive infrastructure requirements. Here, we show a method for creating unclonable keys by molecular self-assembly of resonance energy transfer (RET) devices. It is infeasible to clone the RET-key due to the inability to characterize the key using current technology, the large number of input-output combinations per key, and the variation of the key's response with time. However, the manufacturer can produce multiple identical devices, which enables inexpensive, secure authentication and communication over classical channels, and thus any distance. Through a detailed experimental survey of the nanoscale keys, we demonstrate that legitimate users are successfully authenticated 99.48% of the time and the false-positives are only 0.39%, over two attempts. We estimate that a legitimate user would have a computational advantage of more than 10(340) years over an attacker. Our method enables the discovery of physical key based multiparty authentication and communication schemes that are both practical and possess unprecedented security.

  16. Objective comparison of particle tracking methods.

    PubMed

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R; Godinez, William J; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E G; Jaldén, Joakim; Blau, Helen M; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P; Dan, Han-Wei; Tsai, Yuh-Show; Ortiz de Solórzano, Carlos; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-03-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Because manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized an open competition in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to notable practical conclusions for users and developers.

  17. Method Development in Forensic Toxicology.

    PubMed

    Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona

    2017-01-01

    In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. Orthogonal Array Testing for Transmit Precoding based Codebooks in Space Shift Keying Systems

    NASA Astrophysics Data System (ADS)

    Al-Ansi, Mohammed; Alwee Aljunid, Syed; Sourour, Essam; Mat Safar, Anuar; Rashidi, C. B. M.

    2018-03-01

    In Space Shift Keying (SSK) systems, transmit precoding based codebook approaches have been proposed to improve the performance in limited feedback channels. The receiver performs an exhaustive search in a predefined Full-Combination (FC) codebook to select the optimal codeword that maximizes the Minimum Euclidean Distance (MED) between the received constellations. This research aims to reduce the codebook size with the purpose of minimizing the selection time and the number of feedback bits. Therefore, we propose to construct the codebooks based on Orthogonal Array Testing (OAT) methods due to their powerful inherent properties. These methods allow to acquire a short codebook where the codewords are sufficient to cover almost all the possible effects included in the FC codebook. Numerical results show the effectiveness of the proposed OAT codebooks in terms of the system performance and complexity.

  19. An improved network model for railway traffic

    NASA Astrophysics Data System (ADS)

    Li, Keping; Ma, Xin; Shao, Fubo

    In railway traffic, safety analysis is a key issue for controlling train operation. Here, the identification and order of key factors are very important. In this paper, a new network model is constructed for analyzing the railway safety, in which nodes are regarded as causation factors and links represent possible relationships among those factors. Our aim is to give all these nodes an importance order, and to find the in-depth relationship among these nodes including how failures spread among them. Based on the constructed network model, we propose a control method to ensure the safe state by setting each node a threshold. As the results, by protecting the Hub node of the constructed network, the spreading of railway accident can be controlled well. The efficiency of such a method is further tested with the help of numerical example.

  20. One-click scanning of large-size documents using mobile phone camera

    NASA Astrophysics Data System (ADS)

    Liu, Sijiang; Jiang, Bo; Yang, Yuanjie

    2016-07-01

    Currently mobile apps for document scanning do not provide convenient operations to tackle large-size documents. In this paper, we present a one-click scanning approach for large-size documents using mobile phone camera. After capturing a continuous video of documents, our approach automatically extracts several key frames by optical flow analysis. Then based on key frames, a mobile GPU based image stitching method is adopted to generate a completed document image with high details. There are no extra manual intervention in the process and experimental results show that our app performs well, showing convenience and practicability for daily life.

  1. Novel image encryption algorithm based on multiple-parameter discrete fractional random transform

    NASA Astrophysics Data System (ADS)

    Zhou, Nanrun; Dong, Taiji; Wu, Jianhua

    2010-08-01

    A new method of digital image encryption is presented by utilizing a new multiple-parameter discrete fractional random transform. Image encryption and decryption are performed based on the index additivity and multiple parameters of the multiple-parameter fractional random transform. The plaintext and ciphertext are respectively in the spatial domain and in the fractional domain determined by the encryption keys. The proposed algorithm can resist statistic analyses effectively. The computer simulation results show that the proposed encryption algorithm is sensitive to the multiple keys, and that it has considerable robustness, noise immunity and security.

  2. A prototype system for perinatal knowledge engineering using an artificial intelligence tool.

    PubMed

    Sokol, R J; Chik, L

    1988-01-01

    Though several perinatal expert systems are extant, the use of artificial intelligence has, as yet, had minimal impact in medical computing. In this evaluation of the potential of AI techniques in the development of a computer based "Perinatal Consultant," a "top down" approach to the development of a perinatal knowledge base was taken, using as a source for such a knowledge base a 30-page manuscript of a chapter concerning high risk pregnancy. The UNIX utility "style" was used to parse sentences and obtain key words and phrases, both as part of a natural language interface and to identify key perinatal concepts. Compared with the "gold standard" of sentences containing key facts as chosen by the experts, a semiautomated method using a nonmedical speller to identify key words and phrases in context functioned with a sensitivity of 79%, i.e., approximately 8 in 10 key sentences were detected as the basis for PROLOG, rules and facts for the knowledge base. These encouraging results suggest that functional perinatal expert systems may well be expedited by using programming utilities in conjunction with AI tools and published literature.

  3. Measuring Sexual Behavior Stigma to Inform Effective HIV Prevention and Treatment Programs for Key Populations

    PubMed Central

    Hargreaves, James R; Sprague, Laurel; Stangl, Anne L; Baral, Stefan D

    2017-01-01

    Background The levels of coverage of human immunodeficiency virus (HIV) treatment and prevention services needed to change the trajectory of the HIV epidemic among key populations, including gay men and other men who have sex with men (MSM) and sex workers, have consistently been shown to be limited by stigma. Objective The aim of this study was to propose an agenda for the goals and approaches of a sexual behavior stigma surveillance effort for key populations, with a focus on collecting surveillance data from 4 groups: (1) members of key population groups themselves (regardless of HIV status), (2) people living with HIV (PLHIV) who are also members of key populations, (3) members of nonkey populations, and (4) health workers. Methods We discuss strengths and weaknesses of measuring multiple different types of stigma including perceived, anticipated, experienced, perpetrated, internalized, and intersecting stigma as measured among key populations themselves, as well as attitudes or beliefs about key populations as measured among other groups. Results With the increasing recognition of the importance of stigma, consistent and validated stigma metrics for key populations are needed to monitor trends and guide immediate action. Evidence-based stigma interventions may ultimately be the key to overcoming the barriers to coverage and retention in life-saving antiretroviral-based HIV prevention and treatment programs for key populations. Conclusions Moving forward necessitates the integration of validated stigma scales in routine HIV surveillance efforts, as well as HIV epidemiologic and intervention studies focused on key populations, as a means of tracking progress toward a more efficient and impactful HIV response. PMID:28446420

  4. Finite element modeling of truss structures with frequency-dependent material damping

    NASA Technical Reports Server (NTRS)

    Lesieutre, George A.

    1991-01-01

    A physically motivated modelling technique for structural dynamic analysis that accommodates frequency dependent material damping was developed. Key features of the technique are the introduction of augmenting thermodynamic fields (AFT) to interact with the usual mechanical displacement field, and the treatment of the resulting coupled governing equations using finite element analysis methods. The AFT method is fully compatible with current structural finite element analysis techniques. The method is demonstrated in the dynamic analysis of a 10-bay planar truss structure, a structure representative of those contemplated for use in future space systems.

  5. Problems Involved in an Emergency Method of Guiding a Gliding Vehicle from High Altitudes to a High Key Position

    NASA Technical Reports Server (NTRS)

    Jewel, Joseph W., Jr.; Whitten, James B.

    1960-01-01

    An investigation has been conducted to determine the problems involved in an emergency method of guiding a gliding vehicle from high altitudes to a high key position (initial position) above a landing field. A jet airplane in a simulated flameout condition, conventional ground-tracking radar, and a scaled wire for guidance programming on the radar plotting board were used in the tests. Starting test altitudes varied from 30,000 feet to 46,500 feet, and starting positions ranged 8.4 to 67 nautical miles from the high key. Specified altitudes of the high key were 12,000, 10,000 or 4,000 feet. Lift-drag ratios of the aircraft of either 17, 16, or 6 were held constant during any given flight; however, for a few flights the lift-drag ratio was varied from 11 to 6. Indicated airspeeds were held constant at either 160 or 250 knots. Results from these tests indicate that a gliding vehicle having a lift-drag ratio of 16 and an indicated approach speed of 160 knots can be guided to within 800 feet vertically and 2,400 feet laterally of a high key position. When the lift-drag ratio of the vehicle is reduced to 6 and the indicated approach speed is raised to 250 knots, the radar controller was able to guide the vehicle to within 2,400 feet vertically and au feet laterally of the high key. It was also found that radar stations which give only azimuth-distance information could control the glide path of a gliding vehicle as well as stations that receive azimuth-distance-altitude information, provided that altitude information is supplied by the pilot.

  6. Active Learning Framework for Non-Intrusive Load Monitoring: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Xin

    2016-05-16

    Non-Intrusive Load Monitoring (NILM) is a set of techniques that estimate the electricity usage of individual appliances from power measurements taken at a limited number of locations in a building. One of the key challenges in NILM is having too much data without class labels yet being unable to label the data manually for cost or time constraints. This paper presents an active learning framework that helps existing NILM techniques to overcome this challenge. Active learning is an advanced machine learning method that interactively queries a user for the class label information. Unlike most existing NILM systems that heuristically requestmore » user inputs, the proposed method only needs minimally sufficient information from a user to build a compact and yet highly representative load signature library. Initial results indicate the proposed method can reduce the user inputs by up to 90% while still achieving similar disaggregation performance compared to a heuristic method. Thus, the proposed method can substantially reduce the burden on the user, improve the performance of a NILM system with limited user inputs, and overcome the key market barriers to the wide adoption of NILM technologies.« less

  7. Parameter estimation in large-scale systems biology models: a parallel and self-adaptive cooperative strategy.

    PubMed

    Penas, David R; González, Patricia; Egea, Jose A; Doallo, Ramón; Banga, Julio R

    2017-01-21

    The development of large-scale kinetic models is one of the current key issues in computational systems biology and bioinformatics. Here we consider the problem of parameter estimation in nonlinear dynamic models. Global optimization methods can be used to solve this type of problems but the associated computational cost is very large. Moreover, many of these methods need the tuning of a number of adjustable search parameters, requiring a number of initial exploratory runs and therefore further increasing the computation times. Here we present a novel parallel method, self-adaptive cooperative enhanced scatter search (saCeSS), to accelerate the solution of this class of problems. The method is based on the scatter search optimization metaheuristic and incorporates several key new mechanisms: (i) asynchronous cooperation between parallel processes, (ii) coarse and fine-grained parallelism, and (iii) self-tuning strategies. The performance and robustness of saCeSS is illustrated by solving a set of challenging parameter estimation problems, including medium and large-scale kinetic models of the bacterium E. coli, bakerés yeast S. cerevisiae, the vinegar fly D. melanogaster, Chinese Hamster Ovary cells, and a generic signal transduction network. The results consistently show that saCeSS is a robust and efficient method, allowing very significant reduction of computation times with respect to several previous state of the art methods (from days to minutes, in several cases) even when only a small number of processors is used. The new parallel cooperative method presented here allows the solution of medium and large scale parameter estimation problems in reasonable computation times and with small hardware requirements. Further, the method includes self-tuning mechanisms which facilitate its use by non-experts. We believe that this new method can play a key role in the development of large-scale and even whole-cell dynamic models.

  8. Ranking of critical species to preserve the functionality of mutualistic networks using the k-core decomposition

    PubMed Central

    García-Algarra, Javier; Pastor, Juan Manuel; Iriondo, José María

    2017-01-01

    Background Network analysis has become a relevant approach to analyze cascading species extinctions resulting from perturbations on mutualistic interactions as a result of environmental change. In this context, it is essential to be able to point out key species, whose stability would prevent cascading extinctions, and the consequent loss of ecosystem function. In this study, we aim to explain how the k-core decomposition sheds light on the understanding the robustness of bipartite mutualistic networks. Methods We defined three k-magnitudes based on the k-core decomposition: k-radius, k-degree, and k-risk. The first one, k-radius, quantifies the distance from a node to the innermost shell of the partner guild, while k-degree provides a measure of centrality in the k-shell based decomposition. k-risk is a way to measure the vulnerability of a network to the loss of a particular species. Using these magnitudes we analyzed 89 mutualistic networks involving plant pollinators or seed dispersers. Two static extinction procedures were implemented in which k-degree and k-risk were compared against other commonly used ranking indexes, as for example MusRank, explained in detail in Material and Methods. Results When extinctions take place in both guilds, k-risk is the best ranking index if the goal is to identify the key species to preserve the giant component. When species are removed only in the primary class and cascading extinctions are measured in the secondary class, the most effective ranking index to identify the key species to preserve the giant component is k-degree. However, MusRank index was more effective when the goal is to identify the key species to preserve the greatest species richness in the second class. Discussion The k-core decomposition offers a new topological view of the structure of mutualistic networks. The new k-radius, k-degree and k-risk magnitudes take advantage of its properties and provide new insight into the structure of mutualistic networks. The k-risk and k-degree ranking indexes are especially effective approaches to identify key species to preserve when conservation practitioners focus on the preservation of ecosystem functionality over species richness. PMID:28533969

  9. The Model Method: Singapore Children's Tool for Representing and Solving Algebraic Word Problems

    ERIC Educational Resources Information Center

    Ng, Swee Fong; Lee, Kerry

    2009-01-01

    Solving arithmetic and algebraic word problems is a key component of the Singapore elementary mathematics curriculum. One heuristic taught, the model method, involves drawing a diagram to represent key information in the problem. We describe the model method and a three-phase theoretical framework supporting its use. We conducted 2 studies to…

  10. Numerical simulations of motion-insensitive diffusion imaging based on the distant dipolar field effects.

    PubMed

    Lin, Tao; Sun, Huijun; Chen, Zhong; You, Rongyi; Zhong, Jianhui

    2007-12-01

    Diffusion weighting in MRI is commonly achieved with the pulsed-gradient spin-echo (PGSE) method. When combined with spin-warping image formation, this method often results in ghosts due to the sample's macroscopic motion. It has been shown experimentally (Kennedy and Zhong, MRM 2004;52:1-6) that these motion artifacts can be effectively eliminated by the distant dipolar field (DDF) method, which relies on the refocusing of spatially modulated transverse magnetization by the DDF within the sample itself. In this report, diffusion-weighted images (DWIs) using both DDF and PGSE methods in the presence of macroscopic sample motion were simulated. Numerical simulation results quantify the dependence of signals in DWI on several key motion parameters and demonstrate that the DDF DWIs are much less sensitive to macroscopic sample motion than the traditional PGSE DWIs. The results also show that the dipolar correlation distance (d(c)) can alter contrast in DDF DWIs. The simulated results are in good agreement with the experimental results reported previously.

  11. Practical passive decoy state measurement-device-independent quantum key distribution with unstable sources.

    PubMed

    Liu, Li; Guo, Fen-Zhuo; Wen, Qiao-Yan

    2017-09-12

    Measurement-device-independent quantum key distribution (MDI-QKD) with the active decoy state method can remove all detector loopholes, and resist the imperfections of sources. But it may lead to side channel attacks and break the security of QKD system. In this paper, we apply the passive decoy state method to the MDI-QKD based on polarization encoding mode. Not only all attacks on detectors can be removed, but also the side channel attacks on sources can be overcome. We get that the MDI-QKD with our passive decoy state method can have a performance comparable to the protocol with the active decoy state method. To fit for the demand of practical application, we discuss intensity fluctuation in the security analysis of MDI-QKD protocol using passive decoy state method, and derive the key generation rate for our protocol with intensity fluctuation. It shows that intensity fluctuation has an adverse effect on the key generation rate which is non-negligible, especially in the case of small data size of total transmitting signals and long distance transmission. We give specific simulations on the relationship between intensity fluctuation and the key generation rate. Furthermore, the statistical fluctuation due to the finite length of data is also taken into account.

  12. Information Security Scheme Based on Computational Temporal Ghost Imaging.

    PubMed

    Jiang, Shan; Wang, Yurong; Long, Tao; Meng, Xiangfeng; Yang, Xiulun; Shu, Rong; Sun, Baoqing

    2017-08-09

    An information security scheme based on computational temporal ghost imaging is proposed. A sequence of independent 2D random binary patterns are used as encryption key to multiply with the 1D data stream. The cipher text is obtained by summing the weighted encryption key. The decryption process can be realized by correlation measurement between the encrypted information and the encryption key. Due to the instinct high-level randomness of the key, the security of this method is greatly guaranteed. The feasibility of this method and robustness against both occlusion and additional noise attacks are discussed with simulation, respectively.

  13. Stakeholder-focused evaluation of an online course for health care providers.

    PubMed

    Dunet, Diane O; Reyes, Michele

    2006-01-01

    Different people who have a stake or interest in a training course (stakeholders) may have markedly different definitions of what constitutes "training success" and how they will use evaluation results. Stakeholders at multiple levels within and outside of the organization guided the development of an evaluation plan for a Web-based training course on hemochromatosis. Stakeholder interests and values were reflected in the type, level, and rigor of evaluation methods selected. Our mixed-method evaluation design emphasized small sample sizes and repeated measures. Limited resources for evaluation were leveraged by focusing on the data needs of key stakeholders, understanding how they wanted to use evaluation results, and collecting data needed for stakeholder decision making. Regular feedback to key stakeholders provided opportunities for updating the course evaluation plan to meet emerging needs for new or different information. Early and repeated involvement of stakeholders in the evaluation process also helped build support for the final product. Involving patient advocacy groups, managers, and representative course participants improved the course and enhanced product dissemination. For training courses, evaluation planning is an opportunity to tailor methods and data collection to meet the information needs of particular stakeholders. Rigorous evaluation research of every training course may be infeasible or unwarranted; however, course evaluations can be improved by good planning. A stakeholder-focused approach can build a picture of the results and impact of training while fostering the practical use of evaluation data.

  14. Optical image encryption by random shifting in fractional Fourier domains

    NASA Astrophysics Data System (ADS)

    Hennelly, B.; Sheridan, J. T.

    2003-02-01

    A number of methods have recently been proposed in the literature for the encryption of two-dimensional information by use of optical systems based on the fractional Fourier transform. Typically, these methods require random phase screen keys for decrypting the data, which must be stored at the receiver and must be carefully aligned with the received encrypted data. A new technique based on a random shifting, or jigsaw, algorithm is proposed. This method does not require the use of phase keys. The image is encrypted by juxtaposition of sections of the image in fractional Fourier domains. The new method has been compared with existing methods and shows comparable or superior robustness to blind decryption. Optical implementation is discussed, and the sensitivity of the various encryption keys to blind decryption is examined.

  15. Comparative Study of Impedance Eduction Methods, Part 2: NASA Tests and Methodology

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Howerton, Brian M.; Busse-Gerstengarbe, Stefan

    2013-01-01

    A number of methods have been developed at NASA Langley Research Center for eduction of the acoustic impedance of sound-absorbing liners mounted in the wall of a flow duct. This investigation uses methods based on the Pridmore-Brown and convected Helmholtz equations to study the acoustic behavior of a single-layer, conventional liner fabricated by the German Aerospace Center and tested in the NASA Langley Grazing Flow Impedance Tube. Two key assumptions are explored in this portion of the investigation. First, a comparison of results achieved with uniform-flow and shear-flow impedance eduction methods is considered. Also, an approach based on the Prony method is used to extend these methods from single-mode to multi-mode implementations. Finally, a detailed investigation into the effects of harmonic distortion on the educed impedance is performed, and the results are used to develop guidelines regarding acceptable levels of harmonic distortion

  16. Optical detection of random features for high security applications

    NASA Astrophysics Data System (ADS)

    Haist, T.; Tiziani, H. J.

    1998-02-01

    Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.

  17. Optimization of spent fuel pool weir gate driving mechanism

    NASA Astrophysics Data System (ADS)

    Liu, Chao; Du, Lin; Tao, Xinlei; Wang, Shijie; Shang, Ertao; Yu, Jianjiang

    2018-04-01

    Spent fuel pool is crucial facility for fuel storage and nuclear safety, and the spent fuel pool weir gate is the key related equipment. In order to achieve a goal of more efficient driving force transfer, loading during the opening/closing process is analyzed and an optimized calculation method for dimensions of driving mechanism is proposed. The result of optimizing example shows that the method can be applied to weir gates' design with similar driving mechanism.

  18. Ground Deployment Demonstration and Material Testing for Solar Sail

    NASA Astrophysics Data System (ADS)

    Huang, Xiaoqi; Cheng, Zhengai; Liu, Yufei; Wang, Li

    2016-07-01

    Solar Sail is a kind of spacecraft that can achieve extremely high velocity by light pressure instead of chemical fuel. The great accelerate rely on its high area-to-mass ratio. So solar sail is always designed in huge size and it use ultra thin and light weight materials. For 100-meter class solar sail, two key points must be considered in the design process. They are fold-deployment method, and material property change in space environment. To test and verify the fold-deployment technology, a 8*8m principle prototype was developed. Sail membrane folding in method of IKAROS, Nanosail-D , and new proposed L-shape folding pattern were tested on this prototype. Their deployment properties were investigated in detail, and comparisons were made between them. Also, the space environment suitability of ultra thin polyimide films as candidate solar sail material was analyzed. The preliminary test results showed that membrane by all the folding method could deploy well. Moreover, sail membrane folding by L-shape pattern deployed more rapidly and more organized among the three folding pattern tested. The mechanical properties of the polyimide had no significant change after electron irradiation. As the preliminary research on the key technology of solar sail spacecraft, in this paper, the results of the study would provide important basis on large-scale solar sail membrane select and fold-deploying method design.

  19. Secure and Efficient Signature Scheme Based on NTRU for Mobile Payment

    NASA Astrophysics Data System (ADS)

    Xia, Yunhao; You, Lirong; Sun, Zhe; Sun, Zhixin

    2017-10-01

    Mobile payment becomes more and more popular, however the traditional public-key encryption algorithm has higher requirements for hardware which is not suitable for mobile terminals of limited computing resources. In addition, these public-key encryption algorithms do not have the ability of anti-quantum computing. This paper researches public-key encryption algorithm NTRU for quantum computation through analyzing the influence of parameter q and k on the probability of generating reasonable signature value. Two methods are proposed to improve the probability of generating reasonable signature value. Firstly, increase the value of parameter q. Secondly, add the authentication condition that meet the reasonable signature requirements during the signature phase. Experimental results show that the proposed signature scheme can realize the zero leakage of the private key information of the signature value, and increase the probability of generating the reasonable signature value. It also improve rate of the signature, and avoid the invalid signature propagation in the network, but the scheme for parameter selection has certain restrictions.

  20. Rethinking key–value store for parallel I/O optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kougkas, Anthony; Eslami, Hassan; Sun, Xian-He

    2015-01-26

    Key-value stores are being widely used as the storage system for large-scale internet services and cloud storage systems. However, they are rarely used in HPC systems, where parallel file systems are the dominant storage solution. In this study, we examine the architecture differences and performance characteristics of parallel file systems and key-value stores. We propose using key-value stores to optimize overall Input/Output (I/O) performance, especially for workloads that parallel file systems cannot handle well, such as the cases with intense data synchronization or heavy metadata operations. We conducted experiments with several synthetic benchmarks, an I/O benchmark, and a real application.more » We modeled the performance of these two systems using collected data from our experiments, and we provide a predictive method to identify which system offers better I/O performance given a specific workload. The results show that we can optimize the I/O performance in HPC systems by utilizing key-value stores.« less

  1. Video Feedback in Key Word Signing Training for Preservice Direct Support Staff.

    PubMed

    Rombouts, Ellen; Meuris, Kristien; Maes, Bea; De Meyer, Anne-Marie; Zink, Inge

    2016-04-01

    Research has demonstrated that formal training is essential for professionals to learn key word signing. Yet, the particular didactic strategies have not been studied. Therefore, this study compared the effectiveness of verbal and video feedback in a key word signing training for future direct support staff. Forty-nine future direct support staff were randomly assigned to 1 of 3 key word signing training programs: modeling and verbal feedback (classical method [CM]), additional video feedback (+ViF), and additional video feedback and photo reminder (+ViF/R). Signing accuracy and training acceptability were measured 1 week after and 7 months after training. Participants from the +ViF/R program achieved significantly higher signing accuracy compared with the CM group. Acceptability ratings did not differ between any of the groups. Results suggest that at an equal time investment, the programs containing more training components were more effective. Research on the effect of rehearsal on signing maintenance is warranted.

  2. Distributed Factorization Computation on Multiple Volunteered Mobile Resource to Break RSA Key

    NASA Astrophysics Data System (ADS)

    Jaya, I.; Hardi, S. M.; Tarigan, J. T.; Zamzami, E. M.; Sihombing, P.

    2017-01-01

    Similar to common asymmeric encryption, RSA can be cracked by usmg a series mathematical calculation. The private key used to decrypt the massage can be computed using the public key. However, finding the private key may require a massive amount of calculation. In this paper, we propose a method to perform a distributed computing to calculate RSA’s private key. The proposed method uses multiple volunteered mobile devices to contribute during the calculation process. Our objective is to demonstrate how the use of volunteered computing on mobile devices may be a feasible option to reduce the time required to break a weak RSA encryption and observe the behavior and running time of the application on mobile devices.

  3. ‘Scared of going to the clinic’: Contextualising healthcare access for men who have sex with men, female sex workers and people who use drugs in two South African cities

    PubMed Central

    2018-01-01

    Background Men who have sex with men (MSM), sex workers (SW) and people who use drugs (PWUD) are at increased risk for HIV because of multiple socio-structural barriers and do not have adequate access to appropriate HIV prevention, diagnosis and treatment services. Objective To examine the context of access to healthcare experienced by these three ‘Key Populations’, we conducted a qualitative study in two South African cities: Bloemfontein in the Free State province and Mafikeng in the North West province. Method We carried out in-depth interviews to explore healthcare workers’ perceptions, beliefs and attitudes towards Key Populations. Focus group discussions were also conducted with members of Key Populations exploring their experiences of accessing healthcare. Results Healthcare workers described their own attitudes towards Key Populations and demonstrated a lack of relevant knowledge, skills and training to manage the particular health needs and vulnerabilities facing Key Populations. Female SW, MSM and PWUD described their experiences of stigmatisation, and of being made to feel guilt, shame and a loss of dignity as a result of the discrimination by healthcare providers and other community. members. Our findings suggest that the uptake and effectiveness of health services amongst Key Populations in South Africa is limited by internalised stigma, reluctance to seek care, unwillingness to disclose risk behaviours to healthcare workers, combined with a lack of knowledge and understanding on the part of the broader community members, including healthcare workers. Conclusion This research highlights the need to address the broader healthcare provision environment, improving alignment of policies and programming in order to strengthen provision of effective health services that people from Key Populations will be able to access. PMID:29568645

  4. A Polynomial Subset-Based Efficient Multi-Party Key Management System for Lightweight Device Networks.

    PubMed

    Mahmood, Zahid; Ning, Huansheng; Ghafoor, AtaUllah

    2017-03-24

    Wireless Sensor Networks (WSNs) consist of lightweight devices to measure sensitive data that are highly vulnerable to security attacks due to their constrained resources. In a similar manner, the internet-based lightweight devices used in the Internet of Things (IoT) are facing severe security and privacy issues because of the direct accessibility of devices due to their connection to the internet. Complex and resource-intensive security schemes are infeasible and reduce the network lifetime. In this regard, we have explored the polynomial distribution-based key establishment schemes and identified an issue that the resultant polynomial value is either storage intensive or infeasible when large values are multiplied. It becomes more costly when these polynomials are regenerated dynamically after each node join or leave operation and whenever key is refreshed. To reduce the computation, we have proposed an Efficient Key Management (EKM) scheme for multiparty communication-based scenarios. The proposed session key management protocol is established by applying a symmetric polynomial for group members, and the group head acts as a responsible node. The polynomial generation method uses security credentials and secure hash function. Symmetric cryptographic parameters are efficient in computation, communication, and the storage required. The security justification of the proposed scheme has been completed by using Rubin logic, which guarantees that the protocol attains mutual validation and session key agreement property strongly among the participating entities. Simulation scenarios are performed using NS 2.35 to validate the results for storage, communication, latency, energy, and polynomial calculation costs during authentication, session key generation, node migration, secure joining, and leaving phases. EKM is efficient regarding storage, computation, and communication overhead and can protect WSN-based IoT infrastructure.

  5. A Polynomial Subset-Based Efficient Multi-Party Key Management System for Lightweight Device Networks

    PubMed Central

    Mahmood, Zahid; Ning, Huansheng; Ghafoor, AtaUllah

    2017-01-01

    Wireless Sensor Networks (WSNs) consist of lightweight devices to measure sensitive data that are highly vulnerable to security attacks due to their constrained resources. In a similar manner, the internet-based lightweight devices used in the Internet of Things (IoT) are facing severe security and privacy issues because of the direct accessibility of devices due to their connection to the internet. Complex and resource-intensive security schemes are infeasible and reduce the network lifetime. In this regard, we have explored the polynomial distribution-based key establishment schemes and identified an issue that the resultant polynomial value is either storage intensive or infeasible when large values are multiplied. It becomes more costly when these polynomials are regenerated dynamically after each node join or leave operation and whenever key is refreshed. To reduce the computation, we have proposed an Efficient Key Management (EKM) scheme for multiparty communication-based scenarios. The proposed session key management protocol is established by applying a symmetric polynomial for group members, and the group head acts as a responsible node. The polynomial generation method uses security credentials and secure hash function. Symmetric cryptographic parameters are efficient in computation, communication, and the storage required. The security justification of the proposed scheme has been completed by using Rubin logic, which guarantees that the protocol attains mutual validation and session key agreement property strongly among the participating entities. Simulation scenarios are performed using NS 2.35 to validate the results for storage, communication, latency, energy, and polynomial calculation costs during authentication, session key generation, node migration, secure joining, and leaving phases. EKM is efficient regarding storage, computation, and communication overhead and can protect WSN-based IoT infrastructure. PMID:28338632

  6. A Novel Method for Measuring the Diffusion, Partition and Convective Mass Transfer Coefficients of Formaldehyde and VOC in Building Materials

    PubMed Central

    Xiong, Jianyin; Huang, Shaodan; Zhang, Yinping

    2012-01-01

    The diffusion coefficient (D m) and material/air partition coefficient (K) are two key parameters characterizing the formaldehyde and volatile organic compounds (VOC) sorption behavior in building materials. By virtue of the sorption process in airtight chamber, this paper proposes a novel method to measure the two key parameters, as well as the convective mass transfer coefficient (h m). Compared to traditional methods, it has the following merits: (1) the K, D m and h m can be simultaneously obtained, thus is convenient to use; (2) it is time-saving, just one sorption process in airtight chamber is required; (3) the determination of h m is based on the formaldehyde and VOC concentration data in the test chamber rather than the generally used empirical correlations obtained from the heat and mass transfer analogy, thus is more accurate and can be regarded as a significant improvement. The present method is applied to measure the three parameters by treating the experimental data in the literature, and good results are obtained, which validates the effectiveness of the method. Our new method also provides a potential pathway for measuring h m of semi-volatile organic compounds (SVOC) by using that of VOC. PMID:23145156

  7. Computational methods using genome-wide association studies to predict radiotherapy complications and to identify correlative molecular processes

    NASA Astrophysics Data System (ADS)

    Oh, Jung Hun; Kerns, Sarah; Ostrer, Harry; Powell, Simon N.; Rosenstein, Barry; Deasy, Joseph O.

    2017-02-01

    The biological cause of clinically observed variability of normal tissue damage following radiotherapy is poorly understood. We hypothesized that machine/statistical learning methods using single nucleotide polymorphism (SNP)-based genome-wide association studies (GWAS) would identify groups of patients of differing complication risk, and furthermore could be used to identify key biological sources of variability. We developed a novel learning algorithm, called pre-conditioned random forest regression (PRFR), to construct polygenic risk models using hundreds of SNPs, thereby capturing genomic features that confer small differential risk. Predictive models were trained and validated on a cohort of 368 prostate cancer patients for two post-radiotherapy clinical endpoints: late rectal bleeding and erectile dysfunction. The proposed method results in better predictive performance compared with existing computational methods. Gene ontology enrichment analysis and protein-protein interaction network analysis are used to identify key biological processes and proteins that were plausible based on other published studies. In conclusion, we confirm that novel machine learning methods can produce large predictive models (hundreds of SNPs), yielding clinically useful risk stratification models, as well as identifying important underlying biological processes in the radiation damage and tissue repair process. The methods are generally applicable to GWAS data and are not specific to radiotherapy endpoints.

  8. Effective Social Media Practices for Communicating Climate Change Science to Community Leaders

    NASA Astrophysics Data System (ADS)

    Estrada, M.; DeBenedict, C.; Bruce, L.

    2016-12-01

    Climate Education Partners (CEP) uses an action research approach to increase climate knowledge and informed decision-making among key influential (KI) leaders in San Diego county. Social media has been one method for disseminating knowledge. During CEP's project years, social media use has proliferated. To capitalize on this trend, CEP iteratively developed a strategic method to engage KIs. First, as with all climate education, CEP identified the audience. Three primary Facebook and Twitter audiences were CEP's internal team, local KIs, and strategic partner organizations. Second, post contents were chosen based on interest to CEP key audiences and followed CEP's communications message triangle, which incorporates the Tripartite Integration Model of Social Influence (TIMSI). This message triangle focuses on San Diegan's valued quality of life, future challenges we face due to the changing climate, and ways in which we are working together to protect our quality of life for future generations. Third, an editorial calendar was created to carefully time posts, which capitalize on when target audiences were using social media most and to maintain consistency. The results of these three actions were significant. Results attained utilizing Facebook and Twitter data, which tracks post reach, total followers/likes, and engagement (likes, comments, mentions, shares). For example we found that specifically mentioning KIs resulted in more re-tweets and resulted in reaching a broader audience. Overall, data shows that CEP's reach to audiences of like-minded individuals and organizations now extends beyond CEP's original local network and reached more than 20,000 accounts on Twitter this year (compared with 460 on Twitter the year before). In summary, through posting and participating in the online conversation strategically, CEP disseminated key educational climate resources and relevant climate change news to educate and engage target audience and amplify our work.

  9. Securing Digital Audio using Complex Quadratic Map

    NASA Astrophysics Data System (ADS)

    Suryadi, MT; Satria Gunawan, Tjandra; Satria, Yudi

    2018-03-01

    In This digital era, exchanging data are common and easy to do, therefore it is vulnerable to be attacked and manipulated from unauthorized parties. One data type that is vulnerable to attack is digital audio. So, we need data securing method that is not vulnerable and fast. One of the methods that match all of those criteria is securing the data using chaos function. Chaos function that is used in this research is complex quadratic map (CQM). There are some parameter value that causing the key stream that is generated by CQM function to pass all 15 NIST test, this means that the key stream that is generated using this CQM is proven to be random. In addition, samples of encrypted digital sound when tested using goodness of fit test are proven to be uniform, so securing digital audio using this method is not vulnerable to frequency analysis attack. The key space is very huge about 8.1×l031 possible keys and the key sensitivity is very small about 10-10, therefore this method is also not vulnerable against brute-force attack. And finally, the processing speed for both encryption and decryption process on average about 450 times faster that its digital audio duration.

  10. Discussion and a new method of optical cryptosystem based on interference

    NASA Astrophysics Data System (ADS)

    Lu, Dajiang; He, Wenqi; Liao, Meihua; Peng, Xiang

    2017-02-01

    A discussion and an objective security analysis of the well-known optical image encryption based on interference are presented in this paper. A new method is also proposed to eliminate the security risk of the original cryptosystem. For a possible practical application, we expand this new method into a hierarchical authentication scheme. In this authentication system, with a pre-generated and fixed random phase lock, different target images indicating different authentication levels are analytically encoded into corresponding phase-only masks (phase keys) and amplitude-only masks (amplitude keys). For the authentication process, a legal user can obtain a specified target image at the output plane if his/her phase key, and amplitude key, which should be settled close against the fixed internal phase lock, are respectively illuminated by two coherent beams. By comparing the target image with all the standard certification images in the database, the system can thus verify the user's legality even his/her identity level. Moreover, in despite of the internal phase lock of this system being fixed, the crosstalk between different pairs of keys held by different users is low. Theoretical analysis and numerical simulation are both provided to demonstrate the validity of this method.

  11. Techniques for Sea Ice Characteristics Extraction and Sea Ice Monitoring Using Multi-Sensor Satellite Data in the Bohai Sea-Dragon 3 Programme Final Report (2012-2016)

    NASA Astrophysics Data System (ADS)

    Zhang, Xi; Zhang, Jie; Meng, Junmin

    2016-08-01

    The objectives of Dragon-3 programme (ID: 10501) are to develop methods for classification sea ice types and retrieving ice thickness based on multi-sensor data. In this final results paper, we give a briefly introduction for our research work and mainly results. Key words: the Bohai Sea ice, Sea ice, optical and

  12. Exploring the potential for using results-based financing to address non-communicable diseases in low- and middle-income countries

    PubMed Central

    2013-01-01

    Background The burden of disease due to non-communicable diseases (NCDs) is rising in low- and middle-income countries (LMICs) and funding for global health is increasingly limited. As a large contributor of development assistance for health, the US government has the potential to influence overall trends in NCDs. Results-based financing (RBF) has been proposed as a strategy to increase aid effectiveness and efficiency through incentives for positive performance and results in health programs, but its potential for addressing NCDs has not been explored. Methods Qualitative methods including literature review and key informant interviews were used to identify promising RBF mechanisms for addressing NCDs in resource-limited settings. Eight key informants identified by area of expertise participated in semi-structured interviews. Results The majority of RBF schemes to date have been applied to maternal and child health. Evidence from existing RBF programs suggests that RBF principles can be applied to health programs for NCDs. Several options were identified for US involvement with RBF for NCDs. Conclusion There is potential for the US to have a significant impact on NCDs in LMICs through a comprehensive RBF strategy for global health. RBF mechanisms should be tested for use in NCD programs through pilot programs incorporating robust impact evaluations. PMID:23368959

  13. State Recognition of Bone Drilling Based on Acoustic Emission in Pedicle Screw Operation.

    PubMed

    Guan, Fengqing; Sun, Yu; Qi, Xiaozhi; Hu, Ying; Yu, Gang; Zhang, Jianwei

    2018-05-09

    Pedicle drilling is an important step in pedicle screw fixation and the most significant challenge in this operation is how to determine a key point in the transition region between cancellous and inner cortical bone. The purpose of this paper is to find a method to achieve the recognition for the key point. After acquiring acoustic emission (AE) signals during the drilling process, this paper proposed a novel frequency distribution-based algorithm (FDB) to analyze the AE signals in the frequency domain after certain processes. Then we select a specific frequency domain of the signal for standard operations and choose a fitting function to fit the obtained sequence. Characters of the fitting function are extracted as outputs for identification of different bone layers. The results, which are obtained by detecting force signal and direct measurement, are given in the paper. Compared with the results above, the results obtained by AE signals are distinguishable for different bone layers and are more accurate and precise. The results of the algorithm are trained and identified by a neural network and the recognition rate reaches 84.2%. The proposed method is proved to be efficient and can be used for bone layer identification in pedicle screw fixation.

  14. Associations between the Five-Factor Model of Personality and Health Behaviors among College Students

    ERIC Educational Resources Information Center

    Raynor, Douglas A.; Levine, Heidi

    2009-01-01

    Objective: In fall 2006, the authors examined associations between the five-factor model of personality and several key health behaviors. Methods: College students (N = 583) completed the American College Health Association-National College Health Assessment and the International Personality Item Pool Big Five short-form questionnaire. Results:…

  15. Pre-Service Versus In-Service Science Teachers' Views of NOS

    ERIC Educational Resources Information Center

    Hoh, Yin Kiong

    2013-01-01

    This article reports on the results of a paper-pen questionnaire study involving certain key aspects of the nature of science. The questionnaire covers, among other things, aspects such as uniqueness of the scientific method, objectivity of scientific data, and immutability of scientific laws. The survey was given out to eighty trainee teachers…

  16. Exploring Parent Perceptions of the Food Environment in Youth Sport

    ERIC Educational Resources Information Center

    Thomas, Megan; Nelson, Toben F.; Harwood, Eileen; Neumark-Sztainer, Dianne

    2012-01-01

    Objective: To examine parent perceptions of the food environment in youth sport. Methods: Eight focus group discussions were held with parents (n = 60) of youth aged 6-13 years participating in basketball programs in Minnesota. Key themes and concepts were identified via transcript-based analysis. Results: Parents reported that youth commonly…

  17. Environmental Contaminants, Metabolites, Cells, Organ Tissues, and Water: All in a Day’s Work at the EPA Analytical Chemistry Research Core

    EPA Science Inventory

    The talk will highlight key aspects and results of analytical methods the EPA National Health and Environmental Effects Research Laboratory (NHEERL) Analytical Chemistry Research Core (ACRC) develops and uses to provide data on disposition, metabolism, and effects of environmenta...

  18. Latino High School Students' Perceptions of Caring: Keys to Success

    ERIC Educational Resources Information Center

    Garza, Rubén; Soto Huerta, Mary Esther

    2014-01-01

    This mixed methods investigation specifically examined Latino high school adolescents' perceptions of teacher behaviors that demonstrate caring. A chi-square test was conducted to analyze the frequency of responses, and focus group interviews were conducted to expand on the results. The data indicated that although Latino male students were as…

  19. Effectiveness of Solution-Focused Brief Therapy: A Systematic Qualitative Review of Controlled Outcome Studies

    ERIC Educational Resources Information Center

    Gingerich, Wallace J.; Peterson, Lance T.

    2013-01-01

    Objective: We review all available controlled outcome studies of solution-focused brief therapy (SFBT) to evaluate evidence of its effectiveness. Method: Forty-three studies were located and key data abstracted on problem, setting, SFBT intervention, design characteristics, and outcomes. Results: Thirty-two (74%) of the studies reported…

  20. Ambiguous Loss and Posttraumatic Stress in School-Age Children of Prisoners

    ERIC Educational Resources Information Center

    Bocknek, Erika London; Sanderson, Jessica; Britner, Preston A., IV

    2009-01-01

    We describe a sample of school-age children of incarcerated parents enrolled in a federally funded mentoring program. A mixed methods approach was applied to discern key themes related to caregiver incarceration. Results demonstrated a high prevalence of posttraumatic stress as well as high rates of internalizing and externalizing behaviors.…

  1. Effective Computer-Aided Assessment of Mathematics; Principles, Practice and Results

    ERIC Educational Resources Information Center

    Greenhow, Martin

    2015-01-01

    This article outlines some key issues for writing effective computer-aided assessment (CAA) questions in subjects with substantial mathematical or statistical content, especially the importance of control of random parameters and the encoding of wrong methods of solution (mal-rules) commonly used by students. The pros and cons of using CAA and…

  2. Collaborative Writing in a Statistics and Research Methods Course.

    ERIC Educational Resources Information Center

    Dunn, Dana S.

    1996-01-01

    Describes a collaborative writing project in which students must identify key variables, search and read relevant literature, and reason through a research idea by working closely with a partner. The end result is a polished laboratory report in the APA style. The class includes a peer review workshop prior to final editing. (MJP)

  3. Compact fusion energy based on the spherical tokamak

    NASA Astrophysics Data System (ADS)

    Sykes, A.; Costley, A. E.; Windsor, C. G.; Asunta, O.; Brittles, G.; Buxton, P.; Chuyanov, V.; Connor, J. W.; Gryaznevich, M. P.; Huang, B.; Hugill, J.; Kukushkin, A.; Kingham, D.; Langtry, A. V.; McNamara, S.; Morgan, J. G.; Noonan, P.; Ross, J. S. H.; Shevchenko, V.; Slade, R.; Smith, G.

    2018-01-01

    Tokamak Energy Ltd, UK, is developing spherical tokamaks using high temperature superconductor magnets as a possible route to fusion power using relatively small devices. We present an overview of the development programme including details of the enabling technologies, the key modelling methods and results, and the remaining challenges on the path to compact fusion.

  4. Understanding Textual Authorship in the Digital Environment: Lessons from Historical Perspectives

    ERIC Educational Resources Information Center

    Velagic, Zoran; Hasenay, Damir

    2013-01-01

    Introduction: The paper explains how the modern understanding of authorship developed and sets out the problems to be considered when discussing digital authorship. Method: The contextual analysis of contents of the key themes is employed; in the articulation of the conclusions, analytic and synthetic approaches are used. Results: At each turning…

  5. The Importance of Experiential Learning

    NASA Astrophysics Data System (ADS)

    Stanford, Jennifer

    2017-04-01

    As student numbers increase year on year, the ability to provide experiential learning opportunities and individual formative feedback is decreasing. As an important mechanism for cementing understanding of key concept thresholds in physical Earth sciences, practical based learning is paramount, especially for students with diverse learning abilities. According to Steinaker & Bell's taxonomy, experiential learning and dissemination of information to peers is key for students to make the transition to being much deeper learners. Furthermore, practical based learning also provides opportunity for varied methods of assessment, which are otherwise more challenging to devise. I here present results from practical, experiential based learning within the context of Foundation Year teaching, which shows that predominantly, students found experiential learning to be both a positive and rewarding part of their curriculum. Key aspects of these findings are now being translated to the design of new curricula.

  6. Joint image encryption and compression scheme based on a new hyperchaotic system and curvelet transform

    NASA Astrophysics Data System (ADS)

    Zhang, Miao; Tong, Xiaojun

    2017-07-01

    This paper proposes a joint image encryption and compression scheme based on a new hyperchaotic system and curvelet transform. A new five-dimensional hyperchaotic system based on the Rabinovich system is presented. By means of the proposed hyperchaotic system, a new pseudorandom key stream generator is constructed. The algorithm adopts diffusion and confusion structure to perform encryption, which is based on the key stream generator and the proposed hyperchaotic system. The key sequence used for image encryption is relation to plain text. By means of the second generation curvelet transform, run-length coding, and Huffman coding, the image data are compressed. The joint operation of compression and encryption in a single process is performed. The security test results indicate the proposed methods have high security and good compression effect.

  7. Turbulence study in the vicinity of piano key weir: relevance, instrumentation, parameters and methods

    NASA Astrophysics Data System (ADS)

    Tiwari, Harinarayan; Sharma, Nayan

    2017-05-01

    This research paper focuses on the need of turbulence, instruments reliable to capture turbulence, different turbulence parameters and some advance methodology which can decompose various turbulence structures at different levels near hydraulic structures. Small-scale turbulence research has valid prospects in open channel flow. The relevance of the study is amplified as we introduce any hydraulic structure in the channel which disturbs the natural flow and creates discontinuity. To recover this discontinuity, the piano key weir (PKW) might be used with sloped keys. Constraints of empirical results in the vicinity of PKW necessitate extensive laboratory experiments with fair and reliable instrumentation techniques. Acoustic Doppler velocimeter was established to be best suited within range of some limitations using principal component analysis. Wavelet analysis is proposed to decompose the underlying turbulence structure in a better way.

  8. Channel-parameter estimation for satellite-to-submarine continuous-variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Guo, Ying; Xie, Cailang; Huang, Peng; Li, Jiawei; Zhang, Ling; Huang, Duan; Zeng, Guihua

    2018-05-01

    This paper deals with a channel-parameter estimation for continuous-variable quantum key distribution (CV-QKD) over a satellite-to-submarine link. In particular, we focus on the channel transmittances and the excess noise which are affected by atmospheric turbulence, surface roughness, zenith angle of the satellite, wind speed, submarine depth, etc. The estimation method is based on proposed algorithms and is applied to low-Earth orbits using the Monte Carlo approach. For light at 550 nm with a repetition frequency of 1 MHz, the effects of the estimated parameters on the performance of the CV-QKD system are assessed by a simulation by comparing the secret key bit rate in the daytime and at night. Our results show the feasibility of satellite-to-submarine CV-QKD, providing an unconditionally secure approach to achieve global networks for underwater communications.

  9. Use of conserved key amino acid positions to morph protein folds.

    PubMed

    Reddy, Boojala V B; Li, Wilfred W; Bourne, Philip E

    2002-07-15

    By using three-dimensional (3D) structure alignments and a previously published method to determine Conserved Key Amino Acid Positions (CKAAPs) we propose a theoretical method to design mutations that can be used to morph the protein folds. The original Paracelsus challenge, met by several groups, called for the engineering of a stable but different structure by modifying less than 50% of the amino acid residues. We have used the sequences from the Protein Data Bank (PDB) identifiers 1ROP, and 2CRO, which were previously used in the Paracelsus challenge by those groups, and suggest mutation to CKAAPs to morph the protein fold. The total number of mutations suggested is less than 40% of the starting sequence theoretically improving the challenge results. From secondary structure prediction experiments of the proposed mutant sequence structures, we observe that each of the suggested mutant protein sequences likely folds to a different, non-native potentially stable target structure. These results are an early indicator that analyses using structure alignments leading to CKAAPs of a given structure are of value in protein engineering experiments. Copyright 2002 Wiley Periodicals, Inc.

  10. The Psychedelic Debriefing in Alcohol Dependence Treatment: Illustrating Key Change Phenomena through Qualitative Content Analysis of Clinical Sessions

    PubMed Central

    Nielson, Elizabeth M.; May, Darrick G.; Forcehimes, Alyssa A.; Bogenschutz, Michael P.

    2018-01-01

    Research on the clinical applications of psychedelic-assisted psychotherapy has demonstrated promising early results for treatment of alcohol dependence. Detailed description of the content and methods of psychedelic-assisted psychotherapy, as it is conducted in clinical settings, is scarce. Methods: An open-label pilot (proof-of-concept) study of psilocybin-assisted treatment of alcohol dependence (NCT01534494) was conducted to generate data for a phase 2 RCT (NCT02061293) of a similar treatment in a larger population. The present paper presents a qualitative content analysis of the 17 debriefing sessions conducted in the pilot study, which occurred the day after corresponding psilocybin medication sessions. Results: Participants articulated a series of key phenomena related to change in drinking outcomes and acute subjective effects of psilocybin. Discussion: The data illuminate change processes in patients' own words during clinical sessions, shedding light on potential therapeutic mechanisms of change and how participants express effects of psilocybin. This study is unique in analyzing actual clinical sessions, as opposed to interviews of patients conducted separately from treatment. PMID:29515449

  11. Biased random key genetic algorithm with insertion and gender selection for capacitated vehicle routing problem with time windows

    NASA Astrophysics Data System (ADS)

    Rochman, Auliya Noor; Prasetyo, Hari; Nugroho, Munajat Tri

    2017-06-01

    Vehicle Routing Problem (VRP) often occurs when the manufacturers need to distribute their product to some customers/outlets. The distribution process is typically restricted by the capacity of the vehicle and the working hours at the distributor. This type of VRP is also known as Capacitated Vehicle Routing Problem with Time Windows (CVRPTW). A Biased Random Key Genetic Algorithm (BRKGA) was designed and coded in MATLAB to solve the CVRPTW case of soft drink distribution. The standard BRKGA was then modified by applying chromosome insertion into the initial population and defining chromosome gender for parent undergoing crossover operation. The performance of the established algorithms was then compared to a heuristic procedure for solving a soft drink distribution. Some findings are revealed (1) the total distribution cost of BRKGA with insertion (BRKGA-I) results in a cost saving of 39% compared to the total cost of heuristic method, (2) BRKGA with the gender selection (BRKGA-GS) could further improve the performance of the heuristic method. However, the BRKGA-GS tends to yield worse results compared to that obtained from the standard BRKGA.

  12. Residual stress evaluation of components produced via direct metal laser sintering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kemerling, Brandon; Lippold, John C.; Fancher, Christopher M.

    Direct metal laser sintering is an additive manufacturing process which is capable of fabricating three-dimensional components using a laser energy source and metal powder particles. Despite the numerous benefits offered by this technology, the process maturity is low with respect to traditional subtractive manufacturing methods. Relationships between key processing parameters and final part properties are generally lacking and require further development. In this study, residual stresses were evaluated as a function of key process variables. The variables evaluated included laser scan strategy and build plate preheat temperature. Residual stresses were measured experimentally via neutron diffraction and computationally via finite elementmore » analysis. Good agreement was shown between the experimental and computational results. Results showed variations in the residual stress profile as a function of laser scan strategy. Compressive stresses were dominant along the build height (z) direction, and tensile stresses were dominant in the x and y directions. Build plate preheating was shown to be an effective method for alleviating residual stress due to the reduction in thermal gradient.« less

  13. Residual stress evaluation of components produced via direct metal laser sintering

    DOE PAGES

    Kemerling, Brandon; Lippold, John C.; Fancher, Christopher M.; ...

    2018-03-22

    Direct metal laser sintering is an additive manufacturing process which is capable of fabricating three-dimensional components using a laser energy source and metal powder particles. Despite the numerous benefits offered by this technology, the process maturity is low with respect to traditional subtractive manufacturing methods. Relationships between key processing parameters and final part properties are generally lacking and require further development. In this study, residual stresses were evaluated as a function of key process variables. The variables evaluated included laser scan strategy and build plate preheat temperature. Residual stresses were measured experimentally via neutron diffraction and computationally via finite elementmore » analysis. Good agreement was shown between the experimental and computational results. Results showed variations in the residual stress profile as a function of laser scan strategy. Compressive stresses were dominant along the build height (z) direction, and tensile stresses were dominant in the x and y directions. Build plate preheating was shown to be an effective method for alleviating residual stress due to the reduction in thermal gradient.« less

  14. Performance analysis of EM-based blind detection for ON-OFF keying modulation over atmospheric optical channels

    NASA Astrophysics Data System (ADS)

    Dabiri, Mohammad Taghi; Sadough, Seyed Mohammad Sajad

    2018-04-01

    In the free-space optical (FSO) links, atmospheric turbulence lead to scintillation in the received signal. Due to its ease of implementation, intensity modulation with direct detection (IM/DD) based on ON-OFF keying (OOK) is a popular signaling scheme in these systems. Over turbulence channel, to detect OOK symbols in a blind way, i.e., without sending pilot symbols, an expectation-maximization (EM)-based detection method was recently proposed in the literature related to free-space optical (FSO) communication. However, the performance of EM-based detection methods severely depends on the length of the observation interval (Ls). To choose the optimum values of Ls at target bit error rates (BER)s of FSO communications which are commonly lower than 10-9, Monte-Carlo simulations would be very cumbersome and require a very long processing time. To facilitate performance evaluation, in this letter we derive the analytic expressions for BER and outage probability. Numerical results validate the accuracy of our derived analytic expressions. Our results may serve to evaluate the optimum value for Ls without resorting to time-consuming Monte-Carlo simulations.

  15. Assessing Opportunities for Student Pharmacist Leadership Development at Schools of Pharmacy in the United States.

    PubMed

    Feller, Tara T; Doucette, William R; Witry, Matthew J

    2016-06-25

    Objective. To summarize student pharmacist leadership development opportunities delivered by pharmacy programs, to describe selected opportunities, and to assess how these opportunities meet leadership development competencies. Methods. A multi-method study was conducted that comprised a systematic content analysis of pharmacy education journals, pharmacy program websites, and telephone interviews with key informants, which included open-ended questions and scaled responses. Results. Review of six articles, 37 American Association of Colleges of Pharmacy (AACP) Annual Meeting abstracts, and 138 websites resulted in the identification of 191 leadership development opportunities. These consisted of courses, projects/programs, and events/speaker series. Interviews with 12 key informants detailed unique events that developed leadership competencies. Formal assessments of student leadership development were limited and primarily focused on informal feedback and course evaluations. Conclusion. Most US pharmacy programs offer their students an array of opportunities to develop leadership abilities. Pharmacy programs should consider expanding opportunities beyond elective courses, learn from the successes of others to implement new leadership development opportunities, and bolster the assessment of student leadership competencies and outcomes.

  16. Assessing Opportunities for Student Pharmacist Leadership Development at Schools of Pharmacy in the United States

    PubMed Central

    Feller, Tara T.; Witry, Matthew J.

    2016-01-01

    Objective. To summarize student pharmacist leadership development opportunities delivered by pharmacy programs, to describe selected opportunities, and to assess how these opportunities meet leadership development competencies. Methods. A multi-method study was conducted that comprised a systematic content analysis of pharmacy education journals, pharmacy program websites, and telephone interviews with key informants, which included open-ended questions and scaled responses. Results. Review of six articles, 37 American Association of Colleges of Pharmacy (AACP) Annual Meeting abstracts, and 138 websites resulted in the identification of 191 leadership development opportunities. These consisted of courses, projects/programs, and events/speaker series. Interviews with 12 key informants detailed unique events that developed leadership competencies. Formal assessments of student leadership development were limited and primarily focused on informal feedback and course evaluations. Conclusion. Most US pharmacy programs offer their students an array of opportunities to develop leadership abilities. Pharmacy programs should consider expanding opportunities beyond elective courses, learn from the successes of others to implement new leadership development opportunities, and bolster the assessment of student leadership competencies and outcomes. PMID:27402982

  17. Security Analysis of Measurement-Device-Independent Quantum Key Distribution in Collective-Rotation Noisy Environment

    NASA Astrophysics Data System (ADS)

    Li, Na; Zhang, Yu; Wen, Shuang; Li, Lei-lei; Li, Jian

    2018-01-01

    Noise is a problem that communication channels cannot avoid. It is, thus, beneficial to analyze the security of MDI-QKD in noisy environment. An analysis model for collective-rotation noise is introduced, and the information theory methods are used to analyze the security of the protocol. The maximum amount of information that Eve can eavesdrop is 50%, and the eavesdropping can always be detected if the noise level ɛ ≤ 0.68. Therefore, MDI-QKD protocol is secure as quantum key distribution protocol. The maximum probability that the relay outputs successful results is 16% when existing eavesdropping. Moreover, the probability that the relay outputs successful results when existing eavesdropping is higher than the situation without eavesdropping. The paper validates that MDI-QKD protocol has better robustness.

  18. A Novel Real-Time Reference Key Frame Scan Matching Method

    PubMed Central

    Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu

    2017-01-01

    Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions’ environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems. PMID:28481285

  19. The construction and assessment of a statistical model for the prediction of protein assay data.

    PubMed

    Pittman, J; Sacks, J; Young, S Stanley

    2002-01-01

    The focus of this work is the development of a statistical model for a bioinformatics database whose distinctive structure makes model assessment an interesting and challenging problem. The key components of the statistical methodology, including a fast approximation to the singular value decomposition and the use of adaptive spline modeling and tree-based methods, are described, and preliminary results are presented. These results are shown to compare favorably to selected results achieved using comparitive methods. An attempt to determine the predictive ability of the model through the use of cross-validation experiments is discussed. In conclusion a synopsis of the results of these experiments and their implications for the analysis of bioinformatic databases in general is presented.

  20. Assessing and Valuing Historical Geospatial Data for Decisions

    NASA Astrophysics Data System (ADS)

    Sylak-Glassman, E.; Gallo, J.

    2016-12-01

    We will present a method for assessing the use and valuation of historical geospatial data and information products derived from Earth observations (EO). Historical data is widely used in the establishment of baseline reference cases, time-series analysis, and Earth system modeling. Historical geospatial data is used in diverse application areas, such as risk assessment in the insurance and reinsurance industry, disaster preparedness and response planning, historical demography, land-use change analysis, and paleoclimate research, among others. Establishing the current value of previously collected data, often from EO systems that are no longer operating, is difficult since the costs associated with their preservation, maintenance, and dissemination are current, while the costs associated with their original collection are sunk. Understanding their current use and value can aid in funding decisions about the data management infrastructure and workforce allocation required to maintain their availability. Using a value-tree framework to trace the application of data from EO systems, sensors, networks, and surveys, to weighted key Federal objectives, we are able to estimate relative contribution of individual EO systems, sensors, networks, and surveys to meeting those objectives. The analysis relies on a modified Delphi method to elicit relative levels of reliance on individual EO data inputs, including historical data, from subject matter experts. This results in the identification of a representative portfolio of all EO data used to meet key Federal objectives. Because historical data is collected in conjunction with all other EO data within a weighted framework, its contribution to meeting key Federal objectives can be specifically identified and evaluated in relationship to other EO data. The results of this method could be applied better understanding and projecting the long-term value of data from current and future EO systems.

  1. Innovation in mental health services: what are the key components of success?

    PubMed Central

    2011-01-01

    Background Service development innovation in health technology and practice is viewed as a pressing need within the field of mental health yet is relatively poorly understood. Macro-level theories have been criticised for their limited explanatory power and they may not be appropriate for understanding local and fine-grained uncertainties of services and barriers to the sustainability of change. This study aimed to identify contextual influences inhibiting or promoting the acceptance and integration of innovations in mental health services in both National Health Service (NHS) and community settings. Methods A comparative study using qualitative and case study data collection methods, including semi-structured interviews with key stakeholders and follow-up telephone interviews over a one-year period. The analysis was informed by learning organisation theory. Drawn from 11 mental health innovation projects within community, voluntary and NHS settings, 65 participants were recruited including service users, commissioners, health and non-health professionals, managers, and caregivers. The methods deployed in this evaluation focused on process-outcome links within and between the 11 projects. Results Key barriers to innovation included resistance from corporate departments and middle management, complexity of the innovation, and the availability and access to resources on a prospective basis within the host organisation. The results informed the construction of a proposed model of innovation implementation within mental health services. The main components of which are context, process, and outcomes. Conclusions The study produced a model of conducive and impeding factors drawn from the composite picture of 11 innovative mental health projects, and this is discussed in light of relevant literature. The model provides a rich agenda to consider for services wanting to innovate or adopt innovations from elsewhere. The evaluation suggested the importance of studying innovation with a focus on context, process, and outcomes. PMID:22029930

  2. Genetic algorithms in teaching artificial intelligence (automated generation of specific algebras)

    NASA Astrophysics Data System (ADS)

    Habiballa, Hashim; Jendryscik, Radek

    2017-11-01

    The problem of teaching essential Artificial Intelligence (AI) methods is an important task for an educator in the branch of soft-computing. The key focus is often given to proper understanding of the principle of AI methods in two essential points - why we use soft-computing methods at all and how we apply these methods to generate reasonable results in sensible time. We present one interesting problem solved in the non-educational research concerning automated generation of specific algebras in the huge search space. We emphasize above mentioned points as an educational case study of an interesting problem in automated generation of specific algebras.

  3. [An EMD based time-frequency distribution and its application in EEG analysis].

    PubMed

    Li, Xiaobing; Chu, Meng; Qiu, Tianshuang; Bao, Haiping

    2007-10-01

    Hilbert-Huang transform (HHT) is a new time-frequency analytic method to analyze the nonlinear and the non-stationary signals. The key step of this method is the empirical mode decomposition (EMD), with which any complicated signal can be decomposed into a finite and small number of intrinsic mode functions (IMF). In this paper, a new EMD based method for suppressing the cross-term of Wigner-Ville distribution (WVD) is developed and is applied to analyze the epileptic EEG signals. The simulation data and analysis results show that the new method suppresses the cross-term of the WVD effectively with an excellent resolution.

  4. Research and Implementation of Tibetan Word Segmentation Based on Syllable Methods

    NASA Astrophysics Data System (ADS)

    Jiang, Jing; Li, Yachao; Jiang, Tao; Yu, Hongzhi

    2018-03-01

    Tibetan word segmentation (TWS) is an important problem in Tibetan information processing, while abbreviated word recognition is one of the key and most difficult problems in TWS. Most of the existing methods of Tibetan abbreviated word recognition are rule-based approaches, which need vocabulary support. In this paper, we propose a method based on sequence tagging model for abbreviated word recognition, and then implement in TWS systems with sequence labeling models. The experimental results show that our abbreviated word recognition method is fast and effective and can be combined easily with the segmentation model. This significantly increases the effect of the Tibetan word segmentation.

  5. Retrieving the aerosol lidar ratio profile by combining ground- and space-based elastic lidars.

    PubMed

    Feiyue, Mao; Wei, Gong; Yingying, Ma

    2012-02-15

    The aerosol lidar ratio is a key parameter for the retrieval of aerosol optical properties from elastic lidar, which changes largely for aerosols with different chemical and physical properties. We proposed a method for retrieving the aerosol lidar ratio profile by combining simultaneous ground- and space-based elastic lidars. The method was tested by a simulated case and a real case at 532 nm wavelength. The results demonstrated that our method is robust and can obtain accurate lidar ratio and extinction coefficient profiles. Our method can be useful for determining the local and global lidar ratio and validating space-based lidar datasets.

  6. Anatomising proton NMR spectra with pure shift 2D J-spectroscopy: A cautionary tale

    NASA Astrophysics Data System (ADS)

    Kiraly, Peter; Foroozandeh, Mohammadali; Nilsson, Mathias; Morris, Gareth A.

    2017-09-01

    Analysis of proton NMR spectra has been a key tool in structure determination for over 60 years. A classic tool is 2D J-spectroscopy, but common problems are the difficulty of obtaining the absorption mode lineshapes needed for accurate results, and the need for a 45° shear of the final 2D spectrum. A novel 2D NMR method is reported here that allows straightforward determination of homonuclear couplings, using a modified version of the PSYCHE method to suppress couplings in the direct dimension. The method illustrates the need for care when combining pure shift data acquisition with multiple pulse methods.

  7. Study on evaluation methods for Rayleigh wave dispersion characteristic

    USGS Publications Warehouse

    Shi, L.; Tao, X.; Kayen, R.; Shi, H.; Yan, S.

    2005-01-01

    The evaluation of Rayleigh wave dispersion characteristic is the key step for detecting S-wave velocity structure. By comparing the dispersion curves directly with the spectra analysis of surface waves (SASW) method, rather than comparing the S-wave velocity structure, the validity and precision of microtremor-array method (MAM) can be evaluated more objectively. The results from the China - US joint surface wave investigation in 26 sites in Tangshan, China, show that the MAM has the same precision with SASW method in 83% of the 26 sites. The MAM is valid for Rayleigh wave dispersion characteristic testing and has great application potentiality for site S-wave velocity structure detection.

  8. Real-Time Biologically Inspired Action Recognition from Key Poses Using a Neuromorphic Architecture.

    PubMed

    Layher, Georg; Brosch, Tobias; Neumann, Heiko

    2017-01-01

    Intelligent agents, such as robots, have to serve a multitude of autonomous functions. Examples are, e.g., collision avoidance, navigation and route planning, active sensing of its environment, or the interaction and non-verbal communication with people in the extended reach space. Here, we focus on the recognition of the action of a human agent based on a biologically inspired visual architecture of analyzing articulated movements. The proposed processing architecture builds upon coarsely segregated streams of sensory processing along different pathways which separately process form and motion information (Layher et al., 2014). Action recognition is performed in an event-based scheme by identifying representations of characteristic pose configurations (key poses) in an image sequence. In line with perceptual studies, key poses are selected unsupervised utilizing a feature-driven criterion which combines extrema in the motion energy with the horizontal and the vertical extendedness of a body shape. Per class representations of key pose frames are learned using a deep convolutional neural network consisting of 15 convolutional layers. The network is trained using the energy-efficient deep neuromorphic networks ( Eedn ) framework (Esser et al., 2016), which realizes the mapping of the trained synaptic weights onto the IBM Neurosynaptic System platform (Merolla et al., 2014). After the mapping, the trained network achieves real-time capabilities for processing input streams and classify input images at about 1,000 frames per second while the computational stages only consume about 70 mW of energy (without spike transduction). Particularly regarding mobile robotic systems, a low energy profile might be crucial in a variety of application scenarios. Cross-validation results are reported for two different datasets and compared to state-of-the-art action recognition approaches. The results demonstrate, that (I) the presented approach is on par with other key pose based methods described in the literature, which select key pose frames by optimizing classification accuracy, (II) compared to the training on the full set of frames, representations trained on key pose frames result in a higher confidence in class assignments, and (III) key pose representations show promising generalization capabilities in a cross-dataset evaluation.

  9. Real-Time Biologically Inspired Action Recognition from Key Poses Using a Neuromorphic Architecture

    PubMed Central

    Layher, Georg; Brosch, Tobias; Neumann, Heiko

    2017-01-01

    Intelligent agents, such as robots, have to serve a multitude of autonomous functions. Examples are, e.g., collision avoidance, navigation and route planning, active sensing of its environment, or the interaction and non-verbal communication with people in the extended reach space. Here, we focus on the recognition of the action of a human agent based on a biologically inspired visual architecture of analyzing articulated movements. The proposed processing architecture builds upon coarsely segregated streams of sensory processing along different pathways which separately process form and motion information (Layher et al., 2014). Action recognition is performed in an event-based scheme by identifying representations of characteristic pose configurations (key poses) in an image sequence. In line with perceptual studies, key poses are selected unsupervised utilizing a feature-driven criterion which combines extrema in the motion energy with the horizontal and the vertical extendedness of a body shape. Per class representations of key pose frames are learned using a deep convolutional neural network consisting of 15 convolutional layers. The network is trained using the energy-efficient deep neuromorphic networks (Eedn) framework (Esser et al., 2016), which realizes the mapping of the trained synaptic weights onto the IBM Neurosynaptic System platform (Merolla et al., 2014). After the mapping, the trained network achieves real-time capabilities for processing input streams and classify input images at about 1,000 frames per second while the computational stages only consume about 70 mW of energy (without spike transduction). Particularly regarding mobile robotic systems, a low energy profile might be crucial in a variety of application scenarios. Cross-validation results are reported for two different datasets and compared to state-of-the-art action recognition approaches. The results demonstrate, that (I) the presented approach is on par with other key pose based methods described in the literature, which select key pose frames by optimizing classification accuracy, (II) compared to the training on the full set of frames, representations trained on key pose frames result in a higher confidence in class assignments, and (III) key pose representations show promising generalization capabilities in a cross-dataset evaluation. PMID:28381998

  10. How Is This Flower Pollinated? A Polyclave Key to Use in Teaching.

    ERIC Educational Resources Information Center

    Tyrrell, Lucy

    1989-01-01

    Presents an identification method which uses the process of elimination to identify pollination systems. Provides the polyclave key, methodology for using the key, a sample worksheet, and abbreviation codes for pollination systems. (MVL)

  11. Analysis of complex decisionmaking processes. [with application to jet engine development

    NASA Technical Reports Server (NTRS)

    Hill, J. D.; Ollila, R. G.

    1978-01-01

    The analysis of corporate decisionmaking processes related to major system developments is unusually difficult because of the number of decisionmakers involved in the process and the long development cycle. A method for analyzing such decision processes is developed and illustrated through its application to the analysis of the commercial jet engine development process. The method uses interaction matrices as the key tool for structuring the problem, recording data, and analyzing the data to establish the rank order of the major factors affecting development decisions. In the example, the use of interaction matrices permitted analysts to collect and analyze approximately 50 factors that influenced decisions during the four phases of the development cycle, and to determine the key influencers of decisions at each development phase. The results of this study indicate that the cost of new technology installed on an aircraft is the prime concern of the engine manufacturer.

  12. Body measurements of Chinese males in dynamic postures and application.

    PubMed

    Wang, Y J; Mok, P Y; Li, Y; Kwok, Y L

    2011-11-01

    It is generally accepted that there is a relationship between body dimensions, body movement and clothing wearing ease design, and yet previous research in this area has been neither sufficient nor systematic. This paper proposes a method to measure the human body in the static state and in 17 dynamic postures, so as to understand dimensional changes of different body parts during dynamic movements. Experimental work is carried out to collect 30 measurements of 10 male Chinese subjects in both static and dynamic states. Factor analysis is used to analyse body measurement data in a static state, and such key measurements describe the characteristics of different body figures. Moreover, one-way ANOVA is used to analyse how dynamic postures affect these key body measurements. Finally, an application of the research results is suggested: a dynamic block patternmaking method for high-performance clothing design. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  13. A watermarking algorithm for polysomnography data.

    PubMed

    Jamasebi, R; Johnson, N L; Kaffashi, F; Redline, S; Loparo, K A

    2008-01-01

    A blind watermarking algorithm for polysomnography (PSG) data in European Data Format (EDF) has been developed for the identification and attribution of shared data. This is accomplished by hiding a unique identifier in the phase spectrum of each PSG epoch using an undisclosed key so that a third party cannot retrieve the watermark without knowledge of the key. A pattern discovery algorithm is developed to find the watermark pattern even though the data may have been altered. The method is evaluated using 25 PSG studies from the Sleep Heart Health Study database. The integrity of the signal data was determined using time series measures of both the original and watermarked signals, and by determining its effect on scoring sleep stages from the PSG data. The results of the analysis indicate that the proposed watermarking method for PSG data is an effective and efficient way to identify shared data without compromising its intended use.

  14. Waveguide-type optical circuits for recognition of optical 8QAM-coded label

    NASA Astrophysics Data System (ADS)

    Surenkhorol, Tumendemberel; Kishikawa, Hiroki; Goto, Nobuo; Gonchigsumlaa, Khishigjargal

    2017-10-01

    Optical signal processing is expected to be applied in network nodes. In photonic routers, label recognition is one of the important functions. We have studied different kinds of label recognition methods so far for on-off keying, binary phase-shift keying, quadrature phase-shift keying, and 16 quadrature amplitude modulation-coded labels. We propose a method based on waveguide circuits to recognize an optical eight quadrature amplitude modulation (8QAM)-coded label by simple passive optical signal processing. The recognition of the proposed method is theoretically analyzed and numerically simulated by the finite difference beam propagation method. The noise tolerance is discussed, and bit-error rate against optical signal-to-noise ratio is evaluated. The scalability of the proposed method is also discussed theoretically for two-symbol length 8QAM-coded labels.

  15. Methods for synchronizing a countdown routine of a timer key and electronic device

    DOEpatents

    Condit, Reston A.; Daniels, Michael A.; Clemens, Gregory P.; Tomberlin, Eric S.; Johnson, Joel A.

    2015-06-02

    A timer key relating to monitoring a countdown time of a countdown routine of an electronic device is disclosed. The timer key comprises a processor configured to respond to a countdown time associated with operation of the electronic device, a display operably coupled with the processor, and a housing configured to house at least the processor. The housing has an associated structure configured to engage with the electronic device to share the countdown time between the electronic device and the timer key. The processor is configured to begin a countdown routine based at least in part on the countdown time, wherein the countdown routine is at least substantially synchronized with a countdown routine of the electronic device when the timer key is removed from the electronic device. A system and method for synchronizing countdown routines of a timer key and an electronic device are also disclosed.

  16. Apparatus, system, and method for synchronizing a timer key

    DOEpatents

    Condit, Reston A; Daniels, Michael A; Clemens, Gregory P; Tomberlin, Eric S; Johnson, Joel A

    2014-04-22

    A timer key relating to monitoring a countdown time of a countdown routine of an electronic device is disclosed. The timer key comprises a processor configured to respond to a countdown time associated with operation of the electronic device, a display operably coupled with the processor, and a housing configured to house at least the processor. The housing has an associated structure configured to engage with the electronic device to share the countdown time between the electronic device and the timer key. The processor is configured to begin a countdown routine based at least in part on the countdown time, wherein the countdown routine is at least substantially synchronized with a countdown routine of the electronic device when the timer key is removed from the electronic device. A system and method for synchronizing countdown routines of a timer key and an electronic device are also disclosed.

  17. A State-of-the-Art Review: Personalization of Tinnitus Sound Therapy.

    PubMed

    Searchfield, Grant D; Durai, Mithila; Linford, Tania

    2017-01-01

    Background: There are several established, and an increasing number of putative, therapies using sound to treat tinnitus. There appear to be few guidelines for sound therapy selection and application. Aim: To review current approaches to personalizing sound therapy for tinnitus. Methods: A "state-of-the-art" review (Grant and Booth, 2009) was undertaken to answer the question: how do current sound-based therapies for tinnitus adjust for tinnitus heterogeneity? Scopus, Google Scholar, Embase and PubMed were searched for the 10-year period 2006-2016. The search strategy used the following key words: "tinnitus" AND "sound" AND "therapy" AND "guidelines" OR "personalized" OR "customized" OR "individual" OR "questionnaire" OR "selection." The results of the review were cataloged and organized into themes. Results: In total 165 articles were reviewed in full, 83 contained sufficient details to contribute to answering the study question. The key themes identified were hearing compensation, pitched-match therapy, maskability, reaction to sound and psychosocial factors. Although many therapies mentioned customization, few could be classified as being personalized. Several psychoacoustic and questionnaire-based methods for assisting treatment selection were identified. Conclusions: Assessment methods are available to assist clinicians to personalize sound-therapy and empower patients to be active in therapy decision-making. Most current therapies are modified using only one characteristic of the individual and/or their tinnitus.

  18. Actinobacillus succinogenes ATCC 55618 Fermentation Medium Optimization for the Production of Succinic Acid by Response Surface Methodology

    PubMed Central

    Zhu, Li-Wen; Wang, Cheng-Cheng; Liu, Rui-Sang; Li, Hong-Mei; Wan, Duan-Ji; Tang, Ya-Jie

    2012-01-01

    As a potential intermediary feedstock, succinic acid takes an important place in bulk chemical productions. For the first time, a method combining Plackett-Burman design (PBD), steepest ascent method (SA), and Box-Behnken design (BBD) was developed to optimize Actinobacillus succinogenes ATCC 55618 fermentation medium. First, glucose, yeast extract, and MgCO3 were identified to be key medium components by PBD. Second, preliminary optimization was run by SA method to access the optimal region of the key medium components. Finally, the responses, that is, the production of succinic acid, were optimized simultaneously by using BBD, and the optimal concentration was located to be 84.6 g L−1 of glucose, 14.5 g L−1 of yeast extract, and 64.7 g L−1 of MgCO3. Verification experiment indicated that the maximal succinic acid production of 52.7 ± 0.8 g L−1 was obtained under the identified optimal conditions. The result agreed with the predicted value well. Compared with that of the basic medium, the production of succinic acid and yield of succinic acid against glucose were enhanced by 67.3% and 111.1%, respectively. The results obtained in this study may be useful for the industrial commercial production of succinic acid. PMID:23093852

  19. High-efficiency reconciliation for continuous variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Bai, Zengliang; Yang, Shenshen; Li, Yongmin

    2017-04-01

    Quantum key distribution (QKD) is the most mature application of quantum information technology. Information reconciliation is a crucial step in QKD and significantly affects the final secret key rates shared between two legitimate parties. We analyze and compare various construction methods of low-density parity-check (LDPC) codes and design high-performance irregular LDPC codes with a block length of 106. Starting from these good codes and exploiting the slice reconciliation technique based on multilevel coding and multistage decoding, we realize high-efficiency Gaussian key reconciliation with efficiency higher than 95% for signal-to-noise ratios above 1. Our demonstrated method can be readily applied in continuous variable QKD.

  20. Image Mosaic Method Based on SIFT Features of Line Segment

    PubMed Central

    Zhu, Jun; Ren, Mingwu

    2014-01-01

    This paper proposes a novel image mosaic method based on SIFT (Scale Invariant Feature Transform) feature of line segment, aiming to resolve incident scaling, rotation, changes in lighting condition, and so on between two images in the panoramic image mosaic process. This method firstly uses Harris corner detection operator to detect key points. Secondly, it constructs directed line segments, describes them with SIFT feature, and matches those directed segments to acquire rough point matching. Finally, Ransac method is used to eliminate wrong pairs in order to accomplish image mosaic. The results from experiment based on four pairs of images show that our method has strong robustness for resolution, lighting, rotation, and scaling. PMID:24511326

  1. Objective comparison of particle tracking methods

    PubMed Central

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F.; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R.; Godinez, William J.; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E. G.; Jaldén, Joakim; Blau, Helen M.; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L.; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P.; Dan, Han-Wei; Tsai, Yuh-Show; de Solórzano, Carlos Ortiz; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-01-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Since manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized, for the first time, an open competition, in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to important practical conclusions for users and developers. PMID:24441936

  2. Bridge Condition Assessment Using D Numbers

    PubMed Central

    Hu, Yong

    2014-01-01

    Bridge condition assessment is a complex problem influenced by many factors. The uncertain environment increases more its complexity. Due to the uncertainty in the process of assessment, one of the key problems is the representation of assessment results. Though there exists many methods that can deal with uncertain information, however, they have more or less deficiencies. In this paper, a new representation of uncertain information, called D numbers, is presented. It extends the Dempster-Shafer theory. By using D numbers, a new method is developed for the bridge condition assessment. Compared to these existing methods, the proposed method is simpler and more effective. An illustrative case is given to show the effectiveness of the new method. PMID:24696639

  3. High performance frame synchronization for continuous variable quantum key distribution systems.

    PubMed

    Lin, Dakai; Huang, Peng; Huang, Duan; Wang, Chao; Peng, Jinye; Zeng, Guihua

    2015-08-24

    Considering a practical continuous variable quantum key distribution(CVQKD) system, synchronization is of significant importance as it is hardly possible to extract secret keys from unsynchronized strings. In this paper, we proposed a high performance frame synchronization method for CVQKD systems which is capable to operate under low signal-to-noise(SNR) ratios and is compatible with random phase shift induced by quantum channel. A practical implementation of this method with low complexity is presented and its performance is analysed. By adjusting the length of synchronization frame, this method can work well with large range of SNR values which paves the way for longer distance CVQKD.

  4. Optimisation of DNA extraction from the crustacean Daphnia

    PubMed Central

    Athanasio, Camila Gonçalves; Chipman, James K.; Viant, Mark R.

    2016-01-01

    Daphnia are key model organisms for mechanistic studies of phenotypic plasticity, adaptation and microevolution, which have led to an increasing demand for genomics resources. A key step in any genomics analysis, such as high-throughput sequencing, is the availability of sufficient and high quality DNA. Although commercial kits exist to extract genomic DNA from several species, preparation of high quality DNA from Daphnia spp. and other chitinous species can be challenging. Here, we optimise methods for tissue homogenisation, DNA extraction and quantification customised for different downstream analyses (e.g., LC-MS/MS, Hiseq, mate pair sequencing or Nanopore). We demonstrate that if Daphnia magna are homogenised as whole animals (including the carapace), absorbance-based DNA quantification methods significantly over-estimate the amount of DNA, resulting in using insufficient starting material for experiments, such as preparation of sequencing libraries. This is attributed to the high refractive index of chitin in Daphnia’s carapace at 260 nm. Therefore, unless the carapace is removed by overnight proteinase digestion, the extracted DNA should be quantified with fluorescence-based methods. However, overnight proteinase digestion will result in partial fragmentation of DNA therefore the prepared DNA is not suitable for downstream methods that require high molecular weight DNA, such as PacBio, mate pair sequencing and Nanopore. In conclusion, we found that the MasterPure DNA purification kit, coupled with grinding of frozen tissue, is the best method for extraction of high molecular weight DNA as long as the extracted DNA is quantified with fluorescence-based methods. This method generated high yield and high molecular weight DNA (3.10 ± 0.63 ng/µg dry mass, fragments >60 kb), free of organic contaminants (phenol, chloroform) and is suitable for large number of downstream analyses. PMID:27190714

  5. a Gross Error Elimination Method for Point Cloud Data Based on Kd-Tree

    NASA Astrophysics Data System (ADS)

    Kang, Q.; Huang, G.; Yang, S.

    2018-04-01

    Point cloud data has been one type of widely used data sources in the field of remote sensing. Key steps of point cloud data's pro-processing focus on gross error elimination and quality control. Owing to the volume feature of point could data, existed gross error elimination methods need spend massive memory both in space and time. This paper employed a new method which based on Kd-tree algorithm to construct, k-nearest neighbor algorithm to search, settled appropriate threshold to determine with result turns out a judgement that whether target point is or not an outlier. Experimental results show that, our proposed algorithm will help to delete gross error in point cloud data and facilitate to decrease memory consumption, improve efficiency.

  6. Optimizing prognosis-related key miRNA-target interactions responsible for cancer metastasis.

    PubMed

    Zhao, Hongying; Yuan, Huating; Hu, Jing; Xu, Chaohan; Liao, Gaoming; Yin, Wenkang; Xu, Liwen; Wang, Li; Zhang, Xinxin; Shi, Aiai; Li, Jing; Xiao, Yun

    2017-12-12

    Increasing evidence suggests that the abnormality of microRNAs (miRNAs) and their downstream targets is frequently implicated in the pathogenesis of human cancers, however, the clinical benefit of causal miRNA-target interactions has been seldom studied. Here, we proposed a computational method to optimize prognosis-related key miRNA-target interactions by combining transcriptome and clinical data from thousands of TCGA tumors across 16 cancer types. We obtained a total of 1,956 prognosis-related key miRNA-target interactions between 112 miRNAs and 1,443 their targets. Interestingly, these key target genes are specifically involved in tumor progression-related functions, such as 'cell adhesion' and 'cell migration'. Furthermore, they are most significantly correlated with 'tissue invasion and metastasis', a hallmark of metastasis, in ten distinct types of cancer through the hallmark analysis. These results implicated that the prognosis-related key miRNA-target interactions were highly associated with cancer metastasis. Finally, we observed that the combination of these key miRNA-target interactions allowed to distinguish patients with good prognosis from those with poor prognosis both in most TCGA cancer types and independent validation sets, highlighting their roles in cancer metastasis. We provided a user-friendly database named miRNATarget (freely available at http://biocc.hrbmu.edu.cn/miRNATar/), which provides an overview of the prognosis-related key miRNA-target interactions across 16 cancer types.

  7. Modeling a space-based quantum link that includes an adaptive optics system

    NASA Astrophysics Data System (ADS)

    Duchane, Alexander W.; Hodson, Douglas D.; Mailloux, Logan O.

    2017-10-01

    Quantum Key Distribution uses optical pulses to generate shared random bit strings between two locations. If a high percentage of the optical pulses are comprised of single photons, then the statistical nature of light and information theory can be used to generate secure shared random bit strings which can then be converted to keys for encryption systems. When these keys are incorporated along with symmetric encryption techniques such as a one-time pad, then this method of key generation and encryption is resistant to future advances in quantum computing which will significantly degrade the effectiveness of current asymmetric key sharing techniques. This research first reviews the transition of Quantum Key Distribution free-space experiments from the laboratory environment to field experiments, and finally, ongoing space experiments. Next, a propagation model for an optical pulse from low-earth orbit to ground and the effects of turbulence on the transmitted optical pulse is described. An Adaptive Optics system is modeled to correct for the aberrations caused by the atmosphere. The long-term point spread function of the completed low-earth orbit to ground optical system is explored in the results section. Finally, the impact of this optical system and its point spread function on an overall quantum key distribution system as well as the future work necessary to show this impact is described.

  8. Standing on the shoulders of giants: improving medical image segmentation via bias correction.

    PubMed

    Wang, Hongzhi; Das, Sandhitsu; Pluta, John; Craige, Caryne; Altinay, Murat; Avants, Brian; Weiner, Michael; Mueller, Susanne; Yushkevich, Paul

    2010-01-01

    We propose a simple strategy to improve automatic medical image segmentation. The key idea is that without deep understanding of a segmentation method, we can still improve its performance by directly calibrating its results with respect to manual segmentation. We formulate the calibration process as a bias correction problem, which is addressed by machine learning using training data. We apply this methodology on three segmentation problems/methods and show significant improvements for all of them.

  9. Gas demand forecasting by a new artificial intelligent algorithm

    NASA Astrophysics Data System (ADS)

    Khatibi. B, Vahid; Khatibi, Elham

    2012-01-01

    Energy demand forecasting is a key issue for consumers and generators in all energy markets in the world. This paper presents a new forecasting algorithm for daily gas demand prediction. This algorithm combines a wavelet transform and forecasting models such as multi-layer perceptron (MLP), linear regression or GARCH. The proposed method is applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the proposed method.

  10. Unified Least Squares Methods for the Evaluation of Diagnostic Tests With the Gold Standard

    PubMed Central

    Tang, Liansheng Larry; Yuan, Ao; Collins, John; Che, Xuan; Chan, Leighton

    2017-01-01

    The article proposes a unified least squares method to estimate the receiver operating characteristic (ROC) parameters for continuous and ordinal diagnostic tests, such as cancer biomarkers. The method is based on a linear model framework using the empirically estimated sensitivities and specificities as input “data.” It gives consistent estimates for regression and accuracy parameters when the underlying continuous test results are normally distributed after some monotonic transformation. The key difference between the proposed method and the method of Tang and Zhou lies in the response variable. The response variable in the latter is transformed empirical ROC curves at different thresholds. It takes on many values for continuous test results, but few values for ordinal test results. The limited number of values for the response variable makes it impractical for ordinal data. However, the response variable in the proposed method takes on many more distinct values so that the method yields valid estimates for ordinal data. Extensive simulation studies are conducted to investigate and compare the finite sample performance of the proposed method with an existing method, and the method is then used to analyze 2 real cancer diagnostic example as an illustration. PMID:28469385

  11. Perceptions and Use of Technology to Support Older Adults with Multimorbidity.

    PubMed

    Murphy, Emma; Doyle, Julie; Hannigan, Caoimhe; Smith, Suzanne; Kuiper, Janneke; Jacobs, An; Hoogerwerf, Evert-Jan; Desideri, Lorenzo; Fiordelmondo, Valentina; Maluccelli, Lorenza; Brady, Anne-Marie; Dinsmore, John

    2017-01-01

    Digital technologies hold great potential to improve and advance home based integrated care for older people living with multiple chronic health conditions. In this paper, we present the results of a user requirement study for a planned digital integrated care system, based on the experiences and needs of key stakeholders. We present rich, multi-stakeholder, qualitative data on the perceptions and use of technology among older people with multiple chronic health conditions and their key support actors. We have outlined our future work for the design of the system, which will involve continuous stakeholder engagement through a user-centred co-design method.

  12. Combining Cryptography with EEG Biometrics

    PubMed Central

    Kazanavičius, Egidijus; Woźniak, Marcin

    2018-01-01

    Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.

  13. Combining Cryptography with EEG Biometrics.

    PubMed

    Damaševičius, Robertas; Maskeliūnas, Rytis; Kazanavičius, Egidijus; Woźniak, Marcin

    2018-01-01

    Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.

  14. A Novel Image Encryption Algorithm Based on DNA Subsequence Operation

    PubMed Central

    Zhang, Qiang; Xue, Xianglian; Wei, Xiaopeng

    2012-01-01

    We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc.) combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack. PMID:23093912

  15. Luminescence Dating Work From The Heidelberg Group: A Key Technology In Geoarchaeology

    NASA Astrophysics Data System (ADS)

    Wagner, G. A.; Kadereit, A.

    Geoarchaeology is a growing discipline in archaeological science. It aims at the natural environment as context of past human societies and at the interaction between both, the environment and man as part of a joint ecosystem. This topic is also of considerable concern of present societies. Like other historic sciences, geoarchaeology requires accurate chronologies. Since one deals in geoarchaeology predominantly with sediments and rocks, luminescence methods play a key role. This is demonstrated in two case studies from Phlious in southern Greece and Nasca in southern Peru. The results show clearly climatically triggered social developments and feedbacks to the environment.

  16. Magnetic Field Response Measurement Acquisition System

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Taylor,Bryant D.; Shams, Qamar A.; Fox, Robert L.

    2007-01-01

    This paper presents a measurement acquisition method that alleviates many shortcomings of traditional measurement systems. The shortcomings are a finite number of measurement channels, weight penalty associated with measurements, electrical arcing, wire degradations due to wear or chemical decay and the logistics needed to add new sensors. Wire degradation has resulted in aircraft fatalities and critical space launches being delayed. The key to this method is the use of sensors designed as passive inductor-capacitor circuits that produce magnetic field responses. The response attributes correspond to states of physical properties for which the sensors measure. Power is wirelessly provided to the sensing element by using Faraday induction. A radio frequency antenna produces a time-varying magnetic field used to power the sensor and receive the magnetic field response of the sensor. An interrogation system for discerning changes in the sensor response frequency, resistance and amplitude has been developed and is presented herein. Multiple sensors can be interrogated using this method. The method eliminates the need for a data acquisition channel dedicated to each sensor. The method does not require the sensors to be near the acquisition hardware. Methods of developing magnetic field response sensors and the influence of key parameters on measurement acquisition are discussed. Examples of magnetic field response sensors and the respective measurement characterizations are presented. Implementation of this method on an aerospace system is discussed.

  17. Scale invariant texture descriptors for classifying celiac disease

    PubMed Central

    Hegenbart, Sebastian; Uhl, Andreas; Vécsei, Andreas; Wimmer, Georg

    2013-01-01

    Scale invariant texture recognition methods are applied for the computer assisted diagnosis of celiac disease. In particular, emphasis is given to techniques enhancing the scale invariance of multi-scale and multi-orientation wavelet transforms and methods based on fractal analysis. After fine-tuning to specific properties of our celiac disease imagery database, which consists of endoscopic images of the duodenum, some scale invariant (and often even viewpoint invariant) methods provide classification results improving the current state of the art. However, not each of the investigated scale invariant methods is applicable successfully to our dataset. Therefore, the scale invariance of the employed approaches is explicitly assessed and it is found that many of the analyzed methods are not as scale invariant as they theoretically should be. Results imply that scale invariance is not a key-feature required for successful classification of our celiac disease dataset. PMID:23481171

  18. Redistributive effects of the National Health Insurance on physicians in Taiwan: a natural experiment time series study

    PubMed Central

    2013-01-01

    Background Previous studies have evaluated the effects of various health manpower policies but did not include full consideration of the effect of universal health insurance on physician re-distribution. This study examines the effects of implementing National Health Insurance (NHI) on the problem of geographic mal-distribution of health providers in Taiwan. Methods Data on health providers and population between 1971 and 2001 are obtained from relevant governmental publications in Taiwan. Gini coefficients derived from the Lorenz curve are used under a spline regression model to examine the impact of the NHI on the geographic distribution of health providers. Results The geographic distribution equality of the three key health providers has improved significantly after the implementation of NHI program. After accounting for the influences of other confounding factors, Gini coefficients of the three key providers have a net reduction of 1.248% for dentists, 0.365% for western medicine physicians, and 0.311% for Chinese medicine physicians. Overall, the absolute values of the three key providers’ Gini coefficients also become close to one another. Conclusions This study found that NHI’s offering universal health coverage to all citizens and with proper financial incentives have resulted in more equal geographic distributions among the key health care providers in Taiwan. PMID:23374629

  19. Method and system for source authentication in group communications

    NASA Technical Reports Server (NTRS)

    Roy-Chowdhury, Ayan (Inventor); Baras, John S. (Inventor)

    2013-01-01

    A method and system for authentication is provided. A central node for issuing certificates to a plurality of nodes associated with the central node in a network is also provided. The central node receives a first key from at least one node from among the plurality of nodes and generates a second key based on the received first key and generates a certificate for the at least one node. The generated certificate is transmitted to the at least one node.

  20. Flammability Indices for Refrigerants

    NASA Astrophysics Data System (ADS)

    Kataoka, Osami

    This paper introduces a new index to classify flammable refrigerants. A question on flammability indices that ASHRAE employs arose from combustion test results of R152a and ammonia. Conventional methods of not only ASHRAE but also ISO and Japanese High-pressure gas safety law to classify the flammability of refrigerants are evaluated to show why these methods conflict with the test results. The key finding of this paper is that the ratio of stoichiometric concentration to LFL concentration (R factor) represents the test results most precisely. In addition, it has excellent correlation with other flammability parameters such as flame speed and pressure rise coefficient. Classification according to this index gives reasonable flammability order of substances including ammonia, R152a and carbon monoxide. Theoretical background why this index gives good correlation is also discussed as well as the insufficient part of this method.

  1. Capturing how age-friendly communities foster positive health, social participation and health equity: a study protocol of key components and processes that promote population health in aging Canadians.

    PubMed

    Levasseur, Mélanie; Dubois, Marie-France; Généreux, Mélissa; Menec, Verena; Raina, Parminder; Roy, Mathieu; Gabaude, Catherine; Couturier, Yves; St-Pierre, Catherine

    2017-05-25

    To address the challenges of the global aging population, the World Health Organization promoted age-friendly communities as a way to foster the development of active aging community initiatives. Accordingly, key components (i.e., policies, services and structures related to the communities' physical and social environments) should be designed to be age-friendly and help all aging adults to live safely, enjoy good health and stay involved in their communities. Although age-friendly communities are believed to be a promising way to help aging Canadians lead healthy and active lives, little is known about which key components best foster positive health, social participation and health equity, and their underlying mechanisms. This study aims to better understand which and how key components of age-friendly communities best foster positive health, social participation and health equity in aging Canadians. Specifically, the research objectives are to: 1) Describe and compare age-friendly key components of communities across Canada 2) Identify key components best associated with positive health, social participation and health equity of aging adults 3) Explore how these key components foster positive health, social participation and health equity METHODS: A mixed-method sequential explanatory design will be used. The quantitative part will involve a survey of Canadian communities and secondary analysis of cross-sectional data from the Canadian Longitudinal Study on Aging (CLSA). The survey will include an age-friendly questionnaire targeting key components in seven domains: physical environment, housing options, social environment, opportunities for participation, community supports and healthcare services, transportation options, communication and information. The CLSA is a large, national prospective study representative of the Canadian aging population designed to examine health transitions and trajectories of adults as they age. In the qualitative part, a multiple case study will be conducted in five Canadian communities performing best on positive health, social participation and health equity. Building on new and existing collaborations and generating evidence from real-world interventions, the results of this project will help communities to promote age-friendly policies, services and structures which foster positive health, social participation and health equity at a population level.

  2. Computation of transmitted and received B1 fields in magnetic resonance imaging.

    PubMed

    Milles, Julien; Zhu, Yue Min; Chen, Nan-Kuei; Panych, Lawrence P; Gimenez, Gérard; Guttmann, Charles R G

    2006-05-01

    Computation of B1 fields is a key issue for determination and correction of intensity nonuniformity in magnetic resonance images. This paper presents a new method for computing transmitted and received B1 fields. Our method combines a modified MRI acquisition protocol and an estimation technique based on the Levenberg-Marquardt algorithm and spatial filtering. It enables accurate estimation of transmitted and received B1 fields for both homogeneous and heterogeneous objects. The method is validated using numerical simulations and experimental data from phantom and human scans. The experimental results are in agreement with theoretical expectations.

  3. Improved wavelet de-noising method of rail vibration signal for wheel tread detection

    NASA Astrophysics Data System (ADS)

    Zhao, Quan-ke; Zhao, Quanke; Gao, Xiao-rong; Luo, Lin

    2011-12-01

    The irregularities of wheel tread can be detected by processing acceleration vibration signal of railway. Various kinds of noise from different sources such as wheel-rail resonance, bad weather and artificial reasons are the key factors influencing detection accuracy. A method which uses wavelet threshold de-noising is investigated to reduce noise in the detection signal, and an improved signal processing algorithm based on it has been established. The results of simulations and field experiments show that the proposed method can increase signal-to-noise ratio (SNR) of the rail vibration signal effectively, and improve the detection accuracy.

  4. Novel optical scanning cryptography using Fresnel telescope imaging.

    PubMed

    Yan, Aimin; Sun, Jianfeng; Hu, Zhijuan; Zhang, Jingtao; Liu, Liren

    2015-07-13

    We propose a new method called modified optical scanning cryptography using Fresnel telescope imaging technique for encryption and decryption of remote objects. An image or object can be optically encrypted on the fly by Fresnel telescope scanning system together with an encryption key. For image decryption, the encrypted signals are received and processed with an optical coherent heterodyne detection system. The proposed method has strong performance through use of secure Fresnel telescope scanning with orthogonal polarized beams and efficient all-optical information processing. The validity of the proposed method is demonstrated by numerical simulations and experimental results.

  5. Automation of On-Board Flightpath Management

    NASA Technical Reports Server (NTRS)

    Erzberger, H.

    1981-01-01

    The status of concepts and techniques for the design of onboard flight path management systems is reviewed. Such systems are designed to increase flight efficiency and safety by automating the optimization of flight procedures onboard aircraft. After a brief review of the origins and functions of such systems, two complementary methods are described for attacking the key design problem, namely, the synthesis of efficient trajectories. One method optimizes en route, the other optimizes terminal area flight; both methods are rooted in optimal control theory. Simulation and flight test results are reviewed to illustrate the potential of these systems for fuel and cost savings.

  6. A new distributed systems scheduling algorithm: a swarm intelligence approach

    NASA Astrophysics Data System (ADS)

    Haghi Kashani, Mostafa; Sarvizadeh, Raheleh; Jameii, Mahdi

    2011-12-01

    The scheduling problem in distributed systems is known as an NP-complete problem, and methods based on heuristic or metaheuristic search have been proposed to obtain optimal and suboptimal solutions. The task scheduling is a key factor for distributed systems to gain better performance. In this paper, an efficient method based on memetic algorithm is developed to solve the problem of distributed systems scheduling. With regard to load balancing efficiently, Artificial Bee Colony (ABC) has been applied as local search in the proposed memetic algorithm. The proposed method has been compared to existing memetic-Based approach in which Learning Automata method has been used as local search. The results demonstrated that the proposed method outperform the above mentioned method in terms of communication cost.

  7. Issues in Moderation of National Curriculum Assessment at Key Stage 3.

    ERIC Educational Resources Information Center

    Cowling, Les

    1994-01-01

    Highlights the issues of moderation of teacher judgments for accountability and moderation for achieving consistent assessments. Discusses the meaning of moderation, the need for moderating teacher judgments, approaches to moderation, methods of moderation, and most appropriate methods of moderation at Key Stage 3. Presents an approach to…

  8. Incorrect Match Detection Method for Arctic Sea-Ice Reconstruction Using Uav Images

    NASA Astrophysics Data System (ADS)

    Kim, J.-I.; Kim, H.-C.

    2018-05-01

    Shapes and surface roughness, which are considered as key indicators in understanding Arctic sea-ice, can be measured from the digital surface model (DSM) of the target area. Unmanned aerial vehicle (UAV) flying at low altitudes enables theoretically accurate DSM generation. However, the characteristics of sea-ice with textureless surface and incessant motion make image matching difficult for DSM generation. In this paper, we propose a method for effectively detecting incorrect matches before correcting a sea-ice DSM derived from UAV images. The proposed method variably adjusts the size of search window to analyze the matching results of DSM generated and distinguishes incorrect matches. Experimental results showed that the sea-ice DSM produced large errors along the textureless surfaces, and that the incorrect matches could be effectively detected by the proposed method.

  9. Application of the three-dimensional aperiodic Fourier modal method using arc elements in curvilinear coordinates.

    PubMed

    Bucci, Davide; Martin, Bruno; Morand, Alain

    2012-03-01

    This paper deals with a full vectorial generalization of the aperiodic Fourier modal method (AFMM) in cylindrical coordinates. The goal is to predict some key characteristics such as the bending losses of waveguides having an arbitrary distribution of the transverse refractive index. After a description of the method, we compare the results of the cylindrical coordinates AFMM with simulations by the finite-difference time-domain (FDTD) method performed on an S-bend structure made by a 500 nm × 200 nm silicon core (n=3.48) in silica (n=1.44) at a wavelength λ=1550 nm, the bending radius varying from 0.5 up to 2 μm. The FDTD and AFMM results show differences comparable to the variations obtained by changing the parameters of the FDTD simulations.

  10. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    NASA Astrophysics Data System (ADS)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  11. The experimental verification on the shear bearing capacity of exposed steel column foot

    NASA Astrophysics Data System (ADS)

    Xijin, LIU

    2017-04-01

    In terms of the shear bearing capacity of the exposed steel column foot, there are many researches both home and abroad. However, the majority of the researches are limited to the theoretical analysis sector and few of them make the experimental analysis. In accordance with the prototype of an industrial plant in Beijing, this paper designs the experimental model. The experimental model is composed of six steel structural members in two groups, with three members without shear key and three members with shear key. The paper checks the shear bearing capacity of two groups respectively under different axial forces. The experiment shows: The anchor bolt of the exposed steel column foot features relatively large shear bearing capacity which could not be neglected. The results deducted through calculation methods proposed by this paper under two situations match the experimental results in terms of the shear bearing capacity of the steel column foot. Besides, it also proposed suggestions on revising the Code for Design of Steel Structure in the aspect of setting the shear key in the steel column foot.

  12. Thermooptics of magnetoactive media: Faraday isolators for high average power lasers

    NASA Astrophysics Data System (ADS)

    Khazanov, E. A.

    2016-09-01

    The Faraday isolator, one of the key high-power laser elements, provides optical isolation between a master oscillator and a power amplifier or between a laser and its target, for example, a gravitational wave detector interferometer. However, the absorbed radiation inevitably heats the magnetoactive medium and leads to thermally induced polarization and phase distortions in the laser beam. This self-action process limits the use of Faraday isolators in high average power lasers. A unique property of magnetoactive medium thermooptics is that parasitic thermal effects arise on the background of circular birefringence rather than in an isotropic medium. Also, even insignificant polarization distortions of the radiation result in a worse isolation ratio, which is the key characteristic of the Faraday isolator. All possible laser beam distortions are analyzed for their deteriorating effect on the Faraday isolator parameters. The mechanisms responsible for and key physical parameters associated with different kinds of distortions are identified and discussed. Methods for compensating and suppressing parasitic thermal effects are described in detail, the published experimental data are systematized, and avenues for further research are discussed based on the results achieved.

  13. Identifying Regional Key Eco-Space to Maintain Ecological Security Using GIS

    PubMed Central

    Xie, Hualin; Yao, Guanrong; Wang, Peng

    2014-01-01

    Ecological security and environmental sustainability are the foundations of sustainable development. With the acceleration of urbanization, increasing human activities have promoted greater impacts on the eco-spaces that maintain ecological security. Regional key eco-space has become the primary need to maintain environmental sustainability and can offer society with continued ecosystem services. In this paper, considering the security of water resources, biodiversity conservation, disaster avoidance and protection and natural recreation, an integrated index of eco-space importance was established and a method for identifying key eco-space was created using GIS, with Lanzhou City, China as a case study. The results show that the area of core eco-space in the Lanzhou City is approximately 50,908.7 hm2, accounting for 40% of the region’s total area. These areas mainly consist of geological hazard protection zones and the core zones of regional river systems, wetlands, nature reserves, forest parks and scenic spots. The results of this study provide some guidance for the management of ecological security, ecological restoration and environmental sustainability. PMID:24590051

  14. Encoding plaintext by Fourier transform hologram in double random phase encoding using fingerprint keys

    NASA Astrophysics Data System (ADS)

    Takeda, Masafumi; Nakano, Kazuya; Suzuki, Hiroyuki; Yamaguchi, Masahiro

    2012-09-01

    It has been shown that biometric information can be used as a cipher key for binary data encryption by applying double random phase encoding. In such methods, binary data are encoded in a bit pattern image, and the decrypted image becomes a plain image when the key is genuine; otherwise, decrypted images become random images. In some cases, images decrypted by imposters may not be fully random, such that the blurred bit pattern can be partially observed. In this paper, we propose a novel bit coding method based on a Fourier transform hologram, which makes images decrypted by imposters more random. Computer experiments confirm that the method increases the randomness of images decrypted by imposters while keeping the false rejection rate as low as in the conventional method.

  15. Ultrasonic characterization of the fiber-matrix interfacial bond in aerospace composites.

    PubMed

    Aggelis, D G; Kleitsa, D; Matikas, T E

    2013-01-01

    The properties of advanced composites rely on the quality of the fiber-matrix bonding. Service-induced damage results in deterioration of bonding quality, seriously compromising the load-bearing capacity of the structure. While traditional methods to assess bonding are destructive, herein a nondestructive methodology based on shear wave reflection is numerically investigated. Reflection relies on the bonding quality and results in discernable changes in the received waveform. The key element is the "interphase" model material with varying stiffness. The study is an example of how computational methods enhance the understanding of delicate features concerning the nondestructive evaluation of materials used in advanced structures.

  16. Microspine Gripping Mechanism for Asteroid Capture

    NASA Technical Reports Server (NTRS)

    Merriam, Ezekiel G.; Berg, Andrew B.; Willig, Andrew; Parness, Aaron; Frey, Tim; Howell, Larry L.

    2016-01-01

    This paper details the development and early testing of a compliant suspension for a microspine gripper device for asteroid capture or micro-gravity percussive drilling. The microspine gripper architecture is reviewed, and a proposed microspine suspension design is presented and discussed. Prototyping methods are discussed, as well as testing methods and results. A path forward is identified from the results of the testing completed thus far. Key findings include: the microspine concept has been established as a valid architecture and the compliant suspension exhibits the desired stiffness characteristics for good gripping behavior. These developments will aid in developing the capability to grasp irregularly shaped boulders in micro-gravity.

  17. Computations of Drop Collision and Coalescence

    NASA Technical Reports Server (NTRS)

    Tryggvason, Gretar; Juric, Damir; Nas, Selman; Mortazavi, Saeed

    1996-01-01

    Computations of drops collisions, coalescence, and other problems involving drops are presented. The computations are made possible by a finite difference/front tracking technique that allows direct solutions of the Navier-Stokes equations for a multi-fluid system with complex, unsteady internal boundaries. This method has been used to examine the various collision modes for binary collisions of drops of equal size, mixing of two drops of unequal size, behavior of a suspension of drops in linear and parabolic shear flows, and the thermal migration of several drops. The key results from these simulations are reviewed. Extensions of the method to phase change problems and preliminary results for boiling are also shown.

  18. After-visit summaries in primary care: mixed methods results from a literature review and stakeholder interviews.

    PubMed

    Lyles, Courtney R; Gupta, Reena; Tieu, Lina; Fernandez, Alicia

    2018-05-28

    After-visit summary (AVS) documents presenting key information from each medical encounter have become standard in the USA due to federal health care reform. Little is known about how they are used or whether they improve patient care. First, we completed a literature review and described the totality of the literature on AVS by article type and major outcome measures. Next, we used reputational sampling from large-scale US studies on primary care to identify and interview nine stakeholders on their perceptions of AVS across high-performing primary care practices. Interviews were transcribed and coded for AVS use in practice, perceptions of the best/worst features and recommendations for improving AVS utility in routine care. The literature review resulted in 17 studies; patients reported higher perceived value of AVS compared with providers, despite poor recall of specific AVS content and varied post-visit use. In key informant interviews, key informants expressed enthusiasm for the potential of using AVS to reinforce key information with patients, especially if AVS were customizable. Despite this potential, key informants found that AVS included incorrect information and did not feel that patients or their practices were using AVS to enhance care. There is a gap between the potential of AVS and how providers and patients are using it in routine care. Suggestions for improved use of AVS include increasing customization, establishing care team responsibilities and workflows and ensuring patients with communication barriers have dedicated support to review AVS during visits.

  19. Adapting the Quebecois method for assessing implementation to the French National Alzheimer Plan 2008–2012: lessons for gerontological services integration

    PubMed Central

    Somme, Dominique; Trouvé, Hélène; Perisset, Catherine; Corvol, Aline; Ankri, Joël; Saint-Jean, Olivier; de Stampa, Matthieu

    2014-01-01

    Introduction Many countries face ageing-related demographic and epidemiological challenges, notably neurodegenerative disorders, due to the multiple care services they require, thereby pleading for a more integrated system of care. The integrated Quebecois method issued from the Programme of Research to Integrate Services for the Maintenance of Autonomy inspired a French pilot experiment and the National Alzheimer Plan 2008–2012. Programme of Research to Integrate Services for the Maintenance of Autonomy method implementation was rated with an evaluation grid adapted to assess its successive degrees of completion. Discussion The approaching end of the president's term led to the method's institutionalization (2011–2012), before the implementation study ended. When the government changed, the study was interrupted. The results extracted from that ‘lost’ study (presented herein) have, nonetheless, ‘found’ some key lessons. Key lessons/conclusion It was possible to implement a Quebecois integrated-care method in France. We describe the lessons and pitfalls encountered in adapting this evaluation tool. This process is necessarily multidisciplinary and requires a test phase. A simple tool for quantitative assessment of integration was obtained. The first assessment of the tool was unsatisfactory but requires further studies. In the meantime, we recommend using mixed methodologies to assess the services integration level. PMID:24959112

  20. Using molecular functional networks to manifest connections between obesity and obesity-related diseases

    PubMed Central

    Yang, Jialiang; Qiu, Jing; Wang, Kejing; Zhu, Lijuan; Fan, Jingjing; Zheng, Deyin; Meng, Xiaodi; Yang, Jiasheng; Peng, Lihong; Fu, Yu; Zhang, Dahan; Peng, Shouneng; Huang, Haiyun; Zhang, Yi

    2017-01-01

    Obesity is a primary risk factor for many diseases such as certain cancers. In this study, we have developed three algorithms including a random-walk based method OBNet, a shortest-path based method OBsp and a direct-overlap method OBoverlap, to reveal obesity-disease connections at protein-interaction subnetworks corresponding to thousands of biological functions and pathways. Through literature mining, we also curated an obesity-associated disease list, by which we compared the methods. As a result, OBNet outperforms other two methods. OBNet can predict whether a disease is obesity-related based on its associated genes. Meanwhile, OBNet identifies extensive connections between obesity genes and genes associated with a few diseases at various functional modules and pathways. Using breast cancer and Type 2 diabetes as two examples, OBNet identifies meaningful genes that may play key roles in connecting obesity and the two diseases. For example, TGFB1 and VEGFA are inferred to be the top two key genes mediating obesity-breast cancer connection in modules associated with brain development. Finally, the top modules identified by OBNet in breast cancer significantly overlap with modules identified from TCGA breast cancer gene expression study, revealing the power of OBNet in identifying biological processes involved in the disease. PMID:29156709

  1. Key Components in eHealth Interventions Combining Self-Tracking and Persuasive eCoaching to Promote a Healthier Lifestyle: A Scoping Review

    PubMed Central

    Oldenhuis, Hilbrand KE; de Groot, Martijn; Polstra, Louis; Velthuijsen, Hugo; van Gemert-Pijnen, Julia EWC

    2017-01-01

    Background The combination of self-tracking and persuasive eCoaching in automated interventions is a new and promising approach for healthy lifestyle management. Objective The aim of this study was to identify key components of self-tracking and persuasive eCoaching in automated healthy lifestyle interventions that contribute to their effectiveness on health outcomes, usability, and adherence. A secondary aim was to identify the way in which these key components should be designed to contribute to improved health outcomes, usability, and adherence. Methods The scoping review methodology proposed by Arskey and O’Malley was applied. Scopus, EMBASE, PsycINFO, and PubMed were searched for publications dated from January 1, 2013 to January 31, 2016 that included (1) self-tracking, (2) persuasive eCoaching, and (3) healthy lifestyle intervention. Results The search resulted in 32 publications, 17 of which provided results regarding the effect on health outcomes, 27 of which provided results regarding usability, and 13 of which provided results regarding adherence. Among the 32 publications, 27 described an intervention. The most commonly applied persuasive eCoaching components in the described interventions were personalization (n=24), suggestion (n=19), goal-setting (n=17), simulation (n=17), and reminders (n=15). As for self-tracking components, most interventions utilized an accelerometer to measure steps (n=11). Furthermore, the medium through which the user could access the intervention was usually a mobile phone (n=10). The following key components and their specific design seem to influence both health outcomes and usability in a positive way: reduction by setting short-term goals to eventually reach long-term goals, personalization of goals, praise messages, reminders to input self-tracking data into the technology, use of validity-tested devices, integration of self-tracking and persuasive eCoaching, and provision of face-to-face instructions during implementation. In addition, health outcomes or usability were not negatively affected when more effort was requested from participants to input data into the technology. The data extracted from the included publications provided limited ability to identify key components for adherence. However, one key component was identified for both usability and adherence, namely the provision of personalized content. Conclusions This scoping review provides a first overview of the key components in automated healthy lifestyle interventions combining self-tracking and persuasive eCoaching that can be utilized during the development of such interventions. Future studies should focus on the identification of key components for effects on adherence, as adherence is a prerequisite for an intervention to be effective. PMID:28765103

  2. Human body motion capture from multi-image video sequences

    NASA Astrophysics Data System (ADS)

    D'Apuzzo, Nicola

    2003-01-01

    In this paper is presented a method to capture the motion of the human body from multi image video sequences without using markers. The process is composed of five steps: acquisition of video sequences, calibration of the system, surface measurement of the human body for each frame, 3-D surface tracking and tracking of key points. The image acquisition system is currently composed of three synchronized progressive scan CCD cameras and a frame grabber which acquires a sequence of triplet images. Self calibration methods are applied to gain exterior orientation of the cameras, the parameters of internal orientation and the parameters modeling the lens distortion. From the video sequences, two kinds of 3-D information are extracted: a three-dimensional surface measurement of the visible parts of the body for each triplet and 3-D trajectories of points on the body. The approach for surface measurement is based on multi-image matching, using the adaptive least squares method. A full automatic matching process determines a dense set of corresponding points in the triplets. The 3-D coordinates of the matched points are then computed by forward ray intersection using the orientation and calibration data of the cameras. The tracking process is also based on least squares matching techniques. Its basic idea is to track triplets of corresponding points in the three images through the sequence and compute their 3-D trajectories. The spatial correspondences between the three images at the same time and the temporal correspondences between subsequent frames are determined with a least squares matching algorithm. The results of the tracking process are the coordinates of a point in the three images through the sequence, thus the 3-D trajectory is determined by computing the 3-D coordinates of the point at each time step by forward ray intersection. Velocities and accelerations are also computed. The advantage of this tracking process is twofold: it can track natural points, without using markers; and it can track local surfaces on the human body. In the last case, the tracking process is applied to all the points matched in the region of interest. The result can be seen as a vector field of trajectories (position, velocity and acceleration). The last step of the process is the definition of selected key points of the human body. A key point is a 3-D region defined in the vector field of trajectories, whose size can vary and whose position is defined by its center of gravity. The key points are tracked in a simple way: the position at the next time step is established by the mean value of the displacement of all the trajectories inside its region. The tracked key points lead to a final result comparable to the conventional motion capture systems: 3-D trajectories of key points which can be afterwards analyzed and used for animation or medical purposes.

  3. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis

    PubMed Central

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    Background This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Methodology/Principal Findings Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006–2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006–2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including “natural products and polymers” with nine key technical points, “fermentation industry” with twelve ones, “electrical medical equipment” with four ones, and “diagnosis, surgery” with four ones. Conclusions/Significance The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new technological opportunities. PMID:26599967

  4. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    PubMed

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new technological opportunities.

  5. Key Working for Families with Young Disabled Children

    PubMed Central

    Carter, Bernie; Thomas, Megan

    2011-01-01

    For families with a disabled child, the usual challenges of family life can be further complicated by the need to access a wide range of services provided by a plethora of professionals and agencies. Key working aims to support children and their families in navigating these complexities ensuring easy access to relevant, high quality, and coordinated care. The aim of this paper is to explore the key worker role in relation to “being a key worker” and “having a key worker”. The data within this paper draw on a larger evaluation study of the Blackpool Early Support Pilot Programme. The qualitative study used an appreciative and narrative approach and utilised mixed methods (interviews, surveys and a nominal group workshop). Data were collected from 43 participants (parents, key workers, and other stakeholders). All stakeholders who had been involved with the service were invited to participate. In the paper we present and discuss the ways in which key working made a difference to the lives of children and their families. We also consider how key working transformed the perspectives of the key workers creating a deeper and richer understanding of family lives and the ways in which other disciplines and agencies worked. Key working contributed to the shift to a much more family-centred approach, and enhanced communication and information sharing between professionals and agencies improved. This resulted in families feeling more informed. Key workers acted in an entrepreneurial fashion, forging new relationships with families and between families and other stakeholders. Parents of young disabled children and their service providers benefited from key working. Much of the benefit accrued came from strong, relational, and social-professional networking which facilitated the embedding of new ways of working into everyday practice. Using an appreciative inquiry approach provided an effective and relevant way of engaging with parents, professionals, and other stakeholders to explore what was working well with key working within an Early Support Pilot Programme. PMID:21994827

  6. Virtual Sliding QWERTY: A new text entry method for smartwatches using Tap-N-Drag.

    PubMed

    Cha, Jae-Min; Choi, Eunjung; Lim, Jihyoun

    2015-11-01

    A smaller screen of smartwatches compare to conventional mobile devices such as PDAs and smartphones is one of the main factors that makes users to input texts difficult. However, several studies have only proposed a concept for entering texts for smartwatches without usability tests while other studies showed low text input performance. In this study, we proposed a new text entry method called Virtual Sliding QWERTY (VSQ) which utilizes a virtual qwerty-layout keyboard and a 'Tap-N-Drag' method to move the keyboard to the desired position. In addition, to verify VSQ we conducted a usability test with 20 participants for a combination of 5 key sizes and 4 CD-gains. As a result, VSQ achieved an average of 11.9 Words per Minute which was higher than previous studies. In particular, VSQ at 5 × 5 key size and 2×, or 3× CD-gain had the highest performance in terms of the quantitative and qualitative usability test. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  7. Mobile phones and social structures: an exploration of a closed user group in rural Ghana

    PubMed Central

    2013-01-01

    Background In the Millennium Villages Project site of Bonsaaso, Ghana, the Health Team is using a mobile phone closed user group to place calls amongst one another at no cost. Methods In order to determine the utilization and acceptability of the closed user group amongst users, social network analysis and qualitative methods were used. Key informants were identified and interviewed. The key informants also kept prospective call journals. Billing statements and de-identified call data from the closed user group were used to generate data for analyzing the social structure revealed by the network traffic. Results The majority of communication within the closed user group was personal and not for professional purposes. The members of the CUG felt that the group improved their efficiency at work. Conclusions The methods used present an interesting way to investigate the social structure surrounding communication via mobile phones. In addition, the benefits identified from the exploration of this closed user group make a case for supporting mobile phone closed user groups amongst professional groups. PMID:24007331

  8. A new method for teaching physical examination to junior medical students

    PubMed Central

    Sayma, Meelad; Williams, Hywel Rhys

    2016-01-01

    Introduction Teaching effective physical examination is a key component in the education of medical students. Preclinical medical students often have insufficient clinical knowledge to apply to physical examination recall, which may hinder their learning when taught through certain understanding-based models. This pilot project aimed to develop a method to teach physical examination to preclinical medical students using “core clinical cases”, overcoming the need for “rote” learning. Methods This project was developed utilizing three cycles of planning, action, and reflection. Thematic analysis of feedback was used to improve this model, and ensure it met student expectations. Results and discussion A model core clinical case developed in this project is described, with gout as the basis for a “foot and ankle” examination. Key limitations and difficulties encountered on implementation of this pilot are discussed for future users, including the difficulty encountered in “content overload”. Conclusion This approach aims to teach junior medical students physical examination through understanding, using a simulated patient environment. Robust research is now required to demonstrate efficacy and repeatability in the physical examination of other systems. PMID:26937208

  9. Evidence for a Familial Speech Sound Disorder Subtype in a Multigenerational Study of Oral and Hand Motor Sequencing Ability

    ERIC Educational Resources Information Center

    Peter, Beate; Raskind, Wendy H.

    2011-01-01

    Purpose: To evaluate phenotypic expressions of speech sound disorder (SSD) in multigenerational families with evidence of familial forms of SSD. Method: Members of five multigenerational families (N = 36) produced rapid sequences of monosyllables and disyllables and tapped computer keys with repetitive and alternating movements. Results: Measures…

  10. Simulating a Time-of-Flight Mass Spectrometer: A LabView Exercise

    ERIC Educational Resources Information Center

    Marty, Michael T.; Beussman, Douglas J.

    2013-01-01

    An in-depth understanding of all parameters that affect an instrumental analysis method, allowing students to explore how these instruments work so that they are not just a "black box," is key to being able to optimize the technique and obtain the best possible results. It is, however, impractical to provide such in depth coverage of…

  11. Investigating Diversity and Equality: Methods, Challenges and Implications

    ERIC Educational Resources Information Center

    Smith, Maria

    2007-01-01

    This paper sets out the background and key findings from a number of research projects about diversity and equality at a UK university. The works were commissioned as a result of changes in legislation as well as a genuine concern to investigate the issues of inequity and institutional racism within the university. The paper explores the…

  12. Meaningful Messages: Adults in the Lower Mississippi Delta Provide Cultural Insight into Strategies for Promoting the MyPyramid

    ERIC Educational Resources Information Center

    Zoellner, Jamie; Bounds, Wendy; Connell, Carol; Yadrick, Kathy; Crook, LaShaundrea

    2010-01-01

    Objective: To explore cultural perceptions of the MyPyramid key messages and identify factors that may impact adoption of these recommendations. Methods: Systematic content analysis of transcripts from in-depth, structured interviews with 23 adults, primarily African American females, residing in the Lower Mississippi Delta. Results: When asked to…

  13. A Study of the Relationship between Professional Development Evaluation and Middle School Mathematics Achievement

    ERIC Educational Resources Information Center

    Agnant Rogers, Myriam

    2013-01-01

    As a result of poor student performance, professional development has emerged as a key strategy for improving instruction and achievement. In times of reduced resources and increased accountability, schools must evaluate their efforts in order to make sound decisions about policy and practice. This mixed method study was designed to investigate…

  14. The root iron reductase assay: an examination of key factors that must be respected to generate meaningful assay results

    USDA-ARS?s Scientific Manuscript database

    Plant iron researchers have been quantifying root iron reductase activity since the 1970's, using a simple spectrophotometric method based on the color change of a ferrous iron chromophore. The technique was used by Chaney, Brown, and Tiffin (1972) to demonstrate the obligatory reduction of ferric i...

  15. Development of 3D Ice Accretion Measurement Method

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Broeren, Andy P.; Addy, Harold E., Jr.; Sills, Robert; Pifer, Ellen M.

    2012-01-01

    Icing wind tunnels are designed to simulate in-flight icing environments. The chief product of such facilities is the ice accretion that forms on various test articles. Documentation of the resulting ice accretion key piece of data in icing-wind-tunnel tests. Number of currently used options for documenting ice accretion in icing-wind-tunnel testing.

  16. Evaluation of hydrogen embrittlement and temper embrittlement by key curve method in instrumented Charpy test

    NASA Astrophysics Data System (ADS)

    Ohtsuka, N.; Shindo, Y.; Makita, A.

    2010-06-01

    Instrumented Charpy test was conducted on small sized specimen of 21/4Cr-1Mo steel. In the test the single specimen key curve method was applied to determine the value of fracture toughness for the initiation of crack extension with hydrogen free, KIC, and for hydrogen embrittlement cracking, KIH. Also the tearing modulus as a parameter for resistance to crack extension was determined. The role of these parameters was discussed at an upper shelf temperature and at a transition temperature. Then the key curve method combined with instrumented Charpy test was proven to be used to evaluate not only temper embrittlement but also hydrogen embrittlement.

  17. Zoning method for environmental engineering geological patterns in underground coal mining areas.

    PubMed

    Liu, Shiliang; Li, Wenping; Wang, Qiqing

    2018-09-01

    Environmental engineering geological patterns (EEGPs) are used to express the trend and intensity of eco-geological environment caused by mining in underground coal mining areas, a complex process controlled by multiple factors. A new zoning method for EEGPs was developed based on the variable-weight theory (VWT), where the weights of factors vary with their value. The method was applied to the Yushenfu mining area, Shaanxi, China. First, the mechanism of the EEGPs caused by mining was elucidated, and four types of EEGPs were proposed. Subsequently, 13 key control factors were selected from mining conditions, lithosphere, hydrosphere, ecosphere, and climatic conditions; their thematic maps were constructed using ArcGIS software and remote-sensing technologies. Then, a stimulation-punishment variable-weight model derived from the partition of basic evaluation unit of study area, construction of partition state-variable-weight vector, and determination of variable-weight interval was built to calculate the variable weights of each factor. On this basis, a zoning mathematical model of EEGPs was established, and the zoning results were analyzed. For comparison, the traditional constant-weight theory (CWT) was also applied to divide the EEGPs. Finally, the zoning results obtained using VWT and CWT were compared. The verification of field investigation indicates that VWT is more accurate and reliable than CWT. The zoning results are consistent with the actual situations and the key of planning design for the rational development of coal resources and protection of eco-geological environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Key drivers of airline loyalty.

    PubMed

    Dolnicar, Sara; Grabler, Klaus; Grün, Bettina; Kulnig, Anna

    2011-10-01

    This study investigates drivers of airline loyalty. It contributes to the body of knowledge in the area by investigating loyalty for a number of a priori market segments identified by airline management and by using a method which accounts for the multi-step nature of the airline choice process. The study is based on responses from 687 passengers. Results indicate that, at aggregate level, frequent flyer membership, price, the status of being a national carrier and the reputation of the airline as perceived by friends are the variables which best discriminate between travellers loyal to the airline and those who are not. Differences in drivers of airline loyalty for a number of segments were identified. For example, loyalty programs play a key role for business travellers whereas airline loyalty of leisure travellers is difficult to trace back to single factors. For none of the calculated models satisfaction emerged as a key driver of airline loyalty.

  19. A digital memories based user authentication scheme with privacy preservation.

    PubMed

    Liu, JunLiang; Lyu, Qiuyun; Wang, Qiuhua; Yu, Xiangxiang

    2017-01-01

    The traditional username/password or PIN based authentication scheme, which still remains the most popular form of authentication, has been proved insecure, unmemorable and vulnerable to guessing, dictionary attack, key-logger, shoulder-surfing and social engineering. Based on this, a large number of new alternative methods have recently been proposed. However, most of them rely on users being able to accurately recall complex and unmemorable information or using extra hardware (such as a USB Key), which makes authentication more difficult and confusing. In this paper, we propose a Digital Memories based user authentication scheme adopting homomorphic encryption and a public key encryption design which can protect users' privacy effectively, prevent tracking and provide multi-level security in an Internet & IoT environment. Also, we prove the superior reliability and security of our scheme compared to other schemes and present a performance analysis and promising evaluation results.

  20. A digital memories based user authentication scheme with privacy preservation

    PubMed Central

    Liu, JunLiang; Lyu, Qiuyun; Wang, Qiuhua; Yu, Xiangxiang

    2017-01-01

    The traditional username/password or PIN based authentication scheme, which still remains the most popular form of authentication, has been proved insecure, unmemorable and vulnerable to guessing, dictionary attack, key-logger, shoulder-surfing and social engineering. Based on this, a large number of new alternative methods have recently been proposed. However, most of them rely on users being able to accurately recall complex and unmemorable information or using extra hardware (such as a USB Key), which makes authentication more difficult and confusing. In this paper, we propose a Digital Memories based user authentication scheme adopting homomorphic encryption and a public key encryption design which can protect users’ privacy effectively, prevent tracking and provide multi-level security in an Internet & IoT environment. Also, we prove the superior reliability and security of our scheme compared to other schemes and present a performance analysis and promising evaluation results. PMID:29190659

  1. Listening for Prescriptions: A National Consultation on Pharmaceutical Policy Issues

    PubMed Central

    Morgan, Steve; Cunningham, Colleen M.

    2010-01-01

    Objectives and Methods: Pharmaceutical policy is an increasingly costly, essential and challenging component of health system management. We sought to identify priority pharmaceutical policy issues in Canada and to translate them into research priorities using key informant interviews, stakeholder surveys and a deliberative workshop. Results: We found consensus on overarching policy goals: to provide all Canadians with equitable and sustainable access to necessary medicines. We also found widespread frustration that many key pharmaceutical policy issues in Canada — including improving prescription drug financing and pricing — have been persistent challenges owing to a lack of policy coordination. The coverage of extraordinarily costly medicines for serious conditions was identified as a rapidly emerging policy issue. Conclusion: Targeted research and knowledge translation activities can help address key policy issues and, importantly, challenges of policy coordination in Canada and thereby reduce inequity and inefficiency in policy approaches and outcomes. PMID:22043223

  2. Key drivers of airline loyalty

    PubMed Central

    Dolnicar, Sara; Grabler, Klaus; Grün, Bettina; Kulnig, Anna

    2011-01-01

    This study investigates drivers of airline loyalty. It contributes to the body of knowledge in the area by investigating loyalty for a number of a priori market segments identified by airline management and by using a method which accounts for the multi-step nature of the airline choice process. The study is based on responses from 687 passengers. Results indicate that, at aggregate level, frequent flyer membership, price, the status of being a national carrier and the reputation of the airline as perceived by friends are the variables which best discriminate between travellers loyal to the airline and those who are not. Differences in drivers of airline loyalty for a number of segments were identified. For example, loyalty programs play a key role for business travellers whereas airline loyalty of leisure travellers is difficult to trace back to single factors. For none of the calculated models satisfaction emerged as a key driver of airline loyalty. PMID:27064618

  3. Evidence-based development and evaluation of mobile cognitive support apps for people on the autism spectrum: methodological conclusions from two R+D projects.

    PubMed

    Gyori, Miklos; Stefanik, Krisztina; Kanizsai-Nagy, Ildikó

    2015-01-01

    A growing body of evidence confirms that mobile digital devices have key potentials as assistive/educational tools for people with autism spectrum disorders. The aim of this paper is to outline key aspects of development and evaluation methodologies that build on, and provide systematic evidence on effects of using such apps. We rely on the results of two R+D projects, both using quantitative and qualitative methods to support development and to evaluate developed apps (n=54 and n=22). Analyzing methodological conclusions from these studies we outline some guidelines for an 'ideal' R+D methodology but we also point to important trade-offs between the need for best systematic evidence and the limitations on development time and costs. We see these trade-offs as a key issue to be resolved in this field.

  4. Identifying key genes in glaucoma based on a benchmarked dataset and the gene regulatory network.

    PubMed

    Chen, Xi; Wang, Qiao-Ling; Zhang, Meng-Hui

    2017-10-01

    The current study aimed to identify key genes in glaucoma based on a benchmarked dataset and gene regulatory network (GRN). Local and global noise was added to the gene expression dataset to produce a benchmarked dataset. Differentially-expressed genes (DEGs) between patients with glaucoma and normal controls were identified utilizing the Linear Models for Microarray Data (Limma) package based on benchmarked dataset. A total of 5 GRN inference methods, including Zscore, GeneNet, context likelihood of relatedness (CLR) algorithm, Partial Correlation coefficient with Information Theory (PCIT) and GEne Network Inference with Ensemble of Trees (Genie3) were evaluated using receiver operating characteristic (ROC) and precision and recall (PR) curves. The interference method with the best performance was selected to construct the GRN. Subsequently, topological centrality (degree, closeness and betweenness) was conducted to identify key genes in the GRN of glaucoma. Finally, the key genes were validated by performing reverse transcription-quantitative polymerase chain reaction (RT-qPCR). A total of 176 DEGs were detected from the benchmarked dataset. The ROC and PR curves of the 5 methods were analyzed and it was determined that Genie3 had a clear advantage over the other methods; thus, Genie3 was used to construct the GRN. Following topological centrality analysis, 14 key genes for glaucoma were identified, including IL6 , EPHA2 and GSTT1 and 5 of these 14 key genes were validated by RT-qPCR. Therefore, the current study identified 14 key genes in glaucoma, which may be potential biomarkers to use in the diagnosis of glaucoma and aid in identifying the molecular mechanism of this disease.

  5. Joint Carrier-Phase Synchronization and LDPC Decoding

    NASA Technical Reports Server (NTRS)

    Simon, Marvin; Valles, Esteban

    2009-01-01

    A method has been proposed to increase the degree of synchronization of a radio receiver with the phase of a suppressed carrier signal modulated with a binary- phase-shift-keying (BPSK) or quaternary- phase-shift-keying (QPSK) signal representing a low-density parity-check (LDPC) code. This method is an extended version of the method described in Using LDPC Code Constraints to Aid Recovery of Symbol Timing (NPO-43112), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 54. Both methods and the receiver architectures in which they would be implemented belong to a class of timing- recovery methods and corresponding receiver architectures characterized as pilotless in that they do not require transmission and reception of pilot signals. The proposed method calls for the use of what is known in the art as soft decision feedback to remove the modulation from a replica of the incoming signal prior to feeding this replica to a phase-locked loop (PLL) or other carrier-tracking stage in the receiver. Soft decision feedback refers to suitably processed versions of intermediate results of iterative computations involved in the LDPC decoding process. Unlike a related prior method in which hard decision feedback (the final sequence of decoded symbols) is used to remove the modulation, the proposed method does not require estimation of the decoder error probability. In a basic digital implementation of the proposed method, the incoming signal (having carrier phase theta theta (sub c) plus noise would first be converted to inphase (I) and quadrature (Q) baseband signals by mixing it with I and Q signals at the carrier frequency [wc/(2 pi)] generated by a local oscillator. The resulting demodulated signals would be processed through one-symbol-period integrate and- dump filters, the outputs of which would be sampled and held, then multiplied by a soft-decision version of the baseband modulated signal. The resulting I and Q products consist of terms proportional to the cosine and sine of the carrier phase cc as well as correlated noise components. These products would be fed as inputs to a digital PLL that would include a number-controlled oscillator (NCO), which provides an estimate of the carrier phase, theta(sub c).

  6. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    PubMed Central

    Smith, Justin D.

    2013-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874

  7. L-C Measurement Acquisition Method for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Taylor, B. Douglas; Shams, Qamar A.; Fox, Robert L.

    2003-01-01

    This paper describes a measurement acquisition method for aerospace systems that eliminates the need for sensors to have physical connection to a power source (i.e., no lead wires) or to data acquisition equipment. Furthermore, the method does not require the sensors to be in proximity to any form of acquisition hardware. Multiple sensors can be interrogated using this method. The sensors consist of a capacitor, C(p), whose capacitance changes with changes to a physical property, p, electrically connected to an inductor, L. The method uses an antenna to broadcast electromagnetic energy that electrically excites one or more inductive-capacitive sensors via Faraday induction. This method facilitates measurements that were not previously possible because there was no practical means of providing power and data acquisition electrical connections to a sensor. Unlike traditional sensors, which measure only a single physical property, the manner in which the sensing element is interrogated simultaneously allows measurement of at least two unrelated physical properties (e.g., displacement rate and fluid level) by using each constituent of the L-C element. The key to using the method for aerospace applications is to increase the distance between the L-C elements and interrogating antenna; develop all key components to be non-obtrusive and to develop sensing elements that can easily be implemented. Techniques that have resulted in increased distance between antenna and sensor will be presented. Fluid-level measurements and pressure measurements using the acquisition method are demonstrated in the paper.

  8. A Taxonomy-Based Approach to Shed Light on the Babel of Mathematical Models for Rice Simulation

    NASA Technical Reports Server (NTRS)

    Confalonieri, Roberto; Bregaglio, Simone; Adam, Myriam; Ruget, Francoise; Li, Tao; Hasegawa, Toshihiro; Yin, Xinyou; Zhu, Yan; Boote, Kenneth; Buis, Samuel; hide

    2016-01-01

    For most biophysical domains, differences in model structures are seldom quantified. Here, we used a taxonomy-based approach to characterise thirteen rice models. Classification keys and binary attributes for each key were identified, and models were categorised into five clusters using a binary similarity measure and the unweighted pair-group method with arithmetic mean. Principal component analysis was performed on model outputs at four sites. Results indicated that (i) differences in structure often resulted in similar predictions and (ii) similar structures can lead to large differences in model outputs. User subjectivity during calibration may have hidden expected relationships between model structure and behaviour. This explanation, if confirmed, highlights the need for shared protocols to reduce the degrees of freedom during calibration, and to limit, in turn, the risk that user subjectivity influences model performance.

  9. Mixed Methods Research: What Are the Key Issues to Consider?

    ERIC Educational Resources Information Center

    Ghosh, Rajashi

    2016-01-01

    Mixed methods research (MMR) is increasingly becoming a popular methodological approach in several fields due to the promise it holds for comprehensive understanding of complex problems being researched. However, researchers interested in MMR often lack reference to a guide that can explain the key issues pertaining to the paradigm wars…

  10. Teaching Teamwork and Problem Solving Concurrently

    ERIC Educational Resources Information Center

    Goltz, Sonia M.; Hietapelto, Amy B.; Reinsch, Roger W.; Tyrell, Sharon K.

    2008-01-01

    Teamwork and problem-solving skills have frequently been identified by business leaders as being key competencies; thus, teaching methods such as problem-based learning and team-based learning have been developed. However, the focus of these methods has been on teaching one skill or the other. A key argument for teaching the skills concurrently is…

  11. Understanding Interdependencies between Heterogeneous Earth Observation Systems When Applied to Federal Objectives

    NASA Astrophysics Data System (ADS)

    Gallo, J.; Sylak-Glassman, E.

    2017-12-01

    We will present a method for assessing interdependencies between heterogeneous Earth observation (EO) systems when applied to key Federal objectives. Using data from the National Earth Observation Assessment (EOA), we present a case study that examines the frequency that measurements from each of the Landsat 8 sensors are used in conjunction with heterogeneous measurements from other Earth observation sensors to develop data and information products. This EOA data allows us to map the most frequent interactions between Landsat measurements and measurements from other sensors, identify high-impact data and information products where these interdependencies occur, and identify where these combined measurements contribute most to meeting a key Federal objective within one of the 13 Societal Benefit Areas used in the EOA study. Using a value-tree framework to trace the application of data from EO systems to weighted key Federal objectives within the EOA study, we are able to estimate relative contribution of individual EO systems to meeting those objectives, as well as the interdependencies between measurements from all EO systems within the EOA study. The analysis relies on a modified Delphi method to elicit relative levels of reliance on individual measurements from EO systems, including combinations of measurements, from subject matter experts. This results in the identification of a representative portfolio of all EO systems used to meet key Federal objectives. Understanding the interdependencies among a heterogeneous set of measurements that modify the impact of any one individual measurement on meeting a key Federal objective, especially if the measurements originate from multiple agencies or state/local/tribal, international, academic, and commercial sources, can impact agency decision-making regarding mission requirements and inform understanding of user needs.

  12. Final report on key comparison CCQM-K100: Analysis of copper in ethanol

    NASA Astrophysics Data System (ADS)

    Zhou, Tao; Kakoulides, Elias; Zhu, Yanbei; Jaehrling, Reinhard; Rienitz, Olaf; Saxby, David; Phukphatthanachai, Pranee; Yafa, Charun; Labarraque, Guillaume; Cankur, Oktay; Can, Süleyman Z.; Konopelko, Leonid A.; Kustikov, Yu A.; Caciano de Sena, Rodrigo; Marques Rodrigues, Janaina; Fonseca Sarmanho, Gabriel; Fortunato de Carvalho Rocha, Werickson; dos Reis, Lindomar Augusto

    2014-01-01

    The increase in renewable sources in the energy matrix of the countries is an effort to reduce dependency on crude oil and the environmental impacts associated with its use. In order to help overcome the lack of widely accepted quality standards for fuel ethanol and to guarantee its competitiveness in the international trade market, the NMIs have been working to develop certified reference materials for bio-fuels and measurement methods. Inorganic impurities such as Cu, Na and Fe may be present in the fuel ethanol and their presence is associated with corrosion and the formation of oxide deposits in some engine parts. The key comparison CCQM-K100 was carried out under the auspices of the Inorganic Analysis Working Group (IAWG) and the coordination of the National Institute of Metrology, Quality and Technology (INMETRO). The objective of this key comparison was to compare the measurement capabilities of the participants for the determination of Cu in fuel ethanol. Ten NMIs participated in this exercise and most of them used the isotopic dilution method for determining the amount of Cu. The median was chosen as key comparison reference value (KCRV). The assigned KCRV for the Cu content was 0.3589 µg/g with a combined standard uncertainty of 0.0014 µg/g. In general, there is a good agreement among the participants' results. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  13. Integrating optical finger motion tracking with surface touch events.

    PubMed

    MacRitchie, Jennifer; McPherson, Andrew P

    2015-01-01

    This paper presents a method of integrating two contrasting sensor systems for studying human interaction with a mechanical system, using piano performance as the case study. Piano technique requires both precise small-scale motion of fingers on the key surfaces and planned large-scale movement of the hands and arms. Where studies of performance often focus on one of these scales in isolation, this paper investigates the relationship between them. Two sensor systems were installed on an acoustic grand piano: a monocular high-speed camera tracking the position of painted markers on the hands, and capacitive touch sensors attach to the key surfaces which measure the location of finger-key contacts. This paper highlights a method of fusing the data from these systems, including temporal and spatial alignment, segmentation into notes and automatic fingering annotation. Three case studies demonstrate the utility of the multi-sensor data: analysis of finger flexion or extension based on touch and camera marker location, timing analysis of finger-key contact preceding and following key presses, and characterization of individual finger movements in the transitions between successive key presses. Piano performance is the focus of this paper, but the sensor method could equally apply to other fine motor control scenarios, with applications to human-computer interaction.

  14. Integrating optical finger motion tracking with surface touch events

    PubMed Central

    MacRitchie, Jennifer; McPherson, Andrew P.

    2015-01-01

    This paper presents a method of integrating two contrasting sensor systems for studying human interaction with a mechanical system, using piano performance as the case study. Piano technique requires both precise small-scale motion of fingers on the key surfaces and planned large-scale movement of the hands and arms. Where studies of performance often focus on one of these scales in isolation, this paper investigates the relationship between them. Two sensor systems were installed on an acoustic grand piano: a monocular high-speed camera tracking the position of painted markers on the hands, and capacitive touch sensors attach to the key surfaces which measure the location of finger-key contacts. This paper highlights a method of fusing the data from these systems, including temporal and spatial alignment, segmentation into notes and automatic fingering annotation. Three case studies demonstrate the utility of the multi-sensor data: analysis of finger flexion or extension based on touch and camera marker location, timing analysis of finger-key contact preceding and following key presses, and characterization of individual finger movements in the transitions between successive key presses. Piano performance is the focus of this paper, but the sensor method could equally apply to other fine motor control scenarios, with applications to human-computer interaction. PMID:26082732

  15. SLAR image interpretation keys for geographic analysis

    NASA Technical Reports Server (NTRS)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  16. Synthesis of Novel 1,4- Dihydropyridine Derivatives Bearing Biphenyl-2'-Tetrazole Substitution as Potential Dual Angiotensin II Receptors and Calcium Channel Blockers

    PubMed Central

    Shahbazi Mojarrad, Javid; Zamani, Zahra; Nazemiyeh, Hossein; Ghasemi, Saeed; Asgari, Davoud

    2011-01-01

    Introduction: We report the synthesis of novel 1,4-dihydropyridine derivatives containing biphenyl-2'-tetrazole moieties. We hypothesized that merging the key structural elements present in an AT1 receptor antagonist with key structural elements in 1,4-dihydropyridine calcium channel blockers would yield novel analogs with potential dual activity for both receptors. This strategy led to the design and synthesis of dialkyl 1,4-dihydro-2,6-dimethyl-4-[2-n-alkyl-1-[2΄-(1H-tetrazole-5-yl) biphenyl -4-yl] methyl] imidazole-4(or 5)-yl]- 3, 5-pyridinedicarboxylate analogs. Methods: These compounds were obtained by two methods starting from biphenyltetrazolyl-4-(or 5)-imidazolecarboxaldehyde intermediates employing in classical Hantzsch condensation reaction. In the first method, triphenylmethyl protecting group of 4- or 5-carboxaldehyde intermediate was first removed in acidic media and then classical Hantzsch reaction was employed in order to obtain the final products. In the second method, without further deprotection process, protected 4- or 5-carboxaldehyde intermediate directly was used in Hantzsch reaction. Results: The second method was more efficient than the first method since the deprotection and ring closure reaction occurs simultaneously in one pot. Conclusion: Eight novel dihydropridines analogs were synthesized using classic Hantzsch condensation reaction. Chemical structures of the compounds were characterized by 1H NMR, infrared and mass spectroscopy. PMID:24312750

  17. Characterization of Factors Affecting Nanoparticle Tracking Analysis Results With Synthetic and Protein Nanoparticles.

    PubMed

    Krueger, Aaron B; Carnell, Pauline; Carpenter, John F

    2016-04-01

    In many manufacturing and research areas, the ability to accurately monitor and characterize nanoparticles is becoming increasingly important. Nanoparticle tracking analysis is rapidly becoming a standard method for this characterization, yet several key factors in data acquisition and analysis may affect results. Nanoparticle tracking analysis is prone to user input and bias on account of a high number of parameters available, contains a limited analysis volume, and individual sample characteristics such as polydispersity or complex protein solutions may affect analysis results. This study systematically addressed these key issues. The integrated syringe pump was used to increase the sample volume analyzed. It was observed that measurements recorded under flow caused a reduction in total particle counts for both polystyrene and protein particles compared to those collected under static conditions. In addition, data for polydisperse samples tended to lose peak resolution at higher flow rates, masking distinct particle populations. Furthermore, in a bimodal particle population, a bias was seen toward the larger species within the sample. The impacts of filtration on an agitated intravenous immunoglobulin sample and operating parameters including "MINexps" and "blur" were investigated to optimize the method. Taken together, this study provides recommendations on instrument settings and sample preparations to properly characterize complex samples. Copyright © 2016. Published by Elsevier Inc.

  18. The "Key" Method of Identifying Igneous and Metamorphic Rocks in Introductory Laboratory.

    ERIC Educational Resources Information Center

    Eves, Robert Leo; Davis, Larry Eugene

    1987-01-01

    Proposes that identification keys provide an orderly strategy for the identification of igneous and metamorphic rocks in an introductory geology course. Explains the format employed in the system and includes the actual key guides for both igneous and metamorphic rocks. (ML)

  19. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  20. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly.

  1. A novel method for repeatedly generating speckle patterns used in digital image correlation

    NASA Astrophysics Data System (ADS)

    Zhang, Juan; Sweedy, Ahmed; Gitzhofer, François; Baroud, Gamal

    2018-01-01

    Speckle patterns play a key role in Digital Image Correlation (DIC) measurement, and generating an optimal speckle pattern has been the goal for decades now. The usual method of generating a speckle pattern is by manually spraying the paint on the specimen. However, this makes it difficult to reproduce the optimal pattern for maintaining identical testing conditions and achieving consistent DIC results. This study proposed and evaluated a novel method using an atomization system to repeatedly generate speckle patterns. To verify the repeatability of the speckle patterns generated by this system, simulation and experimental studies were systematically performed. The results from both studies showed that the speckle patterns and, accordingly, the DIC measurements become highly accurate and repeatable using the proposed atomization system.

  2. A Hybrid On-line Verification Method of Relay Setting

    NASA Astrophysics Data System (ADS)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  3. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  4. A path-level exact parallelization strategy for sequential simulation

    NASA Astrophysics Data System (ADS)

    Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.

    2018-01-01

    Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

  5. Pilot-multiplexed continuous-variable quantum key distribution with a real local oscillator

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Huang, Peng; Zhou, Yingming; Liu, Weiqi; Zeng, Guihua

    2018-01-01

    We propose a pilot-multiplexed continuous-variable quantum key distribution (CVQKD) scheme based on a local local oscillator (LLO). Our scheme utilizes time-multiplexing and polarization-multiplexing techniques to dramatically isolate the quantum signal from the pilot, employs two heterodyne detectors to separately detect the signal and the pilot, and adopts a phase compensation method to almost eliminate the multifrequency phase jitter. In order to analyze the performance of our scheme, a general LLO noise model is constructed. Besides the phase noise and the modulation noise, the photon-leakage noise from the reference path and the quantization noise due to the analog-to-digital converter (ADC) are also considered, which are first analyzed in the LLO regime. Under such general noise model, our scheme has a higher key rate and longer secure distance compared with the preexisting LLO schemes. Moreover, we also conduct an experiment to verify our pilot-multiplexed scheme. Results show that it maintains a low level of the phase noise and is expected to obtain a 554-Kbps secure key rate within a 15-km distance under the finite-size effect.

  6. Spatial patch occupancy patterns of the Lower Keys marsh rabbit

    USGS Publications Warehouse

    Eaton, Mitchell J.; Hughes, Phillip T.; Nichols, James D.; Morkill, Anne; Anderson, Chad

    2011-01-01

    Reliable estimates of presence or absence of a species can provide substantial information on management questions related to distribution and habitat use but should incorporate the probability of detection to reduce bias. We surveyed for the endangered Lower Keys marsh rabbit (Sylvilagus palustris hefneri) in habitat patches on 5 Florida Key islands, USA, to estimate occupancy and detection probabilities. We derived detection probabilities using spatial replication of plots and evaluated hypotheses that patch location (coastal or interior) and patch size influence occupancy and detection. Results demonstrate that detection probability, given rabbits were present, was <0.5 and suggest that naïve estimates (i.e., estimates without consideration of imperfect detection) of patch occupancy are negatively biased. We found that patch size and location influenced probability of occupancy but not detection. Our findings will be used by Refuge managers to evaluate population trends of Lower Keys marsh rabbits from historical data and to guide management decisions for species recovery. The sampling and analytical methods we used may be useful for researchers and managers of other endangered lagomorphs and cryptic or fossorial animals occupying diverse habitats.

  7. Key parameters design of an aerial target detection system on a space-based platform

    NASA Astrophysics Data System (ADS)

    Zhu, Hanlu; Li, Yejin; Hu, Tingliang; Rao, Peng

    2018-02-01

    To ensure flight safety of an aerial aircraft and avoid recurrence of aircraft collisions, a method of multi-information fusion is proposed to design the key parameter to realize aircraft target detection on a space-based platform. The key parameters of a detection wave band and spatial resolution using the target-background absolute contrast, target-background relative contrast, and signal-to-clutter ratio were determined. This study also presented the signal-to-interference ratio for analyzing system performance. Key parameters are obtained through the simulation of a specific aircraft. And the simulation results show that the boundary ground sampling distance is 30 and 35 m in the mid- wavelength infrared (MWIR) and long-wavelength infrared (LWIR) bands for most aircraft detection, and the most reasonable detection wavebands is 3.4 to 4.2 μm and 4.35 to 4.5 μm in the MWIR bands, and 9.2 to 9.8 μm in the LWIR bands. We also found that the direction of detection has a great impact on the detection efficiency, especially in MWIR bands.

  8. Signal processing for non-destructive testing of railway tracks

    NASA Astrophysics Data System (ADS)

    Heckel, Thomas; Casperson, Ralf; Rühe, Sven; Mook, Gerhard

    2018-04-01

    Increased speed, heavier loads, altered material and modern drive systems result in an increasing number of rail flaws. The appearance of these flaws also changes continually due to the rapid change in damage mechanisms of modern rolling stock. Hence, interpretation has become difficult when evaluating non-destructive rail testing results. Due to the changed interplay between detection methods and flaws, the recorded signals may result in unclassified types of rail flaws. Methods for automatic rail inspection (according to defect detection and classification) undergo continual development. Signal processing is a key technology to master the challenge of classification and maintain resolution and detection quality, independent of operation speed. The basic ideas of signal processing, based on the Glassy-Rail-Diagram for classification purposes, are presented herein. Examples for the detection of damages caused by rolling contact fatigue also are given, and synergetic effects of combined evaluation of diverse inspection methods are shown.

  9. [Ways to improve measurement accuracy of blood glucose sensing by mid-infrared spectroscopy].

    PubMed

    Wang, Yan; Li, Ning; Xu, Kexin

    2006-06-01

    Mid-infrared (MIR) spectroscopy is applicable to blood glucose sensing without using any reagent, however, due to a result of inadequate accuracy, till now this method has not been used in clinical detection. The principle and key technologies of blood glucose sensing by MIR spectroscopy are presented in this paper. Along with our experimental results, the paper analyzes ways to enhance measurement accuracy and prediction accuracy by the following four methods: selection of optimized spectral region; application of spectra data processing method; elimination of the interference with other components in the blood, and promotion in system hardware. According to these four improving methods, we designed four experiments, i.e., strict determination of the region where glucose concentration changes most sensitively in MIR, application of genetic algorithm for wavelength selection, normalization of spectra for the purpose of enhancing measuring reproduction, and utilization of CO2 laser as light source. The results show that the measurement accuracy of blood glucose concentration is enhanced almost to a clinical detection level.

  10. Patient-Reported Outcomes and Survivorship in Radiation Oncology: Overcoming the Cons

    PubMed Central

    Siddiqui, Farzan; Liu, Arthur K.; Watkins-Bruner, Deborah; Movsas, Benjamin

    2014-01-01

    Purpose Although patient-reported outcomes (PROs) have become a key component of clinical oncology trials, many challenges exist regarding their optimal application. The goal of this article is to methodically review these barriers and suggest strategies to overcome them. This review will primarily focus on radiation oncology examples, will address issues regarding the “why, how, and what” of PROs, and will provide strategies for difficult problems such as methods for reducing missing data. This review will also address cancer survivorship because it closely relates to PROs. Methods Key articles focusing on PROs, quality of life, and survivorship issues in oncology trials are highlighted, with an emphasis on radiation oncology clinical trials. Publications and Web sites of various governmental and regulatory agencies are also reviewed. Results The study of PROs in clinical oncology trials has become well established. There are guidelines provided by organizations such as the US Food and Drug Administration that clearly indicate the importance of and methodology for studying PROs. Clinical trials in oncology have repeatedly demonstrated the value of studying PROs and suggested ways to overcome some of the key challenges. The Radiation Therapy Oncology Group (RTOG) has led some of these efforts, and their contributions are highlighted. The current state of cancer survivorship guidelines is also discussed. Conclusion The study of PROs presents significant benefits in understanding and treating toxicities and enhancing quality of life; however, challenges remain. Strategies are presented to overcome these hurdles, which will ultimately improve cancer survivorship. PMID:25113760

  11. Lessons Learned by Community Stakeholders in the Massachusetts Childhood Obesity Research Demonstration (MA-CORD) Project, 2013–2014

    PubMed Central

    Ganter, Claudia; Aftosmes-Tobio, Alyssa; Chuang, Emmeline; Kwass, Jo-Ann; Land, Thomas

    2017-01-01

    Introduction Childhood obesity is a multifaceted disease that requires sustainable, multidimensional approaches that support change at the individual, community, and systems levels. The Massachusetts Childhood Obesity Research Demonstration project addressed this need by using clinical and public health evidence-based methods to prevent childhood obesity. To date, little information is known about successes and lessons learned from implementing such large-scale interventions. To address this gap, we examined perspectives of community stakeholders from various sectors on successes achieved and lessons learned during the implementation process. Methods We conducted 39 semistructured interviews with key stakeholders from 6 community sectors in 2 low-income communities from November 2013 through April 2014, during project implementation. Interviews were audio-recorded, transcribed, and analyzed by using the constant comparative method. Data were analyzed by using QSR NVivo 10. Results Successes included increased parental involvement in children’s health and education, increased connections within participating organizations and within the broader community, changes in organizational policies and environments to better support healthy living, and improvements in health behaviors in children, parents, and stakeholders. Lessons learned included the importance of obtaining administrative and leadership support, involving key stakeholders early in the program planning process, creating buffers that allow for unexpected changes, and establishing opportunities for regular communication within and across sectors. Conclusion Study findings indicate that multidisciplinary approaches support health behavior change and provide insight into key issues to consider in developing and implementing such approaches in low-income communities. PMID:28125400

  12. Development of a Measure of Asthma-Specific Quality of Life among Adults

    PubMed Central

    Eberhart, Nicole K.; Sherbourne, Cathy D.; Edelen, Maria Orlando; Stucky, Brian D.; Sin, Nancy L.; Lara, Marielena

    2014-01-01

    Purpose A key goal in asthma treatment is improvement in quality of life (QoL), but existing measures often confound QoL with symptoms and functional impairment. The current study addresses these limitations and the need for valid patient-reported outcome measures by using state-of-the-art methods to develop an item bank assessing QoL in adults with asthma. This article describes the process for developing an initial item pool for field testing. Methods Five focus group interviews were conducted with a total of 50 asthmatic adults. We used “pile sorting/binning” and “winnowing” methods to identify key QoL dimensions and develop a pool of items based on statements made in the focus group interviews. We then conducted a literature review and consulted with an expert panel to ensure that no key concepts were omitted. Finally, we conducted individual cognitive interviews to ensure that items were well understood and inform final item refinement. Results 661 QoL statements were identified from focus group interview transcripts and subsequently used to generate a pool of 112 items in 16 different content areas. Conclusions Items covering a broad range of content were developed that can serve as a valid gauge of individuals’ perceptions of the effects of asthma and its treatment on their lives. These items do not directly measure symptoms or functional impairment, yet they include a broader range of content than most existent measures of asthma-specific QoL. PMID:24062237

  13. Towards a New Definition of Return-to-Work Outcomes in Common Mental Disorders from a Multi-Stakeholder Perspective

    PubMed Central

    Hees, Hiske L.; Nieuwenhuijsen, Karen; Koeter, Maarten W. J.; Bültmann, Ute; Schene, Aart H.

    2012-01-01

    Objectives To examine the perspectives of key stakeholders involved in the return-to-work (RTW) process regarding the definition of successful RTW outcome after sickness absence related to common mental disorders (CMD’s). Methods A mixed-method design was used: First, we used qualitative methods (focus groups, interviews) to identify a broad range of criteria important for the definition of successful RTW (N = 57). Criteria were grouped into content-related clusters. Second, we used a quantitative approach (online questionnaire) to identify, among a larger stakeholder sample (N = 178), the clusters and criteria most important for successful RTW. Results A total of 11 clusters, consisting of 52 unique criteria, were identified. In defining successful RTW, supervisors and occupational physicians regarded “Sustainability” and “At-work functioning” most important, while employees regarded “Sustainability,” “Job satisfaction,” “Work-home balance,” and “Mental Functioning” most important. Despite agreement on the importance of certain criteria, considerable differences among stakeholders were observed. Conclusions Key stakeholders vary in the aspects and criteria they regard as important when defining successful RTW after CMD-related sickness absence. Current definitions of RTW outcomes used in scientific research may not accurately reflect these key stakeholder perspectives. Future studies should be more aware of the perspective from which they aim to evaluate the effectiveness of a RTW intervention, and define their RTW outcomes accordingly. PMID:22768180

  14. [Experimental research of turbidity influence on water quality monitoring of COD in UV-visible spectroscopy].

    PubMed

    Tang, Bin; Wei, Biao; Wu, De-Cao; Mi, De-Ling; Zhao, Jing-Xiao; Feng, Peng; Jiang, Shang-Hai; Mao, Ben-Jiang

    2014-11-01

    Eliminating turbidity is a direct effect spectroscopy detection of COD key technical problems. This stems from the UV-visible spectroscopy detected key quality parameters depend on an accurate and effective analysis of water quality parameters analytical model, and turbidity is an important parameter that affects the modeling. In this paper, we selected formazine turbidity solution and standard solution of potassium hydrogen phthalate to study the turbidity affect of UV--visible absorption spectroscopy detection of COD, at the characteristics wavelength of 245, 300, 360 and 560 nm wavelength point several characteristics with the turbidity change in absorbance method of least squares curve fitting, thus analyzes the variation of absorbance with turbidity. The results show, In the ultraviolet range of 240 to 380 nm, as the turbidity caused by particle produces compounds to the organics, it is relatively complicated to test the turbidity affections on the water Ultraviolet spectra; in the visible region of 380 to 780 nm, the turbidity of the spectrum weakens with wavelength increases. Based on this, this paper we study the multiplicative scatter correction method affected by the turbidity of the water sample spectra calibration test, this method can correct water samples spectral affected by turbidity. After treatment, by comparing the spectra before, the results showed that the turbidity caused by wavelength baseline shift points have been effectively corrected, and features in the ultraviolet region has not diminished. Then we make multiplicative scatter correction for the three selected UV liquid-visible absorption spectroscopy, experimental results shows that on the premise of saving the characteristic of the Ultraviolet-Visible absorption spectrum of water samples, which not only improve the quality of COD spectroscopy detection SNR, but also for providing an efficient data conditioning regimen for establishing an accurate of the chemical measurement methods.

  15. High Precision Seawater Sr/Ca Measurements in the Florida Keys by Inductively Coupled Plasma Atomic Emission Spectrometry: Analytical Method and Implications for Coral Paleothermometry

    NASA Astrophysics Data System (ADS)

    Khare, A.; Kilbourne, K. H.; Schijf, J.

    2017-12-01

    Standard methods of reconstructing past sea surface temperatures (SSTs) with coral skeletal Sr/Ca ratios assume the seawater Sr/Ca ratio is constant. However, there is little data to support this assumption, in part because analytical techniques capable of determining seawater Sr/Ca with sufficient accuracy and precision are expensive and time consuming. We demonstrate a method to measure seawater Sr/Ca using inductively coupled plasma atomic emission spectrometry where we employ an intensity ratio calibration routine that reduces the self- matrix effects of calcium and cancels out the matrix effects that are common to both calcium and strontium. A seawater standard solution cross-calibrated with multiple instruments is used to correct for long-term instrument drift and any remnant matrix effects. The resulting method produces accurate seawater Sr/Ca determinations rapidly, inexpensively, and with a precision better than 0.2%. This method will make it easier for coral paleoclimatologists to quantify potentially problematic fluctuations in seawater Sr/Ca at their study locations. We apply our method to test for variability in surface seawater Sr/Ca along the Florida Keys Reef Tract. We are collecting winter and summer samples for two years in a grid with eleven nearshore to offshore transects across the reef, as well as continuous samples collected by osmotic pumps at four locations adjacent to our grid. Our initial analysis of the grid samples indicates a trend of decreasing Sr/Ca values offshore potentially due to a decreasing groundwater influence. The values differ by as much as 0.05 mmol/mol which could lead to an error of 1°C in mean SST reconstructions. Future work involves continued sampling in the Florida Keys to test for seasonal and interannual variability in seawater Sr/Ca, as well as collecting data from small reefs in the Virgin Islands to test the stability of seawater Sr/Ca under different geologic, hydrologic and hydrographic environments.

  16. Adjusting survival time estimates to account for treatment switching in randomized controlled trials--an economic evaluation context: methods, limitations, and recommendations.

    PubMed

    Latimer, Nicholas R; Abrams, Keith R; Lambert, Paul C; Crowther, Michael J; Wailoo, Allan J; Morden, James P; Akehurst, Ron L; Campbell, Michael J

    2014-04-01

    Treatment switching commonly occurs in clinical trials of novel interventions in the advanced or metastatic cancer setting. However, methods to adjust for switching have been used inconsistently and potentially inappropriately in health technology assessments (HTAs). We present recommendations on the use of methods to adjust survival estimates in the presence of treatment switching in the context of economic evaluations. We provide background on the treatment switching issue and summarize methods used to adjust for it in HTAs. We discuss the assumptions and limitations associated with adjustment methods and draw on results of a simulation study to make recommendations on their use. We demonstrate that methods used to adjust for treatment switching have important limitations and often produce bias in realistic scenarios. We present an analysis framework that aims to increase the probability that suitable adjustment methods can be identified on a case-by-case basis. We recommend that the characteristics of clinical trials, and the treatment switching mechanism observed within them, should be considered alongside the key assumptions of the adjustment methods. Key assumptions include the "no unmeasured confounders" assumption associated with the inverse probability of censoring weights (IPCW) method and the "common treatment effect" assumption associated with the rank preserving structural failure time model (RPSFTM). The limitations associated with switching adjustment methods such as the RPSFTM and IPCW mean that they are appropriate in different scenarios. In some scenarios, both methods may be prone to bias; "2-stage" methods should be considered, and intention-to-treat analyses may sometimes produce the least bias. The data requirements of adjustment methods also have important implications for clinical trialists.

  17. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  18. 48 CFR 915.408-70 - Key personnel clause.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Key personnel clause. 915.408-70 Section 915.408-70 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 915.408-70 Key personnel clause...

  19. Physician leaders of medical groups face increasing challenges.

    PubMed

    Gerbarg, Zack

    2002-10-01

    Physician leadership has emerged as one of the biggest challenges and opportunities for medical group success. The environment for medical groups has become increasingly complex as the result of five major factors: 1) varying reimbursement methods, 2) growth in the size of groups, 3) technology investments, 4) sale and merger of groups, and 5) regulatory and legal issues. Striking the right balance between too little or too much physician involvement in leading medical groups is a key business decision. Most large, successful businesses view investment in their leaders as critical for success. Medical groups can learn from other businesses that investment in education, coaching, and succession planning for leaders is a key to long-term success.

  20. The engineering of a scalable multi-site communications system utilizing quantum key distribution (QKD)

    NASA Astrophysics Data System (ADS)

    Tysowski, Piotr K.; Ling, Xinhua; Lütkenhaus, Norbert; Mosca, Michele

    2018-04-01

    Quantum key distribution (QKD) is a means of generating keys between a pair of computing hosts that is theoretically secure against cryptanalysis, even by a quantum computer. Although there is much active research into improving the QKD technology itself, there is still significant work to be done to apply engineering methodology and determine how it can be practically built to scale within an enterprise IT environment. Significant challenges exist in building a practical key management service (KMS) for use in a metropolitan network. QKD is generally a point-to-point technique only and is subject to steep performance constraints. The integration of QKD into enterprise-level computing has been researched, to enable quantum-safe communication. A novel method for constructing a KMS is presented that allows arbitrary computing hosts on one site to establish multiple secure communication sessions with the hosts of another site. A key exchange protocol is proposed where symmetric private keys are granted to hosts while satisfying the scalability needs of an enterprise population of users. The KMS operates within a layered architectural style that is able to interoperate with various underlying QKD implementations. Variable levels of security for the host population are enforced through a policy engine. A network layer provides key generation across a network of nodes connected by quantum links. Scheduling and routing functionality allows quantum key material to be relayed across trusted nodes. Optimizations are performed to match the real-time host demand for key material with the capacity afforded by the infrastructure. The result is a flexible and scalable architecture that is suitable for enterprise use and independent of any specific QKD technology.

  1. A Computational Procedure for Identifying Bilinear Representations of Nonlinear Systems Using Volterra Kernels

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G.; Silva, Walter A.

    2008-01-01

    A computational procedure for identifying the state-space matrices corresponding to discrete bilinear representations of nonlinear systems is presented. A key feature of the method is the use of first- and second-order Volterra kernels (first- and second-order pulse responses) to characterize the system. The present method is based on an extension of a continuous-time bilinear system identification procedure given in a 1971 paper by Bruni, di Pillo, and Koch. The analytical and computational considerations that underlie the original procedure and its extension to the title problem are presented and described, pertinent numerical considerations associated with the process are discussed, and results obtained from the application of the method to a variety of nonlinear problems from the literature are presented. The results of these exploratory numerical studies are decidedly promising and provide sufficient credibility for further examination of the applicability of the method.

  2. Virtual-optical information security system based on public key infrastructure

    NASA Astrophysics Data System (ADS)

    Peng, Xiang; Zhang, Peng; Cai, Lilong; Niu, Hanben

    2005-01-01

    A virtual-optical based encryption model with the aid of public key infrastructure (PKI) is presented in this paper. The proposed model employs a hybrid architecture in which our previously published encryption method based on virtual-optics scheme (VOS) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). The whole information security model is run under the framework of international standard ITU-T X.509 PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOS security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network. Numerical experiments prove the effectiveness of the method. The security of proposed model is briefly analyzed by examining some possible attacks from the viewpoint of a cryptanalysis.

  3. Forensic steganalysis: determining the stego key in spatial domain steganography

    NASA Astrophysics Data System (ADS)

    Fridrich, Jessica; Goljan, Miroslav; Soukal, David; Holotyak, Taras

    2005-03-01

    This paper is an extension of our work on stego key search for JPEG images published at EI SPIE in 2004. We provide a more general theoretical description of the methodology, apply our approach to the spatial domain, and add a method that determines the stego key from multiple images. We show that in the spatial domain the stego key search can be made significantly more efficient by working with the noise component of the image obtained using a denoising filter. The technique is tested on the LSB embedding paradigm and on a special case of embedding by noise adding (the +/-1 embedding). The stego key search can be performed for a wide class of steganographic techniques even for sizes of secret message well below those detectable using known methods. The proposed strategy may prove useful to forensic analysts and law enforcement.

  4. Key node selection in minimum-cost control of complex networks

    NASA Astrophysics Data System (ADS)

    Ding, Jie; Wen, Changyun; Li, Guoqi

    2017-11-01

    Finding the key node set that is connected with a given number of external control sources for driving complex networks from initial state to any predefined state with minimum cost, known as minimum-cost control problem, is critically important but remains largely open. By defining an importance index for each node, we propose revisited projected gradient method extension (R-PGME) in Monte-Carlo scenario to determine key node set. It is found that the importance index of a node is strongly correlated to occurrence rate of that node to be selected as a key node in Monte-Carlo realizations for three elementary topologies, Erdős-Rényi and scale-free networks. We also discover the distribution patterns of key nodes when the control cost reaches its minimum. Specifically, the importance indices of all nodes in an elementary stem show a quasi-periodic distribution with high peak values in the beginning and end of a quasi-period while they approach to a uniform distribution in an elementary cycle. We further point out that an elementary dilation can be regarded as two elementary stems whose lengths are the closest, and the importance indices in each stem present similar distribution as in an elementary stem. Our results provide a better understanding and deep insight of locating the key nodes in different topologies with minimum control cost.

  5. Improved One-Way Hash Chain and Revocation Polynomial-Based Self-Healing Group Key Distribution Schemes in Resource-Constrained Wireless Networks

    PubMed Central

    Chen, Huifang; Xie, Lei

    2014-01-01

    Self-healing group key distribution (SGKD) aims to deal with the key distribution problem over an unreliable wireless network. In this paper, we investigate the SGKD issue in resource-constrained wireless networks. We propose two improved SGKD schemes using the one-way hash chain (OHC) and the revocation polynomial (RP), the OHC&RP-SGKD schemes. In the proposed OHC&RP-SGKD schemes, by introducing the unique session identifier and binding the joining time with the capability of recovering previous session keys, the problem of the collusion attack between revoked users and new joined users in existing hash chain-based SGKD schemes is resolved. Moreover, novel methods for utilizing the one-way hash chain and constructing the personal secret, the revocation polynomial and the key updating broadcast packet are presented. Hence, the proposed OHC&RP-SGKD schemes eliminate the limitation of the maximum allowed number of revoked users on the maximum allowed number of sessions, increase the maximum allowed number of revoked/colluding users, and reduce the redundancy in the key updating broadcast packet. Performance analysis and simulation results show that the proposed OHC&RP-SGKD schemes are practical for resource-constrained wireless networks in bad environments, where a strong collusion attack resistance is required and many users could be revoked. PMID:25529204

  6. [Discussion on appraisal methods and key technologies of arbuscular mycorrhizal fungi and medicinal plant symbiosis system].

    PubMed

    Chen, Meilan; Guo, Lanping; Yang, Guang; Chen, Min; Yang, Li; Huang, Luqi

    2011-11-01

    Applications of arbuscular mycorrhizal fungi in research of medicinal plant cultivation are increased in recent years. Medicinal plants habitat is complicated and many inclusions are in root, however crop habitat is simple and few inclusions in root. So appraisal methods and key technologies about the symbiotic system of crop and arbuscular mycorrhizal fungi can't completely suitable for the symbiotic system of medicinal plants and arbuscular mycorrhizal fungi. This article discuss the appraisal methods and key technologies about the symbiotic system of medicinal plant and arbuscular mycorrhizal fungi from the isolation and identification of arbuscular mycorrhiza, and the appraisal of colonization intensity. This article provides guidance for application research of arbuscular mycorrhizal fungi in cultivation of medicinal plants.

  7. Key techniques and risk management for the application of the Pile-Beam-Arch (PBA) excavation method: a case study of the Zhongjie subway station.

    PubMed

    Guan, Yong-ping; Zhao, Wen; Li, Shen-gang; Zhang, Guo-bin

    2014-01-01

    The design and construction of shallow-buried tunnels in densely populated urban areas involve many challenges. The ground movements induced by tunneling effects pose potential risks to infrastructure such as surface buildings, pipelines, and roads. In this paper, a case study of the Zhongjie subway station located in Shenyang, China, is examined to investigate the key construction techniques and the influence of the Pile-Beam-Arch (PBA) excavation method on the surrounding environment. This case study discusses the primary risk factors affecting the environmental safety and summarizes the corresponding risk mitigation measures and key techniques for subway station construction using the PBA excavation method in a densely populated urban area.

  8. RSA-Based Password-Authenticated Key Exchange, Revisited

    NASA Astrophysics Data System (ADS)

    Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki

    The RSA-based Password-Authenticated Key Exchange (PAKE) protocols have been proposed to realize both mutual authentication and generation of secure session keys where a client is sharing his/her password only with a server and the latter should generate its RSA public/private key pair (e, n), (d, n) every time due to the lack of PKI (Public-Key Infrastructures). One of the ways to avoid a special kind of off-line (so called e-residue) attacks in the RSA-based PAKE protocols is to deploy a challenge/response method by which a client verifies the relative primality of e and φ(n) interactively with a server. However, this kind of RSA-based PAKE protocols did not give any proof of the underlying challenge/response method and therefore could not specify the exact complexity of their protocols since there exists another security parameter, needed in the challenge/response method. In this paper, we first present an RSA-based PAKE (RSA-PAKE) protocol that can deploy two different challenge/response methods (denoted by Challenge/Response Method1 and Challenge/Response Method2). The main contributions of this work include: (1) Based on the number theory, we prove that the Challenge/Response Method1 and the Challenge/Response Method2 are secure against e-residue attacks for any odd prime e (2) With the security parameter for the on-line attacks, we show that the RSA-PAKE protocol is provably secure in the random oracle model where all of the off-line attacks are not more efficient than on-line dictionary attacks; and (3) By considering the Hamming weight of e and its complexity in the. RSA-PAKE protocol, we search for primes to be recommended for a practical use. We also compare the RSA-PAKE protocol with the previous ones mainly in terms of computation and communication complexities.

  9. A State-of-the-Art Review: Personalization of Tinnitus Sound Therapy

    PubMed Central

    Searchfield, Grant D.; Durai, Mithila; Linford, Tania

    2017-01-01

    Background: There are several established, and an increasing number of putative, therapies using sound to treat tinnitus. There appear to be few guidelines for sound therapy selection and application. Aim: To review current approaches to personalizing sound therapy for tinnitus. Methods: A “state-of-the-art” review (Grant and Booth, 2009) was undertaken to answer the question: how do current sound-based therapies for tinnitus adjust for tinnitus heterogeneity? Scopus, Google Scholar, Embase and PubMed were searched for the 10-year period 2006–2016. The search strategy used the following key words: “tinnitus” AND “sound” AND “therapy” AND “guidelines” OR “personalized” OR “customized” OR “individual” OR “questionnaire” OR “selection.” The results of the review were cataloged and organized into themes. Results: In total 165 articles were reviewed in full, 83 contained sufficient details to contribute to answering the study question. The key themes identified were hearing compensation, pitched-match therapy, maskability, reaction to sound and psychosocial factors. Although many therapies mentioned customization, few could be classified as being personalized. Several psychoacoustic and questionnaire-based methods for assisting treatment selection were identified. Conclusions: Assessment methods are available to assist clinicians to personalize sound-therapy and empower patients to be active in therapy decision-making. Most current therapies are modified using only one characteristic of the individual and/or their tinnitus. PMID:28970812

  10. Noise-free recovery of optodigital encrypted and multiplexed images.

    PubMed

    Henao, Rodrigo; Rueda, Edgar; Barrera, John F; Torroba, Roberto

    2010-02-01

    We present a method that allows storing multiple encrypted data using digital holography and a joint transform correlator architecture with a controllable angle reference wave. In this method, the information is multiplexed by using a key and a different reference wave angle for each object. In the recovering process, the use of different reference wave angles prevents noise produced by the nonrecovered objects from being superimposed on the recovered object; moreover, the position of the recovered object in the exit plane can be fully controlled. We present the theoretical analysis and the experimental results that show the potential and applicability of the method.

  11. Hiding message into DNA sequence through DNA coding and chaotic maps.

    PubMed

    Liu, Guoyan; Liu, Hongjun; Kadir, Abdurahman

    2014-09-01

    The paper proposes an improved reversible substitution method to hide data into deoxyribonucleic acid (DNA) sequence, and four measures have been taken to enhance the robustness and enlarge the hiding capacity, such as encode the secret message by DNA coding, encrypt it by pseudo-random sequence, generate the relative hiding locations by piecewise linear chaotic map, and embed the encoded and encrypted message into a randomly selected DNA sequence using the complementary rule. The key space and the hiding capacity are analyzed. Experimental results indicate that the proposed method has a better performance compared with the competing methods with respect to robustness and capacity.

  12. The method for froth floatation condition recognition based on adaptive feature weighted

    NASA Astrophysics Data System (ADS)

    Wang, Jieran; Zhang, Jun; Tian, Jinwen; Zhang, Daimeng; Liu, Xiaomao

    2018-03-01

    The fusion of foam characteristics can play a complementary role in expressing the content of foam image. The weight of foam characteristics is the key to make full use of the relationship between the different features. In this paper, an Adaptive Feature Weighted Method For Froth Floatation Condition Recognition is proposed. Foam features without and with weights are both classified by using support vector machine (SVM).The classification accuracy and optimal equaling algorithm under the each ore grade are regarded as the result of the adaptive feature weighting algorithm. At the same time the effectiveness of adaptive weighted method is demonstrated.

  13. Advances in high throughput DNA sequence data compression.

    PubMed

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz

    2016-06-01

    Advances in high throughput sequencing technologies and reduction in cost of sequencing have led to exponential growth in high throughput DNA sequence data. This growth has posed challenges such as storage, retrieval, and transmission of sequencing data. Data compression is used to cope with these challenges. Various methods have been developed to compress genomic and sequencing data. In this article, we present a comprehensive review of compression methods for genome and reads compression. Algorithms are categorized as referential or reference free. Experimental results and comparative analysis of various methods for data compression are presented. Finally, key challenges and research directions in DNA sequence data compression are highlighted.

  14. Spatio-Temporal Process Simulation of Dam-Break Flood Based on SPH

    NASA Astrophysics Data System (ADS)

    Wang, H.; Ye, F.; Ouyang, S.; Li, Z.

    2018-04-01

    On the basis of introducing the SPH (Smooth Particle Hydrodynamics) simulation method, the key research problems were given solutions in this paper, which ere the spatial scale and temporal scale adapting to the GIS(Geographical Information System) application, the boundary condition equations combined with the underlying surface, and the kernel function and parameters applicable to dam-break flood simulation. In this regards, a calculation method of spatio-temporal process emulation with elaborate particles for dam-break flood was proposed. Moreover the spatio-temporal process was dynamic simulated by using GIS modelling and visualization. The results show that the method gets more information, objectiveness and real situations.

  15. RESEARCH: An Ecoregional Approach to the Economic Valuation of Land- and Water-Based Recreation in the United States

    PubMed

    Bhat; Bergstrom; Teasley; Bowker; Cordell

    1998-01-01

    / This paper describes a framework for estimating the economic value of outdoor recreation across different ecoregions. Ten ecoregions in the continental United States were defined based on similarly functioning ecosystem characters. The individual travel cost method was employed to estimate recreation demand functions for activities such as motor boating and waterskiing, developed and primitive camping, coldwater fishing, sightseeing and pleasure driving, and big game hunting for each ecoregion. While our ecoregional approach differs conceptually from previous work, our results appear consistent with the previous travel cost method valuation studies.KEY WORDS: Recreation; Ecoregion; Travel cost method; Truncated Poisson model

  16. Standardization of Tc-99 by two methods and participation at the CCRI(II)-K2. Tc-99 comparison.

    PubMed

    Sahagia, M; Antohe, A; Ioan, R; Luca, A; Ivan, C

    2014-05-01

    The work accomplished within the participation at the 2012 key comparison of Tc-99 is presented. The solution was standardized for the first time in IFIN-HH by two methods: LSC-TDCR and 4π(PC)β-γ efficiency tracer. The methods are described and the results are compared. For the LSC-TDCR method, the program TDCR07c, written and provided by P. Cassette, was used for processing the measurement data. The results are 2.1% higher than when applying the TDCR06b program; the higher value, calculated with the software TDCR07c, was used for reporting the final result in the comparison. The tracer used for the 4π(PC)β-γ efficiency tracer method was a standard (60)Co solution. The sources were prepared from the mixture (60)Co+(99)Tc solution and a general extrapolation curve, type: N(βTc-99)/(M)(Tc-99)=f [1-ε(Co-60)], was drawn. This value was not used for the final result of the comparison. The difference between the values of activity concentration obtained by the two methods was within the limit of the combined standard uncertainty of the difference of these two results. © 2013 Published by Elsevier Ltd.

  17. Topology-optimization-based design method of flexures for mounting the primary mirror of a large-aperture space telescope.

    PubMed

    Hu, Rui; Liu, Shutian; Li, Quhao

    2017-05-20

    For the development of a large-aperture space telescope, one of the key techniques is the method for designing the flexures for mounting the primary mirror, as the flexures are the key components. In this paper, a topology-optimization-based method for designing flexures is presented. The structural performances of the mirror system under multiple load conditions, including static gravity and thermal loads, as well as the dynamic vibration, are considered. The mirror surface shape error caused by gravity and the thermal effect is treated as the objective function, and the first-order natural frequency of the mirror structural system is taken as the constraint. The pattern repetition constraint is added, which can ensure symmetrical material distribution. The topology optimization model for flexure design is established. The substructuring method is also used to condense the degrees of freedom (DOF) of all the nodes of the mirror system, except for the nodes that are linked to the mounting flexures, to reduce the computation effort during the optimization iteration process. A potential optimized configuration is achieved by solving the optimization model and post-processing. A detailed shape optimization is subsequently conducted to optimize its dimension parameters. Our optimization method deduces new mounting structures that significantly enhance the optical performance of the mirror system compared to the traditional methods, which only focus on the parameters of existing structures. Design results demonstrate the effectiveness of the proposed optimization method.

  18. Long-Term Outcome from a Medium Secure Service for People with Intellectual Disability

    ERIC Educational Resources Information Center

    Alexander, R. T.; Crouch, K.; Halstead, S.; Piachaud, J.

    2006-01-01

    Background: The purpose of this paper is to describe long-term outcomes for patients discharged over a 12-year period from a medium secure service for people with intellectual disabilities (ID). Methods: A cohort study using case-notes analysis and a structured interview of current key informants. Results: Eleven per cent of the sample was…

  19. Using Nominal Technique to Inform a Sexual Health Program for Black Youth

    ERIC Educational Resources Information Center

    Annang, Lucy; Hannon, Lonnie; Fletcher, Faith E.; Horn, Wendy Sykes; Cornish, Disa

    2011-01-01

    Objectives: To describe how nominal group technique (NGT) was used to inform the development of a sexual health education program for black high school youth in the South. Methods: NGT was used with a community advisory board (CAB) to obtain information regarding the key components of a sexual health program for youth in their community. Results:…

  20. "I Do but I Don't": The Search for Identity in Urban African American Adolescents

    ERIC Educational Resources Information Center

    Gullan, Rebecca Lakin; Hoffman, Beth Necowitz; Leff, Stephen S.

    2011-01-01

    Achievement of a coherent and strong sense of self is critical to positive academic outcomes for urban minority youth. The present study utilized a mixed-methods approach to explore key aspects of identity development for African American adolescents living in a high-poverty, urban neighborhood. Results suggest that efforts to develop a sense of…

  1. [Essential procedure and key methods for survey of traditional knowledge related to Chinese materia medica resources].

    PubMed

    Cheng, Gong; Huang, Lu-qi; Xue, Da-yuan; Zhang, Xiao-bo

    2014-12-01

    The survey of traditional knowledge related to Chinese materia medica resources is the important component and one of the innovative aspects of the fourth national survey of the Chinese materia medica resources. China has rich traditional knowledge of traditional Chinese medicine (TCM) and the comprehensive investigation of TCM traditional knowledge aims to promote conservation and sustainable use of Chinese materia medica resources. Building upon the field work of pilot investigations, this paper introduces the essential procedures and key methods for conducting the survey of traditional knowledge related to Chinese materia medica resources. The essential procedures are as follows. First is the preparation phrase. It is important to review all relevant literature and provide training to the survey teams so that they have clear understanding of the concept of traditional knowledge and master key survey methods. Second is the field investigation phrase. When conducting field investigations, survey teams should identify the traditional knowledge holders by using the 'snowball method', record the traditional knowledge after obtaining prior informed concerned from the traditional knowledge holders. Researchers should fill out the survey forms provided by the Technical Specification of the Fourth National Survey of Chinese Materia Medica Resources. Researchers should pay particular attention to the scope of traditional knowledge and the method of inheriting the knowledge, which are the key information for traditional knowledge holders and potential users to reach mutual agreed terms to achieve benefit sharing. Third is the data compilation and analysis phrase. Researchers should try to compile and edit the TCM traditional knowledge in accordance with intellectual property rights requirements so that the information collected through the national survey can serve as the basic data for the TCM traditional knowledge database. The key methods of the survey include regional division of Chinese materia medica resources, interview of key information holders and standardization of information.' In particular, using "snowball method" can effectively identify traditional knowledge holder in the targeted regions and ensuring traditional knowledge holders receiving prior informed concerned before sharing the information with researcher to make sure the rights of traditional knowledge holders are protected. Employing right survey methods is not only the key to obtain traditional knowledge related to Chinese materia medica resources, but also the pathway to fulfill the objectives of access and benefit sharing stipulated in Convention on Biological Resources. It will promote the legal protection of TCM traditional knowledge and conservation of TCM intangible, cultural heritage.

  2. Numerical Simulation and Optimization of Directional Solidification Process of Single Crystal Superalloy Casting

    PubMed Central

    Zhang, Hang; Xu, Qingyan; Liu, Baicheng

    2014-01-01

    The rapid development of numerical modeling techniques has led to more accurate results in modeling metal solidification processes. In this study, the cellular automaton-finite difference (CA-FD) method was used to simulate the directional solidification (DS) process of single crystal (SX) superalloy blade samples. Experiments were carried out to validate the simulation results. Meanwhile, an intelligent model based on fuzzy control theory was built to optimize the complicate DS process. Several key parameters, such as mushy zone width and temperature difference at the cast-mold interface, were recognized as the input variables. The input variables were functioned with the multivariable fuzzy rule to get the output adjustment of withdrawal rate (v) (a key technological parameter). The multivariable fuzzy rule was built, based on the structure feature of casting, such as the relationship between section area, and the delay time of the temperature change response by changing v, and the professional experience of the operator as well. Then, the fuzzy controlling model coupled with CA-FD method could be used to optimize v in real-time during the manufacturing process. The optimized process was proven to be more flexible and adaptive for a steady and stray-grain free DS process. PMID:28788535

  3. On supervised graph Laplacian embedding CA model & kernel construction and its application

    NASA Astrophysics Data System (ADS)

    Zeng, Junwei; Qian, Yongsheng; Wang, Min; Yang, Yongzhong

    2017-01-01

    There are many methods to construct kernel with given data attribute information. Gaussian radial basis function (RBF) kernel is one of the most popular ways to construct a kernel. The key observation is that in real-world data, besides the data attribute information, data label information also exists, which indicates the data class. In order to make use of both data attribute information and data label information, in this work, we propose a supervised kernel construction method. Supervised information from training data is integrated into standard kernel construction process to improve the discriminative property of resulting kernel. A supervised Laplacian embedding cellular automaton model is another key application developed for two-lane heterogeneous traffic flow with the safe distance and large-scale truck. Based on the properties of traffic flow in China, we re-calibrate the cell length, velocity, random slowing mechanism and lane-change conditions and use simulation tests to study the relationships among the speed, density and flux. The numerical results show that the large-scale trucks will have great effects on the traffic flow, which are relevant to the proportion of the large-scale trucks, random slowing rate and the times of the lane space change.

  4. An efficient and stable hydrodynamic model with novel source term discretization schemes for overland flow and flood simulations

    NASA Astrophysics Data System (ADS)

    Xia, Xilin; Liang, Qiuhua; Ming, Xiaodong; Hou, Jingming

    2017-05-01

    Numerical models solving the full 2-D shallow water equations (SWEs) have been increasingly used to simulate overland flows and better understand the transient flow dynamics of flash floods in a catchment. However, there still exist key challenges that have not yet been resolved for the development of fully dynamic overland flow models, related to (1) the difficulty of maintaining numerical stability and accuracy in the limit of disappearing water depth and (2) inaccurate estimation of velocities and discharges on slopes as a result of strong nonlinearity of friction terms. This paper aims to tackle these key research challenges and present a new numerical scheme for accurately and efficiently modeling large-scale transient overland flows over complex terrains. The proposed scheme features a novel surface reconstruction method (SRM) to correctly compute slope source terms and maintain numerical stability at small water depth, and a new implicit discretization method to handle the highly nonlinear friction terms. The resulting shallow water overland flow model is first validated against analytical and experimental test cases and then applied to simulate a hypothetic rainfall event in the 42 km2 Haltwhistle Burn, UK.

  5. Preserving Lagrangian Structure in Nonlinear Model Reduction with Application to Structural Dynamics

    DOE PAGES

    Carlberg, Kevin; Tuminaro, Ray; Boggs, Paul

    2015-03-11

    Our work proposes a model-reduction methodology that preserves Lagrangian structure and achieves computational efficiency in the presence of high-order nonlinearities and arbitrary parameter dependence. As such, the resulting reduced-order model retains key properties such as energy conservation and symplectic time-evolution maps. We focus on parameterized simple mechanical systems subjected to Rayleigh damping and external forces, and consider an application to nonlinear structural dynamics. To preserve structure, the method first approximates the system's “Lagrangian ingredients''---the Riemannian metric, the potential-energy function, the dissipation function, and the external force---and subsequently derives reduced-order equations of motion by applying the (forced) Euler--Lagrange equation with thesemore » quantities. Moreover, from the algebraic perspective, key contributions include two efficient techniques for approximating parameterized reduced matrices while preserving symmetry and positive definiteness: matrix gappy proper orthogonal decomposition and reduced-basis sparsification. Our results for a parameterized truss-structure problem demonstrate the practical importance of preserving Lagrangian structure and illustrate the proposed method's merits: it reduces computation time while maintaining high accuracy and stability, in contrast to existing nonlinear model-reduction techniques that do not preserve structure.« less

  6. Preserving Lagrangian Structure in Nonlinear Model Reduction with Application to Structural Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlberg, Kevin; Tuminaro, Ray; Boggs, Paul

    Our work proposes a model-reduction methodology that preserves Lagrangian structure and achieves computational efficiency in the presence of high-order nonlinearities and arbitrary parameter dependence. As such, the resulting reduced-order model retains key properties such as energy conservation and symplectic time-evolution maps. We focus on parameterized simple mechanical systems subjected to Rayleigh damping and external forces, and consider an application to nonlinear structural dynamics. To preserve structure, the method first approximates the system's “Lagrangian ingredients''---the Riemannian metric, the potential-energy function, the dissipation function, and the external force---and subsequently derives reduced-order equations of motion by applying the (forced) Euler--Lagrange equation with thesemore » quantities. Moreover, from the algebraic perspective, key contributions include two efficient techniques for approximating parameterized reduced matrices while preserving symmetry and positive definiteness: matrix gappy proper orthogonal decomposition and reduced-basis sparsification. Our results for a parameterized truss-structure problem demonstrate the practical importance of preserving Lagrangian structure and illustrate the proposed method's merits: it reduces computation time while maintaining high accuracy and stability, in contrast to existing nonlinear model-reduction techniques that do not preserve structure.« less

  7. Structured Illumination Microscopy for the Investigation of Synaptic Structure and Function.

    PubMed

    Hong, Soyon; Wilton, Daniel K; Stevens, Beth; Richardson, Douglas S

    2017-01-01

    The neuronal synapse is a primary building block of the nervous system to which alterations in structure or function can result in numerous pathologies. Studying its formation and elimination is the key to understanding how brains are wired during development, maintained throughout adulthood plasticity, and disrupted during disease. However, due to its diffraction-limited size, investigations of the synaptic junction at the structural level have primarily relied on labor-intensive electron microscopy or ultra-thin section array tomography. Recent advances in the field of super-resolution light microscopy now allow researchers to image synapses and associated molecules with high-spatial resolution, while taking advantage of the key characteristics of light microscopy, such as easy sample preparation and the ability to detect multiple targets with molecular specificity. One such super-resolution technique, Structured Illumination Microscopy (SIM), has emerged as an attractive method to examine synapse structure and function. SIM requires little change in standard light microscopy sample preparation steps, but results in a twofold improvement in both lateral and axial resolutions compared to widefield microscopy. The following protocol outlines a method for imaging synaptic structures at resolutions capable of resolving the intricacies of these neuronal connections.

  8. Classification of focal liver lesions on ultrasound images by extracting hybrid textural features and using an artificial neural network.

    PubMed

    Hwang, Yoo Na; Lee, Ju Hwan; Kim, Ga Young; Jiang, Yuan Yuan; Kim, Sung Min

    2015-01-01

    This paper focuses on the improvement of the diagnostic accuracy of focal liver lesions by quantifying the key features of cysts, hemangiomas, and malignant lesions on ultrasound images. The focal liver lesions were divided into 29 cysts, 37 hemangiomas, and 33 malignancies. A total of 42 hybrid textural features that composed of 5 first order statistics, 18 gray level co-occurrence matrices, 18 Law's, and echogenicity were extracted. A total of 29 key features that were selected by principal component analysis were used as a set of inputs for a feed-forward neural network. For each lesion, the performance of the diagnosis was evaluated by using the positive predictive value, negative predictive value, sensitivity, specificity, and accuracy. The results of the experiment indicate that the proposed method exhibits great performance, a high diagnosis accuracy of over 96% among all focal liver lesion groups (cyst vs. hemangioma, cyst vs. malignant, and hemangioma vs. malignant) on ultrasound images. The accuracy was slightly increased when echogenicity was included in the optimal feature set. These results indicate that it is possible for the proposed method to be applied clinically.

  9. The influence of the dispersion method on the electrical properties of vapor-grown carbon nanofiber/epoxy composites

    PubMed Central

    2011-01-01

    The influence of the dispersion of vapor-grown carbon nanofibers (VGCNF) on the electrical properties of VGCNF/Epoxy composites has been studied. A homogenous dispersion of the VGCNF does not imply better electrical properties. In fact, it is demonstrated that the most simple of the tested dispersion methods results in higher conductivity, since the presence of well-distributed nanofiber clusters appears to be a key factor for increasing composite conductivity. PACS: 72.80.Tm; 73.63.Fg; 81.05.Qk PMID:21711873

  10. Mathematics in modern immunology

    DOE PAGES

    Castro, Mario; Lythe, Grant; Molina-París, Carmen; ...

    2016-02-19

    Mathematical and statistical methods enable multidisciplinary approaches that catalyse discovery. Together with experimental methods, they identify key hypotheses, define measurable observables and reconcile disparate results. Here, we collect a representative sample of studies in T-cell biology that illustrate the benefits of modelling–experimental collaborations and that have proven valuable or even groundbreaking. Furthermore, we conclude that it is possible to find excellent examples of synergy between mathematical modelling and experiment in immunology, which have brought significant insight that would not be available without these collaborations, but that much remains to be discovered.

  11. African Primary Care Research: Qualitative data analysis and writing results

    PubMed Central

    Govender, Indiran; Ogunbanjo, Gboyega A.; Mash, Bob

    2014-01-01

    Abstract This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given. PMID:26245437

  12. Mathematics in modern immunology

    PubMed Central

    Castro, Mario; Lythe, Grant; Molina-París, Carmen; Ribeiro, Ruy M.

    2016-01-01

    Mathematical and statistical methods enable multidisciplinary approaches that catalyse discovery. Together with experimental methods, they identify key hypotheses, define measurable observables and reconcile disparate results. We collect a representative sample of studies in T-cell biology that illustrate the benefits of modelling–experimental collaborations and that have proven valuable or even groundbreaking. We conclude that it is possible to find excellent examples of synergy between mathematical modelling and experiment in immunology, which have brought significant insight that would not be available without these collaborations, but that much remains to be discovered. PMID:27051512

  13. Cross Correlation versus Normalized Mutual Information on Image Registration

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Tilton, James C.; Lin, Guoqing

    2016-01-01

    This is the first study to quantitatively assess and compare cross correlation and normalized mutual information methods used to register images in subpixel scale. The study shows that the normalized mutual information method is less sensitive to unaligned edges due to the spectral response differences than is cross correlation. This characteristic makes the normalized image resolution a better candidate for band to band registration. Improved band-to-band registration in the data from satellite-borne instruments will result in improved retrievals of key science measurements such as cloud properties, vegetation, snow and fire.

  14. Ribosomal RNA: a key to phylogeny

    NASA Technical Reports Server (NTRS)

    Olsen, G. J.; Woese, C. R.

    1993-01-01

    As molecular phylogeny increasingly shapes our understanding of organismal relationships, no molecule has been applied to more questions than have ribosomal RNAs. We review this role of the rRNAs and some of the insights that have been gained from them. We also offer some of the practical considerations in extracting the phylogenetic information from the sequences. Finally, we stress the importance of comparing results from multiple molecules, both as a method for testing the overall reliability of the organismal phylogeny and as a method for more broadly exploring the history of the genome.

  15. African Primary Care Research: qualitative data analysis and writing results.

    PubMed

    Mabuza, Langalibalele H; Govender, Indiran; Ogunbanjo, Gboyega A; Mash, Bob

    2014-06-05

    This article is part of a series on African primary care research and gives practical guidance on qualitative data analysis and the presentation of qualitative findings. After an overview of qualitative methods and analytical approaches, the article focuses particularly on content analysis, using the framework method as an example. The steps of familiarisation, creating a thematic index, indexing, charting, interpretation and confirmation are described. Key concepts with regard to establishing the quality and trustworthiness of data analysis are described. Finally, an approach to the presentation of qualitative findings is given.

  16. Attenuation correction with region growing method used in the positron emission mammography imaging system

    NASA Astrophysics Data System (ADS)

    Gu, Xiao-Yue; Li, Lin; Yin, Peng-Fei; Yun, Ming-Kai; Chai, Pei; Huang, Xian-Chao; Sun, Xiao-Li; Wei, Long

    2015-10-01

    The Positron Emission Mammography imaging system (PEMi) provides a novel nuclear diagnosis method dedicated for breast imaging. With a better resolution than whole body PET, PEMi can detect millimeter-sized breast tumors. To address the requirement of semi-quantitative analysis with a radiotracer concentration map of the breast, a new attenuation correction method based on a three-dimensional seeded region growing image segmentation (3DSRG-AC) method has been developed. The method gives a 3D connected region as the segmentation result instead of image slices. The continuity property of the segmentation result makes this new method free of activity variation of breast tissues. The threshold value chosen is the key process for the segmentation method. The first valley in the grey level histogram of the reconstruction image is set as the lower threshold, which works well in clinical application. Results show that attenuation correction for PEMi improves the image quality and the quantitative accuracy of radioactivity distribution determination. Attenuation correction also improves the probability of detecting small and early breast tumors. Supported by Knowledge Innovation Project of The Chinese Academy of Sciences (KJCX2-EW-N06)

  17. Extending the solvent-free MALDI sample preparation method.

    PubMed

    Hanton, Scott D; Parees, David M

    2005-01-01

    Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry is an important technique to characterize many different materials, including synthetic polymers. MALDI mass spectral data can be used to determine the polymer average molecular weights, repeat units, and end groups. One of the key issues in traditional MALDI sample preparation is making good solutions of the analyte and the matrix. Solvent-free sample preparation methods have been developed to address these issues. Previous results of solvent-free or dry prepared samples show some advantages over traditional wet sample preparation methods. Although the results of the published solvent-free sample preparation methods produced excellent mass spectra, we found the method to be very time-consuming, with significant tool cleaning, which presents a significant possibility of cross contamination. To address these issues, we developed an extension of the solvent-free method that replaces the mortar and pestle grinding with ball milling the sample in a glass vial with two small steel balls. This new method generates mass spectra with equal quality of the previous methods, but has significant advantages in productivity, eliminates cross contamination, and is applicable to liquid and soft or waxy analytes.

  18. The efficiency and effectiveness of utilizing diagrams in interviews: an assessment of participatory diagramming and graphic elicitation

    PubMed Central

    Umoquit, Muriah J; Dobrow, Mark J; Lemieux-Charles, Louise; Ritvo, Paul G; Urbach, David R; Wodchis, Walter P

    2008-01-01

    Background This paper focuses on measuring the efficiency and effectiveness of two diagramming methods employed in key informant interviews with clinicians and health care administrators. The two methods are 'participatory diagramming', where the respondent creates a diagram that assists in their communication of answers, and 'graphic elicitation', where a researcher-prepared diagram is used to stimulate data collection. Methods These two diagramming methods were applied in key informant interviews and their value in efficiently and effectively gathering data was assessed based on quantitative measures and qualitative observations. Results Assessment of the two diagramming methods suggests that participatory diagramming is an efficient method for collecting data in graphic form, but may not generate the depth of verbal response that many qualitative researchers seek. In contrast, graphic elicitation was more intuitive, better understood and preferred by most respondents, and often provided more contemplative verbal responses, however this was achieved at the expense of more interview time. Conclusion Diagramming methods are important for eliciting interview data that are often difficult to obtain through traditional verbal exchanges. Subject to the methodological limitations of the study, our findings suggest that while participatory diagramming and graphic elicitation have specific strengths and weaknesses, their combined use can provide complementary information that would not likely occur with the application of only one diagramming method. The methodological insights gained by examining the efficiency and effectiveness of these diagramming methods in our study should be helpful to other researchers considering their incorporation into qualitative research designs. PMID:18691410

  19. Demodulation of acoustic telemetry binary phase shift keying signal based on high-order Duffing system

    NASA Astrophysics Data System (ADS)

    Yan, Bing-Nan; Liu, Chong-Xin; Ni, Jun-Kang; Zhao, Liang

    2016-10-01

    In order to grasp the downhole situation immediately, logging while drilling (LWD) technology is adopted. One of the LWD technologies, called acoustic telemetry, can be successfully applied to modern drilling. It is critical for acoustic telemetry technology that the signal is successfully transmitted to the ground. In this paper, binary phase shift keying (BPSK) is used to modulate carrier waves for the transmission and a new BPSK demodulation scheme based on Duffing chaos is investigated. Firstly, a high-order system is given in order to enhance the signal detection capability and it is realized through building a virtual circuit using an electronic workbench (EWB). Secondly, a new BPSK demodulation scheme is proposed based on the intermittent chaos phenomena of the new Duffing system. Finally, a system variable crossing zero-point equidistance method is proposed to obtain the phase difference between the system and the BPSK signal. Then it is determined that the digital signal transmitted from the bottom of the well is ‘0’ or ‘1’. The simulation results show that the demodulation method is feasible. Project supported by the National Natural Science Foundation of China (Grant No. 51177117) and the National Key Science & Technology Special Projects, China (Grant No. 2011ZX05021-005).

  20. Determinants of sustainability in solid waste management--the Gianyar Waste Recovery Project in Indonesia.

    PubMed

    Zurbrügg, Christian; Gfrerer, Margareth; Ashadi, Henki; Brenner, Werner; Küper, David

    2012-11-01

    According to most experts, integrated and sustainable solid waste management should not only be given top priority, but must go beyond technical aspects to include various key elements of sustainability to ensure success of any solid waste project. Aside from project sustainable impacts, the overall enabling environment is the key feature determining performance and success of an integrated and affordable solid waste system. This paper describes a project-specific approach to assess typical success or failure factors. A questionnaire-based assessment method covers issues of: (i) social mobilisation and acceptance (social element), (ii) stakeholder, legal and institutional arrangements comprising roles, responsibilities and management functions (institutional element); (iii) financial and operational requirements, as well as cost recovery mechanisms (economic element). The Gianyar Waste Recovery Project in Bali, Indonesia was analysed using this integrated assessment method. The results clearly identified chief characteristics, key factors to consider when planning country wide replication but also major barriers and obstacles which must be overcome to ensure project sustainability. The Gianyar project consists of a composting unit processing 60 tons of municipal waste per day from 500,000 inhabitants, including manual waste segregation and subsequent composting of the biodegradable organic fraction. Copyright © 2012 Elsevier Ltd. All rights reserved.

Top