Sample records for digital signature algorithm

  1. Implementation of Digital Signature Using Aes and Rsa Algorithms as a Security in Disposition System af Letter

    NASA Astrophysics Data System (ADS)

    Siregar, H.; Junaeti, E.; Hayatno, T.

    2017-03-01

    Activities correspondence is often used by agencies or companies, so that institutions or companies set up a special division to handle issues related to the letter management. Most of the distribution of letters using electronic media, then the letter should be kept confidential in order to avoid things that are not desirable. Techniques that can be done to meet the security aspect is by using cryptography or by giving a digital signature. The addition of asymmetric and symmetric algorithms, i.e. RSA and AES algorithms, on the digital signature had been done in this study to maintain data security. The RSA algorithm was used during the process of giving digital signature, while the AES algorithm was used during the process of encoding a message that will be sent to the receiver. Based on the research can be concluded that the additions of AES and RSA algorithms on the digital signature meet four objectives of cryptography: Secrecy, Data Integrity, Authentication and Non-repudiation.

  2. A comparative study of Message Digest 5(MD5) and SHA256 algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Tarigan, J. T.; Ginting, A. B. C.

    2018-03-01

    The document is a collection of written or printed data containing information. The more rapid advancement of technology, the integrity of a document should be kept. Because of the nature of an open document means the document contents can be read and modified by many parties so that the integrity of the information as a content of the document is not preserved. To maintain the integrity of the data, it needs to create a mechanism which is called a digital signature. A digital signature is a specific code which is generated from the function of producing a digital signature. One of the algorithms that used to create the digital signature is a hash function. There are many hash functions. Two of them are message digest 5 (MD5) and SHA256. Those both algorithms certainly have its advantages and disadvantages of each. The purpose of this research is to determine the algorithm which is better. The parameters which used to compare that two algorithms are the running time and complexity. The research results obtained from the complexity of the Algorithms MD5 and SHA256 is the same, i.e., ⊖ (N), but regarding the speed is obtained that MD5 is better compared to SHA256.

  3. Photonic quantum digital signatures operating over kilometer ranges in installed optical fiber

    NASA Astrophysics Data System (ADS)

    Collins, Robert J.; Fujiwara, Mikio; Amiri, Ryan; Honjo, Toshimori; Shimizu, Kaoru; Tamaki, Kiyoshi; Takeoka, Masahiro; Andersson, Erika; Buller, Gerald S.; Sasaki, Masahide

    2016-10-01

    The security of electronic communications is a topic that has gained noteworthy public interest in recent years. As a result, there is an increasing public recognition of the existence and importance of mathematically based approaches to digital security. Many of these implement digital signatures to ensure that a malicious party has not tampered with the message in transit, that a legitimate receiver can validate the identity of the signer and that messages are transferable. The security of most digital signature schemes relies on the assumed computational difficulty of solving certain mathematical problems. However, reports in the media have shown that certain implementations of such signature schemes are vulnerable to algorithmic breakthroughs and emerging quantum processing technologies. Indeed, even without quantum processors, the possibility remains that classical algorithmic breakthroughs will render these schemes insecure. There is ongoing research into information-theoretically secure signature schemes, where the security is guaranteed against an attacker with arbitrary computational resources. One such approach is quantum digital signatures. Quantum signature schemes can be made information-theoretically secure based on the laws of quantum mechanics while comparable classical protocols require additional resources such as anonymous broadcast and/or a trusted authority. Previously, most early demonstrations of quantum digital signatures required dedicated single-purpose hardware and operated over restricted ranges in a laboratory environment. Here, for the first time, we present a demonstration of quantum digital signatures conducted over several kilometers of installed optical fiber. The system reported here operates at a higher signature generation rate than previous fiber systems.

  4. Signature Verification Using N-tuple Learning Machine.

    PubMed

    Maneechot, Thanin; Kitjaidure, Yuttana

    2005-01-01

    This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.

  5. Digital camera with apparatus for authentication of images produced from an image file

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1993-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

  6. Digital Camera with Apparatus for Authentication of Images Produced from an Image File

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1996-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.

  7. 76 FR 11433 - Federal Transition To Secure Hash Algorithm (SHA)-256

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... generating digital signatures. Current information systems, Web servers, applications and workstation operating systems were designed to process, and use SHA-1 generated signatures. National Institute of... cryptographic keys, and more robust algorithms by December 2013. Government systems may begin to encounter...

  8. Internet Protocol Security (IPSEC): Testing and Implications on IPv4 and IPv6 Networks

    DTIC Science & Technology

    2008-08-27

    Message Authentication Code-Message Digest 5-96). Due to the processing power consumption and slowness of public key authentication methods, RSA ...MODP) group with a 768 -bit modulus 2. a MODP group with a 1024-bit modulus 3. an Elliptic Curve Group over GF[ 2n ] (EC2N) group with a 155-bit...nonces, digital signatures using the Digital Signature Algorithm, and the Rivest-Shamir- Adelman ( RSA ) algorithm. For more information about the

  9. Evaluation of security algorithms used for security processing on DICOM images

    NASA Astrophysics Data System (ADS)

    Chen, Xiaomeng; Shuai, Jie; Zhang, Jianguo; Huang, H. K.

    2005-04-01

    In this paper, we developed security approach to provide security measures and features in PACS image acquisition and Tele-radiology image transmission. The security processing on medical images was based on public key infrastructure (PKI) and including digital signature and data encryption to achieve the security features of confidentiality, privacy, authenticity, integrity, and non-repudiation. There are many algorithms which can be used in PKI for data encryption and digital signature. In this research, we select several algorithms to perform security processing on different DICOM images in PACS environment, evaluate the security processing performance of these algorithms, and find the relationship between performance with image types, sizes and the implementation methods.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    AISL-CRYPTO is a library of cryptography functions supporting other AISL software. It provides various crypto functions for Common Lisp, including Digital Signature Algorithm, Data Encryption Standard, Secure Hash Algorithm, and public-key cryptography.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hymel, Ross

    The Public Key (PK) FPGA software performs asymmetric authentication using the 163-bit Elliptic Curve Digital Signature Algorithm (ECDSA) on an embedded FPGA platform. A digital signature is created on user-supplied data, and communication with a host system is performed via a Serial Peripheral Interface (SPI) bus. Software includes all components necessary for signing, including custom random number generator for key creation and SHA-256 for data hashing.

  12. Formal Methods for Cryptographic Protocol Analysis: Emerging Issues and Trends

    DTIC Science & Technology

    2003-01-01

    signatures , which depend upon the homomor- phic properties of RSA. Other algorithms and data structures, such as Chaum mixes [17], designed for...Communications Security, pages 176–185. ACM, Novem- ber 2001. [17] D. Chaum . Untraceable electronic mail, return addresses and digital signatures ...something like the Diffie- Hellman algorithm, which depends, as a minimum, on the commutative properties of exponentiation, or something like Chaum’s blinded

  13. Design Time Optimization for Hardware Watermarking Protection of HDL Designs

    PubMed Central

    Castillo, E.; Morales, D. P.; García, A.; Parrilla, L.; Todorovich, E.; Meyer-Baese, U.

    2015-01-01

    HDL-level design offers important advantages for the application of watermarking to IP cores, but its complexity also requires tools automating these watermarking algorithms. A new tool for signature distribution through combinational logic is proposed in this work. IPP@HDL, a previously proposed high-level watermarking technique, has been employed for evaluating the tool. IPP@HDL relies on spreading the bits of a digital signature at the HDL design level using combinational logic included within the original system. The development of this new tool for the signature distribution has not only extended and eased the applicability of this IPP technique, but it has also improved the signature hosting process itself. Three algorithms were studied in order to develop this automated tool. The selection of a cost function determines the best hosting solutions in terms of area and performance penalties on the IP core to protect. An 1D-DWT core and MD5 and SHA1 digital signatures were used in order to illustrate the benefits of the new tool and its optimization related to the extraction logic resources. Among the proposed algorithms, the alternative based on simulated annealing reduces the additional resources while maintaining an acceptable computation time and also saving designer effort and time. PMID:25861681

  14. A high speed implementation of the random decrement algorithm

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.

    1982-01-01

    The algorithm is useful for measuring net system damping levels in stochastic processes and for the development of equivalent linearized system response models. The algorithm works by summing together all subrecords which occur after predefined threshold level is crossed. The random decrement signature is normally developed by scanning stored data and adding subrecords together. The high speed implementation of the random decrement algorithm exploits the digital character of sampled data and uses fixed record lengths of 2(n) samples to greatly speed up the process. The contributions to the random decrement signature of each data point was calculated only once and in the same sequence as the data were taken. A hardware implementation of the algorithm using random logic is diagrammed and the process is shown to be limited only by the record size and the threshold crossing frequency of the sampled data. With a hardware cycle time of 200 ns and 1024 point signature, a threshold crossing frequency of 5000 Hertz can be processed and a stably averaged signature presented in real time.

  15. Research on key technologies for data-interoperability-based metadata, data compression and encryption, and their application

    NASA Astrophysics Data System (ADS)

    Yu, Xu; Shao, Quanqin; Zhu, Yunhai; Deng, Yuejin; Yang, Haijun

    2006-10-01

    With the development of informationization and the separation between data management departments and application departments, spatial data sharing becomes one of the most important objectives for the spatial information infrastructure construction, and spatial metadata management system, data transmission security and data compression are the key technologies to realize spatial data sharing. This paper discusses the key technologies for metadata based on data interoperability, deeply researches the data compression algorithms such as adaptive Huffman algorithm, LZ77 and LZ78 algorithm, studies to apply digital signature technique to encrypt spatial data, which can not only identify the transmitter of spatial data, but also find timely whether the spatial data are sophisticated during the course of network transmission, and based on the analysis of symmetric encryption algorithms including 3DES,AES and asymmetric encryption algorithm - RAS, combining with HASH algorithm, presents a improved mix encryption method for spatial data. Digital signature technology and digital watermarking technology are also discussed. Then, a new solution of spatial data network distribution is put forward, which adopts three-layer architecture. Based on the framework, we give a spatial data network distribution system, which is efficient and safe, and also prove the feasibility and validity of the proposed solution.

  16. Offline Signature Verification Using the Discrete Radon Transform and a Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Coetzer, J.; Herbst, B. M.; du Preez, J. A.

    2004-12-01

    We developed a system that automatically authenticates offline handwritten signatures using the discrete Radon transform (DRT) and a hidden Markov model (HMM). Given the robustness of our algorithm and the fact that only global features are considered, satisfactory results are obtained. Using a database of 924 signatures from 22 writers, our system achieves an equal error rate (EER) of 18% when only high-quality forgeries (skilled forgeries) are considered and an EER of 4.5% in the case of only casual forgeries. These signatures were originally captured offline. Using another database of 4800 signatures from 51 writers, our system achieves an EER of 12.2% when only skilled forgeries are considered. These signatures were originally captured online and then digitally converted into static signature images. These results compare well with the results of other algorithms that consider only global features.

  17. Novel algorithm to identify and differentiate specific digital signature of breath sound in patients with diffuse parenchymal lung disease.

    PubMed

    Bhattacharyya, Parthasarathi; Mondal, Ashok; Dey, Rana; Saha, Dipanjan; Saha, Goutam

    2015-05-01

    Auscultation is an important part of the clinical examination of different lung diseases. Objective analysis of lung sounds based on underlying characteristics and its subsequent automatic interpretations may help a clinical practice. We collected the breath sounds from 8 normal subjects and 20 diffuse parenchymal lung disease (DPLD) patients using a newly developed instrument and then filtered off the heart sounds using a novel technology. The collected sounds were thereafter analysed digitally on several characteristics as dynamical complexity, texture information and regularity index to find and define their unique digital signatures for differentiating normality and abnormality. For convenience of testing, these characteristic signatures of normal and DPLD lung sounds were transformed into coloured visual representations. The predictive power of these images has been validated by six independent observers that include three physicians. The proposed method gives a classification accuracy of 100% for composite features for both the normal as well as lung sound signals from DPLD patients. When tested by independent observers on the visually transformed images, the positive predictive value to diagnose the normality and DPLD remained 100%. The lung sounds from the normal and DPLD subjects could be differentiated and expressed according to their digital signatures. On visual transformation to coloured images, they retain 100% predictive power. This technique may assist physicians to diagnose DPLD from visual images bearing the digital signature of the condition. © 2015 Asian Pacific Society of Respirology.

  18. Limitations and requirements of content-based multimedia authentication systems

    NASA Astrophysics Data System (ADS)

    Wu, Chai W.

    2001-08-01

    Recently, a number of authentication schemes have been proposed for multimedia data such as images and sound data. They include both label based systems and semifragile watermarks. The main requirement for such authentication systems is that minor modifications such as lossy compression which do not alter the content of the data preserve the authenticity of the data, whereas modifications which do modify the content render the data not authentic. These schemes can be classified into two main classes depending on the model of image authentication they are based on. One of the purposes of this paper is to look at some of the advantages and disadvantages of these image authentication schemes and their relationship with fundamental limitations of the underlying model of image authentication. In particular, we study feature-based algorithms which generate an authentication tag based on some inherent features in the image such as the location of edges. The main disadvantage of most proposed feature-based algorithms is that similar images generate similar features, and therefore it is possible for a forger to generate dissimilar images that have the same features. On the other hand, the class of hash-based algorithms utilizes a cryptographic hash function or a digital signature scheme to reduce the data and generate an authentication tag. It inherits the security of digital signatures to thwart forgery attacks. The main disadvantage of hash-based algorithms is that the image needs to be modified in order to be made authenticatable. The amount of modification is on the order of the noise the image can tolerate before it is rendered inauthentic. The other purpose of this paper is to propose a multimedia authentication scheme which combines some of the best features of both classes of algorithms. The proposed scheme utilizes cryptographic hash functions and digital signature schemes and the data does not need to be modified in order to be made authenticatable. Several applications including the authentication of images on CD-ROM and handwritten documents will be discussed.

  19. Terrain type recognition using ERTS-1 MSS images

    NASA Technical Reports Server (NTRS)

    Gramenopoulos, N.

    1973-01-01

    For the automatic recognition of earth resources from ERTS-1 digital tapes, both multispectral and spatial pattern recognition techniques are important. Recognition of terrain types is based on spatial signatures that become evident by processing small portions of an image through selected algorithms. An investigation of spatial signatures that are applicable to ERTS-1 MSS images is described. Artifacts in the spatial signatures seem to be related to the multispectral scanner. A method for suppressing such artifacts is presented. Finally, results of terrain type recognition for one ERTS-1 image are presented.

  20. The application of data encryption technology in computer network communication security

    NASA Astrophysics Data System (ADS)

    Gong, Lina; Zhang, Li; Zhang, Wei; Li, Xuhong; Wang, Xia; Pan, Wenwen

    2017-04-01

    With the rapid development of Intemet and the extensive application of computer technology, the security of information becomes more and more serious, and the information security technology with data encryption technology as the core has also been developed greatly. Data encryption technology not only can encrypt and decrypt data, but also can realize digital signature, authentication and authentication and other functions, thus ensuring the confidentiality, integrity and confirmation of data transmission over the network. In order to improve the security of data in network communication, in this paper, a hybrid encryption system is used to encrypt and decrypt the triple DES algorithm with high security, and the two keys are encrypted with RSA algorithm, thus ensuring the security of the triple DES key and solving the problem of key management; At the same time to realize digital signature using Java security software, to ensure data integrity and non-repudiation. Finally, the data encryption system is developed by Java language. The data encryption system is simple and effective, with good security and practicality.

  1. Global lightning studies

    NASA Technical Reports Server (NTRS)

    Goodman, Steven J.; Wright, Pat; Christian, Hugh; Blakeslee, Richard; Buechler, Dennis; Scharfen, Greg

    1991-01-01

    The global lightning signatures were analyzed from the DMSP Optical Linescan System (OLS) imagery archived at the National Snow and Ice Data Center. Transition to analysis of the digital archive becomes available and compare annual, interannual, and seasonal variations with other global data sets. An initial survey of the quality of the existing film archive was completed and lightning signatures were digitized for the summer months of 1986 to 1987. The relationship is studied between: (1) global and regional lightning activity and rainfall, and (2) storm electrical development and environment. Remote sensing data sets obtained from field programs are used in conjunction with satellite/radar/lightning data to develop and improve precipitation estimation algorithms, and to provide a better understanding of the co-evolving electrical, microphysical, and dynamical structure of storms.

  2. SHAMROCK: A Synthesizable High Assurance Cryptography and Key Management Coprocessor

    DTIC Science & Technology

    2016-11-01

    and excluding devices from a communicating group as they become trusted, or untrusted. An example of using rekeying to dynamically adjust group...algorithms, such as the Elliptic Curve Digital Signature Algorithm (ECDSA), work by computing a cryptographic hash of a message using, for example , the...material is based upon work supported by the Assistant Secretary of Defense for Research and Engineering under Air Force Contract No. FA8721- 05-C

  3. Modelling Digital Thunder

    ERIC Educational Resources Information Center

    Blanco, Francesco; La Rocca, Paola; Petta, Catia; Riggi, Francesco

    2009-01-01

    An educational model simulation of the sound produced by lightning in the sky has been employed to demonstrate realistic signatures of thunder and its connection to the particular structure of the lightning channel. Algorithms used in the past have been revisited and implemented, making use of current computer techniques. The basic properties of…

  4. Secure smart grid communications and information integration based on digital watermarking in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Yan, Xin; Zhang, Ling; Wu, Yang; Luo, Youlong; Zhang, Xiaoxing

    2017-02-01

    As more and more wireless sensor nodes and networks are employed to acquire and transmit the state information of power equipment in smart grid, we are in urgent need of some viable security solutions to ensure secure smart grid communications. Conventional information security solutions, such as encryption/decryption, digital signature and so forth, are not applicable to wireless sensor networks in smart grid any longer, where bulk messages need to be exchanged continuously. The reason is that these cryptographic solutions will account for a large portion of the extremely limited resources on sensor nodes. In this article, a security solution based on digital watermarking is adopted to achieve the secure communications for wireless sensor networks in smart grid by data and entity authentications at a low cost of operation. Our solution consists of a secure framework of digital watermarking, and two digital watermarking algorithms based on alternating electric current and time window, respectively. Both watermarking algorithms are composed of watermark generation, embedding and detection. The simulation experiments are provided to verify the correctness and practicability of our watermarking algorithms. Additionally, a new cloud-based architecture for the information integration of smart grid is proposed on the basis of our security solutions.

  5. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    PubMed Central

    Kabir, Muhammad N.; Alginahi, Yasser M.

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  6. Dual function seal: visualized digital signature for electronic medical record systems.

    PubMed

    Yu, Yao-Chang; Hou, Ting-Wei; Chiang, Tzu-Chiang

    2012-10-01

    Digital signature is an important cryptography technology to be used to provide integrity and non-repudiation in electronic medical record systems (EMRS) and it is required by law. However, digital signatures normally appear in forms unrecognizable to medical staff, this may reduce the trust from medical staff that is used to the handwritten signatures or seals. Therefore, in this paper we propose a dual function seal to extend user trust from a traditional seal to a digital signature. The proposed dual function seal is a prototype that combines the traditional seal and digital seal. With this prototype, medical personnel are not just can put a seal on paper but also generate a visualized digital signature for electronic medical records. Medical Personnel can then look at the visualized digital signature and directly know which medical personnel generated it, just like with a traditional seal. Discrete wavelet transform (DWT) is used as an image processing method to generate a visualized digital signature, and the peak signal to noise ratio (PSNR) is calculated to verify that distortions of all converted images are beyond human recognition, and the results of our converted images are from 70 dB to 80 dB. The signature recoverability is also tested in this proposed paper to ensure that the visualized digital signature is verifiable. A simulated EMRS is implemented to show how the visualized digital signature can be integrity into EMRS.

  7. ECDSA B-233 with Precomputation 1.0 Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draelos, Timothy; Schroeppel, Richard; Schoeneman, Barry

    2009-12-11

    This software, written in C, performs two functions: 1) the generation of digital signatures using ECDSA with the B-233 curve and a table of precomputed values, and 2) the generation and encryption of a table of precomputed values to support the generation of many digital signatures. The computationally expensive operations of ECDSA signature generation are precomputed, stored in a table, and protected with AES encryption. This allows digital signatures to be generated in low-power, computationally-constrained environments, such as are often found in non-proliferation monitoring applications. The encrypted, precomputed table and digital signature generation software are used to provide public keymore » data authentication for sensor data. When digital data is presented for signing, a set of values from the table is decrypted and used to generate an ECDSA digital signatureThis software, written in C, performs two functions: 1) the generation of digital signatures using ECDSA with the B-233 curve and a table of precomputed values, and 2) the generation and encryption of a table of precomputed values to support the generation of many digital signatures. The computationally expensive operations of ECDSA signature generation are precomputed, stored in a table, and protected with AES encryption. This allows digital signatures to be generated in low-power, computationally-constrained environments, such as are often found in non-proliferation monitoring applications. The encrypted, precomputed table and digital signature generation software are used to provide public key data authentication for sensor data. When digital data is presented for signing, a set of values from the table is decrypted and used to generate an ECDSA digital signature« less

  8. Removal of instrument signature from Mariner 9 television images of Mars

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Jepsen, P. L.; Kreznar, J. E.; Ruiz, R. M.; Schwartz, A. A.; Seidman, J. B.

    1975-01-01

    The Mariner 9 spacecraft was inserted into orbit around Mars in November 1971. The two vidicon camera systems returned over 7300 digital images during orbital operations. The high volume of returned data and the scientific objectives of the Television Experiment made development of automated digital techniques for the removal of camera system-induced distortions from each returned image necessary. This paper describes the algorithms used to remove geometric and photometric distortions from the returned imagery. Enhancement processing of the final photographic products is also described.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    I. W. Ginsberg

    Multiresolutional decompositions known as spectral fingerprints are often used to extract spectral features from multispectral/hyperspectral data. In this study, the authors investigate the use of wavelet-based algorithms for generating spectral fingerprints. The wavelet-based algorithms are compared to the currently used method, traditional convolution with first-derivative Gaussian filters. The comparison analyses consists of two parts: (a) the computational expense of the new method is compared with the computational costs of the current method and (b) the outputs of the wavelet-based methods are compared with those of the current method to determine any practical differences in the resulting spectral fingerprints. The resultsmore » show that the wavelet-based algorithms can greatly reduce the computational expense of generating spectral fingerprints, while practically no differences exist in the resulting fingerprints. The analysis is conducted on a database of hyperspectral signatures, namely, Hyperspectral Digital Image Collection Experiment (HYDICE) signatures. The reduction in computational expense is by a factor of about 30, and the average Euclidean distance between resulting fingerprints is on the order of 0.02.« less

  10. What a Difference a Year Makes.

    ERIC Educational Resources Information Center

    Birt, Carina

    1998-01-01

    Addresses the growth of signatures in document management. Describes the three basic types of electronic signature technology: image signatures, digital signatures, and digitized biometric signatures. Discusses legal and regulatory acceptability and bringing signatures into document management. (AEF)

  11. Realization of Quantum Digital Signatures without the Requirement of Quantum Memory

    NASA Astrophysics Data System (ADS)

    Collins, Robert J.; Donaldson, Ross J.; Dunjko, Vedran; Wallden, Petros; Clarke, Patrick J.; Andersson, Erika; Jeffers, John; Buller, Gerald S.

    2014-07-01

    Digital signatures are widely used to provide security for electronic communications, for example, in financial transactions and electronic mail. Currently used classical digital signature schemes, however, only offer security relying on unproven computational assumptions. In contrast, quantum digital signatures offer information-theoretic security based on laws of quantum mechanics. Here, security against forging relies on the impossibility of perfectly distinguishing between nonorthogonal quantum states. A serious drawback of previous quantum digital signature schemes is that they require long-term quantum memory, making them impractical at present. We present the first realization of a scheme that does not need quantum memory and which also uses only standard linear optical components and photodetectors. In our realization, the recipients measure the distributed quantum signature states using a new type of quantum measurement, quantum state elimination. This significantly advances quantum digital signatures as a quantum technology with potential for real applications.

  12. Realization of quantum digital signatures without the requirement of quantum memory.

    PubMed

    Collins, Robert J; Donaldson, Ross J; Dunjko, Vedran; Wallden, Petros; Clarke, Patrick J; Andersson, Erika; Jeffers, John; Buller, Gerald S

    2014-07-25

    Digital signatures are widely used to provide security for electronic communications, for example, in financial transactions and electronic mail. Currently used classical digital signature schemes, however, only offer security relying on unproven computational assumptions. In contrast, quantum digital signatures offer information-theoretic security based on laws of quantum mechanics. Here, security against forging relies on the impossibility of perfectly distinguishing between nonorthogonal quantum states. A serious drawback of previous quantum digital signature schemes is that they require long-term quantum memory, making them impractical at present. We present the first realization of a scheme that does not need quantum memory and which also uses only standard linear optical components and photodetectors. In our realization, the recipients measure the distributed quantum signature states using a new type of quantum measurement, quantum state elimination. This significantly advances quantum digital signatures as a quantum technology with potential for real applications.

  13. 31 CFR 370.39 - To what extent is a digital signature admissible in any civil litigation or dispute?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false To what extent is a digital signature... Submission of Transaction Requests Through the Bureau of the Public Debt § 370.39 To what extent is a digital signature admissible in any civil litigation or dispute? In asserting a digital signature against you in any...

  14. 31 CFR 370.39 - To what extent is a digital signature admissible in any civil litigation or dispute?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false To what extent is a digital signature... Submission of Transaction Requests Through the Bureau of the Public Debt § 370.39 To what extent is a digital signature admissible in any civil litigation or dispute? In asserting a digital signature against you in any...

  15. Digital pathology in nephrology clinical trials, research, and pathology practice.

    PubMed

    Barisoni, Laura; Hodgin, Jeffrey B

    2017-11-01

    In this review, we will discuss (i) how the recent advancements in digital technology and computational engineering are currently applied to nephropathology in the setting of clinical research, trials, and practice; (ii) the benefits of the new digital environment; (iii) how recognizing its challenges provides opportunities for transformation; and (iv) nephropathology in the upcoming era of kidney precision and predictive medicine. Recent studies highlighted how new standardized protocols facilitate the harmonization of digital pathology database infrastructure and morphologic, morphometric, and computer-aided quantitative analyses. Digital pathology enables robust protocols for clinical trials and research, with the potential to identify previously underused or unrecognized clinically useful parameters. The integration of digital pathology with molecular signatures is leading the way to establishing clinically relevant morpho-omic taxonomies of renal diseases. The introduction of digital pathology in clinical research and trials, and the progressive implementation of the modern software ecosystem, opens opportunities for the development of new predictive diagnostic paradigms and computer-aided algorithms, transforming the practice of renal disease into a modern computational science.

  16. Digital Signature Management.

    ERIC Educational Resources Information Center

    Hassler, Vesna; Biely, Helmut

    1999-01-01

    Describes the Digital Signature Project that was developed in Austria to establish an infrastructure for applying smart card-based digital signatures in banking and electronic-commerce applications. Discusses the need to conform to international standards, an international certification infrastructure, and security features for a public directory…

  17. Modal identification of structures from the responses and random decrement signatures

    NASA Technical Reports Server (NTRS)

    Brahim, S. R.; Goglia, G. L.

    1977-01-01

    The theory and application of a method which utilizes the free response of a structure to determine its vibration parameters is described. The time-domain free response is digitized and used in a digital computer program to determine the number of modes excited, the natural frequencies, the damping factors, and the modal vectors. The technique is applied to a complex generalized payload model previously tested using sine sweep method and analyzed by NASTRAN. Ten modes of the payload model are identified. In case free decay response is not readily available, an algorithm is developed to obtain the free responses of a structure from its random responses, due to some unknown or known random input or inputs, using the random decrement technique without changing time correlation between signals. The algorithm is tested using random responses from a generalized payload model and from the space shuttle model.

  18. Digital imaging and remote sensing image generator (DIRSIG) as applied to NVESD sensor performance modeling

    NASA Astrophysics Data System (ADS)

    Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.

    2016-05-01

    The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.

  19. Device interoperability and authentication for telemedical appliance based on the ISO/IEEE 11073 Personal Health Device (PHD) Standards.

    PubMed

    Caranguian, Luther Paul R; Pancho-Festin, Susan; Sison, Luis G

    2012-01-01

    In this study, we focused on the interoperability and authentication of medical devices in the context of telemedical systems. A recent standard called the ISO/IEEE 11073 Personal Health Device (X73-PHD) Standards addresses the device interoperability problem by defining common protocols for agent (medical device) and manager (appliance) interface. The X73-PHD standard however has not addressed security and authentication of medical devices which is important in establishing integrity of a telemedical system. We have designed and implemented a security policy within the X73-PHD standards. The policy will enable device authentication using Asymmetric-Key Cryptography and the RSA algorithm as the digital signature scheme. We used two approaches for performing the digital signatures: direct software implementation and use of embedded security modules (ESM). The two approaches were evaluated and compared in terms of execution time and memory requirement. For the standard 2048-bit RSA, ESM calculates digital signatures only 12% of the total time for the direct implementation. Moreover, analysis shows that ESM offers more security advantage such as secure storage of keys compared to using direct implementation. Interoperability with other systems was verified by testing the system with LNI Healthlink, a manager software that implements the X73-PHD standard. Lastly, security analysis was done and the system's response to common attacks on authentication systems was analyzed and several measures were implemented to protect the system against them.

  20. SU-F-BRA-01: A Procedure for the Fast Semi-Automatic Localization of Catheters Using An Electromagnetic Tracker (EMT) for Image-Guided Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damato, A; Viswanathan, A; Cormack, R

    2015-06-15

    Purpose: To evaluate the feasibility of brachytherapy catheter localization through use of an EMT and 3D image set. Methods: A 15-catheter phantom mimicking an interstitial implantation was built and CT-scanned. Baseline catheter reconstruction was performed manually. An EMT was used to acquire the catheter coordinates in the EMT frame of reference. N user-identified catheter tips, without catheter number associations, were used to establish registration with the CT frame of reference. Two algorithms were investigated: brute-force registration (BFR), in which all possible permutation of N identified tips with the EMT tips were evaluated; and signature-based registration (SBR), in which a distancemore » matrix was used to generate a list of matching signatures describing possible N-point matches with the registration points. Digitization error (average of the distance between corresponding EMT and baseline dwell positions; average, standard deviation, and worst-case scenario over all possible registration-point selections) and algorithm inefficiency (maximum number of rigid registrations required to find the matching fusion for all possible selections of registration points) were calculated. Results: Digitization errors on average <2 mm were observed for N ≥5, with standard deviation <2 mm for N ≥6, and worst-case scenario error <2 mm for N ≥11. Algorithm inefficiencies were: N = 5, 32,760 (BFR) and 9900 (SBR); N = 6, 360,360 (BFR) and 21,660 (SBR); N = 11, 5.45*1010 (BFR) and 12 (SBR). Conclusion: A procedure was proposed for catheter reconstruction using EMT and only requiring user identification of catheter tips without catheter localization. Digitization errors <2 mm were observed on average with 5 or more registration points, and in any scenario with 11 or more points. Inefficiency for N = 11 was 9 orders of magnitude lower for SBR than for BFR. Funding: Kaye Family Award.« less

  1. 27 CFR 73.3 - What terms must I know to understand this part?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... and/or actions are both unique to that individual and measurable. Digital signature. An electronic... verified. A signer creates a digital signature by using public-key encryption to transform a message digest of an electronic message. If a recipient of the digital signature has an electronic message, message...

  2. 27 CFR 73.3 - What terms must I know to understand this part?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... and/or actions are both unique to that individual and measurable. Digital signature. An electronic... verified. A signer creates a digital signature by using public-key encryption to transform a message digest of an electronic message. If a recipient of the digital signature has an electronic message, message...

  3. Design of a mutual authentication based on NTRUsign with a perturbation and inherent multipoint control protocol frames in an Ethernet-based passive optical network

    NASA Astrophysics Data System (ADS)

    Yin, Aihan; Ding, Yisheng

    2014-11-01

    Identity-related security issues inherently present in passive optical networks (PON) still exist in the current (1G) and next-generation (10G) Ethernet-based passive optical network (EPON) systems. We propose a mutual authentication scheme that integrates an NTRUsign digital signature algorithm with inherent multipoint control protocol (MPCP) frames over an EPON system between the optical line terminal (OLT) and optical network unit (ONU). Here, a primitive NTRUsign algorithm is significantly modified through the use of a new perturbation so that it can be effectively used for simultaneously completing signature and authentication functions on the OLT and the ONU sides. Also, in order to transmit their individual sensitive messages, which include public key, signature, and random value and so forth, to each other, we redefine three unique frames according to MPCP format frame. These generated messages can be added into the frames and delivered to each other, allowing the OLT and the ONU to go ahead with a mutual identity authentication process to verify their legal identities. Our simulation results show that this proposed scheme performs very well in resisting security attacks and has low influence on the registration efficiency to to-be-registered ONUs. A performance comparison with traditional authentication algorithms is also presented. To the best of our knowledge, no detailed design of mutual authentication in EPON can be found in the literature up to now.

  4. Modelling digital thunder

    NASA Astrophysics Data System (ADS)

    Blanco, Francesco; La Rocca, Paola; Petta, Catia; Riggi, Francesco

    2009-01-01

    An educational model simulation of the sound produced by lightning in the sky has been employed to demonstrate realistic signatures of thunder and its connection to the particular structure of the lightning channel. Algorithms used in the past have been revisited and implemented, making use of current computer techniques. The basic properties of the mathematical model, together with typical results and suggestions for additional developments are discussed. The paper is intended as a teaching aid for students and teachers in the context of introductory physics courses at university level.

  5. Digital Image Processing Technique for Breast Cancer Detection

    NASA Astrophysics Data System (ADS)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  6. Experimental demonstration of quantum digital signatures over 43 dB channel loss using differential phase shift quantum key distribution.

    PubMed

    Collins, Robert J; Amiri, Ryan; Fujiwara, Mikio; Honjo, Toshimori; Shimizu, Kaoru; Tamaki, Kiyoshi; Takeoka, Masahiro; Sasaki, Masahide; Andersson, Erika; Buller, Gerald S

    2017-06-12

    Ensuring the integrity and transferability of digital messages is an important challenge in modern communications. Although purely mathematical approaches exist, they usually rely on the computational complexity of certain functions, in which case there is no guarantee of long-term security. Alternatively, quantum digital signatures offer security guaranteed by the physical laws of quantum mechanics. Prior experimental demonstrations of quantum digital signatures in optical fiber have typically been limited to operation over short distances and/or operated in a laboratory environment. Here we report the experimental transmission of quantum digital signatures over channel losses of up to 42.8 ± 1.2 dB in a link comprised of 90 km of installed fiber with additional optical attenuation introduced to simulate longer distances. The channel loss of 42.8 ± 1.2 dB corresponds to an equivalent distance of 134.2 ± 3.8 km and this represents the longest effective distance and highest channel loss that quantum digital signatures have been shown to operate over to date. Our theoretical model indicates that this represents close to the maximum possible channel attenuation for this quantum digital signature protocol, defined as the loss for which the signal rate is comparable to the dark count rate of the detectors.

  7. Access Control to Information in Pervasive Computing Environments

    DTIC Science & Technology

    2005-08-01

    for foo’s public key. (Digital signatures are omitted.) indicate a set of location and time intervals. A service will return location information only...stands for foo’s public key. (Digital signatures are omitted.) it describes the resource to which access is granted. Currently, we allow only resources...information relationship. Alice’s location information is bun- dled in her personal information. (The digital signature is omitted.) We use extended

  8. Securing Digital Images Integrity using Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Hajji, Tarik; Itahriouan, Zakaria; Ouazzani Jamil, Mohammed

    2018-05-01

    Digital image signature is a technique used to protect the image integrity. The application of this technique can serve several areas of imaging applied to smart cities. The objective of this work is to propose two methods to protect digital image integrity. We present a description of two approaches using artificial neural networks (ANN) to digitally sign an image. The first one is “Direct Signature without learning” and the second is “Direct Signature with learning”. This paper presents the theory of proposed approaches and an experimental study to test their effectiveness.

  9. An in fiber experimental approach to photonic quantum digital signatures that does not require quantum memory

    NASA Astrophysics Data System (ADS)

    Collins, Robert J.; Donaldon, Ross J.; Dunjko, Vedran; Wallden, Petros; Clarke, Patrick J.; Andersson, Erika; Jeffers, John; Buller, Gerald S.

    2014-10-01

    Classical digital signatures are commonly used in e-mail, electronic financial transactions and other forms of electronic communications to ensure that messages have not been tampered with in transit, and that messages are transferrable. The security of commonly used classical digital signature schemes relies on the computational difficulty of inverting certain mathematical functions. However, at present, there are no such one-way functions which have been proven to be hard to invert. With enough computational resources certain implementations of classical public key cryptosystems can be, and have been, broken with current technology. It is nevertheless possible to construct information-theoretically secure signature schemes, including quantum digital signature schemes. Quantum signature schemes can be made information theoretically secure based on the laws of quantum mechanics, while classical comparable protocols require additional resources such as secret communication and a trusted authority. Early demonstrations of quantum digital signatures required quantum memory, rendering them impractical at present. Our present implementation is based on a protocol that does not require quantum memory. It also uses the new technique of unambiguous quantum state elimination, Here we report experimental results for a test-bed system, recorded with a variety of different operating parameters, along with a discussion of aspects of the system security.

  10. Real time recognition of explosophorous group and explosive material using laser induced photoacoustic spectroscopy associated with novel algorithm for time and frequency domain analysis.

    PubMed

    El-Sharkawy, Yasser H; Elbasuney, Sherif

    2018-06-07

    Energy-rich bonds such as nitrates (NO 3 - ) and percholorates (ClO 4 - ) have an explosive nature; they are frequently encountered in high energy materials. These bonds encompass two highly electronegative atoms competing for electrons. Common explosive materials including urea nitrate, ammonium nitrate, and ammonium percholorates were subjected to photoacoustic spectroscopy. The captured signal was processed using novel digital algorithm designed for time and frequency domain analysis. Frequency domain analysis offered not only characteristic frequencies for NO 3 - and ClO 4 - groups; but also characteristic fingerprint spectra (based on thermal, acoustical, and optical properties) for different materials. The main outcome of this study is that phase-shift domain analysis offered an outstanding signature for each explosive material, with novel discrimination between explosive and similar non-explosive material. Photoacoustic spectroscopy offered different characteristic signatures that can be employed for real time detection with stand-off capabilities. There is no two materials could have the same optical, thermal, and acoustical properties. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Public-key quantum digital signature scheme with one-time pad private-key

    NASA Astrophysics Data System (ADS)

    Chen, Feng-Lin; Liu, Wan-Fang; Chen, Su-Gen; Wang, Zhi-Hua

    2018-01-01

    A quantum digital signature scheme is firstly proposed based on public-key quantum cryptosystem. In the scheme, the verification public-key is derived from the signer's identity information (such as e-mail) on the foundation of identity-based encryption, and the signature private-key is generated by one-time pad (OTP) protocol. The public-key and private-key pair belongs to classical bits, but the signature cipher belongs to quantum qubits. After the signer announces the public-key and generates the final quantum signature, each verifier can verify publicly whether the signature is valid or not with the public-key and quantum digital digest. Analysis results show that the proposed scheme satisfies non-repudiation and unforgeability. Information-theoretic security of the scheme is ensured by quantum indistinguishability mechanics and OTP protocol. Based on the public-key cryptosystem, the proposed scheme is easier to be realized compared with other quantum signature schemes under current technical conditions.

  12. An algorithm of discovering signatures from DNA databases on a computer cluster.

    PubMed

    Lee, Hsiao Ping; Sheu, Tzu-Fang

    2014-10-05

    Signatures are short sequences that are unique and not similar to any other sequence in a database that can be used as the basis to identify different species. Even though several signature discovery algorithms have been proposed in the past, these algorithms require the entirety of databases to be loaded in the memory, thus restricting the amount of data that they can process. It makes those algorithms unable to process databases with large amounts of data. Also, those algorithms use sequential models and have slower discovery speeds, meaning that the efficiency can be improved. In this research, we are debuting the utilization of a divide-and-conquer strategy in signature discovery and have proposed a parallel signature discovery algorithm on a computer cluster. The algorithm applies the divide-and-conquer strategy to solve the problem posed to the existing algorithms where they are unable to process large databases and uses a parallel computing mechanism to effectively improve the efficiency of signature discovery. Even when run with just the memory of regular personal computers, the algorithm can still process large databases such as the human whole-genome EST database which were previously unable to be processed by the existing algorithms. The algorithm proposed in this research is not limited by the amount of usable memory and can rapidly find signatures in large databases, making it useful in applications such as Next Generation Sequencing and other large database analysis and processing. The implementation of the proposed algorithm is available at http://www.cs.pu.edu.tw/~fang/DDCSDPrograms/DDCSD.htm.

  13. Invisibly Sanitizable Digital Signature Scheme

    NASA Astrophysics Data System (ADS)

    Miyazaki, Kunihiko; Hanaoka, Goichiro; Imai, Hideki

    A digital signature does not allow any alteration of the document to which it is attached. Appropriate alteration of some signed documents, however, should be allowed because there are security requirements other than the integrity of the document. In the disclosure of official information, for example, sensitive information such as personal information or national secrets is masked when an official document is sanitized so that its nonsensitive information can be disclosed when it is requested by a citizen. If this disclosure is done digitally by using the current digital signature schemes, the citizen cannot verify the disclosed information because it has been altered to prevent the leakage of sensitive information. The confidentiality of official information is thus incompatible with the integrity of that information, and this is called the digital document sanitizing problem. Conventional solutions such as content extraction signatures and digitally signed document sanitizing schemes with disclosure condition control can either let the sanitizer assign disclosure conditions or hide the number of sanitized portions. The digitally signed document sanitizing scheme we propose here is based on the aggregate signature derived from bilinear maps and can do both. Moreover, the proposed scheme can sanitize a signed document invisibly, that is, no one can distinguish whether the signed document has been sanitized or not.

  14. AdaBoost-based on-line signature verifier

    NASA Astrophysics Data System (ADS)

    Hongo, Yasunori; Muramatsu, Daigo; Matsumoto, Takashi

    2005-03-01

    Authentication of individuals is rapidly becoming an important issue. The authors previously proposed a Pen-input online signature verification algorithm. The algorithm considers a writer"s signature as a trajectory of pen position, pen pressure, pen azimuth, and pen altitude that evolve over time, so that it is dynamic and biometric. Many algorithms have been proposed and reported to achieve accuracy for on-line signature verification, but setting the threshold value for these algorithms is a problem. In this paper, we introduce a user-generic model generated by AdaBoost, which resolves this problem. When user- specific models (one model for each user) are used for signature verification problems, we need to generate the models using only genuine signatures. Forged signatures are not available because imposters do not give forged signatures for training in advance. However, we can make use of another's forged signature in addition to the genuine signatures for learning by introducing a user generic model. And Adaboost is a well-known classification algorithm, making final decisions depending on the sign of the output value. Therefore, it is not necessary to set the threshold value. A preliminary experiment is performed on a database consisting of data from 50 individuals. This set consists of western-alphabet-based signatures provide by a European research group. In this experiment, our algorithm gives an FRR of 1.88% and an FAR of 1.60%. Since no fine-tuning was done, this preliminary result looks very promising.

  15. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    PubMed

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  16. Digital "Testimonio" as a Signature Pedagogy for Latin@ Studies

    ERIC Educational Resources Information Center

    Benmayor, Rina

    2012-01-01

    This article proposes the curricular integration of digital "testimonio" as a "signature" pedagogy in Latin@ Studies. The "testimonio" tradition of urgent narratives and the creative multimedia languages of digital storytelling--text, voice, image, and sound--invite historically marginalized subjects, especially younger generations, to author and…

  17. Comparison of Fingerprint and Iris Biometric Authentication for Control of Digital Signatures

    PubMed Central

    Zuckerman, Alan E.; Moon, Kenneth A.; Eaddy, Kenneth

    2002-01-01

    Biometric authentication systems can be used to control digital signature of medical documents. This pilot study evaluated the use of two different fingerprint technologies and one iris technology to control creation of digital signatures on a central server using public private key pairs stored on the server. Documents and signatures were stored in XML for portability. Key pairs and authentication certificates were generated during biometric enrollment. Usability and user acceptance were guarded and limitations of biometric systems prevented use of the system with all test subjects. The system detected alternations in the data content and provided future signer re-authentication for non-repudiation.

  18. Experimental demonstration of quantum digital signatures using phase-encoded coherent states of light

    PubMed Central

    Clarke, Patrick J.; Collins, Robert J.; Dunjko, Vedran; Andersson, Erika; Jeffers, John; Buller, Gerald S.

    2012-01-01

    Digital signatures are frequently used in data transfer to prevent impersonation, repudiation and message tampering. Currently used classical digital signature schemes rely on public key encryption techniques, where the complexity of so-called ‘one-way' mathematical functions is used to provide security over sufficiently long timescales. No mathematical proofs are known for the long-term security of such techniques. Quantum digital signatures offer a means of sending a message, which cannot be forged or repudiated, with security verified by information-theoretical limits and quantum mechanics. Here we demonstrate an experimental system, which distributes quantum signatures from one sender to two receivers and enables message sending ensured against forging and repudiation. Additionally, we analyse the security of the system in some typical scenarios. Our system is based on the interference of phase-encoded coherent states of light and our implementation utilizes polarization-maintaining optical fibre and photons with a wavelength of 850 nm. PMID:23132024

  19. Digital signature feasibility study

    DOT National Transportation Integrated Search

    2008-06-01

    The purpose of this study was to assess the advantages and disadvantages of using digital signatures to assist the Arizona Department of Transportation in conducting business. The Department is evaluating the potential of performing more electronic t...

  20. Artificial neural networks for acoustic target recognition

    NASA Astrophysics Data System (ADS)

    Robertson, James A.; Mossing, John C.; Weber, Bruce A.

    1995-04-01

    Acoustic sensors can be used to detect, track and identify non-line-of-sight targets passively. Attempts to alter acoustic emissions often result in an undesirable performance degradation. This research project investigates the use of neural networks for differentiating between features extracted from the acoustic signatures of sources. Acoustic data were filtered and digitized using a commercially available analog-digital convertor. The digital data was transformed to the frequency domain for additional processing using the FFT. Narrowband peak detection algorithms were incorporated to select peaks above a user defined SNR. These peaks were then used to generate a set of robust features which relate specifically to target components in varying background conditions. The features were then used as input into a backpropagation neural network. A K-means unsupervised clustering algorithm was used to determine the natural clustering of the observations. Comparisons between a feature set consisting of the normalized amplitudes of the first 250 frequency bins of the power spectrum and a set of 11 harmonically related features were made. Initial results indicate that even though some different target types had a tendency to group in the same clusters, the neural network was able to differentiate the targets. Successful identification of acoustic sources under varying operational conditions with high confidence levels was achieved.

  1. Combinational Circuit Obfuscation Through Power Signature Manipulation

    DTIC Science & Technology

    2011-06-01

    Algorithm produced by SID . . . . . . . . . . . . . . . . . . . . . . 80 Appendix B . Power Signature Estimation Results 2 . . . . . . . . . . 85 B .1 Power...Signature for c264 Circuit Variant per Algorithm produced by SPICE Simulation . . . . . . . . . . . . . . 85 B .2 Power Signature for c5355 and c499...Smart SSR selecting rear level components and gates with 1000 iterations . . . . . . . . . 84 B .1. Power Signature for c264 By Random Sequence

  2. Practical quantum digital signature

    NASA Astrophysics Data System (ADS)

    Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing

    2016-03-01

    Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.

  3. Improving Spectral Image Classification through Band-Ratio Optimization and Pixel Clustering

    NASA Astrophysics Data System (ADS)

    O'Neill, M.; Burt, C.; McKenna, I.; Kimblin, C.

    2017-12-01

    The Underground Nuclear Explosion Signatures Experiment (UNESE) seeks to characterize non-prompt observables from underground nuclear explosions (UNE). As part of this effort, we evaluated the ability of DigitalGlobe's WorldView-3 (WV3) to detect and map UNE signatures. WV3 is the current state-of-the-art, commercial, multispectral imaging satellite; however, it has relatively limited spectral and spatial resolutions. These limitations impede image classifiers from detecting targets that are spatially small and lack distinct spectral features. In order to improve classification results, we developed custom algorithms to reduce false positive rates while increasing true positive rates via a band-ratio optimization and pixel clustering front-end. The clusters resulting from these algorithms were processed with standard spectral image classifiers such as Mixture-Tuned Matched Filter (MTMF) and Adaptive Coherence Estimator (ACE). WV3 and AVIRIS data of Cuprite, Nevada, were used as a validation data set. These data were processed with a standard classification approach using MTMF and ACE algorithms. They were also processed using the custom front-end prior to the standard approach. A comparison of the results shows that the custom front-end significantly increases the true positive rate and decreases the false positive rate.This work was done by National Security Technologies, LLC, under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy. DOE/NV/25946-3283.

  4. Secure and Efficient Signature Scheme Based on NTRU for Mobile Payment

    NASA Astrophysics Data System (ADS)

    Xia, Yunhao; You, Lirong; Sun, Zhe; Sun, Zhixin

    2017-10-01

    Mobile payment becomes more and more popular, however the traditional public-key encryption algorithm has higher requirements for hardware which is not suitable for mobile terminals of limited computing resources. In addition, these public-key encryption algorithms do not have the ability of anti-quantum computing. This paper researches public-key encryption algorithm NTRU for quantum computation through analyzing the influence of parameter q and k on the probability of generating reasonable signature value. Two methods are proposed to improve the probability of generating reasonable signature value. Firstly, increase the value of parameter q. Secondly, add the authentication condition that meet the reasonable signature requirements during the signature phase. Experimental results show that the proposed signature scheme can realize the zero leakage of the private key information of the signature value, and increase the probability of generating the reasonable signature value. It also improve rate of the signature, and avoid the invalid signature propagation in the network, but the scheme for parameter selection has certain restrictions.

  5. U.S. Army Research Laboratory (ARL) multimodal signatures database

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly

    2008-04-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) is a centralized collection of sensor data of various modalities that are co-located and co-registered. The signatures include ground and air vehicles, personnel, mortar, artillery, small arms gunfire from potential sniper weapons, explosives, and many other high value targets. This data is made available to Department of Defense (DoD) and DoD contractors, Intel agencies, other government agencies (OGA), and academia for use in developing target detection, tracking, and classification algorithms and systems to protect our Soldiers. A platform independent Web interface disseminates the signatures to researchers and engineers within the scientific community. Hierarchical Data Format 5 (HDF5) signature models provide an excellent solution for the sharing of complex multimodal signature data for algorithmic development and database requirements. Many open source tools for viewing and plotting HDF5 signatures are available over the Web. Seamless integration of HDF5 signatures is possible in both proprietary computational environments, such as MATLAB, and Free and Open Source Software (FOSS) computational environments, such as Octave and Python, for performing signal processing, analysis, and algorithm development. Future developments include extending the Web interface into a portal system for accessing ARL algorithms and signatures, High Performance Computing (HPC) resources, and integrating existing database and signature architectures into sensor networking environments.

  6. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1974-01-01

    The MIDAS System is described as a third-generation fast multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turnaround time and significant gains in throughput. The hardware and software are described. The system contains a mini-computer to control the various high-speed processing elements in the data path, and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 200,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation.

  7. Signature Verification Based on Handwritten Text Recognition

    NASA Astrophysics Data System (ADS)

    Viriri, Serestina; Tapamo, Jules-R.

    Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.

  8. Information security system based on virtual-optics imaging methodology and public key infrastructure

    NASA Astrophysics Data System (ADS)

    Peng, Xiang; Zhang, Peng; Cai, Lilong

    In this paper, we present a virtual-optical based information security system model with the aid of public-key-infrastructure (PKI) techniques. The proposed model employs a hybrid architecture in which our previously published encryption algorithm based on virtual-optics imaging methodology (VOIM) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). For an asymmetric system, given an encryption key, it is computationally infeasible to determine the decryption key and vice versa. The whole information security model is run under the framework of PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOIM security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network.

  9. Experimental measurement-device-independent quantum digital signatures over a metropolitan network

    NASA Astrophysics Data System (ADS)

    Yin, Hua-Lei; Wang, Wei-Long; Tang, Yan-Lin; Zhao, Qi; Liu, Hui; Sun, Xiang-Xiang; Zhang, Wei-Jun; Li, Hao; Puthoor, Ittoop Vergheese; You, Li-Xing; Andersson, Erika; Wang, Zhen; Liu, Yang; Jiang, Xiao; Ma, Xiongfeng; Zhang, Qiang; Curty, Marcos; Chen, Teng-Yun; Pan, Jian-Wei

    2017-04-01

    Quantum digital signatures (QDSs) provide a means for signing electronic communications with information-theoretic security. However, all previous demonstrations of quantum digital signatures assume trusted measurement devices. This renders them vulnerable against detector side-channel attacks, just like quantum key distribution. Here we exploit a measurement-device-independent (MDI) quantum network, over a metropolitan area, to perform a field test of a three-party MDI QDS scheme that is secure against any detector side-channel attack. In so doing, we are able to successfully sign a binary message with a security level of about 10-7. Remarkably, our work demonstrates the feasibility of MDI QDSs for practical applications.

  10. The Maximum Likelihood Estimation of Signature Transformation /MLEST/ algorithm. [for affine transformation of crop inventory data

    NASA Technical Reports Server (NTRS)

    Thadani, S. G.

    1977-01-01

    The Maximum Likelihood Estimation of Signature Transformation (MLEST) algorithm is used to obtain maximum likelihood estimates (MLE) of affine transformation. The algorithm has been evaluated for three sets of data: simulated (training and recognition segment pairs), consecutive-day (data gathered from Landsat images), and geographical-extension (large-area crop inventory experiment) data sets. For each set, MLEST signature extension runs were made to determine MLE values and the affine-transformed training segment signatures were used to classify the recognition segments. The classification results were used to estimate wheat proportions at 0 and 1% threshold values.

  11. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-03-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  12. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-07-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  13. Detection of soil erosion with Thematic Mapper (TM) satellite data within Pinyon-Juniper woodlands

    NASA Technical Reports Server (NTRS)

    Price, Kevin Paul

    1987-01-01

    Pinyon-Juniper woodlands dominate approximately 24.3 million hectares (60 million acres) in the western United States. The overall objective was to test the sensitivity of the LANDSAT Thematic Mapper (TM) spectral data for detecting varying degrees of soil erosion within the Pinyon-Juniper woodlands. A second objective was to assess the potential of the spectral data for assigning the Universal Soil Loss Equation (USLE) crop management (C) factor values to varying cover types within the woodland. Thematic Mapper digital data for June 2, 1984 on channels 2, 3, 4, and 5 were used. Digital data analysis was performed using the ELAS software package. Best results were achieved using CLUS, an unsupervised clustering algorithm. Fifteen of the 40 Pinyon-Juniper signatures were identified as being relatively pure Pinyon-Juniper woodland. Final analysis resulted in the grouping of the 15 signatures into three major groups. Ten study sites were selected from each of the three groups and located on the ground. At each site the following field measurements were taken: percent tree canopy and percent understory cover, soil texture, total soil loss, and soil erosion rate estimates. A technique for measuring soil erosion within Pinyon-Juniper woodlands was developed. A theoretical model of site degradation after Pinyon-Juniper invasion is presented.

  14. Application of ultrasonic signature analysis for fatigue detection in complex structures

    NASA Technical Reports Server (NTRS)

    Zuckerwar, A. J.

    1974-01-01

    Ultrasonic signature analysis shows promise of being a singularly well-suited method for detecting fatigue in structures as complex as aircraft. The method employs instrumentation centered about a Fourier analyzer system, which features analog-to-digital conversion, digital data processing, and digital display of cross-correlation functions and cross-spectra. These features are essential to the analysis of ultrasonic signatures according to the procedure described here. In order to establish the feasibility of the method, the initial experiments were confined to simple plates with simulated and fatigue-induced defects respectively. In the first test the signature proved sensitive to the size of a small hole drilled into the plate. In the second test, performed on a series of fatigue-loaded plates, the signature proved capable of indicating both the initial appearance and subsequent growth of a fatigue crack. In view of these encouraging results it is concluded that the method has reached a sufficiently advanced stage of development to warrant application to small-scale structures or even actual aircraft.

  15. Secure Hierarchical Multicast Routing and Multicast Internet Anonymity

    DTIC Science & Technology

    1998-06-01

    Multimedia, Summer 94, pages 76{79, 94. [15] David Chaum . Blind signatures for untraceable payments. In Proc. Crypto󈨖, pages 199{203, 1982. [16] David L...use of digital signatures , which consist of a cryptographic hash of the message encrypted with the private key of the signer. Digitally-signed messages... signature on the request and on the certi cate it contains. Notice that the location service need not retrieve the initiator’s public key as it is contained

  16. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 3: Wiring diagrams

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The Midas System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in Phase I of the overall program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 2 x 100,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. The MIDAS construction and wiring diagrams are given.

  17. MIDAS, prototype Multivariate Interactive Digital Analysis System, Phase 1. Volume 2: Diagnostic system

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The MIDAS System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughout. The hardware and software generated in Phase I of the over-all program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating 2 x 105 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. Diagnostic programs used to test MIDAS' operations are presented.

  18. Application of Compressive Sensing to Digital Holography

    DTIC Science & Technology

    2015-05-01

    WITH ASSIGNED DISTRIBUTION STATEMENT. // Signature// // Signature// DAVID J. RABB BRIAN D. EWERT, Chief Program Manager...Signature// TRACY W. JOHNSTON, Chief Multispectral Sensing and Detection Division Sensors Directorate This report is published in

  19. Methods of extending crop signatures from one area to another

    NASA Technical Reports Server (NTRS)

    Minter, T. C. (Principal Investigator)

    1979-01-01

    Efforts to develop a technology for signature extension during LACIE phases 1 and 2 are described. A number of haze and Sun angle correction procedures were developed and tested. These included the ROOSTER and OSCAR cluster-matching algorithms and their modifications, the MLEST and UHMLE maximum likelihood estimation procedures, and the ATCOR procedure. All these algorithms were tested on simulated data and consecutive-day LANDSAT imagery. The ATCOR, OSCAR, and MLEST algorithms were also tested for their capability to geographically extend signatures using LANDSAT imagery.

  20. Systems and methods for performing wireless financial transactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCown, Steven Harvey

    2012-07-03

    A secure computing module (SCM) is configured for connection with a host device. The SCM includes a processor for performing secure processing operations, a host interface for coupling the processor to the host device, and a memory connected to the processor wherein the processor logically isolates at least some of the memory from access by the host device. The SCM also includes a proximate-field wireless communicator connected to the processor to communicate with another SCM associated with another host device. The SCM generates a secure digital signature for a financial transaction package and communicates the package and the signature tomore » the other SCM using the proximate-field wireless communicator. Financial transactions are performed from person to person using the secure digital signature of each person's SCM and possibly message encryption. The digital signatures and transaction details are communicated to appropriate financial organizations to authenticate the transaction parties and complete the transaction.« less

  1. 21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...

  2. 21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...

  3. 21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...

  4. Integrating Geo-Spatial Data for Regional Landslide Susceptibility Modeling in Consideration of Run-Out Signature

    NASA Astrophysics Data System (ADS)

    Lai, J.-S.; Tsai, F.; Chiang, S.-H.

    2016-06-01

    This study implements a data mining-based algorithm, the random forests classifier, with geo-spatial data to construct a regional and rainfall-induced landslide susceptibility model. The developed model also takes account of landslide regions (source, non-occurrence and run-out signatures) from the original landslide inventory in order to increase the reliability of the susceptibility modelling. A total of ten causative factors were collected and used in this study, including aspect, curvature, elevation, slope, faults, geology, NDVI (Normalized Difference Vegetation Index), rivers, roads and soil data. Consequently, this study transforms the landslide inventory and vector-based causative factors into the pixel-based format in order to overlay with other raster data for constructing the random forests based model. This study also uses original and edited topographic data in the analysis to understand their impacts to the susceptibility modeling. Experimental results demonstrate that after identifying the run-out signatures, the overall accuracy and Kappa coefficient have been reached to be become more than 85 % and 0.8, respectively. In addition, correcting unreasonable topographic feature of the digital terrain model also produces more reliable modelling results.

  5. Investigation of correlation classification techniques

    NASA Technical Reports Server (NTRS)

    Haskell, R. E.

    1975-01-01

    A two-step classification algorithm for processing multispectral scanner data was developed and tested. The first step is a single pass clustering algorithm that assigns each pixel, based on its spectral signature, to a particular cluster. The output of that step is a cluster tape in which a single integer is associated with each pixel. The cluster tape is used as the input to the second step, where ground truth information is used to classify each cluster using an iterative method of potentials. Once the clusters have been assigned to classes the cluster tape is read pixel-by-pixel and an output tape is produced in which each pixel is assigned to its proper class. In addition to the digital classification programs, a method of using correlation clustering to process multispectral scanner data in real time by means of an interactive color video display is also described.

  6. Variable Star Signature Classification using Slotted Symbolic Markov Modeling

    NASA Astrophysics Data System (ADS)

    Johnston, K. B.; Peter, A. M.

    2017-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. This paper focuses on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern classification algorithm for the identification of variable stars. A methodology for the reduction of stellar variable observations (time-domain data) into a novel feature space representation is introduced. The methodology presented will be referred to as Slotted Symbolic Markov Modeling (SSMM) and has a number of advantages which will be demonstrated to be beneficial; specifically to the supervised classification of stellar variables. It will be shown that the methodology outperformed a baseline standard methodology on a standardized set of stellar light curve data. The performance on a set of data derived from the LINEAR dataset will also be shown.

  7. Variable Star Signature Classification using Slotted Symbolic Markov Modeling

    NASA Astrophysics Data System (ADS)

    Johnston, Kyle B.; Peter, Adrian M.

    2016-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. Our research focuses on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern classification algorithm for the identification of variable stars. A methodology for the reduction of stellar variable observations (time-domain data) into a novel feature space representation is introduced. The methodology presented will be referred to as Slotted Symbolic Markov Modeling (SSMM) and has a number of advantages which will be demonstrated to be beneficial; specifically to the supervised classification of stellar variables. It will be shown that the methodology outperformed a baseline standard methodology on a standardized set of stellar light curve data. The performance on a set of data derived from the LINEAR dataset will also be shown.

  8. Seismic signature analysis for discrimination of people from animals

    NASA Astrophysics Data System (ADS)

    Damarla, Thyagaraju; Mehmood, Asif; Sabatier, James M.

    2013-05-01

    Cadence analysis has been the main focus for discriminating between the seismic signatures of people and animals. However, cadence analysis fails when multiple targets are generating the signatures. We analyze the mechanism of human walking and the signature generated by a human walker, and compare it with the signature generated by a quadruped. We develop Fourier-based analysis to differentiate the human signatures from the animal signatures. We extract a set of basis vectors to represent the human and animal signatures using non-negative matrix factorization, and use them to separate and classify both the targets. Grazing animals such as deer, cows, etc., often produce sporadic signals as they move around from patch to patch of grass and one must characterize them so as to differentiate their signatures from signatures generated by a horse steadily walking along a path. These differences in the signatures are used in developing a robust algorithm to distinguish the signatures of animals from humans. The algorithm is tested on real data collected in a remote area.

  9. Evaluation of algorithms for estimating wheat acreage from multispectral scanner data. [Kansas and Texas

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Richardson, W.; Pentland, A. P.

    1976-01-01

    The author has identified the following significant results. Fourteen different classification algorithms were tested for their ability to estimate the proportion of wheat in an area. For some algorithms, accuracy of classification in field centers was observed. The data base consisted of ground truth and LANDSAT data from 55 sections (1 x 1 mile) from five LACIE intensive test sites in Kansas and Texas. Signatures obtained from training fields selected at random from the ground truth were generally representative of the data distribution patterns. LIMMIX, an algorithm that chooses a pure signature when the data point is close enough to a signature mean and otherwise chooses the best mixture of a pair of signatures, reduced the average absolute error to 6.1% and the bias to 1.0%. QRULE run with a null test achieved a similar reduction.

  10. A Third Approach to Gene Prediction Suggests Thousands of Additional Human Transcribed Regions

    PubMed Central

    Glusman, Gustavo; Qin, Shizhen; El-Gewely, M. Raafat; Siegel, Andrew F; Roach, Jared C; Hood, Leroy; Smit, Arian F. A

    2006-01-01

    The identification and characterization of the complete ensemble of genes is a main goal of deciphering the digital information stored in the human genome. Many algorithms for computational gene prediction have been described, ultimately derived from two basic concepts: (1) modeling gene structure and (2) recognizing sequence similarity. Successful hybrid methods combining these two concepts have also been developed. We present a third orthogonal approach to gene prediction, based on detecting the genomic signatures of transcription, accumulated over evolutionary time. We discuss four algorithms based on this third concept: Greens and CHOWDER, which quantify mutational strand biases caused by transcription-coupled DNA repair, and ROAST and PASTA, which are based on strand-specific selection against polyadenylation signals. We combined these algorithms into an integrated method called FEAST, which we used to predict the location and orientation of thousands of putative transcription units not overlapping known genes. Many of the newly predicted transcriptional units do not appear to code for proteins. The new algorithms are particularly apt at detecting genes with long introns and lacking sequence conservation. They therefore complement existing gene prediction methods and will help identify functional transcripts within many apparent “genomic deserts.” PMID:16543943

  11. CHEMINFORMATICS TOOLS FOR TOXICANT CHARACTERIZATION

    EPA Science Inventory

    • Continued development of the Shape Signatures method is planed.  This effort will include further development of the Shape Signatures database of PDB-extracted ligands and of the clustering algorithm.
    • In addition, we plan to develop an...

    • Detection and classification of concealed weapons using a magnetometer-based portal

      NASA Astrophysics Data System (ADS)

      Kotter, Dale K.; Roybal, Lyle G.; Polk, Robert E.

      2002-08-01

      A concealed weapons detection technology was developed through the support of the National Institute of Justice (NIJ) to provide a non intrusive means for rapid detection, location, and archiving of data (including visual) of potential suspects and weapon threats. This technology, developed by the Idaho National Engineering and Environmental Laboratory (INEEL), has been applied in a portal style weapons detection system using passive magnetic sensors as its basis. This paper will report on enhancements to the weapon detection system to enable weapon classification and to discriminate threats from non-threats. Advanced signal processing algorithms were used to analyze the magnetic spectrum generated when a person passes through a portal. These algorithms analyzed multiple variables including variance in the magnetic signature from random weapon placement and/or orientation. They perform pattern recognition and calculate the probability that the collected magnetic signature correlates to a known database of weapon versus non-weapon responses. Neural networks were used to further discriminate weapon type and identify controlled electronic items such as cell phones and pagers. False alarms were further reduced by analyzing the magnetic detector response by using a Joint Time Frequency Analysis digital signal processing technique. The frequency components and power spectrum for a given sensor response were derived. This unique fingerprint provided additional information to aid in signal analysis. This technology has the potential to produce major improvements in weapon detection and classification.

    • DSP-Based dual-polarity mass spectrum pattern recognition for bio-detection

      DOE Office of Scientific and Technical Information (OSTI.GOV)

      Riot, V; Coffee, K; Gard, E

      2006-04-21

      The Bio-Aerosol Mass Spectrometry (BAMS) instrument analyzes single aerosol particles using a dual-polarity time-of-flight mass spectrometer recording simultaneously spectra of thirty to a hundred thousand points on each polarity. We describe here a real-time pattern recognition algorithm developed at Lawrence Livermore National Laboratory that has been implemented on a nine Digital Signal Processor (DSP) system from Signatec Incorporated. The algorithm first preprocesses independently the raw time-of-flight data through an adaptive baseline removal routine. The next step consists of a polarity dependent calibration to a mass-to-charge representation, reducing the data to about five hundred to a thousand channels per polarity. Themore » last step is the identification step using a pattern recognition algorithm based on a library of known particle signatures including threat agents and background particles. The identification step includes integrating the two polarities for a final identification determination using a score-based rule tree. This algorithm, operating on multiple channels per-polarity and multiple polarities, is well suited for parallel real-time processing. It has been implemented on the PMP8A from Signatec Incorporated, which is a computer based board that can interface directly to the two one-Giga-Sample digitizers (PDA1000 from Signatec Incorporated) used to record the two polarities of time-of-flight data. By using optimized data separation, pipelining, and parallel processing across the nine DSPs it is possible to achieve a processing speed of up to a thousand particles per seconds, while maintaining the recognition rate observed on a non-real time implementation. This embedded system has allowed the BAMS technology to improve its throughput and therefore its sensitivity while maintaining a large dynamic range (number of channels and two polarities) thus maintaining the systems specificity for bio-detection.« less

    • SU-F-T-20: Novel Catheter Lumen Recognition Algorithm for Rapid Digitization

      DOE Office of Scientific and Technical Information (OSTI.GOV)

      Dise, J; McDonald, D; Ashenafi, M

      Purpose: Manual catheter recognition remains a time-consuming aspect of high-dose-rate brachytherapy (HDR) treatment planning. In this work, a novel catheter lumen recognition algorithm was created for accurate and rapid digitization. Methods: MatLab v8.5 was used to create the catheter recognition algorithm. Initially, the algorithm searches the patient CT dataset using an intensity based k-means filter designed to locate catheters. Once the catheters have been located, seed points are manually selected to initialize digitization of each catheter. From each seed point, the algorithm searches locally in order to automatically digitize the remaining catheter. This digitization is accomplished by finding pixels withmore » similar image curvature and divergence parameters compared to the seed pixel. Newly digitized pixels are treated as new seed positions, and hessian image analysis is used to direct the algorithm toward neighboring catheter pixels, and to make the algorithm insensitive to adjacent catheters that are unresolvable on CT, air pockets, and high Z artifacts. The algorithm was tested using 11 HDR treatment plans, including the Syed template, tandem and ovoid applicator, and multi-catheter lung brachytherapy. Digitization error was calculated by comparing manually determined catheter positions to those determined by the algorithm. Results: he digitization error was 0.23 mm ± 0.14 mm axially and 0.62 mm ± 0.13 mm longitudinally at the tip. The time of digitization, following initial seed placement was less than 1 second per catheter. The maximum total time required to digitize all tested applicators was 4 minutes (Syed template with 15 needles). Conclusion: This algorithm successfully digitizes HDR catheters for a variety of applicators with or without CT markers. The minimal axial error demonstrates the accuracy of the algorithm, and its insensitivity to image artifacts and challenging catheter positioning. Future work to automatically place initial seed positions would improve the algorithm speed.« less

    • Digital image envelope: method and evaluation

      NASA Astrophysics Data System (ADS)

      Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang

      2003-05-01

      Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.

    • HDL-level automated watermarking of IP cores

      NASA Astrophysics Data System (ADS)

      Castillo, E.; Meyer-Baese, U.; Parrilla, L.; García, A.; Lloris, A.

      2008-04-01

      This paper presents significant improvements to our previous watermarking technique for Intellectual Property Protection (IPP) of IP cores. The technique relies on hosting the bits of a digital signature at the HDL design level using resources included within the original system. Thus, any attack trying to change or remove the digital signature will damage the design. The technique also includes a procedure for secure signature extraction requiring minimal modifications to the system. The new advances refer to increasing the applicability of this watermarking technique to any design, not only to those including look-ups, and the provision of an automatic tool for signature hosting purposes. Synthesis results show that the application of the proposed watermarking strategy results in negligible degradation of system performance and very low area penalties and that the use of the automated tool, in addition to easy the signature hosting, leads to reduced area penalties.

    • Assessment of Gamma-Ray-Spectra Analysis Method Utilizing the Fireworks Algorithm for Various Error Measures

      DOE PAGES

      Alamaniotis, Miltiadis; Tsoukalas, Lefteri H.

      2018-01-01

      The analysis of measured data plays a significant role in enhancing nuclear nonproliferation mainly by inferring the presence of patterns associated with special nuclear materials. Among various types of measurements, gamma-ray spectra is the widest utilized type of data in nonproliferation applications. In this paper, a method that employs the fireworks algorithm (FWA) for analyzing gamma-ray spectra aiming at detecting gamma signatures is presented. In particular, FWA is utilized to fit a set of known signatures to a measured spectrum by optimizing an objective function, where non-zero coefficients express the detected signatures. FWA is tested on a set of experimentallymore » obtained measurements optimizing various objective functions—MSE, RMSE, Theil-2, MAE, MAPE, MAP—with results exhibiting its potential in providing highly accurate and precise signature detection. Finally and furthermore, FWA is benchmarked against genetic algorithms and multiple linear regression, showing its superiority over those algorithms regarding precision with respect to MAE, MAPE, and MAP measures.« less

    • Long-term verifiability of the electronic healthcare records' authenticity.

      PubMed

      Lekkas, Dimitrios; Gritzalis, Dimitris

      2007-01-01

      To investigate whether the long-term preservation of the authenticity of electronic healthcare records (EHR) is possible. To propose a mechanism that enables the secure validation of an EHR for long periods, far beyond the lifespan of a digital signature and at least as long as the lifetime of a patient. The study is based on the fact that although the attributes of data authenticity, i.e. integrity and origin verifiability, can be preserved by digital signatures, the necessary period for the retention of EHRs is far beyond the lifespan of a simple digital signature. It is identified that the lifespan of signed data is restricted by the validity period of the relevant keys and the digital certificates, by the future unavailability of signature-verification data, and by suppression of trust relationships. In this paper, the notarization paradigm is exploited, and a mechanism for cumulative notarization of signed EHR is proposed. The proposed mechanism implements a successive trust transition towards new entities, modern technologies, and refreshed data, eliminating any dependency of the relying party on ceased entities, obsolete data, or weak old technologies. The mechanism also exhibits strength against various threat scenarios. A future relying party will have to trust only the fresh technology and information provided by the last notary, in order to verify the authenticity of an old signed EHR. A Cumulatively Notarized Signature is strong even in the case of the compromise of a notary in the chain.

    • Digital signal processing algorithms for automatic voice recognition

      NASA Technical Reports Server (NTRS)

      Botros, Nazeih M.

      1987-01-01

      The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.

    • Online signature recognition using principal component analysis and artificial neural network

      NASA Astrophysics Data System (ADS)

      Hwang, Seung-Jun; Park, Seung-Je; Baek, Joong-Hwan

      2016-12-01

      In this paper, we propose an algorithm for on-line signature recognition using fingertip point in the air from the depth image acquired by Kinect. We extract 10 statistical features from X, Y, Z axis, which are invariant to changes in shifting and scaling of the signature trajectories in three-dimensional space. Artificial neural network is adopted to solve the complex signature classification problem. 30 dimensional features are converted into 10 principal components using principal component analysis, which is 99.02% of total variances. We implement the proposed algorithm and test to actual on-line signatures. In experiment, we verify the proposed method is successful to classify 15 different on-line signatures. Experimental result shows 98.47% of recognition rate when using only 10 feature vectors.

  1. Assuring image authenticity within a data grid using lossless digital signature embedding and a HIPAA-compliant auditing system

    NASA Astrophysics Data System (ADS)

    Lee, Jasper C.; Ma, Kevin C.; Liu, Brent J.

    2008-03-01

    A Data Grid for medical images has been developed at the Image Processing and Informatics Laboratory, USC to provide distribution and fault-tolerant storage of medical imaging studies across Internet2 and public domain. Although back-up policies and grid certificates guarantee privacy and authenticity of grid-access-points, there still lacks a method to guarantee the sensitive DICOM images have not been altered or corrupted during transmission across a public domain. This paper takes steps toward achieving full image transfer security within the Data Grid by utilizing DICOM image authentication and a HIPAA-compliant auditing system. The 3-D lossless digital signature embedding procedure involves a private 64 byte signature that is embedded into each original DICOM image volume, whereby on the receiving end the signature can to be extracted and verified following the DICOM transmission. This digital signature method has also been developed at the IPILab. The HIPAA-Compliant Auditing System (H-CAS) is required to monitor embedding and verification events, and allows monitoring of other grid activity as well. The H-CAS system federates the logs of transmission and authentication events at each grid-access-point and stores it into a HIPAA-compliant database. The auditing toolkit is installed at the local grid-access-point and utilizes Syslog [1], a client-server standard for log messaging over an IP network, to send messages to the H-CAS centralized database. By integrating digital image signatures and centralized logging capabilities, DICOM image integrity within the Medical Imaging and Informatics Data Grid can be monitored and guaranteed without loss to any image quality.

  2. Machine learning algorithm accurately detects fMRI signature of vulnerability to major depression.

    PubMed

    Sato, João R; Moll, Jorge; Green, Sophie; Deakin, John F W; Thomaz, Carlos E; Zahn, Roland

    2015-08-30

    Standard functional magnetic resonance imaging (fMRI) analyses cannot assess the potential of a neuroimaging signature as a biomarker to predict individual vulnerability to major depression (MD). Here, we use machine learning for the first time to address this question. Using a recently identified neural signature of guilt-selective functional disconnection, the classification algorithm was able to distinguish remitted MD from control participants with 78.3% accuracy. This demonstrates the high potential of our fMRI signature as a biomarker of MD vulnerability. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Measurement-device-independent quantum digital signatures

    NASA Astrophysics Data System (ADS)

    Puthoor, Ittoop Vergheese; Amiri, Ryan; Wallden, Petros; Curty, Marcos; Andersson, Erika

    2016-08-01

    Digital signatures play an important role in software distribution, modern communication, and financial transactions, where it is important to detect forgery and tampering. Signatures are a cryptographic technique for validating the authenticity and integrity of messages, software, or digital documents. The security of currently used classical schemes relies on computational assumptions. Quantum digital signatures (QDS), on the other hand, provide information-theoretic security based on the laws of quantum physics. Recent work on QDS Amiri et al., Phys. Rev. A 93, 032325 (2016);, 10.1103/PhysRevA.93.032325 Yin, Fu, and Zeng-Bing, Phys. Rev. A 93, 032316 (2016), 10.1103/PhysRevA.93.032316 shows that such schemes do not require trusted quantum channels and are unconditionally secure against general coherent attacks. However, in practical QDS, just as in quantum key distribution (QKD), the detectors can be subjected to side-channel attacks, which can make the actual implementations insecure. Motivated by the idea of measurement-device-independent quantum key distribution (MDI-QKD), we present a measurement-device-independent QDS (MDI-QDS) scheme, which is secure against all detector side-channel attacks. Based on the rapid development of practical MDI-QKD, our MDI-QDS protocol could also be experimentally implemented, since it requires a similar experimental setup.

  4. 27 CFR 73.1 - What does this part do?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ....1 Section 73.1 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT OF THE TREASURY (CONTINUED) PROCEDURES AND PRACTICES ELECTRONIC SIGNATURES; ELECTRONIC SUBMISSION... conditions under which we will allow you to: (1) Use electronic signatures or digital signatures executed to...

  5. 41 CFR Appendix C to Chapter 301 - Standard Data Elements for Federal Travel [Traveler Identification

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Method employee used to purchase transportation tickets Method Indicator GTR U.S. Government Transportation Request Central Billing Account A contractor centrally billed account Government Charge Card In.../Date Fields Claimant Signature Traveler's signature, or digital representation. The signature signifies...

  6. Quantum random oracle model for quantum digital signature

    NASA Astrophysics Data System (ADS)

    Shang, Tao; Lei, Qi; Liu, Jianwei

    2016-10-01

    The goal of this work is to provide a general security analysis tool, namely, the quantum random oracle (QRO), for facilitating the security analysis of quantum cryptographic protocols, especially protocols based on quantum one-way function. QRO is used to model quantum one-way function and different queries to QRO are used to model quantum attacks. A typical application of quantum one-way function is the quantum digital signature, whose progress has been hampered by the slow pace of the experimental realization. Alternatively, we use the QRO model to analyze the provable security of a quantum digital signature scheme and elaborate the analysis procedure. The QRO model differs from the prior quantum-accessible random oracle in that it can output quantum states as public keys and give responses to different queries. This tool can be a test bed for the cryptanalysis of more quantum cryptographic protocols based on the quantum one-way function.

  7. Basis for the implementation of digital signature in Argentine's health environment

    NASA Astrophysics Data System (ADS)

    Escobar, P. P.; Formica, M.

    2007-11-01

    The growth of telemedical applications and electronic transactions in health environments is paced by the constant technology evolution. This implies a big cultural change in traditional medicine and in hospital information systems' users which arrival is delayed, basically, by the lack of solid laws and a well defined role-based infrastructure. The use of digital signature as a mean of identification, authentication, confidentiality and non-repudiation is the most suitable tool for assuring the electronic transactions and patient's data protection. The implementation of a Public Key Infrastructure (PKI) in health environment allows for authentication, encryption and use of digital signature for assuring confidentiality and control of the movement of sensitive information. This work defines the minimum technological, legal and procedural basis for a successful PKI implementation and establishes the roles for the different actors in the chain of confidence in the public health environment of Argentine.

  8. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  9. Pattern classifier for health monitoring of helicopter gearboxes

    NASA Technical Reports Server (NTRS)

    Chin, Hsinyung; Danai, Kourosh; Lewicki, David G.

    1993-01-01

    The application of a newly developed diagnostic method to a helicopter gearbox is demonstrated. This method is a pattern classifier which uses a multi-valued influence matrix (MVIM) as its diagnostic model. The method benefits from a fast learning algorithm, based on error feedback, that enables it to estimate gearbox health from a small set of measurement-fault data. The MVIM method can also assess the diagnosability of the system and variability of the fault signatures as the basis to improve fault signatures. This method was tested on vibration signals reflecting various faults in an OH-58A main rotor transmission gearbox. The vibration signals were then digitized and processed by a vibration signal analyzer to enhance and extract various features of the vibration data. The parameters obtained from this analyzer were utilized to train and test the performance of the MVIM method in both detection and diagnosis. The results indicate that the MVIM method provided excellent detection results when the full range of faults effects on the measurements were included in training, and it had a correct diagnostic rate of 95 percent when the faults were included in training.

  10. New efficient algorithm for recognizing handwritten Hindi digits

    NASA Astrophysics Data System (ADS)

    El-Sonbaty, Yasser; Ismail, Mohammed A.; Karoui, Kamal

    2001-12-01

    In this paper a new algorithm for recognizing handwritten Hindi digits is proposed. The proposed algorithm is based on using the topological characteristics combined with statistical properties of the given digits in order to extract a set of features that can be used in the process of digit classification. 10,000 handwritten digits are used in the experimental results. 1100 digits are used for training and another 5500 unseen digits are used for testing. The recognition rate has reached 97.56%, a substitution rate of 1.822%, and a rejection rate of 0.618%.

  11. A novel algorithm for simplification of complex gene classifiers in cancer

    PubMed Central

    Wilson, Raphael A.; Teng, Ling; Bachmeyer, Karen M.; Bissonnette, Mei Lin Z.; Husain, Aliya N.; Parham, David M.; Triche, Timothy J.; Wing, Michele R.; Gastier-Foster, Julie M.; Barr, Frederic G.; Hawkins, Douglas S.; Anderson, James R.; Skapek, Stephen X.; Volchenboum, Samuel L.

    2013-01-01

    The clinical application of complex molecular classifiers as diagnostic or prognostic tools has been limited by the time and cost needed to apply them to patients. Using an existing fifty-gene expression signature known to separate two molecular subtypes of the pediatric cancer rhabdomyosarcoma, we show that an exhaustive iterative search algorithm can distill this complex classifier down to two or three features with equal discrimination. We validated the two-gene signatures using three separate and distinct data sets, including one that uses degraded RNA extracted from formalin-fixed, paraffin-embedded material. Finally, to demonstrate the generalizability of our algorithm, we applied it to a lung cancer data set to find minimal gene signatures that can distinguish survival. Our approach can easily be generalized and coupled to existing technical platforms to facilitate the discovery of simplified signatures that are ready for routine clinical use. PMID:23913937

  12. Assessment of Gamma-Ray Spectra Analysis Method Utilizing the Fireworks Algorithm for various Error Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alamaniotis, Miltiadis; Tsoukalas, Lefteri H.

    2018-01-01

    Significant role in enhancing nuclear nonproliferation plays the analysis of obtained data and the inference of the presence or not of special nuclear materials in them. Among various types of measurements, gamma-ray spectra is the widest used type of data utilized for analysis in nonproliferation. In this chapter, a method that employs the fireworks algorithm (FWA) for analyzing gamma-ray spectra aiming at detecting gamma signatures is presented. In particular FWA is utilized to fit a set of known signatures to a measured spectrum by optimizing an objective function, with non-zero coefficients expressing the detected signatures. FWA is tested on amore » set of experimentally obtained measurements and various objective functions -MSE, RMSE, Theil-2, MAE, MAPE, MAP- with results exhibiting its potential in providing high accuracy and high precision of detected signatures. Furthermore, FWA is benchmarked against genetic algorithms, and multiple linear regression with results exhibiting its superiority over the rest tested algorithms with respect to precision for MAE, MAPE and MAP measures.« less

  13. Feasibility study for locating archaeological village sites by satellite remote sensing techniques. [multispectral photography of Alaska

    NASA Technical Reports Server (NTRS)

    Cook, J. P. (Principal Investigator); Stringer, W. J.

    1974-01-01

    The author has identified the following significant results. The objective is to determine the feasibility of detecting large Alaskan archaeological sites by satellite remote sensing techniques and mapping such sites. The approach used is to develop digital multispectral signatures of dominant surface features including vegetation, exposed soils and rock, hydrological patterns and known archaeological sites. ERTS-1 scenes are then printed out digitally in a map-like array with a letter reflecting the most appropriate classification representing each pixel. Preliminary signatures were developed and tested. It was determined that there was a need to tighten up the archaeological site signature by developing accurate signatures for all naturally-occurring vegetation and surface conditions in the vicinity of the test area. These second generation signatures have been tested by means of computer printouts and classified tape displays on the University of Alaska CDU-200 and by comparison with aerial photography. It has been concluded that the archaeological signatures now in use are as good as can be developed. Plans are to print out signatures for the entire test area and locate on topographic maps the likely locations of archaeological sites within the test area.

  14. Watermarking protocols for authentication and ownership protection based on timestamps and holograms

    NASA Astrophysics Data System (ADS)

    Dittmann, Jana; Steinebach, Martin; Croce Ferri, Lucilla

    2002-04-01

    Digital watermarking has become an accepted technology for enabling multimedia protection schemes. One problem here is the security of these schemes. Without a suitable framework, watermarks can be replaced and manipulated. We discuss different protocols providing security against rightful ownership attacks and other fraud attempts. We compare the characteristics of existing protocols for different media like direct embedding or seed based and required attributes of the watermarking technology like robustness or payload. We introduce two new media independent protocol schemes for rightful ownership authentication. With the first scheme we ensure security of digital watermarks used for ownership protection with a combination of two watermarks: first watermark of the copyright holder and a second watermark from a Trusted Third Party (TTP). It is based on hologram embedding and the watermark consists of e.g. a company logo. As an example we use digital images and specify the properties of the embedded additional security information. We identify components necessary for the security protocol like timestamp, PKI and cryptographic algorithms. The second scheme is used for authentication. It is designed for invertible watermarking applications which require high data integrity. We combine digital signature schemes and digital watermarking to provide a public verifiable integrity. The original data can only be reproduced with a secret key. Both approaches provide solutions for copyright and authentication watermarking and are introduced for image data but can be easily adopted for video and audio data as well.

  15. The Integrity of Digital Information: Mechanics and Definitional Issues.

    ERIC Educational Resources Information Center

    Lynch, Clifford A.

    1994-01-01

    Considers issues regarding the migration of a system of literature into electronic formats. Highlights include integrity in an information distribution system; digest technology; tracings that permit detection of copied digital objects; verifying sources; digital signature technology and cryptography; electronic publishing; and intellectual…

  16. Detecting and visualizing weak signatures in hyperspectral data

    NASA Astrophysics Data System (ADS)

    MacPherson, Duncan James

    This thesis evaluates existing techniques for detecting weak spectral signatures from remotely sensed hyperspectral data. Algorithms are presented that successfully detect hard-to-find 'mystery' signatures in unknown cluttered backgrounds. The term 'mystery' is used to describe a scenario where the spectral target and background endmembers are unknown. Sub-Pixel analysis and background suppression are used to find deeply embedded signatures which can be less than 10% of the total signal strength. Existing 'mystery target' detection algorithms are derived and compared. Several techniques are shown to be superior both visually and quantitatively. Detection performance is evaluated using confidence metrics that are developed. A multiple algorithm approach is shown to improve detection confidence significantly. Although the research focuses on remote sensing applications, the algorithms presented can be applied to a wide variety of diverse fields such as medicine, law enforcement, manufacturing, earth science, food production, and astrophysics. The algorithms are shown to be general and can be applied to both the reflective and emissive parts of the electromagnetic spectrum. The application scope is a broad one and the final results open new opportunities for many specific applications including: land mine detection, pollution and hazardous waste detection, crop abundance calculations, volcanic activity monitoring, detecting diseases in food, automobile or airplane target recognition, cancer detection, mining operations, extracting galactic gas emissions, etc.

  17. Handwritten digits recognition based on immune network

    NASA Astrophysics Data System (ADS)

    Li, Yangyang; Wu, Yunhui; Jiao, Lc; Wu, Jianshe

    2011-11-01

    With the development of society, handwritten digits recognition technique has been widely applied to production and daily life. It is a very difficult task to solve these problems in the field of pattern recognition. In this paper, a new method is presented for handwritten digit recognition. The digit samples firstly are processed and features extraction. Based on these features, a novel immune network classification algorithm is designed and implemented to the handwritten digits recognition. The proposed algorithm is developed by Jerne's immune network model for feature selection and KNN method for classification. Its characteristic is the novel network with parallel commutating and learning. The performance of the proposed method is experimented to the handwritten number datasets MNIST and compared with some other recognition algorithms-KNN, ANN and SVM algorithm. The result shows that the novel classification algorithm based on immune network gives promising performance and stable behavior for handwritten digits recognition.

  18. Quantum key management

    DOEpatents

    Hughes, Richard John; Thrasher, James Thomas; Nordholt, Jane Elizabeth

    2016-11-29

    Innovations for quantum key management harness quantum communications to form a cryptography system within a public key infrastructure framework. In example implementations, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a Merkle signature scheme (using Winternitz one-time digital signatures or other one-time digital signatures, and Merkle hash trees) to constitute a cryptography system. More generally, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a hash-based signature scheme. This provides a secure way to identify, authenticate, verify, and exchange secret cryptographic keys. Features of the quantum key management innovations further include secure enrollment of users with a registration authority, as well as credential checking and revocation with a certificate authority, where the registration authority and/or certificate authority can be part of the same system as a trusted authority for quantum key distribution.

  19. Engineering analysis of LANDSAT 1 data for Southeast Asian agriculture

    NASA Technical Reports Server (NTRS)

    Mcnair, A. J.; Heydt, H. L.; Liang, T.; Levine, G. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. LANDSAT spatial resolution was estimated to be adequate, but barely so, for the purpose of detailed assessment of rice or site status. This was due to the spatially fine grain, heterogenous nature of most rice areas. Use of two spectral bands of digital data (MSS 5 and MSS 6 or 7) appeared to be adequate for site recognition and gross site status assessment. Spectral/temporal signatures were found to be more powerful than spectra signatures alone and virtually essential for most analyses of rice growth and rice sites in the Philippine environment. Two band, two date signatures were estimated to be adequate for most purposes, although good results were achieved using one band two- or four-date signatures. A radiometric resolution of 64 levels in each band was found adequate for the analyses of LANDSAT digital data for site recognition and gross site or rice growth assessment.

  20. Variability of Grip Kinetics during Adult Signature Writing

    PubMed Central

    Ghali, Bassma; Thalanki Anantha, Nayanashri; Chan, Jennifer; Chau, Tom

    2013-01-01

    Grip kinetics and their variation are emerging as important considerations in the clinical assessment of handwriting pathologies, fine motor rehabilitation, biometrics, forensics and ergonomic pen design. This study evaluated the intra- and inter-participant variability of grip shape kinetics in adults during signature writing. Twenty (20) adult participants wrote on a digitizing tablet using an instrumented pen that measured the forces exerted on its barrel. Signature samples were collected over 10 days, 3 times a day, to capture temporal variations in grip shape kinetics. A kinetic topography (i.e., grip shape image) was derived per signature by time-averaging the measured force at each of 32 locations around the pen barrel. The normalized cross correlations (NCC) of grip shape images were calculated within- and between-participants. Several classification algorithms were implemented to gauge the error rate of participant discrimination based on grip shape kinetics. Four different grip shapes emerged and several participants made grip adjustments (change in grip shape or grip height) or rotated the pen during writing. Nonetheless, intra-participant variation in grip kinetics was generally much smaller than inter-participant force variations. Using the entire grip shape images as a 32-dimensional input feature vector, a K-nearest neighbor classifier achieved an error rate of % in discriminating among participants. These results indicate that writers had unique grip shape kinetics that were repeatable over time but distinct from those of other participants. The topographic analysis of grip kinetics may inform the development of personalized interventions or customizable grips in clinical and industrial applications, respectively. PMID:23658812

  1. Variability of grip kinetics during adult signature writing.

    PubMed

    Ghali, Bassma; Thalanki Anantha, Nayanashri; Chan, Jennifer; Chau, Tom

    2013-01-01

    Grip kinetics and their variation are emerging as important considerations in the clinical assessment of handwriting pathologies, fine motor rehabilitation, biometrics, forensics and ergonomic pen design. This study evaluated the intra- and inter-participant variability of grip shape kinetics in adults during signature writing. Twenty (20) adult participants wrote on a digitizing tablet using an instrumented pen that measured the forces exerted on its barrel. Signature samples were collected over 10 days, 3 times a day, to capture temporal variations in grip shape kinetics. A kinetic topography (i.e., grip shape image) was derived per signature by time-averaging the measured force at each of 32 locations around the pen barrel. The normalized cross correlations (NCC) of grip shape images were calculated within- and between-participants. Several classification algorithms were implemented to gauge the error rate of participant discrimination based on grip shape kinetics. Four different grip shapes emerged and several participants made grip adjustments (change in grip shape or grip height) or rotated the pen during writing. Nonetheless, intra-participant variation in grip kinetics was generally much smaller than inter-participant force variations. Using the entire grip shape images as a 32-dimensional input feature vector, a K-nearest neighbor classifier achieved an error rate of 1.2±0.4% in discriminating among participants. These results indicate that writers had unique grip shape kinetics that were repeatable over time but distinct from those of other participants. The topographic analysis of grip kinetics may inform the development of personalized interventions or customizable grips in clinical and industrial applications, respectively.

  2. The effects of gray scale image processing on digital mammography interpretation performance.

    PubMed

    Cole, Elodia B; Pisano, Etta D; Zeng, Donglin; Muller, Keith; Aylward, Stephen R; Park, Sungwook; Kuzmiak, Cherie; Koomen, Marcia; Pavic, Dag; Walsh, Ruth; Baker, Jay; Gimenez, Edgardo I; Freimanis, Rita

    2005-05-01

    To determine the effects of three image-processing algorithms on diagnostic accuracy of digital mammography in comparison with conventional screen-film mammography. A total of 201 cases consisting of nonprocessed soft copy versions of the digital mammograms acquired from GE, Fischer, and Trex digital mammography systems (1997-1999) and conventional screen-film mammograms of the same patients were interpreted by nine radiologists. The raw digital data were processed with each of three different image-processing algorithms creating three presentations-manufacturer's default (applied and laser printed to film by each of the manufacturers), MUSICA, and PLAHE-were presented in soft copy display. There were three radiologists per presentation. Area under the receiver operating characteristic curve for GE digital mass cases was worse than screen-film for all digital presentations. The area under the receiver operating characteristic for Trex digital mass cases was better, but only with images processed with the manufacturer's default algorithm. Sensitivity for GE digital mass cases was worse than screen film for all digital presentations. Specificity for Fischer digital calcifications cases was worse than screen film for images processed in default and PLAHE algorithms. Specificity for Trex digital calcifications cases was worse than screen film for images processed with MUSICA. Specific image-processing algorithms may be necessary for optimal presentation for interpretation based on machine and lesion type.

  3. Cognitive state monitoring and the design of adaptive instruction in digital environments: lessons learned from cognitive workload assessment using a passive brain-computer interface approach

    PubMed Central

    Gerjets, Peter; Walter, Carina; Rosenstiel, Wolfgang; Bogdan, Martin; Zander, Thorsten O.

    2014-01-01

    According to Cognitive Load Theory (CLT), one of the crucial factors for successful learning is the type and amount of working-memory load (WML) learners experience while studying instructional materials. Optimal learning conditions are characterized by providing challenges for learners without inducing cognitive over- or underload. Thus, presenting instruction in a way that WML is constantly held within an optimal range with regard to learners' working-memory capacity might be a good method to provide these optimal conditions. The current paper elaborates how digital learning environments, which achieve this goal can be developed by combining approaches from Cognitive Psychology, Neuroscience, and Computer Science. One of the biggest obstacles that needs to be overcome is the lack of an unobtrusive method of continuously assessing learners' WML in real-time. We propose to solve this problem by applying passive Brain-Computer Interface (BCI) approaches to realistic learning scenarios in digital environments. In this paper we discuss the methodological and theoretical prospects and pitfalls of this approach based on results from the literature and from our own research. We present a strategy on how several inherent challenges of applying BCIs to WML and learning can be met by refining the psychological constructs behind WML, by exploring their neural signatures, by using these insights for sophisticated task designs, and by optimizing algorithms for analyzing electroencephalography (EEG) data. Based on this strategy we applied machine-learning algorithms for cross-task classifications of different levels of WML to tasks that involve studying realistic instructional materials. We obtained very promising results that yield several recommendations for future work. PMID:25538544

  4. Spatial analysis of fluvial terraces in GRASS GIS accessing R functionality

    NASA Astrophysics Data System (ADS)

    Józsa, Edina

    2017-04-01

    Terrace research along the Danube is a major topic of Hungarian traditional geomorphology because of the socio-economic role of terrace surfaces and their importance in paleo-environmental reconstructions. Semi-automated mapping of fluvial landforms from a coherent digital elevation dataset allow objective analysis of hydrogeomorphic characteristics with low time and cost requirements. New results obtained with unified GIS-based algorithms can be integrated with previous findings regarding landscape evolution. The complementary functionality of GRASS GIS and R provides the possibility to develop a flexible terrain analysing tool for the delineation and quantifiable analysis of terrace remnants. Using R as an intermediate analytical environment and visualisation tool gives great added value to the algorithm, while GRASS GIS is capable of handling the large digital elevation datasets and perform the demanding computations to prepare necessary raster derivatives (Bivand, R.S. et al. 2008). The proposed terrace mapping algorithm is based on the work of Demoulin, A. et al. (2007), but it is further improved in the form of GRASS GIS script tool accessing R functionality. In the first step the hydrogeomorphic signatures of the given study site are explored and the area is divided along clearly recognizable structural-morphological boundaries.The algorithm then cuts up the subregions into parallel sections in the flow direction and determines cells potentially belonging to terrace surfaces based on local slope characteristics and a minimum area size threshold. As a result an output report is created that contains a histogram of altitudes, a swath-profile of the landscape, scatter plots to represent the relation of the relative elevations and slope values in the analysed sections and a final plot showing the longitudinal profile of the river with the determined height ranges of terrace levels. The algorithm also produces a raster map of extracted terrace remnants. From this dataset it is possible to interpolate a new digital elevation model approximating the former terraced valley surface using the Ordinary Kriging method (Troiani, F. and Della Seta, M. 2011). The applicability of the algorithm was tested on the northern foreland of Gerecse Mountains, an antecedent valley section of the Danube, with terrace remnants expected in 6 to 8 altitude ranges. Methodological issues arising from determining the optimal threshold values were explored using an artificial hillslope model, while the terrace profiles and terrace-top surfaces raster generated from the digital elevation model were validated with the previous findings of traditional geomorphological surveys. This research was supported by the Human Capacities Grant Management Office and the Hungarian Ministry of Human Capacities in the framework of the NTP-NFTÖ-16 project. References: Bivand, R.S. et al. (2008). Applied Spatial Data Analysis with R. New York: Springer. 378 p. Demoulin, A. et al. (2007). An automated method to extract fluvial terraces from digital elevation models: The Vesdre valley, a case study in eastern Belgium. - Geomorphology 91 (1-2): 51-64. Troiani, E. and Della Seta, M. (2011). Geomorphological response of fluvial and coastal terraces to Quaternary tectonics and climate as revealed by geostatistical topographic analysis. - Earth Surface Processes and Landforms 36: 1193-1208.

  5. CrossLink: a novel method for cross-condition classification of cancer subtypes.

    PubMed

    Ma, Chifeng; Sastry, Konduru S; Flore, Mario; Gehani, Salah; Al-Bozom, Issam; Feng, Yusheng; Serpedin, Erchin; Chouchane, Lotfi; Chen, Yidong; Huang, Yufei

    2016-08-22

    We considered the prediction of cancer classes (e.g. subtypes) using patient gene expression profiles that contain both systematic and condition-specific biases when compared with the training reference dataset. The conventional normalization-based approaches cannot guarantee that the gene signatures in the reference and prediction datasets always have the same distribution for all different conditions as the class-specific gene signatures change with the condition. Therefore, the trained classifier would work well under one condition but not under another. To address the problem of current normalization approaches, we propose a novel algorithm called CrossLink (CL). CL recognizes that there is no universal, condition-independent normalization mapping of signatures. In contrast, it exploits the fact that the signature is unique to its associated class under any condition and thus employs an unsupervised clustering algorithm to discover this unique signature. We assessed the performance of CL for cross-condition predictions of PAM50 subtypes of breast cancer by using a simulated dataset modeled after TCGA BRCA tumor samples with a cross-validation scheme, and datasets with known and unknown PAM50 classification. CL achieved prediction accuracy >73 %, highest among other methods we evaluated. We also applied the algorithm to a set of breast cancer tumors derived from Arabic population to assign a PAM50 classification to each tumor based on their gene expression profiles. A novel algorithm CrossLink for cross-condition prediction of cancer classes was proposed. In all test datasets, CL showed robust and consistent improvement in prediction performance over other state-of-the-art normalization and classification algorithms.

  6. A data-hiding technique with authentication, integration, and confidentiality for electronic patient records.

    PubMed

    Chao, Hui-Mei; Hsu, Chin-Ming; Miaou, Shaou-Gang

    2002-03-01

    A data-hiding technique called the "bipolar multiple-number base" was developed to provide capabilities of authentication, integration, and confidentiality for an electronic patient record (EPR) transmitted among hospitals through the Internet. The proposed technique is capable of hiding those EPR related data such as diagnostic reports, electrocardiogram, and digital signatures from doctors or a hospital into a mark image. The mark image could be the mark of a hospital used to identify the origin of an EPR. Those digital signatures from doctors and a hospital could be applied for the EPR authentication. Thus, different types of medical data can be integrated into the same mark image. The confidentiality is ultimately achieved by decrypting the EPR related data and digital signatures with an exact copy of the original mark image. The experimental results validate the integrity and the invisibility of the hidden EPR related data. This newly developed technique allows all of the hidden data to be separated and restored perfectly by authorized users.

  7. Statistical Methods for Passive Vehicle Classification in Urban Traffic Surveillance and Control

    DOT National Transportation Integrated Search

    1980-01-01

    A statistical approach to passive vehicle classification using the phase-shift signature from electromagnetic presence-type vehicle detectors is developed with digitized samples of the analog phase-shift signature, the problem of classifying vehicle ...

  8. A Fast lattice-based polynomial digital signature system for m-commerce

    NASA Astrophysics Data System (ADS)

    Wei, Xinzhou; Leung, Lin; Anshel, Michael

    2003-01-01

    The privacy and data integrity are not guaranteed in current wireless communications due to the security hole inside the Wireless Application Protocol (WAP) version 1.2 gateway. One of the remedies is to provide an end-to-end security in m-commerce by applying application level security on top of current WAP1.2. The traditional security technologies like RSA and ECC applied on enterprise's server are not practical for wireless devices because wireless devices have relatively weak computation power and limited memory compared with server. In this paper, we developed a lattice based polynomial digital signature system based on NTRU's Polynomial Authentication and Signature Scheme (PASS), which enabled the feasibility of applying high-level security on both server and wireless device sides.

  9. MIDAS, prototype Multivariate Interactive Digital Analysis System for large area earth resources surveys. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Christenson, D.; Gordon, M.; Kistler, R.; Kriegler, F.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1977-01-01

    A third-generation, fast, low cost, multispectral recognition system (MIDAS) able to keep pace with the large quantity and high rates of data acquisition from large regions with present and projected sensots is described. The program can process a complete ERTS frame in forty seconds and provide a color map of sixteen constituent categories in a few minutes. A principle objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in the overall program is described. The system contains a midi-computer to control the various high speed processing elements in the data path, a preprocessor to condition data, and a classifier which implements an all digital prototype multivariate Gaussian maximum likelihood or a Bayesian decision algorithm. Sufficient software was developed to perform signature extraction, control the preprocessor, compute classifier coefficients, control the classifier operation, operate the color display and printer, and diagnose operation.

  10. Automated thematic mapping and change detection of ERTS-A images. [digital interpretation of Arizona imagery

    NASA Technical Reports Server (NTRS)

    Gramenopoulos, N. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. For the recognition of terrain types, spatial signatures are developed from the diffraction patterns of small areas of ERTS-1 images. This knowledge is exploited for the measurements of a small number of meaningful spatial features from the digital Fourier transforms of ERTS-1 image cells containing 32 x 32 picture elements. Using these spatial features and a heuristic algorithm, the terrain types in the vicinity of Phoenix, Arizona were recognized by the computer with a high accuracy. Then, the spatial features were combined with spectral features and using the maximum likelihood criterion the recognition accuracy of terrain types increased substantially. It was determined that the recognition accuracy with the maximum likelihood criterion depends on the statistics of the feature vectors. Nonlinear transformations of the feature vectors are required so that the terrain class statistics become approximately Gaussian. It was also determined that for a given geographic area the statistics of the classes remain invariable for a period of a month but vary substantially between seasons.

  11. Digital Image Sensor-Based Assessment of the Status of Oat (Avena sativa L.) Crops after Frost Damage

    PubMed Central

    Macedo-Cruz, Antonia; Pajares, Gonzalo; Santos, Matilde; Villegas-Romero, Isidro

    2011-01-01

    The aim of this paper is to classify the land covered with oat crops, and the quantification of frost damage on oats, while plants are still in the flowering stage. The images are taken by a digital colour camera CCD-based sensor. Unsupervised classification methods are applied because the plants present different spectral signatures, depending on two main factors: illumination and the affected state. The colour space used in this application is CIELab, based on the decomposition of the colour in three channels, because it is the closest to human colour perception. The histogram of each channel is successively split into regions by thresholding. The best threshold to be applied is automatically obtained as a combination of three thresholding strategies: (a) Otsu’s method, (b) Isodata algorithm, and (c) Fuzzy thresholding. The fusion of these automatic thresholding techniques and the design of the classification strategy are some of the main findings of the paper, which allows an estimation of the damages and a prediction of the oat production. PMID:22163940

  12. Digital image sensor-based assessment of the status of oat (Avena sativa L.) crops after frost damage.

    PubMed

    Macedo-Cruz, Antonia; Pajares, Gonzalo; Santos, Matilde; Villegas-Romero, Isidro

    2011-01-01

    The aim of this paper is to classify the land covered with oat crops, and the quantification of frost damage on oats, while plants are still in the flowering stage. The images are taken by a digital colour camera CCD-based sensor. Unsupervised classification methods are applied because the plants present different spectral signatures, depending on two main factors: illumination and the affected state. The colour space used in this application is CIELab, based on the decomposition of the colour in three channels, because it is the closest to human colour perception. The histogram of each channel is successively split into regions by thresholding. The best threshold to be applied is automatically obtained as a combination of three thresholding strategies: (a) Otsu's method, (b) Isodata algorithm, and (c) Fuzzy thresholding. The fusion of these automatic thresholding techniques and the design of the classification strategy are some of the main findings of the paper, which allows an estimation of the damages and a prediction of the oat production.

  13. 3D measurement by digital photogrammetry

    NASA Astrophysics Data System (ADS)

    Schneider, Carl T.

    1993-12-01

    Photogrammetry is well known in geodetic surveys as aerial photogrammetry or close range applications as architectural photogrammetry. The photogrammetric methods and algorithms combined with digital cameras and digital image processing methods are now introduced for industrial applications as automation and quality control. The presented paper will describe the photogrammetric and digital image processing algorithms and the calibration methods. These algorithms and methods were demonstrated with application examples. These applications are a digital photogrammetric workstation as a mobil multi purpose 3D measuring tool and a tube measuring system as an example for a single purpose tool.

  14. Glint-induced false alarm reduction in signature adaptive target detection

    NASA Astrophysics Data System (ADS)

    Crosby, Frank J.

    2002-07-01

    The signal adaptive target detection algorithm developed by Crosby and Riley uses target geometry to discern anomalies in local backgrounds. Detection is not restricted based on specific target signatures. The robustness of the algorithm is limited by an increased false alarm potential. The base algorithm is extended to eliminate one common source of false alarms in a littoral environment. This common source is glint reflected on the surface of water. The spectral and spatial transience of glint prevent straightforward characterization and complicate exclusion. However, the statistical basis of the detection algorithm and its inherent computations allow for glint discernment and the removal of its influence.

  15. Doppler-based motion compensation algorithm for focusing the signature of a rotorcraft.

    PubMed

    Goldman, Geoffrey H

    2013-02-01

    A computationally efficient algorithm was developed and tested to compensate for the effects of motion on the acoustic signature of a rotorcraft. For target signatures with large spectral peaks that vary slowly in amplitude and have near constant frequency, the time-varying Doppler shift can be tracked and then removed from the data. The algorithm can be used to preprocess data for classification, tracking, and nulling algorithms. The algorithm was tested on rotorcraft data. The average instantaneous frequency of the first harmonic of a rotorcraft was tracked with a fixed-lag smoother. Then, state space estimates of the frequency were used to calculate a time warping that removed the effect of a time-varying Doppler shift from the data. The algorithm was evaluated by analyzing the increase in the amplitude of the harmonics in the spectrum of a rotorcraft. The results depended upon the frequency of the harmonics and the processing interval duration. Under good conditions, the results for the fundamental frequency of the target (~11 Hz) almost achieved an estimated upper bound. The results for higher frequency harmonics had larger increases in the amplitude of the peaks, but significantly lower than the estimated upper bounds.

  16. Topogrid Derived 10 Meter Resolution Digital Elevation Model of the Shenandoah National Park and Surrounding Region, Virginia

    USGS Publications Warehouse

    Chirico, Peter G.; Tanner, Seth D.

    2004-01-01

    Explanation The purpose of developing a new 10m resolution DEM of the Shenandoah National Park Region was to more accurately depict geologic structure, surfical geology, and landforms of the Shenandoah National Park Region in preparation for automated landform classification. Previously, only a 30m resolution DEM was available through the National Elevation Dataset (NED). During production of the Shenandoah10m DEM of the Park the Geography Discipline of the USGS completed a revised 10m DEM to be included into the NED. However, different methodologies were used to produce the two similar DEMs. The ANUDEM algorithm was used to develop the Shenadoah DEM data. This algorithm allows for the inclusion of contours, streams, rivers, lake and water body polygons as well as spot height data to control the elevation model. A statistical analysis using over 800 National Geodetic Survey (NGS) first and second order vertical control points reveals that the Shenandoah10m DEM, produced as a part of the Appalachian Blue Ridge Landscape project, has a vertical accuracy of ?4.87 meters. The metadata for the 10m NED data reports a vertical accuracy of ?7m. A table listing the NGS control points, the elevation comparison, and the RMSE for the Shenandoah10m DEM is provided. The process of automated terrain classification involves developing statistical signatures from the DEM for each type of surficial deposit and landform type. The signature will be a measure of several characteristics derived from the elevation data including slope, aspect, planform curvature, and profile curvature. The quality of the DEM is of critical importance when extracting terrain signatures. The highest possible horizontal and vertical accuracy is required. The more accurate Shenandoah 10m DEM can now be analyzed and integrated with the geologic observations to yield statistical correlations between the two in the development of landform and surface geology mapping projects.

  17. The optimal digital filters of sine and cosine transforms for geophysical transient electromagnetic method

    NASA Astrophysics Data System (ADS)

    Zhao, Yun-wei; Zhu, Zi-qiang; Lu, Guang-yin; Han, Bo

    2018-03-01

    The sine and cosine transforms implemented with digital filters have been used in the Transient electromagnetic methods for a few decades. Kong (2007) proposed a method of obtaining filter coefficients, which are computed in the sample domain by Hankel transform pair. However, the curve shape of Hankel transform pair changes with a parameter, which usually is set to be 1 or 3 in the process of obtaining the digital filter coefficients of sine and cosine transforms. First, this study investigates the influence of the parameter on the digital filter algorithm of sine and cosine transforms based on the digital filter algorithm of Hankel transform and the relationship between the sine, cosine function and the ±1/2 order Bessel function of the first kind. The results show that the selection of the parameter highly influences the precision of digital filter algorithm. Second, upon the optimal selection of the parameter, it is found that an optimal sampling interval s also exists to achieve the best precision of digital filter algorithm. Finally, this study proposes four groups of sine and cosine transform digital filter coefficients with different length, which may help to develop the digital filter algorithm of sine and cosine transforms, and promote its application.

  18. Identity-Based Verifiably Encrypted Signatures without Random Oracles

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Wu, Qianhong; Qin, Bo

    Fair exchange protocol plays an important role in electronic commerce in the case of exchanging digital contracts. Verifiably encrypted signatures provide an optimistic solution to these scenarios with an off-line trusted third party. In this paper, we propose an identity-based verifiably encrypted signature scheme. The scheme is non-interactive to generate verifiably encrypted signatures and the resulting encrypted signature consists of only four group elements. Based on the computational Diffie-Hellman assumption, our scheme is proven secure without using random oracles. To the best of our knowledge, this is the first identity-based verifiably encrypted signature scheme provably secure in the standard model.

  19. Digital pulse processing for planar TlBr detectors

    NASA Astrophysics Data System (ADS)

    Nakhostin, M.; Hitomi, K.; Ishii, K.; Kikuchi, Y.

    2010-04-01

    We report on a digital pulse processing algorithm for correction of charge trapping in the planar TlBr detectors. The algorithm is performed on the signals digitized at the preamplifier stage. The algorithm is very simple and is implemented with little computational effort. By using a digitizer with a sampling rate of 250 MSample/s and 8 bit resolution, an energy resolution of 6.5% is achieved at 511 keV with a 0.7 mm thick detector.

  20. Autonomous proximity operations using machine vision for trajectory control and pose estimation

    NASA Technical Reports Server (NTRS)

    Cleghorn, Timothy F.; Sternberg, Stanley R.

    1991-01-01

    A machine vision algorithm was developed which permits guidance control to be maintained during autonomous proximity operations. At present this algorithm exists as a simulation, running upon an 80386 based personal computer, using a ModelMATE CAD package to render the target vehicle. However, the algorithm is sufficiently simple, so that following off-line training on a known target vehicle, it should run in real time with existing vision hardware. The basis of the algorithm is a sequence of single camera images of the target vehicle, upon which radial transforms were performed. Selected points of the resulting radial signatures are fed through a decision tree, to determine whether the signature matches that of the known reference signatures for a particular view of the target. Based upon recognized scenes, the position of the maneuvering vehicle with respect to the target vehicles can be calculated, and adjustments made in the former's trajectory. In addition, the pose and spin rates of the target satellite can be estimated using this method.

  1. Fast reconstruction of off-axis digital holograms based on digital spatial multiplexing.

    PubMed

    Sha, Bei; Liu, Xuan; Ge, Xiao-Lu; Guo, Cheng-Shan

    2014-09-22

    A method for fast reconstruction of off-axis digital holograms based on digital multiplexing algorithm is proposed. Instead of the existed angular multiplexing (AM), the new method utilizes a spatial multiplexing (SM) algorithm, in which four off-axis holograms recorded in sequence are synthesized into one SM function through multiplying each hologram with a tilted plane wave and then adding them up. In comparison with the conventional methods, the SM algorithm simplifies two-dimensional (2-D) Fourier transforms (FTs) of four N*N arrays into a 1.25-D FTs of one N*N arrays. Experimental results demonstrate that, using the SM algorithm, the computational efficiency can be improved and the reconstructed wavefronts keep the same quality as those retrieved based on the existed AM method. This algorithm may be useful in design of a fast preview system of dynamic wavefront imaging in digital holography.

  2. Importance of correlation between gene expression levels: application to the type I interferon signature in rheumatoid arthritis.

    PubMed

    Reynier, Frédéric; Petit, Fabien; Paye, Malick; Turrel-Davin, Fanny; Imbert, Pierre-Emmanuel; Hot, Arnaud; Mougin, Bruno; Miossec, Pierre

    2011-01-01

    The analysis of gene expression data shows that many genes display similarity in their expression profiles suggesting some co-regulation. Here, we investigated the co-expression patterns in gene expression data and proposed a correlation-based research method to stratify individuals. Using blood from rheumatoid arthritis (RA) patients, we investigated the gene expression profiles from whole blood using Affymetrix microarray technology. Co-expressed genes were analyzed by a biclustering method, followed by gene ontology analysis of the relevant biclusters. Taking the type I interferon (IFN) pathway as an example, a classification algorithm was developed from the 102 RA patients and extended to 10 systemic lupus erythematosus (SLE) patients and 100 healthy volunteers to further characterize individuals. We developed a correlation-based algorithm referred to as Classification Algorithm Based on a Biological Signature (CABS), an alternative to other approaches focused specifically on the expression levels. This algorithm applied to the expression of 35 IFN-related genes showed that the IFN signature presented a heterogeneous expression between RA, SLE and healthy controls which could reflect the level of global IFN signature activation. Moreover, the monitoring of the IFN-related genes during the anti-TNF treatment identified changes in type I IFN gene activity induced in RA patients. In conclusion, we have proposed an original method to analyze genes sharing an expression pattern and a biological function showing that the activation levels of a biological signature could be characterized by its overall state of correlation.

  3. Issues and approaches for electronic document approval and transmittal using digital signatures and text authentication: Prototype documentation

    NASA Astrophysics Data System (ADS)

    Boling, M. E.

    1989-09-01

    Prototypes were assembled pursuant to recommendations made in report K/DSRD-96, Issues and Approaches for Electronic Document Approval and Transmittal Using Digital Signatures and Text Authentication, and to examine and discover the possibilities for integrating available hardware and software to provide cost effective systems for digital signatures and text authentication. These prototypes show that on a LAN, a multitasking, windowed, mouse/keyboard menu-driven interface can be assembled to provide easy and quick access to bit-mapped images of documents, electronic forms and electronic mail messages with a means to sign, encrypt, deliver, receive or retrieve and authenticate text and signatures. In addition they show that some of this same software may be used in a classified environment using host to terminal transactions to accomplish these same operations. Finally, a prototype was developed demonstrating that binary files may be signed electronically and sent by point to point communication and over ARPANET to remote locations where the authenticity of the code and signature may be verified. Related studies on the subject of electronic signatures and text authentication using public key encryption were done within the Department of Energy. These studies include timing studies of public key encryption software and hardware and testing of experimental user-generated host resident software for public key encryption. This software used commercially available command-line source code. These studies are responsive to an initiative within the Office of the Secretary of Defense (OSD) for the protection of unclassified but sensitive data. It is notable that these related studies are all built around the same commercially available public key encryption products from the private sector and that the software selection was made independently by each study group.

  4. All-optical signatures of strong-field QED in the vacuum emission picture

    NASA Astrophysics Data System (ADS)

    Gies, Holger; Karbstein, Felix; Kohlfürst, Christian

    2018-02-01

    We study all-optical signatures of the effective nonlinear couplings among electromagnetic fields in the quantum vacuum, using the collision of two focused high-intensity laser pulses as an example. The experimental signatures of quantum vacuum nonlinearities are encoded in signal photons, whose kinematic and polarization properties differ from the photons constituting the macroscopic laser fields. We implement an efficient numerical algorithm allowing for the theoretical investigation of such signatures in realistic field configurations accessible in experiment. This algorithm is based on a vacuum emission scheme and can readily be adapted to the collision of more laser beams or further involved field configurations. We solve the case of two colliding pulses in full 3 +1 -dimensional spacetime and identify experimental geometries and parameter regimes with improved signal-to-noise ratios.

  5. Signature molecular descriptor : advanced applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Visco, Donald Patrick, Jr.

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed andmore » the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report provides details on a technique to describe molecules on a computer, called Signature, as well as the computer-aided molecule design algorithm built around Signature. Two applications are provided of the CAMD algorithm with Signature. The first describes the design of green solvents based on data in the GlaxoSmithKline (GSK) Solvent Selection Guide. The second provides novel non-steroidal glucocorticoid receptor ligands with some optimally predicted properties. In addition to using the CAMD algorithm with Signature, it is demonstrated how to employ Signature in a high-throughput screening study. Here, after classifying both active and inactive inhibitors for the protein Factor XIa using Signature, the model developed is used to screen a large, publicly-available database called PubChem for the most active compounds.« less

  6. WE-A-17A-06: Evaluation of An Automatic Interstitial Catheter Digitization Algorithm That Reduces Treatment Planning Time and Provide Means for Adaptive Re-Planning in HDR Brachytherapy of Gynecologic Cancers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dise, J; Liang, X; Lin, L

    Purpose: To evaluate an automatic interstitial catheter digitization algorithm that reduces treatment planning time and provide means for adaptive re-planning in HDR Brachytherapy of Gynecologic Cancers. Methods: The semi-automatic catheter digitization tool utilizes a region growing algorithm in conjunction with a spline model of the catheters. The CT images were first pre-processed to enhance the contrast between the catheters and soft tissue. Several seed locations were selected in each catheter for the region growing algorithm. The spline model of the catheters assisted in the region growing by preventing inter-catheter cross-over caused by air or metal artifacts. Source dwell positions frommore » day one CT scans were applied to subsequent CTs and forward calculated using the automatically digitized catheter positions. This method was applied to 10 patients who had received HDR interstitial brachytherapy on an IRB approved image-guided radiation therapy protocol. The prescribed dose was 18.75 or 20 Gy delivered in 5 fractions, twice daily, over 3 consecutive days. Dosimetric comparisons were made between automatic and manual digitization on day two CTs. Results: The region growing algorithm, assisted by the spline model of the catheters, was able to digitize all catheters. The difference between automatic and manually digitized positions was 0.8±0.3 mm. The digitization time ranged from 34 minutes to 43 minutes with a mean digitization time of 37 minutes. The bulk of the time was spent on manual selection of initial seed positions and spline parameter adjustments. There was no significance difference in dosimetric parameters between the automatic and manually digitized plans. D90% to the CTV was 91.5±4.4% for the manual digitization versus 91.4±4.4% for the automatic digitization (p=0.56). Conclusion: A region growing algorithm was developed to semi-automatically digitize interstitial catheters in HDR brachytherapy using the Syed-Neblett template. This automatic digitization tool was shown to be accurate compared to manual digitization.« less

  7. Realization and optimization of AES algorithm on the TMS320DM6446 based on DaVinci technology

    NASA Astrophysics Data System (ADS)

    Jia, Wen-bin; Xiao, Fu-hai

    2013-03-01

    The application of AES algorithm in the digital cinema system avoids video data to be illegal theft or malicious tampering, and solves its security problems. At the same time, in order to meet the requirements of the real-time, scene and transparent encryption of high-speed data streams of audio and video in the information security field, through the in-depth analysis of AES algorithm principle, based on the hardware platform of TMS320DM6446, with the software framework structure of DaVinci, this paper proposes the specific realization methods of AES algorithm in digital video system and its optimization solutions. The test results show digital movies encrypted by AES128 can not play normally, which ensures the security of digital movies. Through the comparison of the performance of AES128 algorithm before optimization and after, the correctness and validity of improved algorithm is verified.

  8. wHospital: a web-based application with digital signature for drugs dispensing management.

    PubMed

    Rossi, Lorenzo; Margola, Lorenzo; Manzelli, Vacia; Bandera, Alessandra

    2006-01-01

    wHospital is the result of an information technology research project, based on the utilization of a web based application for managing the hospital drugs dispensing. Part of wHospital back bone and its key distinguishing characteristic is the adoption of the digital signature system,initially deployed by the Government of Lombardia, a Northern Italy Region, throughout the distribution of smart cards to all the healthcare and hospital staffs. The developed system is a web-based application with a proposed Health Records Digital Signature (HReDS) handshake to comply with the national law and with the Joint Commission International Standards. The prototype application, for a single hospital Operative Unit (OU), has focused on data and process management, related to drug therapy. Following a multi-faceted selection process, the Infective Disease OU of the Hospital in Busto Arsizio, Lombardia, was chosen for the development and prototype implementation. The project lead time, from user requirement analysis to training and deployment was approximately 8 months. This paper highlights the applied project methodology, the system architecture, and the achieved preliminary results.

  9. Digital holography of intracellular dynamics to probe tissue physiology.

    PubMed

    Merrill, Daniel; An, Ran; Turek, John; Nolte, David D

    2015-01-01

    Digital holography provides improved capabilities for imaging through dense tissue. Using a short-coherence source, the digital hologram recorded from backscattered light performs laser ranging that maintains fidelity of information acquired from depths much greater than possible by traditional imaging techniques. Biodynamic imaging (BDI) is a developing technology for live-tissue imaging of up to a millimeter in depth that uses the hologram intensity fluctuations as label-free image contrast and can study tissue behavior in native microenvironments. In this paper BDI is used to investigate the change in adhesion-dependent tissue response in 3D cultures. The results show that increasing density of cellular adhesions slows motion inside tissue and alters the response to cytoskeletal drugs. A clear signature of membrane fluctuations was observed in mid-frequencies (0.1-1 Hz) and was enhanced by the application of cytochalasin-D that degrades the actin cortex inside the cell membrane. This enhancement feature is only observed in tissues that have formed adhesions, because cell pellets initially do not show this signature, but develop this signature only after incubation enables adhesions to form.

  10. Joint Calibration of 3d Laser Scanner and Digital Camera Based on Dlt Algorithm

    NASA Astrophysics Data System (ADS)

    Gao, X.; Li, M.; Xing, L.; Liu, Y.

    2018-04-01

    Design a calibration target that can be scanned by 3D laser scanner while shot by digital camera, achieving point cloud and photos of a same target. A method to joint calibrate 3D laser scanner and digital camera based on Direct Linear Transformation algorithm was proposed. This method adds a distortion model of digital camera to traditional DLT algorithm, after repeating iteration, it can solve the inner and external position element of the camera as well as the joint calibration of 3D laser scanner and digital camera. It comes to prove that this method is reliable.

  11. Recursive Algorithms for Real-Time Digital CR-RCn Pulse Shaping

    NASA Astrophysics Data System (ADS)

    Nakhostin, M.

    2011-10-01

    This paper reports on recursive algorithms for real-time implementation of CR-(RC)n filters in digital nuclear spectroscopy systems. The algorithms are derived by calculating the Z-transfer function of the filters for filter orders up to n=4 . The performances of the filters are compared with the performance of the conventional digital trapezoidal filter using a noise generator which separately generates pure series, 1/f and parallel noise. The results of our study enable one to select the optimum digital filter for different noise and rate conditions.

  12. Radiogenomics analysis identifies correlations of digital mammography with clinical molecular signatures in breast cancer.

    PubMed

    Tamez-Peña, Jose-Gerardo; Rodriguez-Rojas, Juan-Andrés; Gomez-Rueda, Hugo; Celaya-Padilla, Jose-Maria; Rivera-Prieto, Roxana-Alicia; Palacios-Corona, Rebeca; Garza-Montemayor, Margarita; Cardona-Huerta, Servando; Treviño, Victor

    2018-01-01

    In breast cancer, well-known gene expression subtypes have been related to a specific clinical outcome. However, their impact on the breast tissue phenotype has been poorly studied. Here, we investigate the association of imaging data of tumors to gene expression signatures from 71 patients with breast cancer that underwent pre-treatment digital mammograms and tumor biopsies. From digital mammograms, a semi-automated radiogenomics analysis generated 1,078 features describing the shape, signal distribution, and texture of tumors along their contralateral image used as control. From tumor biopsy, we estimated the OncotypeDX and PAM50 recurrence scores using gene expression microarrays. Then, we used multivariate analysis under stringent cross-validation to train models predicting recurrence scores. Few univariate features reached Spearman correlation coefficients above 0.4. Nevertheless, multivariate analysis yielded significantly correlated models for both signatures (correlation of OncotypeDX = 0.49 ± 0.07 and PAM50 = 0.32 ± 0.10 in stringent cross-validation and OncotypeDX = 0.83 and PAM50 = 0.78 for a unique model). Equivalent models trained from the unaffected contralateral breast were not correlated suggesting that the image signatures were tumor-specific and that overfitting was not a considerable issue. We also noted that models were improved by combining clinical information (triple negative status and progesterone receptor). The models used mostly wavelets and fractal features suggesting their importance to capture tumor information. Our results suggest that molecular-based recurrence risk and breast cancer subtypes have observable radiographic phenotypes. To our knowledge, this is the first study associating mammographic information to gene expression recurrence signatures.

  13. Radiogenomics analysis identifies correlations of digital mammography with clinical molecular signatures in breast cancer

    PubMed Central

    Tamez-Peña, Jose-Gerardo; Rodriguez-Rojas, Juan-Andrés; Gomez-Rueda, Hugo; Celaya-Padilla, Jose-Maria; Rivera-Prieto, Roxana-Alicia; Palacios-Corona, Rebeca; Garza-Montemayor, Margarita; Cardona-Huerta, Servando

    2018-01-01

    In breast cancer, well-known gene expression subtypes have been related to a specific clinical outcome. However, their impact on the breast tissue phenotype has been poorly studied. Here, we investigate the association of imaging data of tumors to gene expression signatures from 71 patients with breast cancer that underwent pre-treatment digital mammograms and tumor biopsies. From digital mammograms, a semi-automated radiogenomics analysis generated 1,078 features describing the shape, signal distribution, and texture of tumors along their contralateral image used as control. From tumor biopsy, we estimated the OncotypeDX and PAM50 recurrence scores using gene expression microarrays. Then, we used multivariate analysis under stringent cross-validation to train models predicting recurrence scores. Few univariate features reached Spearman correlation coefficients above 0.4. Nevertheless, multivariate analysis yielded significantly correlated models for both signatures (correlation of OncotypeDX = 0.49 ± 0.07 and PAM50 = 0.32 ± 0.10 in stringent cross-validation and OncotypeDX = 0.83 and PAM50 = 0.78 for a unique model). Equivalent models trained from the unaffected contralateral breast were not correlated suggesting that the image signatures were tumor-specific and that overfitting was not a considerable issue. We also noted that models were improved by combining clinical information (triple negative status and progesterone receptor). The models used mostly wavelets and fractal features suggesting their importance to capture tumor information. Our results suggest that molecular-based recurrence risk and breast cancer subtypes have observable radiographic phenotypes. To our knowledge, this is the first study associating mammographic information to gene expression recurrence signatures. PMID:29596496

  14. Adaptive Two Dimensional RLS (Recursive Least Squares) Algorithms

    DTIC Science & Technology

    1989-03-01

    in Monterey wonderful. IX I. INTRODUCTION Adaptive algorithms have been used successfully for many years in a wide range of digital signal...SIMULATION RESULTS The 2-D FRLS algorithm was tested both on computer-generated data and on digitized images. For a baseline reference the 2-D L:rv1S...Alexander, S. T. Adaptivt Signal Processing: Theory and Applications. Springer- Verlag, New York. 1986. 7. Bellanger, Maurice G. Adaptive Digital

  15. A covert authentication and security solution for GMOs.

    PubMed

    Mueller, Siguna; Jafari, Farhad; Roth, Don

    2016-09-21

    Proliferation and expansion of security risks necessitates new measures to ensure authenticity and validation of GMOs. Watermarking and other cryptographic methods are available which conceal and recover the original signature, but in the process reveal the authentication information. In many scenarios watermarking and standard cryptographic methods are necessary but not sufficient and new, more advanced, cryptographic protocols are necessary. Herein, we present a new crypto protocol, that is applicable in broader settings, and embeds the authentication string indistinguishably from a random element in the signature space and the string is verified or denied without disclosing the actual signature. Results show that in a nucleotide string of 1000, the algorithm gives a correlation of 0.98 or higher between the distribution of the codon and that of E. coli, making the signature virtually invisible. This algorithm may be used to securely authenticate and validate GMOs without disclosing the actual signature. While this protocol uses watermarking, its novelty is in use of more complex cryptographic techniques based on zero knowledge proofs to encode information.

  16. Radar Detection of Marine Mammals

    DTIC Science & Technology

    2011-09-30

    BFT-BPT algorithm for use with our radar data. This track - before - detect algorithm had been effective in enhancing small but persistent signatures in...will be possible with the detect before track algorithm. 4 We next evaluated the track before detect algorithm, the BFT-BPT, on the CEDAR data

  17. Laser vibrometry exploitation for vehicle identification

    NASA Astrophysics Data System (ADS)

    Nolan, Adam; Lingg, Andrew; Goley, Steve; Sigmund, Kevin; Kangas, Scott

    2014-06-01

    Vibration signatures sensed from distant vehicles using laser vibrometry systems provide valuable information that may be used to help identify key vehicle features such as engine type, engine speed, and number of cylinders. Through the use of physics models of the vibration phenomenology, features are chosen to support classification algorithms. Various individual exploitation algorithms were developed using these models to classify vibration signatures into engine type (piston vs. turbine), engine configuration (Inline 4 vs. Inline 6 vs. V6 vs. V8 vs. V12) and vehicle type. The results of these algorithms will be presented for an 8 class problem. Finally, the benefits of using a factor graph representation to link these independent algorithms together will be presented which constructs a classification hierarchy for the vibration exploitation problem.

  18. Geometric analysis and restitution of digital multispectral scanner data arrays

    NASA Technical Reports Server (NTRS)

    Baker, J. R.; Mikhail, E. M.

    1975-01-01

    An investigation was conducted to define causes of geometric defects within digital multispectral scanner (MSS) data arrays, to analyze the resulting geometric errors, and to investigate restitution methods to correct or reduce these errors. Geometric transformation relationships for scanned data, from which collinearity equations may be derived, served as the basis of parametric methods of analysis and restitution of MSS digital data arrays. The linearization of these collinearity equations is presented. Algorithms considered for use in analysis and restitution included the MSS collinearity equations, piecewise polynomials based on linearized collinearity equations, and nonparametric algorithms. A proposed system for geometric analysis and restitution of MSS digital data arrays was used to evaluate these algorithms, utilizing actual MSS data arrays. It was shown that collinearity equations and nonparametric algorithms both yield acceptable results, but nonparametric algorithms possess definite advantages in computational efficiency. Piecewise polynomials were found to yield inferior results.

  19. A Weak Quantum Blind Signature with Entanglement Permutation

    NASA Astrophysics Data System (ADS)

    Lou, Xiaoping; Chen, Zhigang; Guo, Ying

    2015-09-01

    Motivated by the permutation encryption algorithm, a weak quantum blind signature (QBS) scheme is proposed. It involves three participants, including the sender Alice, the signatory Bob and the trusted entity Charlie, in four phases, i.e., initializing phase, blinding phase, signing phase and verifying phase. In a small-scale quantum computation network, Alice blinds the message based on a quantum entanglement permutation encryption algorithm that embraces the chaotic position string. Bob signs the blinded message with private parameters shared beforehand while Charlie verifies the signature's validity and recovers the original message. Analysis shows that the proposed scheme achieves the secure blindness for the signer and traceability for the message owner with the aid of the authentic arbitrator who plays a crucial role when a dispute arises. In addition, the signature can neither be forged nor disavowed by the malicious attackers. It has a wide application to E-voting and E-payment system, etc.

  20. Classification of a large microarray data set: Algorithm comparison and analysis of drug signatures

    PubMed Central

    Natsoulis, Georges; El Ghaoui, Laurent; Lanckriet, Gert R.G.; Tolley, Alexander M.; Leroy, Fabrice; Dunlea, Shane; Eynon, Barrett P.; Pearson, Cecelia I.; Tugendreich, Stuart; Jarnagin, Kurt

    2005-01-01

    A large gene expression database has been produced that characterizes the gene expression and physiological effects of hundreds of approved and withdrawn drugs, toxicants, and biochemical standards in various organs of live rats. In order to derive useful biological knowledge from this large database, a variety of supervised classification algorithms were compared using a 597-microarray subset of the data. Our studies show that several types of linear classifiers based on Support Vector Machines (SVMs) and Logistic Regression can be used to derive readily interpretable drug signatures with high classification performance. Both methods can be tuned to produce classifiers of drug treatments in the form of short, weighted gene lists which upon analysis reveal that some of the signature genes have a positive contribution (act as “rewards” for the class-of-interest) while others have a negative contribution (act as “penalties”) to the classification decision. The combination of reward and penalty genes enhances performance by keeping the number of false positive treatments low. The results of these algorithms are combined with feature selection techniques that further reduce the length of the drug signatures, an important step towards the development of useful diagnostic biomarkers and low-cost assays. Multiple signatures with no genes in common can be generated for the same classification end-point. Comparison of these gene lists identifies biological processes characteristic of a given class. PMID:15867433

  1. Practical target location and accuracy indicator in digital close range photogrammetry using consumer grade cameras

    NASA Astrophysics Data System (ADS)

    Moriya, Gentaro; Chikatsu, Hirofumi

    2011-07-01

    Recently, pixel numbers and functions of consumer grade digital camera are amazingly increasing by modern semiconductor and digital technology, and there are many low-priced consumer grade digital cameras which have more than 10 mega pixels on the market in Japan. In these circumstances, digital photogrammetry using consumer grade cameras is enormously expected in various application fields. There is a large body of literature on calibration of consumer grade digital cameras and circular target location. Target location with subpixel accuracy had been investigated as a star tracker issue, and many target location algorithms have been carried out. It is widely accepted that the least squares models with ellipse fitting is the most accurate algorithm. However, there are still problems for efficient digital close range photogrammetry. These problems are reconfirmation of the target location algorithms with subpixel accuracy for consumer grade digital cameras, relationship between number of edge points along target boundary and accuracy, and an indicator for estimating the accuracy of normal digital close range photogrammetry using consumer grade cameras. With this motive, an empirical testing of several algorithms for target location with subpixel accuracy and an indicator for estimating the accuracy are investigated in this paper using real data which were acquired indoors using 7 consumer grade digital cameras which have 7.2 mega pixels to 14.7 mega pixels.

  2. 10 CFR 2.304 - Formal requirements for documents; signatures; acceptance for filing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Management for NRC Adjudicatory Hearings § 2.304 Formal requirements for documents; signatures; acceptance... section, it may be struck. (1) An electronic document must be signed using a participant's or a... paragraph (d) of this section. (i) When signing an electronic document using a digital ID certificate, the...

  3. 10 CFR 2.304 - Formal requirements for documents; signatures; acceptance for filing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Management for NRC Adjudicatory Hearings § 2.304 Formal requirements for documents; signatures; acceptance... section, it may be struck. (1) An electronic document must be signed using a participant's or a... paragraph (d) of this section. (i) When signing an electronic document using a digital ID certificate, the...

  4. Gamma watermarking

    DOEpatents

    Ishikawa, Muriel Y.; Wood, Lowell L.; Lougheed, Ronald W.; Moody, Kenton J.; Wang, Tzu-Fang

    2004-05-25

    A covert, gamma-ray "signature" is used as a "watermark" for property identification. This new watermarking technology is based on a unique steganographic or "hidden writing" digital signature, implemented in tiny quantities of gamma-ray-emitting radioisotopic material combinations, generally covertly emplaced on or within an object. This digital signature may be readily recovered at distant future times, by placing a sensitive, high energy-resolution gamma-ray detecting instrument reasonably precisely over the location of the watermark, which location may be known only to the object's owner; however, the signature is concealed from all ordinary detection means because its exceedingly low level of activity is obscured by the natural radiation background (including the gamma radiation naturally emanating from the object itself, from cosmic radiation and material surroundings, from human bodies, etc.). The "watermark" is used in object-tagging for establishing object identity, history or ownership. It thus may serve as an aid to law enforcement officials in identifying stolen property and prosecuting theft thereof. Highly effective, potentially very low cost identification-on demand of items of most all types is thus made possible.

  5. Experimental measurement-device-independent quantum digital signatures.

    PubMed

    Roberts, G L; Lucamarini, M; Yuan, Z L; Dynes, J F; Comandar, L C; Sharpe, A W; Shields, A J; Curty, M; Puthoor, I V; Andersson, E

    2017-10-23

    The development of quantum networks will be paramount towards practical and secure telecommunications. These networks will need to sign and distribute information between many parties with information-theoretic security, requiring both quantum digital signatures (QDS) and quantum key distribution (QKD). Here, we introduce and experimentally realise a quantum network architecture, where the nodes are fully connected using a minimum amount of physical links. The central node of the network can act either as a totally untrusted relay, connecting the end users via the recently introduced measurement-device-independent (MDI)-QKD, or as a trusted recipient directly communicating with the end users via QKD. Using this network, we perform a proof-of-principle demonstration of QDS mediated by MDI-QKD. For that, we devised an efficient protocol to distil multiple signatures from the same block of data, thus reducing the statistical fluctuations in the sample and greatly enhancing the final QDS rate in the finite-size scenario.

  6. Beyond the scope of Free-Wilson analysis: building interpretable QSAR models with machine learning algorithms.

    PubMed

    Chen, Hongming; Carlsson, Lars; Eriksson, Mats; Varkonyi, Peter; Norinder, Ulf; Nilsson, Ingemar

    2013-06-24

    A novel methodology was developed to build Free-Wilson like local QSAR models by combining R-group signatures and the SVM algorithm. Unlike Free-Wilson analysis this method is able to make predictions for compounds with R-groups not present in a training set. Eleven public data sets were chosen as test cases for comparing the performance of our new method with several other traditional modeling strategies, including Free-Wilson analysis. Our results show that the R-group signature SVM models achieve better prediction accuracy compared with Free-Wilson analysis in general. Moreover, the predictions of R-group signature models are also comparable to the models using ECFP6 fingerprints and signatures for the whole compound. Most importantly, R-group contributions to the SVM model can be obtained by calculating the gradient for R-group signatures. For most of the studied data sets, a significant correlation with that of a corresponding Free-Wilson analysis is shown. These results suggest that the R-group contribution can be used to interpret bioactivity data and highlight that the R-group signature based SVM modeling method is as interpretable as Free-Wilson analysis. Hence the signature SVM model can be a useful modeling tool for any drug discovery project.

  7. A Quantum Proxy Blind Signature Scheme Based on Genuine Five-Qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Zeng, Chuan; Zhang, Jian-Zhong; Xie, Shu-Cui

    2017-06-01

    In this paper, a quantum proxy blind signature scheme based on controlled quantum teleportation is proposed. This scheme uses a genuine five-qubit entangled state as quantum channel and adopts the classical Vernam algorithm to blind message. We use the physical characteristics of quantum mechanics to implement delegation, signature and verification. Security analysis shows that our scheme is valid and satisfy the properties of a proxy blind signature, such as blindness, verifiability, unforgeability, undeniability.

  8. Free-Space Quantum Signatures Using Heterodyne Measurements

    NASA Astrophysics Data System (ADS)

    Croal, Callum; Peuntinger, Christian; Heim, Bettina; Khan, Imran; Marquardt, Christoph; Leuchs, Gerd; Wallden, Petros; Andersson, Erika; Korolkova, Natalia

    2016-09-01

    Digital signatures guarantee the authorship of electronic communications. Currently used "classical" signature schemes rely on unproven computational assumptions for security, while quantum signatures rely only on the laws of quantum mechanics to sign a classical message. Previous quantum signature schemes have used unambiguous quantum measurements. Such measurements, however, sometimes give no result, reducing the efficiency of the protocol. Here, we instead use heterodyne detection, which always gives a result, although there is always some uncertainty. We experimentally demonstrate feasibility in a real environment by distributing signature states through a noisy 1.6 km free-space channel. Our results show that continuous-variable heterodyne detection improves the signature rate for this type of scheme and therefore represents an interesting direction in the search for practical quantum signature schemes. For transmission values ranging from 100% to 10%, but otherwise assuming an ideal implementation with no other imperfections, the signature length is shorter by a factor of 2 to 10. As compared with previous relevant experimental realizations, the signature length in this implementation is several orders of magnitude shorter.

  9. Exploring the Use of Radar for a Physically Based Lightning Cessation Nowcasting Tool

    NASA Technical Reports Server (NTRS)

    Schultz, Elise V.; Petersen, Walter A.; Carey, Lawrence D.

    2011-01-01

    NASA s Marshall Space Flight Center (MSFC) and the University of Alabama in Huntsville (UAHuntsville) are collaborating with the 45th Weather Squadron (45WS) at Cape Canaveral Air Force Station (CCAFS) to enable improved nowcasting of lightning cessation. This project centers on use of dual-polarimetric radar capabilities, and in particular, the new C-band dual-polarimetric weather radar acquired by the 45WS. Special emphasis is placed on the development of a physically based operational algorithm to predict lightning cessation. While previous studies have developed statistically based lightning cessation algorithms, we believe that dual-polarimetric radar variables offer the possibility to improve existing algorithms through the inclusion of physically meaningful trends reflecting interactions between in-cloud electric fields and hydrometeors. Specifically, decades of polarimetric radar research using propagation differential phase has demonstrated the presence of distinct phase and ice crystal alignment signatures in the presence of strong electric fields associated with lightning. One question yet to be addressed is: To what extent can these ice-crystal alignment signatures be used to nowcast the cessation of lightning activity in a given storm? Accordingly, data from the UAHuntsville Advanced Radar for Meteorological and Operational Research (ARMOR) along with the NASA-MSFC North Alabama Lightning Mapping Array are used in this study to investigate the radar signatures present before and after lightning cessation. Thus far, our case study results suggest that the negative differential phase shift signature weakens and disappears after the analyzed storms ceased lightning production (i.e., after the last lightning flash occurred). This is a key observation because it suggests that while strong electric fields may still have been present, the lightning cessation signature encompassed the period of the polarimetric negative phase shift signature. To the extent this behavior is repeatable in other cases, even if only in a substantial fraction of those cases, the case analyses suggests that differential propagation phase may prove to be a useful parameter for future lightning cessation algorithms. Indeed, analysis of 15+ cases has shown additional indications of the weakening and disappearance of this ice alignment signature with lightning cessation. A summary of results will be presented.

  10. A Double-function Digital Watermarking Algorithm Based on Chaotic System and LWT

    NASA Astrophysics Data System (ADS)

    Yuxia, Zhao; Jingbo, Fan

    A double- function digital watermarking technology is studied and a double-function digital watermarking algorithm of colored image is presented based on chaotic system and the lifting wavelet transformation (LWT).The algorithm has realized the double aims of the copyright protection and the integrity authentication of image content. Making use of feature of human visual system (HVS), the watermark image is embedded into the color image's low frequency component and middle frequency components by different means. The algorithm has great security by using two kinds chaotic mappings and Arnold to scramble the watermark image at the same time. The algorithm has good efficiency by using LWT. The emulation experiment indicates the algorithm has great efficiency and security, and the effect of concealing is really good.

  11. CoGAPS matrix factorization algorithm identifies transcriptional changes in AP-2alpha target genes in feedback from therapeutic inhibition of the EGFR network

    PubMed Central

    Thakar, Manjusha; Howard, Jason D.; Kagohara, Luciane T.; Krigsfeld, Gabriel; Ranaweera, Ruchira S.; Hughes, Robert M.; Perez, Jimena; Jones, Siân; Favorov, Alexander V.; Carey, Jacob; Stein-O'Brien, Genevieve; Gaykalova, Daria A.; Ochs, Michael F.; Chung, Christine H.

    2016-01-01

    Patients with oncogene driven tumors are treated with targeted therapeutics including EGFR inhibitors. Genomic data from The Cancer Genome Atlas (TCGA) demonstrates molecular alterations to EGFR, MAPK, and PI3K pathways in previously untreated tumors. Therefore, this study uses bioinformatics algorithms to delineate interactions resulting from EGFR inhibitor use in cancer cells with these genetic alterations. We modify the HaCaT keratinocyte cell line model to simulate cancer cells with constitutive activation of EGFR, HRAS, and PI3K in a controlled genetic background. We then measure gene expression after treating modified HaCaT cells with gefitinib, afatinib, and cetuximab. The CoGAPS algorithm distinguishes a gene expression signature associated with the anticipated silencing of the EGFR network. It also infers a feedback signature with EGFR gene expression itself increasing in cells that are responsive to EGFR inhibitors. This feedback signature has increased expression of several growth factor receptors regulated by the AP-2 family of transcription factors. The gene expression signatures for AP-2alpha are further correlated with sensitivity to cetuximab treatment in HNSCC cell lines and changes in EGFR expression in HNSCC tumors with low CDKN2A gene expression. In addition, the AP-2alpha gene expression signatures are also associated with inhibition of MEK, PI3K, and mTOR pathways in the Library of Integrated Network-Based Cellular Signatures (LINCS) data. These results suggest that AP-2 transcription factors are activated as feedback from EGFR network inhibition and may mediate EGFR inhibitor resistance. PMID:27650546

  12. Synthesis of optical polarization signatures of military aircraft

    NASA Astrophysics Data System (ADS)

    Egan, Walter G.; Duggin, Michael J.

    2002-01-01

    Focal plane wide band IR imagery will be compared with visual wide band focal plane digital imagery of a camouflaged B-52 bomber. Extreme enhancement is possible using digital polarized imagery. The experimental observations will be compared to theoretical calculations and modeling result of both specular and shadowed areas to allow extrapolations to the synthesis of the optical polarization signatures of other aircraft. The relationship of both the specular and the shadowed areas to surface structure, orientation, specularlity, roughness, shadowing and the complex index of refraction will be illustrated. The imagery was obtained in two plane-polarized directions. Many aircraft locations were measured as well as sky background.

  13. Digital classification of Landsat data for vegetation and land-cover mapping in the Blackfoot River watershed, southeastern Idaho

    USGS Publications Warehouse

    Pettinger, L.R.

    1982-01-01

    This paper documents the procedures, results, and final products of a digital analysis of Landsat data used to produce a vegetation and landcover map of the Blackfoot River watershed in southeastern Idaho. Resource classes were identified at two levels of detail: generalized Level I classes (for example, forest land and wetland) and detailed Levels II and III classes (for example, conifer forest, aspen, wet meadow, and riparian hardwoods). Training set statistics were derived using a modified clustering approach. Environmental stratification that separated uplands from lowlands improved discrimination between resource classes having similar spectral signatures. Digital classification was performed using a maximum likelihood algorithm. Classification accuracy was determined on a single-pixel basis from a random sample of 25-pixel blocks. These blocks were transferred to small-scale color-infrared aerial photographs, and the image area corresponding to each pixel was interpreted. Classification accuracy, expressed as percent agreement of digital classification and photo-interpretation results, was 83.0:t 2.1 percent (0.95 probability level) for generalized (Level I) classes and 52.2:t 2.8 percent (0.95 probability level) for detailed (Levels II and III) classes. After the classified images were geometrically corrected, two types of maps were produced of Level I and Levels II and III resource classes: color-coded maps at a 1:250,000 scale, and flatbed-plotter overlays at a 1:24,000 scale. The overlays are more useful because of their larger scale, familiar format to users, and compatibility with other types of topographic and thematic maps of the same scale.

  14. Develop security architecture for both in-house healthcare information systems and electronic patient record

    NASA Astrophysics Data System (ADS)

    Zhang, Jianguo; Chen, Xiaomeng; Zhuang, Jun; Jiang, Jianrong; Zhang, Xiaoyan; Wu, Dongqing; Huang, H. K.

    2003-05-01

    In this paper, we presented a new security approach to provide security measures and features in both healthcare information systems (PACS, RIS/HIS), and electronic patient record (EPR). We introduced two security components, certificate authoring (CA) system and patient record digital signature management (DSPR) system, as well as electronic envelope technology, into the current hospital healthcare information infrastructure to provide security measures and functions such as confidential or privacy, authenticity, integrity, reliability, non-repudiation, and authentication for in-house healthcare information systems daily operating, and EPR exchanging among the hospitals or healthcare administration levels, and the DSPR component manages the all the digital signatures of patient medical records signed through using an-symmetry key encryption technologies. The electronic envelopes used for EPR exchanging are created based on the information of signers, digital signatures, and identifications of patient records stored in CAS and DSMS, as well as the destinations and the remote users. The CAS and DSMS were developed and integrated into a RIS-integrated PACS, and the integration of these new security components is seamless and painless. The electronic envelopes designed for EPR were used successfully in multimedia data transmission.

  15. Dynamics, Heat Transport, Spectral Composition and Acoustic Signatures of Mesoscale Variability in the Ocean

    DTIC Science & Technology

    2013-12-01

    Eastward background flow EOS Equation of state GDEM Generalized Digital Environmental Model GRB Growth Rate Balance model HPCMP High Performance...the Naval Research Lab (NRL) Generalized Digital Environmental Model ( GDEM ). This provides a realistic and detailed profile for a known turbulent

  16. Ballistic Signature Identification System Study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The first phase of a research project directed toward development of a high speed automatic process to be used to match gun barrel signatures imparted to fired bullets was documented. An optical projection technique has been devised to produce and photograph a planar image of the entire signature, and the phototransparency produced is subjected to analysis using digital Fourier transform techniques. The success of this approach appears to be limited primarily by the accuracy of the photographic step since no significant processing limitations have been encountered.

  17. Analytical redundancy management mechanization and flight data analysis for the F-8 digital fly-by-wire aircraft flight control sensors

    NASA Technical Reports Server (NTRS)

    Deckert, J. C.

    1983-01-01

    The details are presented of an onboard digital computer algorithm designed to reliably detect and isolate the first failure in a duplex set of flight control sensors aboard the NASA F-8 digital fly-by-wire aircraft. The algorithm's successful flight test program is summarized, and specific examples are presented of algorithm behavior in response to software-induced signal faults, both with and without aircraft parameter modeling errors.

  18. De-Dopplerization of Acoustic Measurements

    DTIC Science & Technology

    2017-08-10

    band energy obtained from fractional octave band digital filters generates a de-Dopplerized spectrum without complex resampling algorithms. An...energy obtained from fractional octave band digital filters generates a de-Dopplerized spectrum without complex resampling algorithms. An equation...fractional octave representation and smearing that occurs within the spectrum11, digital filtering techniques were not considered by these earlier

  19. Providing integrity, authenticity, and confidentiality for header and pixel data of DICOM images.

    PubMed

    Al-Haj, Ali

    2015-04-01

    Exchange of medical images over public networks is subjected to different types of security threats. This has triggered persisting demands for secured telemedicine implementations that will provide confidentiality, authenticity, and integrity for the transmitted images. The medical image exchange standard (DICOM) offers mechanisms to provide confidentiality for the header data of the image but not for the pixel data. On the other hand, it offers mechanisms to achieve authenticity and integrity for the pixel data but not for the header data. In this paper, we propose a crypto-based algorithm that provides confidentially, authenticity, and integrity for the pixel data, as well as for the header data. This is achieved by applying strong cryptographic primitives utilizing internally generated security data, such as encryption keys, hashing codes, and digital signatures. The security data are generated internally from the header and the pixel data, thus a strong bond is established between the DICOM data and the corresponding security data. The proposed algorithm has been evaluated extensively using DICOM images of different modalities. Simulation experiments show that confidentiality, authenticity, and integrity have been achieved as reflected by the results we obtained for normalized correlation, entropy, PSNR, histogram analysis, and robustness.

  20. A Network Topology Control and Identity Authentication Protocol with Support for Movable Sensor Nodes

    PubMed Central

    Zhang, Ying; Chen, Wei; Liang, Jixing; Zheng, Bingxin; Jiang, Shengming

    2015-01-01

    It is expected that in the near future wireless sensor network (WSNs) will be more widely used in the mobile environment, in applications such as Autonomous Underwater Vehicles (AUVs) for marine monitoring and mobile robots for environmental investigation. The sensor nodes’ mobility can easily cause changes to the structure of a network topology, and lead to the decline in the amount of transmitted data, excessive energy consumption, and lack of security. To solve these problems, a kind of efficient Topology Control algorithm for node Mobility (TCM) is proposed. In the topology construction stage, an efficient clustering algorithm is adopted, which supports sensor node movement. It can ensure the balance of clustering, and reduce the energy consumption. In the topology maintenance stage, the digital signature authentication based on Error Correction Code (ECC) and the communication mechanism of soft handover are adopted. After verifying the legal identity of the mobile nodes, secure communications can be established, and this can increase the amount of data transmitted. Compared to some existing schemes, the proposed scheme has significant advantages regarding network topology stability, amounts of data transferred, lifetime and safety performance of the network. PMID:26633405

  1. A Network Topology Control and Identity Authentication Protocol with Support for Movable Sensor Nodes.

    PubMed

    Zhang, Ying; Chen, Wei; Liang, Jixing; Zheng, Bingxin; Jiang, Shengming

    2015-12-01

    It is expected that in the near future wireless sensor network (WSNs) will be more widely used in the mobile environment, in applications such as Autonomous Underwater Vehicles (AUVs) for marine monitoring and mobile robots for environmental investigation. The sensor nodes' mobility can easily cause changes to the structure of a network topology, and lead to the decline in the amount of transmitted data, excessive energy consumption, and lack of security. To solve these problems, a kind of efficient Topology Control algorithm for node Mobility (TCM) is proposed. In the topology construction stage, an efficient clustering algorithm is adopted, which supports sensor node movement. It can ensure the balance of clustering, and reduce the energy consumption. In the topology maintenance stage, the digital signature authentication based on Error Correction Code (ECC) and the communication mechanism of soft handover are adopted. After verifying the legal identity of the mobile nodes, secure communications can be established, and this can increase the amount of data transmitted. Compared to some existing schemes, the proposed scheme has significant advantages regarding network topology stability, amounts of data transferred, lifetime and safety performance of the network.

  2. Advanced information processing system: Authentication protocols for network communication

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Adams, Stuart J.; Babikyan, Carol A.; Butler, Bryan P.; Clark, Anne L.; Lala, Jaynarayan H.

    1994-01-01

    In safety critical I/O and intercomputer communication networks, reliable message transmission is an important concern. Difficulties of communication and fault identification in networks arise primarily because the sender of a transmission cannot be identified with certainty, an intermediate node can corrupt a message without certainty of detection, and a babbling node cannot be identified and silenced without lengthy diagnosis and reconfiguration . Authentication protocols use digital signature techniques to verify the authenticity of messages with high probability. Such protocols appear to provide an efficient solution to many of these problems. The objective of this program is to develop, demonstrate, and evaluate intercomputer communication architectures which employ authentication. As a context for the evaluation, the authentication protocol-based communication concept was demonstrated under this program by hosting a real-time flight critical guidance, navigation and control algorithm on a distributed, heterogeneous, mixed redundancy system of workstations and embedded fault-tolerant computers.

  3. Virtual-optical information security system based on public key infrastructure

    NASA Astrophysics Data System (ADS)

    Peng, Xiang; Zhang, Peng; Cai, Lilong; Niu, Hanben

    2005-01-01

    A virtual-optical based encryption model with the aid of public key infrastructure (PKI) is presented in this paper. The proposed model employs a hybrid architecture in which our previously published encryption method based on virtual-optics scheme (VOS) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). The whole information security model is run under the framework of international standard ITU-T X.509 PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOS security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network. Numerical experiments prove the effectiveness of the method. The security of proposed model is briefly analyzed by examining some possible attacks from the viewpoint of a cryptanalysis.

  4. First stage identification of syntactic elements in an extra-terrestrial signal

    NASA Astrophysics Data System (ADS)

    Elliott, John

    2011-02-01

    By investigating the generic attributes of a representative set of terrestrial languages at varying levels of abstraction, it is our endeavour to try and isolate elements of the signal universe, which are computationally tractable for its detection and structural decipherment. Ultimately, our aim is to contribute in some way to the understanding of what 'languageness' actually is. This paper describes algorithms and software developed to characterise and detect generic intelligent language-like features in an input signal, using natural language learning techniques: looking for characteristic statistical "language-signatures" in test corpora. As a first step towards such species-independent language-detection, we present a suite of programs to analyse digital representations of a range of data, and use the results to extrapolate whether or not there are language-like structures which distinguish this data from other sources, such as music, images, and white noise.

  5. Surface imaging microscope

    NASA Astrophysics Data System (ADS)

    Rogala, Eric W.; Bankman, Isaac N.

    2008-04-01

    The three-dimensional shapes of microscopic objects are becoming increasingly important for battlespace CBRNE sensing. Potential applications of microscopic 3D shape observations include characterization of biological weapon particles and manufacturing of micromechanical components. Aerosol signatures of stand-off lidar systems, using elastic backscatter or polarization, are dictated by the aerosol particle shapes and sizes that must be well characterized in the lab. A low-cost, fast instrument for 3D surface shape microscopy will be a valuable point sensor for biological particle sensing applications. Both the cost and imaging durations of traditional techniques such as confocal microscopes, atomic force microscopes, and electron scanning microscopes are too high. We investigated the feasibility of a low-cost, fast interferometric technique for imaging the 3D surface shape of microscopic objects at frame rates limited only by the camera in the system. The system operates at two laser wavelengths producing two fringe images collected simultaneously by a digital camera, and a specialized algorithm we developed reconstructs the surface map of the microscopic object. The current implementation assembled to test the concept and develop the new 3D reconstruction algorithm has 0.25 micron resolution in the x and y directions, and about 0.1 micron accuracy in the z direction, as tested on a microscopic glass test object manufactured with etching techniques. We describe the interferometric instrument, present the reconstruction algorithm, and discuss further development.

  6. Algorithmic Skin: Health-Tracking Technologies, Personal Analytics and the Biopedagogies of Digitized Health and Physical Education

    ERIC Educational Resources Information Center

    Williamson, Ben

    2015-01-01

    The emergence of digitized health and physical education, or "eHPE", embeds software algorithms in the organization of health and physical education pedagogies. Particularly with the emergence of wearable and mobile activity trackers, biosensors and personal analytics apps, algorithmic processes have an increasingly powerful part to play…

  7. Mental Computation or Standard Algorithm? Children's Strategy Choices on Multi-Digit Subtractions

    ERIC Educational Resources Information Center

    Torbeyns, Joke; Verschaffel, Lieven

    2016-01-01

    This study analyzed children's use of mental computation strategies and the standard algorithm on multi-digit subtractions. Fifty-eight Flemish 4th graders of varying mathematical achievement level were individually offered subtractions that either stimulated the use of mental computation strategies or the standard algorithm in one choice and two…

  8. Reconstruction of a digital core containing clay minerals based on a clustering algorithm.

    PubMed

    He, Yanlong; Pu, Chunsheng; Jing, Cheng; Gu, Xiaoyu; Chen, Qingdong; Liu, Hongzhi; Khan, Nasir; Dong, Qiaoling

    2017-10-01

    It is difficult to obtain a core sample and information for digital core reconstruction of mature sandstone reservoirs around the world, especially for an unconsolidated sandstone reservoir. Meanwhile, reconstruction and division of clay minerals play a vital role in the reconstruction of the digital cores, although the two-dimensional data-based reconstruction methods are specifically applicable as the microstructure reservoir simulation methods for the sandstone reservoir. However, reconstruction of clay minerals is still challenging from a research viewpoint for the better reconstruction of various clay minerals in the digital cores. In the present work, the content of clay minerals was considered on the basis of two-dimensional information about the reservoir. After application of the hybrid method, and compared with the model reconstructed by the process-based method, the digital core containing clay clusters without the labels of the clusters' number, size, and texture were the output. The statistics and geometry of the reconstruction model were similar to the reference model. In addition, the Hoshen-Kopelman algorithm was used to label various connected unclassified clay clusters in the initial model and then the number and size of clay clusters were recorded. At the same time, the K-means clustering algorithm was applied to divide the labeled, large connecting clusters into smaller clusters on the basis of difference in the clusters' characteristics. According to the clay minerals' characteristics, such as types, textures, and distributions, the digital core containing clay minerals was reconstructed by means of the clustering algorithm and the clay clusters' structure judgment. The distributions and textures of the clay minerals of the digital core were reasonable. The clustering algorithm improved the digital core reconstruction and provided an alternative method for the simulation of different clay minerals in the digital cores.

  9. Parallel digital forensics infrastructure.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liebrock, Lorie M.; Duggan, David Patrick

    2009-10-01

    This report documents the architecture and implementation of a Parallel Digital Forensics infrastructure. This infrastructure is necessary for supporting the design, implementation, and testing of new classes of parallel digital forensics tools. Digital Forensics has become extremely difficult with data sets of one terabyte and larger. The only way to overcome the processing time of these large sets is to identify and develop new parallel algorithms for performing the analysis. To support algorithm research, a flexible base infrastructure is required. A candidate architecture for this base infrastructure was designed, instantiated, and tested by this project, in collaboration with New Mexicomore » Tech. Previous infrastructures were not designed and built specifically for the development and testing of parallel algorithms. With the size of forensics data sets only expected to increase significantly, this type of infrastructure support is necessary for continued research in parallel digital forensics. This report documents the implementation of the parallel digital forensics (PDF) infrastructure architecture and implementation.« less

  10. Tmax Determined Using a Bayesian Estimation Deconvolution Algorithm Applied to Bolus Tracking Perfusion Imaging: A Digital Phantom Validation Study.

    PubMed

    Uwano, Ikuko; Sasaki, Makoto; Kudo, Kohsuke; Boutelier, Timothé; Kameda, Hiroyuki; Mori, Futoshi; Yamashita, Fumio

    2017-01-10

    The Bayesian estimation algorithm improves the precision of bolus tracking perfusion imaging. However, this algorithm cannot directly calculate Tmax, the time scale widely used to identify ischemic penumbra, because Tmax is a non-physiological, artificial index that reflects the tracer arrival delay (TD) and other parameters. We calculated Tmax from the TD and mean transit time (MTT) obtained by the Bayesian algorithm and determined its accuracy in comparison with Tmax obtained by singular value decomposition (SVD) algorithms. The TD and MTT maps were generated by the Bayesian algorithm applied to digital phantoms with time-concentration curves that reflected a range of values for various perfusion metrics using a global arterial input function. Tmax was calculated from the TD and MTT using constants obtained by a linear least-squares fit to Tmax obtained from the two SVD algorithms that showed the best benchmarks in a previous study. Correlations between the Tmax values obtained by the Bayesian and SVD methods were examined. The Bayesian algorithm yielded accurate TD and MTT values relative to the true values of the digital phantom. Tmax calculated from the TD and MTT values with the least-squares fit constants showed excellent correlation (Pearson's correlation coefficient = 0.99) and agreement (intraclass correlation coefficient = 0.99) with Tmax obtained from SVD algorithms. Quantitative analyses of Tmax values calculated from Bayesian-estimation algorithm-derived TD and MTT from a digital phantom correlated and agreed well with Tmax values determined using SVD algorithms.

  11. Algorithms exploiting ultrasonic sensors for subject classification

    NASA Astrophysics Data System (ADS)

    Desai, Sachi; Quoraishee, Shafik

    2009-09-01

    Proposed here is a series of techniques exploiting micro-Doppler ultrasonic sensors capable of characterizing various detected mammalian targets based on their physiological movements captured a series of robust features. Employed is a combination of unique and conventional digital signal processing techniques arranged in such a manner they become capable of classifying a series of walkers. These processes for feature extraction develops a robust feature space capable of providing discrimination of various movements generated from bipeds and quadrupeds and further subdivided into large or small. These movements can be exploited to provide specific information of a given signature dividing it in a series of subset signatures exploiting wavelets to generate start/stop times. After viewing a series spectrograms of the signature we are able to see distinct differences and utilizing kurtosis, we generate an envelope detector capable of isolating each of the corresponding step cycles generated during a walk. The walk cycle is defined as one complete sequence of walking/running from the foot pushing off the ground and concluding when returning to the ground. This time information segments the events that are readily seen in the spectrogram but obstructed in the temporal domain into individual walk sequences. This walking sequence is then subsequently translated into a three dimensional waterfall plot defining the expected energy value associated with the motion at particular instance of time and frequency. The value is capable of being repeatable for each particular class and employable to discriminate the events. Highly reliable classification is realized exploiting a classifier trained on a candidate sample space derived from the associated gyrations created by motion from actors of interest. The classifier developed herein provides a capability to classify events as an adult humans, children humans, horses, and dogs at potentially high rates based on the tested sample space. The algorithm developed and described will provide utility to an underused sensor modality for human intrusion detection because of the current high-rate of generated false alarms. The active ultrasonic sensor coupled in a multi-modal sensor suite with binary, less descriptive sensors like seismic devices realizing a greater accuracy rate for detection of persons of interest for homeland purposes.

  12. Relational Reasoning about Numbers and Operations--Foundation for Calculation Strategy Use in Multi-Digit Multiplication and Division

    ERIC Educational Resources Information Center

    Schulz, Andreas

    2018-01-01

    Theoretical analysis of whole number-based calculation strategies and digit-based algorithms for multi-digit multiplication and division reveals that strategy use includes two kinds of reasoning: reasoning about the relations between numbers and reasoning about the relations between operations. In contrast, algorithms aim to reduce the necessary…

  13. Study of Dynamic Characteristics of Aeroelastic Systems Utilizing Randomdec Signatures

    NASA Technical Reports Server (NTRS)

    Chang, C. S.

    1975-01-01

    The feasibility of utilizing the random decrement method in conjunction with a signature analysis procedure to determine the dynamic characteristics of an aeroelastic system for the purpose of on-line prediction of potential on-set of flutter was examined. Digital computer programs were developed to simulate sampled response signals of a two-mode aeroelastic system. Simulated response data were used to test the random decrement method. A special curve-fit approach was developed for analyzing the resulting signatures. A number of numerical 'experiments' were conducted on the combined processes. The method is capable of determining frequency and damping values accurately from randomdec signatures of carefully selected lengths.

  14. Shape Analysis of Planar Multiply-Connected Objects Using Conformal Welding.

    PubMed

    Lok Ming Lui; Wei Zeng; Shing-Tung Yau; Xianfeng Gu

    2014-07-01

    Shape analysis is a central problem in the field of computer vision. In 2D shape analysis, classification and recognition of objects from their observed silhouettes are extremely crucial but difficult. It usually involves an efficient representation of 2D shape space with a metric, so that its mathematical structure can be used for further analysis. Although the study of 2D simply-connected shapes has been subject to a corpus of literatures, the analysis of multiply-connected shapes is comparatively less studied. In this work, we propose a representation for general 2D multiply-connected domains with arbitrary topologies using conformal welding. A metric can be defined on the proposed representation space, which gives a metric to measure dissimilarities between objects. The main idea is to map the exterior and interior of the domain conformally to unit disks and circle domains (unit disk with several inner disks removed), using holomorphic 1-forms. A set of diffeomorphisms of the unit circle S(1) can be obtained, which together with the conformal modules are used to define the shape signature. A shape distance between shape signatures can be defined to measure dissimilarities between shapes. We prove theoretically that the proposed shape signature uniquely determines the multiply-connected objects under suitable normalization. We also introduce a reconstruction algorithm to obtain shapes from their signatures. This completes our framework and allows us to move back and forth between shapes and signatures. With that, a morphing algorithm between shapes can be developed through the interpolation of the Beltrami coefficients associated with the signatures. Experiments have been carried out on shapes extracted from real images. Results demonstrate the efficacy of our proposed algorithm as a stable shape representation scheme.

  15. Linearization of digital derived rate algorithm for use in linear stability analysis

    NASA Technical Reports Server (NTRS)

    Graham, R. E.; Porada, T. W.

    1985-01-01

    The digital derived rate (DDR) algorithm is used to calculate the rate of rotation of the Centaur upper-stage rocket. The DDR is highly nonlinear algorithm, and classical linear stability analysis of the spacecraft cannot be performed without linearization. The performance of this rate algorithm is characterized by a gain and phase curve that drop off at the same frequency. This characteristic is desirable for many applications. A linearization technique for the DDR algorithm is investigated. The linearization method is described. Examples of the results of the linearization technique are illustrated, and the effects of linearization are described. A linear digital filter may be used as a substitute for performing classical linear stability analyses, while the DDR itself may be used in time response analysis.

  16. Free-Space Quantum Signatures Using Heterodyne Measurements.

    PubMed

    Croal, Callum; Peuntinger, Christian; Heim, Bettina; Khan, Imran; Marquardt, Christoph; Leuchs, Gerd; Wallden, Petros; Andersson, Erika; Korolkova, Natalia

    2016-09-02

    Digital signatures guarantee the authorship of electronic communications. Currently used "classical" signature schemes rely on unproven computational assumptions for security, while quantum signatures rely only on the laws of quantum mechanics to sign a classical message. Previous quantum signature schemes have used unambiguous quantum measurements. Such measurements, however, sometimes give no result, reducing the efficiency of the protocol. Here, we instead use heterodyne detection, which always gives a result, although there is always some uncertainty. We experimentally demonstrate feasibility in a real environment by distributing signature states through a noisy 1.6 km free-space channel. Our results show that continuous-variable heterodyne detection improves the signature rate for this type of scheme and therefore represents an interesting direction in the search for practical quantum signature schemes. For transmission values ranging from 100% to 10%, but otherwise assuming an ideal implementation with no other imperfections, the signature length is shorter by a factor of 2 to 10. As compared with previous relevant experimental realizations, the signature length in this implementation is several orders of magnitude shorter.

  17. Cryptography Would Reveal Alterations In Photographs

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L.

    1995-01-01

    Public-key decryption method proposed to guarantee authenticity of photographic images represented in form of digital files. In method, digital camera generates original data from image in standard public format; also produces coded signature to verify standard-format image data. Scheme also helps protect against other forms of lying, such as attaching false captions.

  18. A Prototype of Mathematical Treatment of Pen Pressure Data for Signature Verification.

    PubMed

    Li, Chi-Keung; Wong, Siu-Kay; Chim, Lai-Chu Joyce

    2018-01-01

    A prototype using simple mathematical treatment of the pen pressure data recorded by a digital pen movement recording device was derived. In this study, a total of 48 sets of signature and initial specimens were collected. Pearson's correlation coefficient was used to compare the data of the pen pressure patterns. From the 820 pair comparisons of the 48 sets of genuine signatures, a high degree of matching was found in which 95.4% (782 pairs) and 80% (656 pairs) had rPA > 0.7 and rPA > 0.8, respectively. In the comparison of the 23 forged signatures with their corresponding control signatures, 20 of them (89.2% of pairs) had rPA values < 0.6, showing a lower degree of matching when compared with the results of the genuine signatures. The prototype could be used as a complementary technique to improve the objectivity of signature examination and also has a good potential to be developed as a tool for automated signature identification. © 2017 American Academy of Forensic Sciences.

  19. Digital compression algorithms for HDTV transmission

    NASA Technical Reports Server (NTRS)

    Adkins, Kenneth C.; Shalkhauser, Mary JO; Bibyk, Steven B.

    1990-01-01

    Digital compression of video images is a possible avenue for high definition television (HDTV) transmission. Compression needs to be optimized while picture quality remains high. Two techniques for compression the digital images are explained and comparisons are drawn between the human vision system and artificial compression techniques. Suggestions for improving compression algorithms through the use of neural and analog circuitry are given.

  20. Enhanced spectral resolution by high-dimensional NMR using the filter diagonalization method and "hidden" dimensions.

    PubMed

    Meng, Xi; Nguyen, Bao D; Ridge, Clark; Shaka, A J

    2009-01-01

    High-dimensional (HD) NMR spectra have poorer digital resolution than low-dimensional (LD) spectra, for a fixed amount of experiment time. This has led to "reduced-dimensionality" strategies, in which several LD projections of the HD NMR spectrum are acquired, each with higher digital resolution; an approximate HD spectrum is then inferred by some means. We propose a strategy that moves in the opposite direction, by adding more time dimensions to increase the information content of the data set, even if only a very sparse time grid is used in each dimension. The full HD time-domain data can be analyzed by the filter diagonalization method (FDM), yielding very narrow resonances along all of the frequency axes, even those with sparse sampling. Integrating over the added dimensions of HD FDM NMR spectra reconstitutes LD spectra with enhanced resolution, often more quickly than direct acquisition of the LD spectrum with a larger number of grid points in each of the fewer dimensions. If the extra-dimensions do not appear in the final spectrum, and are used solely to boost information content, we propose the moniker hidden-dimension NMR. This work shows that HD peaks have unmistakable frequency signatures that can be detected as single HD objects by an appropriate algorithm, even though their patterns would be tricky for a human operator to visualize or recognize, and even if digital resolution in an HD FT spectrum is very coarse compared with natural line widths.

  1. Enhanced spectral resolution by high-dimensional NMR using the filter diagonalization method and “hidden” dimensions

    PubMed Central

    Meng, Xi; Nguyen, Bao D.; Ridge, Clark; Shaka, A. J.

    2009-01-01

    High-dimensional (HD) NMR spectra have poorer digital resolution than low-dimensional (LD) spectra, for a fixed amount of experiment time. This has led to “reduced-dimensionality” strategies, in which several LD projections of the HD NMR spectrum are acquired, each with higher digital resolution; an approximate HD spectrum is then inferred by some means. We propose a strategy that moves in the opposite direction, by adding more time dimensions to increase the information content of the data set, even if only a very sparse time grid is used in each dimension. The full HD time-domain data can be analyzed by the Filter Diagonalization Method (FDM), yielding very narrow resonances along all of the frequency axes, even those with sparse sampling. Integrating over the added dimensions of HD FDM NMR spectra reconstitutes LD spectra with enhanced resolution, often more quickly than direct acquisition of the LD spectrum with a larger number of grid points in each of the fewer dimensions. If the extra dimensions do not appear in the final spectrum, and are used solely to boost information content, we propose the moniker hidden-dimension NMR. This work shows that HD peaks have unmistakable frequency signatures that can be detected as single HD objects by an appropriate algorithm, even though their patterns would be tricky for a human operator to visualize or recognize, and even if digital resolution in an HD FT spectrum is very coarse compared with natural line widths. PMID:18926747

  2. A quantum proxy group signature scheme based on an entangled five-qubit state

    NASA Astrophysics Data System (ADS)

    Wang, Meiling; Ma, Wenping; Wang, Lili; Yin, Xunru

    2015-09-01

    A quantum proxy group signature (QPGS) scheme based on controlled teleportation is presented, by using the entangled five-qubit quantum state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security of the scheme is guaranteed by the entanglement correlations of the entangled five-qubit state, the secret keys based on the quantum key distribution (QKD) and the one-time pad algorithm, all of which have been proven to be unconditionally secure and the signature anonymity.

  3. Implementation of digital image encryption algorithm using logistic function and DNA encoding

    NASA Astrophysics Data System (ADS)

    Suryadi, MT; Satria, Yudi; Fauzi, Muhammad

    2018-03-01

    Cryptography is a method to secure information that might be in form of digital image. Based on past research, in order to increase security level of chaos based encryption algorithm and DNA based encryption algorithm, encryption algorithm using logistic function and DNA encoding was proposed. Digital image encryption algorithm using logistic function and DNA encoding use DNA encoding to scramble the pixel values into DNA base and scramble it in DNA addition, DNA complement, and XOR operation. The logistic function in this algorithm used as random number generator needed in DNA complement and XOR operation. The result of the test show that the PSNR values of cipher images are 7.98-7.99 bits, the entropy values are close to 8, the histogram of cipher images are uniformly distributed and the correlation coefficient of cipher images are near 0. Thus, the cipher image can be decrypted perfectly and the encryption algorithm has good resistance to entropy attack and statistical attack.

  4. Queued History based Mediator Identification for an Incentive Attached peer to peer Electronic Coupon System

    NASA Astrophysics Data System (ADS)

    Shojima, Taiki; Ikkai, Yoshitomo; Komoda, Norihisa

    An incentive attached peer to peer (P2P) electronic coupon system is proposed in which users forward e-coupons to potential users by providing incentives to those mediators. A service provider needs to acquire distribution history for incentive payment by recording UserIDs (UIDs) in the e-coupons, since this system is intended for pure P2P environment. This causes problems of dishonestly altering distribution history. In order to solve such problems, distribution history is realized in a couple of queues structure. They are the UID queue, and the public key queue. Each element of the UID queue at the initial state consists of index, a secret key, and a digital signature. In recording one's UID, the encrypted UID is enqueued to the UID queue with a new digital signature created by a secret key of the dequeued element, so that each UID cannot be altered. The public key queue provides the functionality of validating digital signatures on mobile devices. This method makes it possible both each UID and sequence of them to be certificated. The availability of the method is evaluated by quantifying risk reduction using Fault Tree Analysis (FTA). And it's recognized that the method is better than common encryption methods.

  5. Exploring the Use of Radar for Physically-Based Nowcasting of Lightning Cessation

    NASA Technical Reports Server (NTRS)

    Schultz, Elise V.; Petersen, Walter A.; Carey, Lawrence D.

    2011-01-01

    NASA's Marshall Space Flight Center and the University of Alabama in Huntsville (UAHuntsville) are collaborating with the 45th Weather Squadron (45WS) at Cape Canaveral Air Force Station (CCAFS) to enable improved nowcasting of lightning cessation. This project centers on use of dual-polarimetric radar capabilities, and in particular, the new C-band dual polarimetric weather radar acquired by the 45WS. Special emphasis is placed on the development of a physically-based operational algorithm to predict lightning cessation. While previous studies have developed statistically based lightning cessation algorithms driven primarily by trending in the actual total lightning flash rate, we believe that dual polarimetric radar variables offer the possibility to improve existing algorithms through the inclusion of physically meaningful trends reflecting interactions between in-cloud electric fields and ice-microphysics. Specifically, decades of polarimetric radar research using propagation differential phase has demonstrated the presence of distinct phase and ice crystal alignment signatures in the presence of strong electric fields associated with lightning. One question yet to be addressed is: To what extent can propagation phase-based ice-crystal alignment signatures be used to nowcast the cessation of lightning activity in a given storm? Accordingly, data from the UAHuntsville Advanced Radar for Meteorological and Operational Research (ARMOR) along with the NASA-MSFC North Alabama Lightning Mapping Array are used in this study to investigate the radar signatures present before and after lightning cessation. Thus far our case study results suggest that the negative differential phase shift signature weakens and disappears after the analyzed storms ceased lightning production (i.e., after the last lightning flash occurred). This is a key observation because it suggests that while strong electric fields may still have been present, the lightning cessation signature was encompassed in the period of the polarimetric negative phase shift signature. To the extent this behavior is repeatable in other cases, even if only in a substantial fraction of those cases, the analysis suggests that differential propagation phase may prove to be a useful parameter for future lightning cessation algorithms. Indeed, a preliminary analysis of 15+ cases has shown additional indications of the weakening and disappearance of this ice alignment signature with lightning cessation. A summary of these case-study results is presented.

  6. A Subsystem Test Bed for Chinese Spectral Radioheliograph

    NASA Astrophysics Data System (ADS)

    Zhao, An; Yan, Yihua; Wang, Wei

    2014-11-01

    The Chinese Spectral Radioheliograph is a solar dedicated radio interferometric array that will produce high spatial resolution, high temporal resolution, and high spectral resolution images of the Sun simultaneously in decimetre and centimetre wave range. Digital processing of intermediate frequency signal is an important part in a radio telescope. This paper describes a flexible and high-speed digital down conversion system for the CSRH by applying complex mixing, parallel filtering, and extracting algorithms to process IF signal at the time of being designed and incorporates canonic-signed digit coding and bit-plane method to improve program efficiency. The DDC system is intended to be a subsystem test bed for simulation and testing for CSRH. Software algorithms for simulation and hardware language algorithms based on FPGA are written which use less hardware resources and at the same time achieve high performances such as processing high-speed data flow (1 GHz) with 10 MHz spectral resolution. An experiment with the test bed is illustrated by using geostationary satellite data observed on March 20, 2014. Due to the easy alterability of the algorithms on FPGA, the data can be recomputed with different digital signal processing algorithms for selecting optimum algorithm.

  7. Electronic medical archives: a different approach to applying re-signing mechanisms to digital signatures.

    PubMed

    Chen, Tzer-Long; Lin, Frank Y S

    2011-08-01

    Electronic medical records can be defined as a digital format of the traditionally paper-based anamneses, which contains the history of a patient such as his somewhat illness, current health problems, and his chronic treatments. An electronic anamnesis is meant to make the patient's health information more conveniently accessible and transferable between different medical institutions and also easier to be kept quite a long time. Because of such transferability and accessibility of electronic anamneses, we can use less resource than before on storing the patients' medical information. This also means that medical care providers could save more funds on record-keeping and access a patient's medical background directly since shown on the computer screen more quickly and easily. Overall, the service quality has seemingly improved greatly. However, the usage of electronic anamneses involves in some concerned issues such as its related law declaration, and the security of the patient's confidential information. Because of these concerns, a secure medical networking scheme is taking into consideration. Nowadays, the administrators at the medical institutions are facing more challenges on monitoring computers and network systems, because of dramatic advances in this field. For instance, a trusted third party is authorized to access some medical records for a certain period of time. In regard to the security purpose, all the electronic medical records are embedded with both of the public-key infrastructure (PKI) cryptography and the digital signature technique so as to ensure the records well-protected. Since the signatures will be invalid due to the revocation or time expiration, the security of records under this premise would turn into vulnerable. Hence, we propose a re-signing scheme, whose purpose is to make a going-expired digital signature been resigned in time, in keeping with the premise of not conflicting with the laws, morals, and privacy while maintaining the security of the electronic medical records.

  8. Radiologists' preferences for digital mammographic display. The International Digital Mammography Development Group.

    PubMed

    Pisano, E D; Cole, E B; Major, S; Zong, S; Hemminger, B M; Muller, K E; Johnston, R E; Walsh, R; Conant, E; Fajardo, L L; Feig, S A; Nishikawa, R M; Yaffe, M J; Williams, M B; Aylward, S R

    2000-09-01

    To determine the preferences of radiologists among eight different image processing algorithms applied to digital mammograms obtained for screening and diagnostic imaging tasks. Twenty-eight images representing histologically proved masses or calcifications were obtained by using three clinically available digital mammographic units. Images were processed and printed on film by using manual intensity windowing, histogram-based intensity windowing, mixture model intensity windowing, peripheral equalization, multiscale image contrast amplification (MUSICA), contrast-limited adaptive histogram equalization, Trex processing, and unsharp masking. Twelve radiologists compared the processed digital images with screen-film mammograms obtained in the same patient for breast cancer screening and breast lesion diagnosis. For the screening task, screen-film mammograms were preferred to all digital presentations, but the acceptability of images processed with Trex and MUSICA algorithms were not significantly different. All printed digital images were preferred to screen-film radiographs in the diagnosis of masses; mammograms processed with unsharp masking were significantly preferred. For the diagnosis of calcifications, no processed digital mammogram was preferred to screen-film mammograms. When digital mammograms were preferred to screen-film mammograms, radiologists selected different digital processing algorithms for each of three mammographic reading tasks and for different lesion types. Soft-copy display will eventually allow radiologists to select among these options more easily.

  9. Seismic and acoustic signal identification algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LADD,MARK D.; ALAM,M. KATHLEEN; SLEEFE,GERARD E.

    2000-04-03

    This paper will describe an algorithm for detecting and classifying seismic and acoustic signals for unattended ground sensors. The algorithm must be computationally efficient and continuously process a data stream in order to establish whether or not a desired signal has changed state (turned-on or off). The paper will focus on describing a Fourier based technique that compares the running power spectral density estimate of the data to a predetermined signature in order to determine if the desired signal has changed state. How to establish the signature and the detection thresholds will be discussed as well as the theoretical statisticsmore » of the algorithm for the Gaussian noise case with results from simulated data. Actual seismic data results will also be discussed along with techniques used to reduce false alarms due to the inherent nonstationary noise environments found with actual data.« less

  10. Overview of Digital Forensics Algorithms in Dslr Cameras

    NASA Astrophysics Data System (ADS)

    Aminova, E.; Trapeznikov, I.; Priorov, A.

    2017-05-01

    The widespread usage of the mobile technologies and the improvement of the digital photo devices getting has led to more frequent cases of falsification of images including in the judicial practice. Consequently, the actual task for up-to-date digital image processing tools is the development of algorithms for determining the source and model of the DSLR (Digital Single Lens Reflex) camera and improve image formation algorithms. Most research in this area based on the mention that the extraction of unique sensor trace of DSLR camera could be possible on the certain stage of the imaging process into the camera. It is considered that the study focuses on the problem of determination of unique feature of DSLR cameras based on optical subsystem artifacts and sensor noises.

  11. Advanced digital SAR processing study

    NASA Technical Reports Server (NTRS)

    Martinson, L. W.; Gaffney, B. P.; Liu, B.; Perry, R. P.; Ruvin, A.

    1982-01-01

    A highly programmable, land based, real time synthetic aperture radar (SAR) processor requiring a processed pixel rate of 2.75 MHz or more in a four look system was designed. Variations in range and azimuth compression, number of looks, range swath, range migration and SR mode were specified. Alternative range and azimuth processing algorithms were examined in conjunction with projected integrated circuit, digital architecture, and software technologies. The advaced digital SAR processor (ADSP) employs an FFT convolver algorithm for both range and azimuth processing in a parallel architecture configuration. Algorithm performace comparisons, design system design, implementation tradeoffs and the results of a supporting survey of integrated circuit and digital architecture technologies are reported. Cost tradeoffs and projections with alternate implementation plans are presented.

  12. Thermal imaging as a biometrics approach to facial signature authentication.

    PubMed

    Guzman, A M; Goryawala, M; Wang, Jin; Barreto, A; Andrian, J; Rishe, N; Adjouadi, M

    2013-01-01

    A new thermal imaging framework with unique feature extraction and similarity measurements for face recognition is presented. The research premise is to design specialized algorithms that would extract vasculature information, create a thermal facial signature and identify the individual. The proposed algorithm is fully integrated and consolidates the critical steps of feature extraction through the use of morphological operators, registration using the Linear Image Registration Tool and matching through unique similarity measures designed for this task. The novel approach at developing a thermal signature template using four images taken at various instants of time ensured that unforeseen changes in the vasculature over time did not affect the biometric matching process as the authentication process relied only on consistent thermal features. Thirteen subjects were used for testing the developed technique on an in-house thermal imaging system. The matching using the similarity measures showed an average accuracy of 88.46% for skeletonized signatures and 90.39% for anisotropically diffused signatures. The highly accurate results obtained in the matching process clearly demonstrate the ability of the thermal infrared system to extend in application to other thermal imaging based systems. Empirical results applying this approach to an existing database of thermal images proves this assertion.

  13. Genome signature analysis of thermal virus metagenomes reveals Archaea and thermophilic signatures

    PubMed Central

    Pride, David T; Schoenfeld, Thomas

    2008-01-01

    Background Metagenomic analysis provides a rich source of biological information for otherwise intractable viral communities. However, study of viral metagenomes has been hampered by its nearly complete reliance on BLAST algorithms for identification of DNA sequences. We sought to develop algorithms for examination of viral metagenomes to identify the origin of sequences independent of BLAST algorithms. We chose viral metagenomes obtained from two hot springs, Bear Paw and Octopus, in Yellowstone National Park, as they represent simple microbial populations where comparatively large contigs were obtained. Thermal spring metagenomes have high proportions of sequences without significant Genbank homology, which has hampered identification of viruses and their linkage with hosts. To analyze each metagenome, we developed a method to classify DNA fragments using genome signature-based phylogenetic classification (GSPC), where metagenomic fragments are compared to a database of oligonucleotide signatures for all previously sequenced Bacteria, Archaea, and viruses. Results From both Bear Paw and Octopus hot springs, each assembled contig had more similarity to other metagenome contigs than to any sequenced microbial genome based on GSPC analysis, suggesting a genome signature common to each of these extreme environments. While viral metagenomes from Bear Paw and Octopus share some similarity, the genome signatures from each locale are largely unique. GSPC using a microbial database predicts most of the Octopus metagenome has archaeal signatures, while bacterial signatures predominate in Bear Paw; a finding consistent with those of Genbank BLAST. When using a viral database, the majority of the Octopus metagenome is predicted to belong to archaeal virus Families Globuloviridae and Fuselloviridae, while none of the Bear Paw metagenome is predicted to belong to archaeal viruses. As expected, when microbial and viral databases are combined, each of the Octopus and Bear Paw metagenomic contigs are predicted to belong to viruses rather than to any Bacteria or Archaea, consistent with the apparent viral origin of both metagenomes. Conclusion That BLAST searches identify no significant homologs for most metagenome contigs, while GSPC suggests their origin as archaeal viruses or bacteriophages, indicates GSPC provides a complementary approach in viral metagenomic analysis. PMID:18798991

  14. Genome signature analysis of thermal virus metagenomes reveals Archaea and thermophilic signatures.

    PubMed

    Pride, David T; Schoenfeld, Thomas

    2008-09-17

    Metagenomic analysis provides a rich source of biological information for otherwise intractable viral communities. However, study of viral metagenomes has been hampered by its nearly complete reliance on BLAST algorithms for identification of DNA sequences. We sought to develop algorithms for examination of viral metagenomes to identify the origin of sequences independent of BLAST algorithms. We chose viral metagenomes obtained from two hot springs, Bear Paw and Octopus, in Yellowstone National Park, as they represent simple microbial populations where comparatively large contigs were obtained. Thermal spring metagenomes have high proportions of sequences without significant Genbank homology, which has hampered identification of viruses and their linkage with hosts. To analyze each metagenome, we developed a method to classify DNA fragments using genome signature-based phylogenetic classification (GSPC), where metagenomic fragments are compared to a database of oligonucleotide signatures for all previously sequenced Bacteria, Archaea, and viruses. From both Bear Paw and Octopus hot springs, each assembled contig had more similarity to other metagenome contigs than to any sequenced microbial genome based on GSPC analysis, suggesting a genome signature common to each of these extreme environments. While viral metagenomes from Bear Paw and Octopus share some similarity, the genome signatures from each locale are largely unique. GSPC using a microbial database predicts most of the Octopus metagenome has archaeal signatures, while bacterial signatures predominate in Bear Paw; a finding consistent with those of Genbank BLAST. When using a viral database, the majority of the Octopus metagenome is predicted to belong to archaeal virus Families Globuloviridae and Fuselloviridae, while none of the Bear Paw metagenome is predicted to belong to archaeal viruses. As expected, when microbial and viral databases are combined, each of the Octopus and Bear Paw metagenomic contigs are predicted to belong to viruses rather than to any Bacteria or Archaea, consistent with the apparent viral origin of both metagenomes. That BLAST searches identify no significant homologs for most metagenome contigs, while GSPC suggests their origin as archaeal viruses or bacteriophages, indicates GSPC provides a complementary approach in viral metagenomic analysis.

  15. Influence Of Momentum Excess On The Pattern And Dynamics Of Intermediate-Range Stratified Wakes

    DTIC Science & Technology

    2016-06-01

    excess in order to model the fundamental differences between signatures generated by towed and self- propelled bodies in various ocean states. In cases...which can be used on the operational level for developing and improving algorithms for non- acoustic signature prediction and detection. 14. SUBJECT...order to model the fundamental differences between signatures generated by towed and self- propelled bodies in various ocean states. In cases where

  16. GESearch: An Interactive GUI Tool for Identifying Gene Expression Signature.

    PubMed

    Ye, Ning; Yin, Hengfu; Liu, Jingjing; Dai, Xiaogang; Yin, Tongming

    2015-01-01

    The huge amount of gene expression data generated by microarray and next-generation sequencing technologies present challenges to exploit their biological meanings. When searching for the coexpression genes, the data mining process is largely affected by selection of algorithms. Thus, it is highly desirable to provide multiple options of algorithms in the user-friendly analytical toolkit to explore the gene expression signatures. For this purpose, we developed GESearch, an interactive graphical user interface (GUI) toolkit, which is written in MATLAB and supports a variety of gene expression data files. This analytical toolkit provides four models, including the mean, the regression, the delegate, and the ensemble models, to identify the coexpression genes, and enables the users to filter data and to select gene expression patterns by browsing the display window or by importing knowledge-based genes. Subsequently, the utility of this analytical toolkit is demonstrated by analyzing two sets of real-life microarray datasets from cell-cycle experiments. Overall, we have developed an interactive GUI toolkit that allows for choosing multiple algorithms for analyzing the gene expression signatures.

  17. Age and gender-invariant features of handwritten signatures for verification systems

    NASA Astrophysics Data System (ADS)

    AbdAli, Sura; Putz-Leszczynska, Joanna

    2014-11-01

    Handwritten signature is one of the most natural biometrics, the study of human physiological and behavioral patterns. Behavioral biometrics includes signatures that may be different due to its owner gender or age because of intrinsic or extrinsic factors. This paper presents the results of the author's research on age and gender influence on verification factors. The experiments in this research were conducted using a database that contains signatures and their associated metadata. The used algorithm is based on the universal forgery feature idea, where the global classifier is able to classify a signature as a genuine one or, as a forgery, without the actual knowledge of the signature template and its owner. Additionally, the reduction of the dimensionality with the MRMR method is discussed.

  18. Advanced power system protection and incipient fault detection and protection of spaceborne power systems

    NASA Technical Reports Server (NTRS)

    Russell, B. Don

    1989-01-01

    This research concentrated on the application of advanced signal processing, expert system, and digital technologies for the detection and control of low grade, incipient faults on spaceborne power systems. The researchers have considerable experience in the application of advanced digital technologies and the protection of terrestrial power systems. This experience was used in the current contracts to develop new approaches for protecting the electrical distribution system in spaceborne applications. The project was divided into three distinct areas: (1) investigate the applicability of fault detection algorithms developed for terrestrial power systems to the detection of faults in spaceborne systems; (2) investigate the digital hardware and architectures required to monitor and control spaceborne power systems with full capability to implement new detection and diagnostic algorithms; and (3) develop a real-time expert operating system for implementing diagnostic and protection algorithms. Significant progress has been made in each of the above areas. Several terrestrial fault detection algorithms were modified to better adapt to spaceborne power system environments. Several digital architectures were developed and evaluated in light of the fault detection algorithms.

  19. Super-Nyquist shaping and processing technologies for high-spectral-efficiency optical systems

    NASA Astrophysics Data System (ADS)

    Jia, Zhensheng; Chien, Hung-Chang; Zhang, Junwen; Dong, Ze; Cai, Yi; Yu, Jianjun

    2013-12-01

    The implementations of super-Nyquist pulse generation, both in a digital field using a digital-to-analog converter (DAC) or an optical filter at transmitter side, are introduced. Three corresponding signal processing algorithms at receiver are presented and compared for high spectral-efficiency (SE) optical systems employing the spectral prefiltering. Those algorithms are designed for the mitigation towards inter-symbol-interference (ISI) and inter-channel-interference (ICI) impairments by the bandwidth constraint, including 1-tap constant modulus algorithm (CMA) and 3-tap maximum likelihood sequence estimation (MLSE), regular CMA and digital filter with 2-tap MLSE, and constant multi-modulus algorithm (CMMA) with 2-tap MLSE. The principles and prefiltering tolerance are given through numerical and experimental results.

  20. Low-Light Image Enhancement Using Adaptive Digital Pixel Binning

    PubMed Central

    Yoo, Yoonjong; Im, Jaehyun; Paik, Joonki

    2015-01-01

    This paper presents an image enhancement algorithm for low-light scenes in an environment with insufficient illumination. Simple amplification of intensity exhibits various undesired artifacts: noise amplification, intensity saturation, and loss of resolution. In order to enhance low-light images without undesired artifacts, a novel digital binning algorithm is proposed that considers brightness, context, noise level, and anti-saturation of a local region in the image. The proposed algorithm does not require any modification of the image sensor or additional frame-memory; it needs only two line-memories in the image signal processor (ISP). Since the proposed algorithm does not use an iterative computation, it can be easily embedded in an existing digital camera ISP pipeline containing a high-resolution image sensor. PMID:26121609

  1. Authenticity techniques for PACS images and records

    NASA Astrophysics Data System (ADS)

    Wong, Stephen T. C.; Abundo, Marco; Huang, H. K.

    1995-05-01

    Along with the digital radiology environment supported by picture archiving and communication systems (PACS) comes a new problem: How to establish trust in multimedia medical data that exist only in the easily altered memory of a computer. Trust is characterized in terms of integrity and privacy of digital data. Two major self-enforcing techniques can be used to assure the authenticity of electronic images and text -- key-based cryptography and digital time stamping. Key-based cryptography associates the content of an image with the originator using one or two distinct keys and prevents alteration of the document by anyone other than the originator. A digital time stamping algorithm generates a characteristic `digital fingerprint' for the original document using a mathematical hash function, and checks that it has not been modified. This paper discusses these cryptographic algorithms and their appropriateness for a PACS environment. It also presents experimental results of cryptographic algorithms on several imaging modalities.

  2. Histopathological Image Analysis: A Review

    PubMed Central

    Gurcan, Metin N.; Boucheron, Laura; Can, Ali; Madabhushi, Anant; Rajpoot, Nasir; Yener, Bulent

    2010-01-01

    Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement to the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe. PMID:20671804

  3. Target discrimination of man-made objects using passive polarimetric signatures acquired in the visible and infrared spectral bands

    NASA Astrophysics Data System (ADS)

    Lavigne, Daniel A.; Breton, Mélanie; Fournier, Georges; Charette, Jean-François; Pichette, Mario; Rivet, Vincent; Bernier, Anne-Pier

    2011-10-01

    Surveillance operations and search and rescue missions regularly exploit electro-optic imaging systems to detect targets of interest in both the civilian and military communities. By incorporating the polarization of light as supplementary information to such electro-optic imaging systems, it is possible to increase their target discrimination capabilities, considering that man-made objects are known to depolarized light in different manner than natural backgrounds. As it is known that electro-magnetic radiation emitted and reflected from a smooth surface observed near a grazing angle becomes partially polarized in the visible and infrared wavelength bands, additional information about the shape, roughness, shading, and surface temperatures of difficult targets can be extracted by processing effectively such reflected/emitted polarized signatures. This paper presents a set of polarimetric image processing algorithms devised to extract meaningful information from a broad range of man-made objects. Passive polarimetric signatures are acquired in the visible, shortwave infrared, midwave infrared, and longwave infrared bands using a fully automated imaging system developed at DRDC Valcartier. A fusion algorithm is used to enable the discrimination of some objects lying in shadowed areas. Performance metrics, derived from the computed Stokes parameters, characterize the degree of polarization of man-made objects. Field experiments conducted during winter and summer time demonstrate: 1) the utility of the imaging system to collect polarized signatures of different objects in the visible and infrared spectral bands, and 2) the enhanced performance of target discrimination and fusion algorithms to exploit the polarized signatures of man-made objects against cluttered backgrounds.

  4. Are We Ready for Another Change? Digital Signatures Can Change How We Handle the Academic Record

    ERIC Educational Resources Information Center

    Black, Thomas C.; Mohr, John

    2004-01-01

    In this electronic age, where information is digital and service is virtual, the registrar profession is changing rapidly to keep up with increasing standards and expectations. EDI and now XML standards enable system-to-system exchanges of academic records information. While many of the registrar's profession display student academic records under…

  5. The geometric signature: Quantifying landslide-terrain types from digital elevation models

    USGS Publications Warehouse

    Pike, R.J.

    1988-01-01

    Topography of various types and scales can be fingerprinted by computer analysis of altitude matrices (digital elevation models, or DEMs). The critical analytic tool is the geometric signature, a set of measures that describes topographic form well enough to distinguish among geomorphically disparate landscapes. Different surficial processes create topography with diagnostic forms that are recognizable in the field. The geometric signature abstracts those forms from contour maps or their DEMs and expresses them numerically. This multivariate characterization enables once-in-tractable problems to be addressed. The measures that constitute a geometric signature express different but complementary attributes of topographic form. Most parameters used here are statistical estimates of central tendency and dispersion for five major categories of terrain geometry; altitude, altitude variance spectrum, slope between slope reversals, and slope and its curvature at fixed slope lengths. As an experimental application of geometric signatures, two mapped terrain types associated with different processes of shallow landsliding in Marin County, California, were distinguished consistently by a 17-variable description of topography from 21??21 DEMs (30-m grid spacing). The small matrix is a statistical window that can be used to scan large DEMs by computer, thus potentially automating the mapping of contrasting terrain types. The two types in Marin County host either (1) slow slides: earth flows and slump-earth flows, or (2) rapid flows: debris avalanches and debris flows. The signature approach should adapt to terrain taxonomy and mapping in other areas, where conditions differ from those in Central California. ?? 1988 International Association for Mathematical Geology.

  6. Contextualising the topographic signature of historic mining, a scaling analysis

    NASA Astrophysics Data System (ADS)

    Reinhardt, Liam

    2017-04-01

    Mining is globally one of the most significant means by which humans alter landscapes; we do so through erosion (mining), transport, and deposition of extracted sediments (waste). The iconic Dartmoor mountain landscape of SW England ( 700km2) has experienced over 1000 years of shallow (Cu & Sn) mining that has left a pervasive imprint on the landscape. The availability of high resolution digital elevation models (<=1m) and aerial photographs @12.5 cm resolution) combined with historic records of mining activity and output make this an ideal location to investigate the topographic signature of mining. Conceptually I ask the question: how much (digital elevation model) smoothing is required to remove the human imprint from this landscape ? While we may have entered the Anthropocene other gravity driven process have imparted distinct scale-dependant signatures. How might the human signature differ from these processes and how pervasive is it at the landscape scale? Spatial scaling analysis (curvature & semi-variance) was used to quantify the topographic signature of historic mining and to determine how it differs to a) natural landforms such as bedrock tors; and b) the morphology of biological activity (e.g. peat formation). Other forms of historic activity such as peat cutting and quarrying were also investigated. The existence of 400 years of mine activity archives also makes it possible to distinguish between the imprint of differing forms of mine technology and their spatio-temporal signature. Interestingly the higher technology 19th C mines have left a much smaller topographic legacy than Medieval miners; though the former had a much greater impact in terms of heavy metal contamination.

  7. Image compression evaluation for digital cinema: the case of Star Wars: Episode II

    NASA Astrophysics Data System (ADS)

    Schnuelle, David L.

    2003-05-01

    A program of evaluation of compression algorithms proposed for use in a digital cinema application is described and the results presented in general form. The work was intended to aid in the selection of a compression system to be used for the digital cinema release of Star Wars: Episode II, in May 2002. An additional goal was to provide feedback to the algorithm proponents on what parameters and performance levels the feature film industry is looking for in digital cinema compression. The primary conclusion of the test program is that any of the current digital cinema compression proponents will work for digital cinema distribution to today's theaters.

  8. Optimizing of a high-order digital filter using PSO algorithm

    NASA Astrophysics Data System (ADS)

    Xu, Fuchun

    2018-04-01

    A self-adaptive high-order digital filter, which offers opportunity to simplify the process of tuning parameters and further improve the noise performance, is presented in this paper. The parameters of traditional digital filter are mainly tuned by complex calculation, whereas this paper presents a 5th order digital filter to obtain outstanding performance and the parameters of the proposed filter are optimized by swarm intelligent algorithm. Simulation results with respect to the proposed 5th order digital filter, SNR>122dB and the noise floor under -170dB are obtained in frequency range of [5-150Hz]. In further simulation, the robustness of the proposed 5th order digital is analyzed.

  9. Improvements in estimating proportions of objects from multispectral data

    NASA Technical Reports Server (NTRS)

    Horwitz, H. M.; Hyde, P. D.; Richardson, W.

    1974-01-01

    Methods for estimating proportions of objects and materials imaged within the instantaneous field of view of a multispectral sensor were developed further. Improvements in the basic proportion estimation algorithm were devised as well as improved alien object detection procedures. Also, a simplified signature set analysis scheme was introduced for determining the adequacy of signature set geometry for satisfactory proportion estimation. Averaging procedures used in conjunction with the mixtures algorithm were examined theoretically and applied to artificially generated multispectral data. A computationally simpler estimator was considered and found unsatisfactory. Experiments conducted to find a suitable procedure for setting the alien object threshold yielded little definitive result. Mixtures procedures were used on a limited amount of ERTS data to estimate wheat proportion in selected areas. Results were unsatisfactory, partly because of the ill-conditioned nature of the pure signature set.

  10. Comparison of Neural Networks and Tabular Nearest Neighbor Encoding for Hyperspectral Signature Classification in Unresolved Object Detection

    NASA Astrophysics Data System (ADS)

    Schmalz, M.; Ritter, G.; Key, R.

    Accurate and computationally efficient spectral signature classification is a crucial step in the nonimaging detection and recognition of spaceborne objects. In classical hyperspectral recognition applications using linear mixing models, signature classification accuracy depends on accurate spectral endmember discrimination [1]. If the endmembers cannot be classified correctly, then the signatures cannot be classified correctly, and object recognition from hyperspectral data will be inaccurate. In practice, the number of endmembers accurately classified often depends linearly on the number of inputs. This can lead to potentially severe classification errors in the presence of noise or densely interleaved signatures. In this paper, we present an comparison of emerging technologies for nonimaging spectral signature classfication based on a highly accurate, efficient search engine called Tabular Nearest Neighbor Encoding (TNE) [3,4] and a neural network technology called Morphological Neural Networks (MNNs) [5]. Based on prior results, TNE can optimize its classifier performance to track input nonergodicities, as well as yield measures of confidence or caution for evaluation of classification results. Unlike neural networks, TNE does not have a hidden intermediate data structure (e.g., the neural net weight matrix). Instead, TNE generates and exploits a user-accessible data structure called the agreement map (AM), which can be manipulated by Boolean logic operations to effect accurate classifier refinement algorithms. The open architecture and programmability of TNE's agreement map processing allows a TNE programmer or user to determine classification accuracy, as well as characterize in detail the signatures for which TNE did not obtain classification matches, and why such mis-matches occurred. In this study, we will compare TNE and MNN based endmember classification, using performance metrics such as probability of correct classification (Pd) and rate of false detections (Rfa). As proof of principle, we analyze classification of multiple closely spaced signatures from a NASA database of space material signatures. Additional analysis pertains to computational complexity and noise sensitivity, which are superior to Bayesian techniques based on classical neural networks. [1] Winter, M.E. "Fast autonomous spectral end-member determination in hyperspectral data," in Proceedings of the 13th International Conference On Applied Geologic Remote Sensing, Vancouver, B.C., Canada, pp. 337-44 (1999). [2] N. Keshava, "A survey of spectral unmixing algorithms," Lincoln Laboratory Journal 14:55-78 (2003). [3] Key, G., M.S. SCHMALZ, F.M. Caimi, and G.X. Ritter. "Performance analysis of tabular nearest neighbor encoding algorithm for joint compression and ATR", in Proceedings SPIE 3814:115-126 (1999). [4] Schmalz, M.S. and G. Key. "Algorithms for hyperspectral signature classification in unresolved object detection using tabular nearest neighbor encoding" in Proceedings of the 2007 AMOS Conference, Maui HI (2007). [5] Ritter, G.X., G. Urcid, and M.S. Schmalz. "Autonomous single-pass endmember approximation using lattice auto-associative memories", Neurocomputing (Elsevier), accepted (June 2008).

  11. Large Scale Assessment of Radio Frequency Interference Signatures in L-band SAR Data

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Nicoll, J.

    2011-12-01

    Imagery of L-band Synthetic Aperture Radar (SAR) systems such as the PALSAR sensor on board the Advanced Land Observing Satellite (ALOS) has proven to be a valuable tool for observing environmental changes around the globe. Besides offering 24/7 operability, the L-band frequency provides improved interferometric coherence, and L-band polarimetric data has shown great potential for vegetation monitoring, sea ice classification, and the observation of glaciers and ice sheets. To maximize the benefit of missions such as ALOS PALSAR for environmental monitoring, data consistency and calibration are vital. Unfortunately, radio frequency interference (RFI) signatures from ground-based radar systems regularly impair L-band SAR data quality and consistency. With this study we present a large-scale analysis of typical RFI signatures that are regularly observed in L-band SAR data over the Americas. Through a study of the vast archive of L-band SAR data in the US Government Research Consortium (USGRC) data pool at the Alaska Satellite Facility (ASF) we were able to address the following research goals: 1. Assessment of RFI Signatures in L-band SAR data and their Effects on SAR Data Quality: An analysis of time-frequency properties of RFI signatures in L-band SAR data of the USGRC data pool is presented. It is shown that RFI-filtering algorithms implemented in the operational ALOS PALSAR processor are not sufficient to remove all RFI-related artifacts. In examples, the deleterious effects of RFI on SAR image quality, polarimetric signature, SAR phase, and interferometric coherence are presented. 2. Large-Scale Assessment of Severity, Spatial Distribution, and Temporal Variation of RFI Signatures in L-band SAR data: L-band SAR data in the USGRC data pool were screened for RFI using a custom algorithm. Per SAR frame, the algorithm creates geocoded frame bounding boxes that are color-coded according to RFI intensity and converted to KML files for analysis in Google Earth. From the screening results, parameters such as RFI severity and spatial distribution of RFI were derived. Through a comparison of RFI signatures in older SAR data from JAXA's Japanese Earth Resources Satellite (JERS-1) and recent ALOS PALSAR data, changes in RFI signatures in the Americas were derived, indicating a strong increase of L-band signal contamination over time. 3. An Optimized RFI Filter and its Performance in Data Restoration: An optimized RFI filter has been developed and tested at ASF. The algorithm has proven to be effective in detecting and removing RFI signatures in L-band SAR data and restoring the advertised quality of SAR imagery, polarization, and interferometric phase. The properties of the RFI filter will be described and its performance will be demonstrated in examples. The presented work is a prime example of large-scale research that is made possible by the availability of SAR data through the extensive data archive of the USGRC data pool at ASF.

  12. Adaptive Aft Signature Shaping of a Low-Boom Supersonic Aircraft Using Off-Body Pressures

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu

    2012-01-01

    The design and optimization of a low-boom supersonic aircraft using the state-of-the- art o -body aerodynamics and sonic boom analysis has long been a challenging problem. The focus of this paper is to demonstrate an e ective geometry parameterization scheme and a numerical optimization approach for the aft shaping of a low-boom supersonic aircraft using o -body pressure calculations. A gradient-based numerical optimization algorithm that models the objective and constraints as response surface equations is used to drive the aft ground signature toward a ramp shape. The design objective is the minimization of the variation between the ground signature and the target signature subject to several geometric and signature constraints. The target signature is computed by using a least-squares regression of the aft portion of the ground signature. The parameterization and the deformation of the geometry is performed with a NASA in- house shaping tool. The optimization algorithm uses the shaping tool to drive the geometric deformation of a horizontal tail with a parameterization scheme that consists of seven camber design variables and an additional design variable that describes the spanwise location of the midspan section. The demonstration cases show that numerical optimization using the state-of-the-art o -body aerodynamic calculations is not only feasible and repeatable but also allows the exploration of complex design spaces for which a knowledge-based design method becomes less effective.

  13. Direction of Radio Finding via MUSIC (Multiple Signal Classification) Algorithm for Hardware Design System

    NASA Astrophysics Data System (ADS)

    Zhang, Zheng

    2017-10-01

    Concept of radio direction finding systems, which use radio direction finding is based on digital signal processing algorithms. Thus, the radio direction finding system becomes capable to locate and track signals by the both. Performance of radio direction finding significantly depends on effectiveness of digital signal processing algorithms. The algorithm uses the Direction of Arrival (DOA) algorithms to estimate the number of incidents plane waves on the antenna array and their angle of incidence. This manuscript investigates implementation of the DOA algorithms (MUSIC) on the uniform linear array in the presence of white noise. The experiment results exhibit that MUSIC algorithm changed well with the radio direction.

  14. Distributed Pheromone-Based Swarming Control of Unmanned Air and Ground Vehicles for RSTA

    DTIC Science & Technology

    2008-03-20

    Forthcoming in Proceedings of SPIE Defense & Security Conference, March 2008, Orlando, FL Distributed Pheromone -Based Swarming Control of Unmanned...describes recent advances in a fully distributed digital pheromone algorithm that has demonstrated its effectiveness in managing the complexity of...onboard digital pheromone responding to the needs of the automatic target recognition algorithms. UAVs and UGVs controlled by the same pheromone algorithm

  15. DEMON-type algorithms for determination of hydro-acoustic signatures of surface ships and of divers

    NASA Astrophysics Data System (ADS)

    Slamnoiu, G.; Radu, O.; Rosca, V.; Pascu, C.; Damian, R.; Surdu, G.; Curca, E.; Radulescu, A.

    2016-08-01

    With the project “System for detection, localization, tracking and identification of risk factors for strategic importance in littoral areas”, developed in the National Programme II, the members of the research consortium intend to develop a functional model for a hydroacoustic passive subsystem for determination of acoustic signatures of targets such as fast boats and autonomous divers. This paper presents some of the results obtained in the area of hydroacoustic signal processing by using DEMON-type algorithms (Detection of Envelope Modulation On Noise). For evaluation of the performance of various algorithm variations we have used both audio recordings of the underwater noise generated by ships and divers in real situations and also simulated noises. We have analysed the results of processing these signals using four DEMON algorithm structures as presented in the reference literature and a fifth DEMON algorithm structure proposed by the authors of this paper. The algorithm proposed by the authors generates similar results to those obtained by applying the traditional algorithms but requires less computing resources than those and at the same time it has proven to be more resilient to random noise influence.

  16. Authentication of digital video evidence

    NASA Astrophysics Data System (ADS)

    Beser, Nicholas D.; Duerr, Thomas E.; Staisiunas, Gregory P.

    2003-11-01

    In response to a requirement from the United States Postal Inspection Service, the Technical Support Working Group tasked The Johns Hopkins University Applied Physics Laboratory (JHU/APL) to develop a technique tha will ensure the authenticity, or integrity, of digital video (DV). Verifiable integrity is needed if DV evidence is to withstand a challenge to its admissibility in court on the grounds that it can be easily edited. Specifically, the verification technique must detect additions, deletions, or modifications to DV and satisfy the two-part criteria pertaining to scientific evidence as articulated in Daubert et al. v. Merrell Dow Pharmaceuticals Inc., 43 F3d (9th Circuit, 1995). JHU/APL has developed a prototype digital video authenticator (DVA) that generates digital signatures based on public key cryptography at the frame level of the DV. Signature generation and recording is accomplished at the same time as DV is recorded by the camcorder. Throughput supports the consumer-grade camcorder data rate of 25 Mbps. The DVA software is implemented on a commercial laptop computer, which is connected to a commercial digital camcorder via the IEEE-1394 serial interface. A security token provides agent identification and the interface to the public key infrastructure (PKI) that is needed for management of the public keys central to DV integrity verification.

  17. How digital design shapes political participation: A natural experiment with social information

    PubMed Central

    John, Peter; Margetts, Helen; Yasseri, Taha

    2018-01-01

    Political behaviour increasingly takes place on digital platforms, where people are presented with a range of social information—real-time feedback about the behaviour of peers and reference groups—which can stimulate (or depress) participation. This social information is hypothesized to impact the distribution of political activity, stimulating participation in mobilizations that are increasing in popularity, and depressing participation in those that appear to be less popular, leading to a non-normal distribution. Changes to these platforms can generate natural experiments allowing for an estimate of the impact of different kinds of social information on participation. This paper tests the hypothesis that social information shapes the distribution of political mobilizations by examining the introduction of trending information to the homepage of the UK government petition platform. The introduction of the trending feature did not increase the overall number of signatures per day, but the distribution of signatures across petitions changed significantly—the most popular petitions gained more signatures at the expense of those with fewer signatories. We further find significant differences between petitions trending at different ranks on the homepage. This evidence suggests that the ubiquity of trending information on digital platforms is introducing instability into political markets, as has been shown for cultural markets. As well as highlighting the importance of digital design in shaping political behaviour, the findings suggest that a non-negligible group of individuals visit the homepage of the site looking for petitions to sign, without having decided the issues they wish to support in advance. These ‘aimless petitioners’ are particularly susceptible to changes in social information. PMID:29702664

  18. How digital design shapes political participation: A natural experiment with social information.

    PubMed

    Hale, Scott A; John, Peter; Margetts, Helen; Yasseri, Taha

    2018-01-01

    Political behaviour increasingly takes place on digital platforms, where people are presented with a range of social information-real-time feedback about the behaviour of peers and reference groups-which can stimulate (or depress) participation. This social information is hypothesized to impact the distribution of political activity, stimulating participation in mobilizations that are increasing in popularity, and depressing participation in those that appear to be less popular, leading to a non-normal distribution. Changes to these platforms can generate natural experiments allowing for an estimate of the impact of different kinds of social information on participation. This paper tests the hypothesis that social information shapes the distribution of political mobilizations by examining the introduction of trending information to the homepage of the UK government petition platform. The introduction of the trending feature did not increase the overall number of signatures per day, but the distribution of signatures across petitions changed significantly-the most popular petitions gained more signatures at the expense of those with fewer signatories. We further find significant differences between petitions trending at different ranks on the homepage. This evidence suggests that the ubiquity of trending information on digital platforms is introducing instability into political markets, as has been shown for cultural markets. As well as highlighting the importance of digital design in shaping political behaviour, the findings suggest that a non-negligible group of individuals visit the homepage of the site looking for petitions to sign, without having decided the issues they wish to support in advance. These 'aimless petitioners' are particularly susceptible to changes in social information.

  19. Security authentication using phase-encoded nanoparticle structures and polarized light.

    PubMed

    Carnicer, Artur; Hassanfiroozi, Amir; Latorre-Carmona, Pedro; Huang, Yi-Pai; Javidi, Bahram

    2015-01-15

    Phase-encoded nanostructures such as quick response (QR) codes made of metallic nanoparticles are suggested to be used in security and authentication applications. We present a polarimetric optical method able to authenticate random phase-encoded QR codes. The system is illuminated using polarized light, and the QR code is encoded using a phase-only random mask. Using classification algorithms, it is possible to validate the QR code from the examination of the polarimetric signature of the speckle pattern. We used Kolmogorov-Smirnov statistical test and Support Vector Machine algorithms to authenticate the phase-encoded QR codes using polarimetric signatures.

  20. Automated road network extraction from high spatial resolution multi-spectral imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Qiaoping

    For the last three decades, the Geomatics Engineering and Computer Science communities have considered automated road network extraction from remotely-sensed imagery to be a challenging and important research topic. The main objective of this research is to investigate the theory and methodology of automated feature extraction for image-based road database creation, refinement or updating, and to develop a series of algorithms for road network extraction from high resolution multi-spectral imagery. The proposed framework for road network extraction from multi-spectral imagery begins with an image segmentation using the k-means algorithm. This step mainly concerns the exploitation of the spectral information for feature extraction. The road cluster is automatically identified using a fuzzy classifier based on a set of predefined road surface membership functions. These membership functions are established based on the general spectral signature of road pavement materials and the corresponding normalized digital numbers on each multi-spectral band. Shape descriptors of the Angular Texture Signature are defined and used to reduce the misclassifications between roads and other spectrally similar objects (e.g., crop fields, parking lots, and buildings). An iterative and localized Radon transform is developed for the extraction of road centerlines from the classified images. The purpose of the transform is to accurately and completely detect the road centerlines. It is able to find short, long, and even curvilinear lines. The input image is partitioned into a set of subset images called road component images. An iterative Radon transform is locally applied to each road component image. At each iteration, road centerline segments are detected based on an accurate estimation of the line parameters and line widths. Three localization approaches are implemented and compared using qualitative and quantitative methods. Finally, the road centerline segments are grouped into a road network. The extracted road network is evaluated against a reference dataset using a line segment matching algorithm. The entire process is unsupervised and fully automated. Based on extensive experimentation on a variety of remotely-sensed multi-spectral images, the proposed methodology achieves a moderate success in automating road network extraction from high spatial resolution multi-spectral imagery.

  1. The influence of digital filter type, amplitude normalisation method, and co-contraction algorithm on clinically relevant surface electromyography data during clinical movement assessments.

    PubMed

    Devaprakash, Daniel; Weir, Gillian J; Dunne, James J; Alderson, Jacqueline A; Donnelly, Cyril J

    2016-12-01

    There is a large and growing body of surface electromyography (sEMG) research using laboratory-specific signal processing procedures (i.e., digital filter type and amplitude normalisation protocols) and data analyses methods (i.e., co-contraction algorithms) to acquire practically meaningful information from these data. As a result, the ability to compare sEMG results between studies is, and continues to be challenging. The aim of this study was to determine if digital filter type, amplitude normalisation method, and co-contraction algorithm could influence the practical or clinical interpretation of processed sEMG data. Sixteen elite female athletes were recruited. During data collection, sEMG data was recorded from nine lower limb muscles while completing a series of calibration and clinical movement assessment trials (running and sidestepping). Three analyses were conducted: (1) signal processing with two different digital filter types (Butterworth or critically damped), (2) three amplitude normalisation methods, and (3) three co-contraction ratio algorithms. Results showed the choice of digital filter did not influence the clinical interpretation of sEMG; however, choice of amplitude normalisation method and co-contraction algorithm did influence the clinical interpretation of the running and sidestepping task. Care is recommended when choosing amplitude normalisation method and co-contraction algorithms if researchers/clinicians are interested in comparing sEMG data between studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Construct validation of an interactive digital algorithm for ostomy care.

    PubMed

    Beitz, Janice M; Gerlach, Mary A; Schafer, Vickie

    2014-01-01

    The purpose of this study was to evaluate construct validity for a previously face and content validated Ostomy Algorithm using digital real-life clinical scenarios. A cross-sectional, mixed-methods Web-based survey design study was conducted. Two hundred ninety-seven English-speaking RNs completed the study; participants practiced in both acute care and postacute settings, with 1 expert ostomy nurse (WOC nurse) and 2 nonexpert nurses. Following written consent, respondents answered demographic questions and completed a brief algorithm tutorial. Participants were then presented with 7 ostomy-related digital scenarios consisting of real-life photos and pertinent clinical information. Respondents used the 11 assessment components of the digital algorithm to choose management options. Participant written comments about the scenarios and the research process were collected. The mean overall percentage of correct responses was 84.23%. Mean percentage of correct responses for respondents with a self-reported basic ostomy knowledge was 87.7%; for those with a self-reported intermediate ostomy knowledge was 85.88% and those who were self-reported experts in ostomy care achieved 82.77% correct response rate. Five respondents reported having no prior ostomy care knowledge at screening and achieved an overall 45.71% correct response rate. No negative comments regarding the algorithm were recorded by participants. The new standardized Ostomy Algorithm remains the only face, content, and construct validated digital clinical decision instrument currently available. Further research on application at the bedside while tracking patient outcomes is warranted.

  3. Security Issues on the Internet.

    ERIC Educational Resources Information Center

    Bar-Ilan, Judit

    1996-01-01

    Discusses some basic notions of modern cryptography: public key systems and digital signatures. Describes how theoretical modern cryptography can help solve security problems on the Internet. (Author/JKP)

  4. A Double Perturbation Method for Reducing Dynamical Degradation of the Digital Baker Map

    NASA Astrophysics Data System (ADS)

    Liu, Lingfeng; Lin, Jun; Miao, Suoxia; Liu, Bocheng

    2017-06-01

    The digital Baker map is widely used in different kinds of cryptosystems, especially for image encryption. However, any chaotic map which is realized on the finite precision device (e.g. computer) will suffer from dynamical degradation, which refers to short cycle lengths, low complexity and strong correlations. In this paper, a novel double perturbation method is proposed for reducing the dynamical degradation of the digital Baker map. Both state variables and system parameters are perturbed by the digital logistic map. Numerical experiments show that the perturbed Baker map can achieve good statistical and cryptographic properties. Furthermore, a new image encryption algorithm is provided as a simple application. With a rather simple algorithm, the encrypted image can achieve high security, which is competitive to the recently proposed image encryption algorithms.

  5. Eliminating "Hotspots" in Digital Image Processing

    NASA Technical Reports Server (NTRS)

    Salomon, P. M.

    1984-01-01

    Signals from defective picture elements rejected. Image processing program for use with charge-coupled device (CCD) or other mosaic imager augmented with algorithm that compensates for common type of electronic defect. Algorithm prevents false interpretation of "hotspots". Used for robotics, image enhancement, image analysis and digital television.

  6. Novel laser induced photoacoustic spectroscopy for instantaneous trace detection of explosive materials.

    PubMed

    El-Sharkawy, Yasser H; Elbasuney, Sherif

    2017-08-01

    Laser photoacoustic spectroscopy (LPAS) is an attractive technology in terms of simplicity, ruggedness, and overall sensitivity; it detects the time dependent heat generated (thermo-elastic effect) in the target via interaction with pulsed optical radiation. This study reports on novel LPAS technique that offers instant and standoff detection capabilities of trace explosives. Over the current study, light is generated using pulsed Q-switched Nd:YAG laser; the generated photoacoustic response in stimulated explosive material offers signature values that depend on the optical, thermal, and acoustical properties. The generated acoustic waves were captured using piezoelectric transducer as well as novel customized optical sensor with remotely laser interferometer probe. A digital signal processing algorithm was employed to identify explosive material signatures via calculation of characteristic optical properties (absorption coefficient), sound velocity, and frequency response of the generated photoacoustic signal. Customized LPAS technique was employed for instantaneous trace detection of three main different high explosive materials including TNT, RDX, and HMX. The main outcome of this study is that the novel customized optical sensor signals were validated with traditional piezoelectric transducer. Furthermore, the customized optical sensor offered standoff detection capabilities (10cm), fast response, high sensitivity, and enhanced signal to noise ratio. This manuscript shaded the light on the instant detection of trace explosive materials from significant standoffs using novel customized LPAS technique. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Noise-shaping gradient descent-based online adaptation algorithms for digital calibration of analog circuits.

    PubMed

    Chakrabartty, Shantanu; Shaga, Ravi K; Aono, Kenji

    2013-04-01

    Analog circuits that are calibrated using digital-to-analog converters (DACs) use a digital signal processor-based algorithm for real-time adaptation and programming of system parameters. In this paper, we first show that this conventional framework for adaptation yields suboptimal calibration properties because of artifacts introduced by quantization noise. We then propose a novel online stochastic optimization algorithm called noise-shaping or ΣΔ gradient descent, which can shape the quantization noise out of the frequency regions spanning the parameter adaptation trajectories. As a result, the proposed algorithms demonstrate superior parameter search properties compared to floating-point gradient methods and better convergence properties than conventional quantized gradient-methods. In the second part of this paper, we apply the ΣΔ gradient descent algorithm to two examples of real-time digital calibration: 1) balancing and tracking of bias currents, and 2) frequency calibration of a band-pass Gm-C biquad filter biased in weak inversion. For each of these examples, the circuits have been prototyped in a 0.5-μm complementary metal-oxide-semiconductor process, and we demonstrate that the proposed algorithm is able to find the optimal solution even in the presence of spurious local minima, which are introduced by the nonlinear and non-monotonic response of calibration DACs.

  8. The computation of pi to 29,360,000 decimal digits using Borweins' quartically convergent algorithm

    NASA Technical Reports Server (NTRS)

    Bailey, David H.

    1988-01-01

    The quartically convergent numerical algorithm developed by Borwein and Borwein (1987) for 1/pi is implemented via a prime-modulus-transform multiprecision technique on the NASA Ames Cray-2 supercomputer to compute the first 2.936 x 10 to the 7th digits of the decimal expansion of pi. The history of pi computations is briefly recalled; the most recent algorithms are characterized; the implementation procedures are described; and samples of the output listing are presented. Statistical analyses show that the present decimal expansion is completely random, with only acceptable numbers of long repeating strings and single-digit runs.

  9. A method for digital image registration using a mathematical programming technique

    NASA Technical Reports Server (NTRS)

    Yao, S. S.

    1973-01-01

    A new algorithm based on a nonlinear programming technique to correct the geometrical distortions of one digital image with respect to another is discussed. This algorithm promises to be superior to existing ones in that it is capable of treating localized differential scaling, translational and rotational errors over the whole image plane. A series of piece-wise 'rubber-sheet' approximations are used, constrained in such a manner that a smooth approximation over the entire image can be obtained. The theoretical derivation is included. The result of using the algorithm to register four channel S065 Apollo IX digitized photography over Imperial Valley, California, is discussed in detail.

  10. Generalized look-ahead number conversion from signed digit to complement representation with optical logic operations

    NASA Astrophysics Data System (ADS)

    Qian, Feng; Li, Guoqiang

    2001-12-01

    In this paper a generalized look-ahead logic algorithm for number conversion from signed-digit to its complement representation is developed. By properly encoding the signed digits, all the operations are performed by binary logic, and unified logical expressions can be obtained for conversion from modified-signed-digit (MSD) to 2's complement, trinary signed-digit (TSD) to 3's complement, and quaternary signed-digit (QSD) to 4's complement. For optical implementation, a parallel logical array module using electron-trapping device is employed, which is suitable for realizing complex logic functions in the form of sum-of-product. The proposed algorithm and architecture are compatible with a general-purpose optoelectronic computing system.

  11. Authentication, integrity, and confidentiality in DICOM-structured reporting: concept and implementation

    NASA Astrophysics Data System (ADS)

    Riesmeier, Joerg; Eichelberg, Marco; Kleber, Klaus; Groenemeyer, Dietrich H.; Oosterwijk, Herman J.; Jensch, Peter F.

    2002-05-01

    With the release of 'DICOM Structured Reporting' (SR) as an official extension of the standard about two years ago, DICOM has entered a new domain that is only indirectly related to medical imaging. Basically, DICOM SR is a general model allowing to encode medical reports in a structured manner in DICOM's tag-based format. Therefore, the existing DICOM infrastructure can be used to archive and communicate structured reports, with only relatively small changes to existing systems. As a consequence of the introduction of medical reports in a digital form, the relevance of security measures increases significantly. We have developed a prototype implementation of DICOM structured reporting together with the new security extensions for secure transport connections and digital signatures. The application allows to create, read and modify any SR document, to digitally sign an SR document in whole or part and to transmit such documents over a network. While the secure transport connection protects data from modifications or unauthorized access only during transmission, digital signatures provide a lifetime integrity check and, therefore, maintain the legal document status of structured reports. The application has been successfully demonstrated at RSNA 2000 and ECR 2001, and is freely available on the Internet.

  12. Multipurpose image watermarking algorithm based on multistage vector quantization.

    PubMed

    Lu, Zhe-Ming; Xu, Dian-Guo; Sun, Sheng-He

    2005-06-01

    The rapid growth of digital multimedia and Internet technologies has made copyright protection, copy protection, and integrity verification three important issues in the digital world. To solve these problems, the digital watermarking technique has been presented and widely researched. Traditional watermarking algorithms are mostly based on discrete transform domains, such as the discrete cosine transform, discrete Fourier transform (DFT), and discrete wavelet transform (DWT). Most of these algorithms are good for only one purpose. Recently, some multipurpose digital watermarking methods have been presented, which can achieve the goal of content authentication and copyright protection simultaneously. However, they are based on DWT or DFT. Lately, several robust watermarking schemes based on vector quantization (VQ) have been presented, but they can only be used for copyright protection. In this paper, we present a novel multipurpose digital image watermarking method based on the multistage vector quantizer structure, which can be applied to image authentication and copyright protection. In the proposed method, the semi-fragile watermark and the robust watermark are embedded in different VQ stages using different techniques, and both of them can be extracted without the original image. Simulation results demonstrate the effectiveness of our algorithm in terms of robustness and fragility.

  13. A framework for comparing different image segmentation methods and its use in studying equivalences between level set and fuzzy connectedness frameworks

    PubMed Central

    Ciesielski, Krzysztof Chris; Udupa, Jayaram K.

    2011-01-01

    In the current vast image segmentation literature, there seems to be considerable redundancy among algorithms, while there is a serious lack of methods that would allow their theoretical comparison to establish their similarity, equivalence, or distinctness. In this paper, we make an attempt to fill this gap. To accomplish this goal, we argue that: (1) every digital segmentation algorithm A should have a well defined continuous counterpart MA, referred to as its model, which constitutes an asymptotic of A when image resolution goes to infinity; (2) the equality of two such models MA and MA′ establishes a theoretical (asymptotic) equivalence of their digital counterparts A and A′. Such a comparison is of full theoretical value only when, for each involved algorithm A, its model MA is proved to be an asymptotic of A. So far, such proofs do not appear anywhere in the literature, even in the case of algorithms introduced as digitizations of continuous models, like level set segmentation algorithms. The main goal of this article is to explore a line of investigation for formally pairing the digital segmentation algorithms with their asymptotic models, justifying such relations with mathematical proofs, and using the results to compare the segmentation algorithms in this general theoretical framework. As a first step towards this general goal, we prove here that the gradient based thresholding model M∇ is the asymptotic for the fuzzy connectedness Udupa and Samarasekera segmentation algorithm used with gradient based affinity A∇. We also argue that, in a sense, M∇ is the asymptotic for the original front propagation level set algorithm of Malladi, Sethian, and Vemuri, thus establishing a theoretical equivalence between these two specific algorithms. Experimental evidence of this last equivalence is also provided. PMID:21442014

  14. Threshold automatic selection hybrid phase unwrapping algorithm for digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Zhou, Meiling; Min, Junwei; Yao, Baoli; Yu, Xianghua; Lei, Ming; Yan, Shaohui; Yang, Yanlong; Dan, Dan

    2015-01-01

    Conventional quality-guided (QG) phase unwrapping algorithm is hard to be applied to digital holographic microscopy because of the long execution time. In this paper, we present a threshold automatic selection hybrid phase unwrapping algorithm that combines the existing QG algorithm and the flood-filled (FF) algorithm to solve this problem. The original wrapped phase map is divided into high- and low-quality sub-maps by selecting a threshold automatically, and then the FF and QG unwrapping algorithms are used in each level to unwrap the phase, respectively. The feasibility of the proposed method is proved by experimental results, and the execution speed is shown to be much faster than that of the original QG unwrapping algorithm.

  15. Automated thematic mapping and change detection of ERTS-A images. [farmlands, cities, and mountain identification in Utah, Washington, Arizona, and California

    NASA Technical Reports Server (NTRS)

    Gramenopoulos, N. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. A diffraction pattern analysis of MSS images led to the development of spatial signatures for farm land, urban areas and mountains. Four spatial features are employed to describe the spatial characteristics of image cells in the digital data. Three spectral features are combined with the spatial features to form a seven dimensional vector describing each cell. Then, the classification of the feature vectors is accomplished by using the maximum likelihood criterion. It was determined that the recognition accuracy with the maximum likelihood criterion depends on the statistics of the feature vectors. It was also determined that for a given geographic area the statistics of the classes remain invariable for a period of a month, but vary substantially between seasons. Three ERTS-1 images from the Phoenix, Arizona area were processed, and recognition rates between 85% and 100% were obtained for the terrain classes of desert, farms, mountains, and urban areas. To eliminate the need for training data, a new clustering algorithm has been developed. Seven ERTS-1 images from four test sites have been processed through the clustering algorithm, and high recognition rates have been achieved for all terrain classes.

  16. A look-up-table digital predistortion technique for high-voltage power amplifiers in ultrasonic applications.

    PubMed

    Gao, Zheng; Gui, Ping

    2012-07-01

    In this paper, we present a digital predistortion technique to improve the linearity and power efficiency of a high-voltage class-AB power amplifier (PA) for ultrasound transmitters. The system is composed of a digital-to-analog converter (DAC), an analog-to-digital converter (ADC), and a field-programmable gate array (FPGA) in which the digital predistortion (DPD) algorithm is implemented. The DPD algorithm updates the error, which is the difference between the ideal signal and the attenuated distorted output signal, in the look-up table (LUT) memory during each cycle of a sinusoidal signal using the least-mean-square (LMS) algorithm. On the next signal cycle, the error data are used to equalize the signal with negative harmonic components to cancel the amplifier's nonlinear response. The algorithm also includes a linear interpolation method applied to the windowed sinusoidal signals for the B-mode and Doppler modes. The measurement test bench uses an arbitrary function generator as the DAC to generate the input signal, an oscilloscope as the ADC to capture the output waveform, and software to implement the DPD algorithm. The measurement results show that the proposed system is able to reduce the second-order harmonic distortion (HD2) by 20 dB and the third-order harmonic distortion (HD3) by 14.5 dB, while at the same time improving the power efficiency by 18%.

  17. Use of the Hotelling observer to optimize image reconstruction in digital breast tomosynthesis

    PubMed Central

    Sánchez, Adrian A.; Sidky, Emil Y.; Pan, Xiaochuan

    2015-01-01

    Abstract. We propose an implementation of the Hotelling observer that can be applied to the optimization of linear image reconstruction algorithms in digital breast tomosynthesis. The method is based on considering information within a specific region of interest, and it is applied to the optimization of algorithms for detectability of microcalcifications. Several linear algorithms are considered: simple back-projection, filtered back-projection, back-projection filtration, and Λ-tomography. The optimized algorithms are then evaluated through the reconstruction of phantom data. The method appears robust across algorithms and parameters and leads to the generation of algorithm implementations which subjectively appear optimized for the task of interest. PMID:26702408

  18. Quantum digital-to-analog conversion algorithm using decoherence

    NASA Astrophysics Data System (ADS)

    SaiToh, Akira

    2015-08-01

    We consider the problem of mapping digital data encoded on a quantum register to analog amplitudes in parallel. It is shown to be unlikely that a fully unitary polynomial-time quantum algorithm exists for this problem; NP becomes a subset of BQP if it exists. In the practical point of view, we propose a nonunitary linear-time algorithm using quantum decoherence. It tacitly uses an exponentially large physical resource, which is typically a huge number of identical molecules. Quantumness of correlation appearing in the process of the algorithm is also discussed.

  19. Novel Virtual Screening Approach for the Discovery of Human Tyrosinase Inhibitors

    PubMed Central

    Ai, Ni; Welsh, William J.; Santhanam, Uma; Hu, Hong; Lyga, John

    2014-01-01

    Tyrosinase is the key enzyme involved in the human pigmentation process, as well as the undesired browning of fruits and vegetables. Compounds inhibiting tyrosinase catalytic activity are an important class of cosmetic and dermatological agents which show high potential as depigmentation agents used for skin lightening. The multi-step protocol employed for the identification of novel tyrosinase inhibitors incorporated the Shape Signatures computational algorithm for rapid screening of chemical libraries. This algorithm converts the size and shape of a molecule, as well its surface charge distribution and other bio-relevant properties, into compact histograms (signatures) that lend themselves to rapid comparison between molecules. Shape Signatures excels at scaffold hopping across different chemical families, which enables identification of new actives whose molecular structure is distinct from other known actives. Using this approach, we identified a novel class of depigmentation agents that demonstrated promise for skin lightening product development. PMID:25426625

  20. Novel virtual screening approach for the discovery of human tyrosinase inhibitors.

    PubMed

    Ai, Ni; Welsh, William J; Santhanam, Uma; Hu, Hong; Lyga, John

    2014-01-01

    Tyrosinase is the key enzyme involved in the human pigmentation process, as well as the undesired browning of fruits and vegetables. Compounds inhibiting tyrosinase catalytic activity are an important class of cosmetic and dermatological agents which show high potential as depigmentation agents used for skin lightening. The multi-step protocol employed for the identification of novel tyrosinase inhibitors incorporated the Shape Signatures computational algorithm for rapid screening of chemical libraries. This algorithm converts the size and shape of a molecule, as well its surface charge distribution and other bio-relevant properties, into compact histograms (signatures) that lend themselves to rapid comparison between molecules. Shape Signatures excels at scaffold hopping across different chemical families, which enables identification of new actives whose molecular structure is distinct from other known actives. Using this approach, we identified a novel class of depigmentation agents that demonstrated promise for skin lightening product development.

  1. Pose and motion recovery from feature correspondences and a digital terrain map.

    PubMed

    Lerner, Ronen; Rivlin, Ehud; Rotstein, Héctor P

    2006-09-01

    A novel algorithm for pose and motion estimation using corresponding features and a Digital Terrain Map is proposed. Using a Digital Terrain (or Digital Elevation) Map (DTM/DEM) as a global reference enables the elimination of the ambiguity present in vision-based algorithms for motion recovery. As a consequence, the absolute position and orientation of a camera can be recovered with respect to the external reference frame. In order to do this, the DTM is used to formulate a constraint between corresponding features in two consecutive frames. Explicit reconstruction of the 3D world is not required. When considering a number of feature points, the resulting constraints can be solved using nonlinear optimization in terms of position, orientation, and motion. Such a procedure requires an initial guess of these parameters, which can be obtained from dead-reckoning or any other source. The feasibility of the algorithm is established through extensive experimentation. Performance is compared with a state-of-the-art alternative algorithm, which intermediately reconstructs the 3D structure and then registers it to the DTM. A clear advantage for the novel algorithm is demonstrated in variety of scenarios.

  2. Micro-Doppler analysis of multiple frequency continuous wave radar signatures

    NASA Astrophysics Data System (ADS)

    Anderson, Michael G.; Rogers, Robert L.

    2007-04-01

    Micro-Doppler refers to Doppler scattering returns produced by non rigid-body motion. Micro-Doppler gives rise to many detailed radar image features in addition to those associated with bulk target motion. Targets of different classes (for example, humans, animals, and vehicles) produce micro-Doppler images that are often distinguishable even by nonexpert observers. Micro-Doppler features have great potential for use in automatic target classification algorithms. Although the potential benefit of using micro-Doppler in classification algorithms is high, relatively little experimental (non-synthetic) micro-Doppler data exists. Much of the existing experimental data comes from highly cooperative targets (human or vehicle targets directly approaching the radar). This research involved field data collection and analysis of micro-Doppler radar signatures from non-cooperative targets. The data was collected using a low cost Xband multiple frequency continuous wave (MFCW) radar with three transmit frequencies. The collected MFCW radar signatures contain data from humans, vehicles, and animals. The presented data includes micro-Doppler signatures previously unavailable in the literature such as crawling humans and various animal species. The animal micro-Doppler signatures include deer, dog, and goat datasets. This research focuses on the analysis of micro-Doppler from noncooperative targets approaching the radar at various angles, maneuvers, and postures.

  3. Four-Digit Numbers Which Are Squared Sums

    ERIC Educational Resources Information Center

    Coughlin, Heather; Jue, Brian

    2009-01-01

    There is a very natural way to divide a four-digit number into 2 two-digit numbers. Applying an algorithm to this pair of numbers, determine how often the original four-digit number reappears. (Contains 3 tables.)

  4. Upgraded Readout Electronics for the ATLAS Liquid Argon Calorimeters at the High Luminosity LHC

    NASA Astrophysics Data System (ADS)

    Andeen, Timothy R.; ATLAS Liquid Argon Calorimeter Group

    2012-12-01

    The ATLAS liquid-argon calorimeters produce a total of 182,486 signals which are digitized and processed by the front-end and back-end electronics at every triggered event. In addition, the front-end electronics sum analog signals to provide coarsely grained energy sums, called trigger towers, to the first-level trigger system, which is optimized for nominal LHC luminosities. However, the pile-up background expected during the high luminosity phases of the LHC will be increased by factors of 3 to 7. An improved spatial granularity of the trigger primitives is therefore proposed in order to improve the identification performance for trigger signatures, like electrons or photons, at high background rejection rates. For the first upgrade phase in 2018, new Liquid Argon Trigger Digitizer Boards are being designed to receive higher granularity signals, digitize them on detector and send them via fast optical links to a new, off-detector digital processing system. The digital processing system applies digital filtering and identifies significant energy depositions. The refined trigger primitives are then transmitted to the first level trigger system to extract improved trigger signatures. The general concept of the upgraded liquid-argon calorimeter readout together with the various electronics components to be developed for such a complex system is presented. The research activities and architectural studies undertaken by the ATLAS Liquid Argon Calorimeter Group are described, particularly details of the on-going design of mixed-signal front-end electronics, of radiation tolerant optical-links, and of the high-speed off-detector digital processing system.

  5. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1989-01-01

    Advances in very large-scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible and potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for a DPCM-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the CODEC are described, and performance results are provided.

  6. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A.

    1991-01-01

    Advances in very large scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible an potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for DPCM (differential pulse code midulation)-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the codec are described, and performance results are provided.

  7. Ballistic missile precession frequency extraction based on the Viterbi & Kalman algorithm

    NASA Astrophysics Data System (ADS)

    Wu, Longlong; Xie, Yongjie; Xu, Daping; Ren, Li

    2015-12-01

    Radar Micro-Doppler signatures are of great potential for target detection, classification and recognition. In the mid-course phase, warheads flying outside the atmosphere are usually accompanied by precession. Precession may induce additional frequency modulations on the returned radar signal, which can be regarded as a unique signature and provide additional information that is complementary to existing target recognition methods. The main purpose of this paper is to establish a more actual precession model of conical ballistic missile warhead and extract the precession parameters by utilizing Viterbi & Kalman algorithm, which improving the precession frequency estimation accuracy evidently , especially in low SNR.

  8. Fast Physically Accurate Rendering of Multimodal Signatures of Distributed Fracture in Heterogeneous Materials.

    PubMed

    Visell, Yon

    2015-04-01

    This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

  9. An evaluation of the signature extension approach to large area crop inventories utilizing space image data. [Kansas and North Dakota

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Cicone, R. C.; Stinson, J. L.; Balon, R. J.

    1977-01-01

    The author has identified the following significant results. Two examples of haze correction algorithms were tested: CROP-A and XSTAR. The CROP-A was tested in a unitemporal mode on data collected in 1973-74 over ten sample segments in Kansas. Because of the uniformly low level of haze present in these segments, no conclusion could be reached about CROP-A's ability to compensate for haze. It was noted, however, that in some cases CROP-A made serious errors which actually degraded classification performance. The haze correction algorithm XSTAR was tested in a multitemporal mode on 1975-76 LACIE sample segment data over 23 blind sites in Kansas and 18 sample segments in North Dakota, providing wide range of haze levels and other conditions for algorithm evaluation. It was found that this algorithm substantially improved signature extension classification accuracy when a sum-of-likelihoods classifier was used with an alien rejection threshold.

  10. Fringe pattern demodulation with a two-frame digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-frame digital phase-locked loop for fringe pattern demodulation is presented. In this scheme, two fringe patterns with different spatial carrier frequencies are grabbed for an object. A digital phase-locked loop algorithm tracks and demodulates the phase difference between both fringe patterns by employing the wrapped phase components of one of the fringe patterns as a reference to demodulate the second fringe pattern. The desired phase information can be extracted from the demodulated phase difference. We tested the algorithm experimentally using real fringe patterns. The technique is shown to be suitable for noncontact measurement of objects with rapid surface variations, and it outperforms the Fourier fringe analysis technique in this aspect. Phase maps produced withthis algorithm are noisy in comparison with phase maps generated with the Fourier fringe analysis technique.

  11. HYBRID FAST HANKEL TRANSFORM ALGORITHM FOR ELECTROMAGNETIC MODELING

    EPA Science Inventory

    A hybrid fast Hankel transform algorithm has been developed that uses several complementary features of two existing algorithms: Anderson's digital filtering or fast Hankel transform (FHT) algorithm and Chave's quadrature and continued fraction algorithm. A hybrid FHT subprogram ...

  12. Environmental Requirements for Authentication Protocols

    DTIC Science & Technology

    2002-01-01

    Engineering for Informa- tion Security, March 2001. 10. D. Chaum . Blind signatures for untraceable payments. In Advances in Cryptology{ Proceedings of...the connection, the idea relies on a concept similar to blinding in the sense of Chaum [10], who used it e ectively in the design of anonymous payment...digital signature on the key and a nonce provided by the server, in which the client’s challenge response was independent of the type of cipher

  13. Methods for using a biometric parameter in the identification of persons

    DOEpatents

    Hively, Lee M [Philadelphia, TN

    2011-11-22

    Brain waves are used as a biometric parameter to provide for authentication and identification of personnel. The brain waves are sampled using EEG equipment and are processed using phase-space distribution functions to compare digital signature data from enrollment of authorized individuals to data taken from a test subject to determine if the data from the test subject matches the signature data to a degree to support positive identification.

  14. Code conversion from signed-digit to complement representation based on look-ahead optical logic operations

    NASA Astrophysics Data System (ADS)

    Li, Guoqiang; Qian, Feng

    2001-11-01

    We present, for the first time to our knowledge, a generalized lookahead logic algorithm for number conversion from signed-digit to complement representation. By properly encoding the signed-digits, all the operations are performed by binary logic, and unified logical expressions can be obtained for conversion from modified-signed- digit (MSD) to 2's complement, trinary signed-digit (TSD) to 3's complement, and quarternary signed-digit (QSD) to 4's complement. For optical implementation, a parallel logical array module using an electron-trapping device is employed and experimental results are shown. This optical module is suitable for implementing complex logic functions in the form of the sum of the product. The algorithm and architecture are compatible with a general-purpose optoelectronic computing system.

  15. Science of Land Target Spectral Signatures

    DTIC Science & Technology

    2013-04-03

    F. Meriaudeau, T. Downey , A. Wig , A. Passian, M. Buncick, T.L. Ferrell, Fiber optic sensor based on gold island plasmon resonance , Sensors and...processing, detection algorithms, sensor fusion, spectral signature modeling Dr. J. Michael Cathcart Georgia Tech Research Corporation Office of...target detection and sensor fusion. The phenomenology research continued to focus on spectroscopic soil measurements, optical property analyses, field

  16. Water quality parameter measurement using spectral signatures

    NASA Technical Reports Server (NTRS)

    White, P. E.

    1973-01-01

    Regression analysis is applied to the problem of measuring water quality parameters from remote sensing spectral signature data. The equations necessary to perform regression analysis are presented and methods of testing the strength and reliability of a regression are described. An efficient algorithm for selecting an optimal subset of the independent variables available for a regression is also presented.

  17. Modelling the passive microwave signature from land surfaces: a review of recent results and application to the SMOS & SMAP soil moisture retrieval algorithms

    USDA-ARS?s Scientific Manuscript database

    Two passive microwave missions are currently operating at L-band to monitor surface soil moisture (SM) over continental surfaces. The SMOS sensor, based on an innovative interferometric technology enabling multi-angular signatures of surfaces to be measured, was launched in November 2009....

  18. Verifying Safety Messages Using Relative-Time and Zone Priority in Vehicular Ad Hoc Networks.

    PubMed

    Banani, Sam; Gordon, Steven; Thiemjarus, Surapa; Kittipiyakul, Somsak

    2018-04-13

    In high-density road networks, with each vehicle broadcasting multiple messages per second, the arrival rate of safety messages can easily exceed the rate at which digital signatures can be verified. Since not all messages can be verified, algorithms for selecting which messages to verify are required to ensure that each vehicle receives appropriate awareness about neighbouring vehicles. This paper presents a novel scheme to select important safety messages for verification in vehicular ad hoc networks (VANETs). The proposed scheme uses location and direction of the sender, as well as proximity and relative-time between vehicles, to reduce the number of irrelevant messages verified (i.e., messages from vehicles that are unlikely to cause an accident). Compared with other existing schemes, the analysis results show that the proposed scheme can verify messages from nearby vehicles with lower inter-message delay and reduced packet loss and thus provides high level of awareness of the nearby vehicles.

  19. Low-power cryptographic coprocessor for autonomous wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Olszyna, Jakub; Winiecki, Wiesław

    2013-10-01

    The concept of autonomous wireless sensor networks involves energy harvesting, as well as effective management of system resources. Public-key cryptography (PKC) offers the advantage of elegant key agreement schemes with which a secret key can be securely established over unsecure channels. In addition to solving the key management problem, the other major application of PKC is digital signatures, with which non-repudiation of messages exchanges can be achieved. The motivation for studying low-power and area efficient modular arithmetic algorithms comes from enabling public-key security for low-power devices that can perform under constrained environment like autonomous wireless sensor networks. This paper presents a cryptographic coprocessor tailored to the autonomous wireless sensor networks constraints. Such hardware circuit is aimed to support the implementation of different public-key cryptosystems based on modular arithmetic in GF(p) and GF(2m). Key components of the coprocessor are described as GEZEL models and can be easily transformed to VHDL and implemented in hardware.

  20. Perceptual approaches to finding features in data

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.

    2013-03-01

    Electronic imaging applications hinge on the ability to discover features in data. For example, doctors examine diagnostic images for tumors, broken bones and changes in metabolic activity. Financial analysts explore visualizations of market data to find correlations, outliers and interaction effects. Seismologists look for signatures in geological data to tell them where to drill or where an earthquake may begin. These data are very diverse, including images, numbers, graphs, 3-D graphics, and text, and are growing exponentially, largely through the rise in automatic data collection technologies such as sensors and digital imaging. This paper explores important trends in the art and science of finding features in data, such as the tension between bottom-up and top-down processing, the semantics of features, and the integration of human- and algorithm-based approaches. This story is told from the perspective of the IS and T/SPIE Conference on Human Vision and Electronic Imaging (HVEI), which has fostered research at the intersection between human perception and the evolution of new technologies.

  1. Verifying Safety Messages Using Relative-Time and Zone Priority in Vehicular Ad Hoc Networks

    PubMed Central

    Banani, Sam; Thiemjarus, Surapa; Kittipiyakul, Somsak

    2018-01-01

    In high-density road networks, with each vehicle broadcasting multiple messages per second, the arrival rate of safety messages can easily exceed the rate at which digital signatures can be verified. Since not all messages can be verified, algorithms for selecting which messages to verify are required to ensure that each vehicle receives appropriate awareness about neighbouring vehicles. This paper presents a novel scheme to select important safety messages for verification in vehicular ad hoc networks (VANETs). The proposed scheme uses location and direction of the sender, as well as proximity and relative-time between vehicles, to reduce the number of irrelevant messages verified (i.e., messages from vehicles that are unlikely to cause an accident). Compared with other existing schemes, the analysis results show that the proposed scheme can verify messages from nearby vehicles with lower inter-message delay and reduced packet loss and thus provides high level of awareness of the nearby vehicles. PMID:29652840

  2. Self-tuning control of attitude and momentum management for the Space Station

    NASA Technical Reports Server (NTRS)

    Shieh, L. S.; Sunkel, J. W.; Yuan, Z. Z.; Zhao, X. M.

    1992-01-01

    This paper presents a hybrid state-space self-tuning design methodology using dual-rate sampling for suboptimal digital adaptive control of attitude and momentum management for the Space Station. This new hybrid adaptive control scheme combines an on-line recursive estimation algorithm for indirectly identifying the parameters of a continuous-time system from the available fast-rate sampled data of the inputs and states and a controller synthesis algorithm for indirectly finding the slow-rate suboptimal digital controller from the designed optimal analog controller. The proposed method enables the development of digitally implementable control algorithms for the robust control of Space Station Freedom with unknown environmental disturbances and slowly time-varying dynamics.

  3. A digitally implemented preambleless demodulator for maritime and mobile data communications

    NASA Astrophysics Data System (ADS)

    Chalmers, Harvey; Shenoy, Ajit; Verahrami, Farhad B.

    The hardware design and software algorithms for a low-bit-rate, low-cost, all-digital preambleless demodulator are described. The demodulator operates under severe high-noise conditions, fast Doppler frequency shifts, large frequency offsets, and multipath fading. Sophisticated algorithms, including a fast Fourier transform (FFT)-based burst acquisition algorithm, a cycle-slip resistant carrier phase tracker, an innovative Doppler tracker, and a fast acquisition symbol synchronizer, were developed and extensively simulated for reliable burst reception. The compact digital signal processor (DSP)-based demodulator hardware uses a unique personal computer test interface for downloading test data files. The demodulator test results demonstrate a near-ideal performance within 0.2 dB of theory.

  4. Phase Response Design of Recursive All-Pass Digital Filters Using a Modified PSO Algorithm

    PubMed Central

    2015-01-01

    This paper develops a new design scheme for the phase response of an all-pass recursive digital filter. A variant of particle swarm optimization (PSO) algorithm will be utilized for solving this kind of filter design problem. It is here called the modified PSO (MPSO) algorithm in which another adjusting factor is more introduced in the velocity updating formula of the algorithm in order to improve the searching ability. In the proposed method, all of the designed filter coefficients are firstly collected to be a parameter vector and this vector is regarded as a particle of the algorithm. The MPSO with a modified velocity formula will force all particles into moving toward the optimal or near optimal solution by minimizing some defined objective function of the optimization problem. To show the effectiveness of the proposed method, two different kinds of linear phase response design examples are illustrated and the general PSO algorithm is compared as well. The obtained results show that the MPSO is superior to the general PSO for the phase response design of digital recursive all-pass filter. PMID:26366168

  5. A digital prediction algorithm for a single-phase boost PFC

    NASA Astrophysics Data System (ADS)

    Qing, Wang; Ning, Chen; Weifeng, Sun; Shengli, Lu; Longxing, Shi

    2012-12-01

    A novel digital control algorithm for digital control power factor correction is presented, which is called the prediction algorithm and has a feature of a higher PF (power factor) with lower total harmonic distortion, and a faster dynamic response with the change of the input voltage or load current. For a certain system, based on the current system state parameters, the prediction algorithm can estimate the track of the output voltage and the inductor current at the next switching cycle and get a set of optimized control sequences to perfectly track the trajectory of input voltage. The proposed prediction algorithm is verified at different conditions, and computer simulation and experimental results under multi-situations confirm the effectiveness of the prediction algorithm. Under the circumstances that the input voltage is in the range of 90-265 V and the load current in the range of 20%-100%, the PF value is larger than 0.998. The startup and the recovery times respectively are about 0.1 s and 0.02 s without overshoot. The experimental results also verify the validity of the proposed method.

  6. A classification-based assessment of the optimal spatial and spectral resolution of coastal wetland imagery

    NASA Astrophysics Data System (ADS)

    Becker, Brian L.

    Great Lakes wetlands are increasingly being recognized as vital ecosystem components that provide valuable functions such as sediment retention, wildlife habitat, and nutrient removal. Aerial photography has traditionally provided a cost effective means to inventory and monitor coastal wetlands, but is limited by its broad spectral sensitivity and non-digital format. Airborne sensor advancements have now made the acquisition of digital imagery with high spatial and spectral resolution a reality. In this investigation, we selected two Lake Huron coastal wetlands, each from a distinct eco-region, over which, digital, airborne imagery (AISA or CASI-II) was acquired. The 1-meter images contain approximately twenty, 10-nanometer-wide spectral bands strategically located throughout the visible and near-infrared. The 4-meter hyperspectral imagery contains 48 contiguous bands across the visible and short-wavelength near-infrared. Extensive, in-situ, reflectance spectra (SE-590) and sub-meter GPS locations were acquired for the dominant botanical and substrate classes field-delineated at each location. Normalized in-situ spectral signatures were subjected to Principal Components and 2nd Derivative analyses in order to identify the most botanically explanative image bands. Three image-based investigations were implemented in order to evaluate the ability of three classification algorithms (ISODATA, Spectral Angle Mapper and Maximum-Likelihood) to differentiate botanical regions-of-interest. Two additional investigations were completed in order to assess classification changes associated with the independent manipulation of both spatial and spectral resolution. Of the three algorithms tested, the Maximum-Likelihood classifier best differentiated (89%) the regions-of-interest in both study sites. Covariance-based PCA rotation consistently enhanced the performance of the Maximum-Likelihood classifier. Seven non-overlapping bands (425.4, 514.9, 560.1, 685.5, 731.5, 812.3 and 916.7 nanometers) were identified that represented the best performing bands with respect to classification performance. A spatial resolution of 2 meters or less was determined to be the as being most appropriate in Great Lakes coastal wetland environments. This research represents the first step in evaluating the effectiveness of applying high-resolution, narrow-band imagery to the detailed mapping of coastal wetlands in the Great Lakes region.

  7. Multi-frequency and polarimetric radar backscatter signatures for discrimination between agricultural crops at the Flevoland experimental test site

    NASA Technical Reports Server (NTRS)

    Freeman, A.; Villasenor, J.; Klein, J. D.

    1991-01-01

    We describe the calibration and analysis of multi-frequency, multi-polarization radar backscatter signatures over an agriculture test site in the Netherlands. The calibration procedure involved two stages: in the first stage, polarimetric and radiometric calibrations (ignoring noise) were carried out using square-base trihedral corner reflector signatures and some properties of the clutter background. In the second stage, a novel algorithm was used to estimate the noise level in the polarimetric data channels by using the measured signature of an idealized rough surface with Bragg scattering (the ocean in this case). This estimated noise level was then used to correct the measured backscatter signatures from the agriculture fields. We examine the significance of several key parameters extracted from the calibrated and noise-corrected backscatter signatures. The significance is assessed in terms of the ability to uniquely separate among classes from 13 different backscatter types selected from the test site data, including eleven different crops, one forest and one ocean area. Using the parameters with the highest separation for a given class, we use a hierarchical algorithm to classify the entire image. We find that many classes, including ocean, forest, potato, and beet, can be identified with high reliability, while the classes for which no single parameter exhibits sufficient separation have higher rates of misclassification. We expect that modified decision criteria involving simultaneous consideration of several parameters increase performance for these classes.

  8. Airborne laser scanning for forest health status assessment and radiative transfer modelling

    NASA Astrophysics Data System (ADS)

    Novotny, Jan; Zemek, Frantisek; Pikl, Miroslav; Janoutova, Ruzena

    2013-04-01

    Structural parameters of forest stands/ecosystems are an important complementary source of information to spectral signatures obtained from airborne imaging spectroscopy when quantitative assessment of forest stands are in the focus, such as estimation of forest biomass, biochemical properties (e.g. chlorophyll /water content), etc. The parameterization of radiative transfer (RT) models used in latter case requires three-dimensional spatial distribution of green foliage and woody biomass. Airborne LiDAR data acquired over forest sites bears these kinds of 3D information. The main objective of the study was to compare the results from several approaches to interpolation of digital elevation model (DEM) and digital surface model (DSM). We worked with airborne LiDAR data with different density (TopEye Mk II 1,064nm instrument, 1-5 points/m2) acquired over the Norway spruce forests situated in the Beskydy Mountains, the Czech Republic. Three different interpolation algorithms with increasing complexity were tested: i/Nearest neighbour approach implemented in the BCAL software package (Idaho Univ.); ii/Averaging and linear interpolation techniques used in the OPALS software (Vienna Univ. of Technology); iii/Active contour technique implemented in the TreeVis software (Univ. of Freiburg). We defined two spatial resolutions for the resulting coupled raster DEMs and DSMs outputs: 0.4 m and 1 m, calculated by each algorithm. The grids correspond to the same spatial resolutions of hyperspectral imagery data for which the DEMs were used in a/geometrical correction and b/building a complex tree models for radiative transfer modelling. We applied two types of analyses when comparing between results from the different interpolations/raster resolution: 1/calculated DEM or DSM between themselves; 2/comparison with field data: DEM with measurements from referential GPS, DSM - field tree alometric measurements, where tree height was calculated as DSM-DEM. The results of the analyses show that: 1/averaging techniques tend to underestimate the tree height and the generated surface does not follow the first LiDAR echoes both for 1 m and 0.4 m pixel size; 2/we did not find any significant difference between tree heights calculated by nearest neighbour algorithm and the active contour technique for 1 m pixel output but the difference increased with finer resolution (0.4 m); 3/the accuracy of the DEMs calculated by tested algorithms is similar.

  9. Merging Digital Medicine and Economics: Two Moving Averages Unlock Biosignals for Better Health.

    PubMed

    Elgendi, Mohamed

    2018-01-06

    Algorithm development in digital medicine necessitates ongoing knowledge and skills updating to match the current demands and constant progression in the field. In today's chaotic world there is an increasing trend to seek out simple solutions for complex problems that can increase efficiency, reduce resource consumption, and improve scalability. This desire has spilled over into the world of science and research where many disciplines have taken to investigating and applying more simplistic approaches. Interestingly, through a review of current literature and research efforts, it seems that the learning and teaching principles in digital medicine continue to push towards the development of sophisticated algorithms with a limited scope and has not fully embraced or encouraged a shift towards more simple solutions that yield equal or better results. This short note aims to demonstrate that within the world of digital medicine and engineering, simpler algorithms can offer effective and efficient solutions, where traditionally more complex algorithms have been used. Moreover, the note demonstrates that bridging different research disciplines is very beneficial and yields valuable insights and results.

  10. Polarimetric signature imaging of anisotropic bio-medical tissues

    NASA Astrophysics Data System (ADS)

    Wu, Stewart H.; Yang, De-Ming; Chiou, Arthur; Nee, Soe-Mie F.; Nee, Tsu-Wei

    2010-02-01

    Polarimetric imaging of Stokes vector (I, Q, U, V) can provide 4 independent signatures showing the linear and circular polarizations of biological tissues and cells. Using a recently developed Stokes digital imaging system, we measured the Stokes vector images of tissue samples from sections of rat livers containing normal portions and hematomas. The derived Mueller matrix elements can quantitatively provide multi-signature data of the bio-sample. This polarimetric optical technology is a new option of biosensing technology to inspect the structures of tissue samples, particularly for discriminating tumor and non-tumor biopsy. This technology is useful for critical disease discrimination and medical diagnostics applications.

  11. Concepts and algorithms in digital photogrammetry

    NASA Technical Reports Server (NTRS)

    Schenk, T.

    1994-01-01

    Despite much progress in digital photogrammetry, there is still a considerable lack of understanding of theories and methods which would allow a substantial increase in the automation of photogrammetric processes. The purpose of this paper is to raise awareness that the automation problem is one that cannot be solved in a bottom-up fashion by a trial-and-error approach. We present a short overview of concepts and algorithms used in digital photogrammetry. This is followed by a more detailed presentation of perceptual organization, a typical middle-level task.

  12. Automated analysis in generic groups

    NASA Astrophysics Data System (ADS)

    Fagerholm, Edvard

    This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups --- and prove ''computational soundness'' theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating ''pairing-product equations''. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an automated tool to search for structure-preserving signatures matching a template. Through exhaustive search we conjecture lower bounds for the number of pairings required in the Type~II setting and prove our conjecture to be true. Finally, our tool exhibits examples of structure-preserving signatures matching the lower bounds, which proves tightness of our bounds, as well as improves on previously known structure-preserving signature schemes.

  13. What’s in a Name: A Comparative Analysis of the United States Real ID Act and the United Kingdom’s National Identity Scheme

    DTIC Science & Technology

    2015-12-01

    has always been important to modern governments, but the issue has become much more pressing in the Internet age, when a person’s digital identity...nationals in the UK are being given different cards. 4. Place of birth 5. Signature - digitally embedded in the card 6. Date of card issue and date...license or personal identification card number  A digital photograph of the person  The person’s address of principal residence  The person’s

  14. Detection of suspicious pain regions on a digital infrared thermal image using the multimodal function optimization.

    PubMed

    Lee, Junghoon; Lee, Joosung; Song, Sangha; Lee, Hyunsook; Lee, Kyoungjoung; Yoon, Youngro

    2008-01-01

    Automatic detection of suspicious pain regions is very useful in the medical digital infrared thermal imaging research area. To detect those regions, we use the SOFES (Survival Of the Fitness kind of the Evolution Strategy) algorithm which is one of the multimodal function optimization methods. We apply this algorithm to famous diseases, such as a foot of the glycosuria, the degenerative arthritis and the varicose vein. The SOFES algorithm is available to detect some hot spots or warm lines as veins. And according to a hundred of trials, the algorithm is very fast to converge.

  15. Creating an anthropomorphic digital MR phantom—an extensible tool for comparing and evaluating quantitative imaging algorithms

    NASA Astrophysics Data System (ADS)

    Bosca, Ryan J.; Jackson, Edward F.

    2016-01-01

    Assessing and mitigating the various sources of bias and variance associated with image quantification algorithms is essential to the use of such algorithms in clinical research and practice. Assessment is usually accomplished with grid-based digital reference objects (DRO) or, more recently, digital anthropomorphic phantoms based on normal human anatomy. Publicly available digital anthropomorphic phantoms can provide a basis for generating realistic model-based DROs that incorporate the heterogeneity commonly found in pathology. Using a publicly available vascular input function (VIF) and digital anthropomorphic phantom of a normal human brain, a methodology was developed to generate a DRO based on the general kinetic model (GKM) that represented realistic and heterogeneously enhancing pathology. GKM parameters were estimated from a deidentified clinical dynamic contrast-enhanced (DCE) MRI exam. This clinical imaging volume was co-registered with a discrete tissue model, and model parameters estimated from clinical images were used to synthesize a DCE-MRI exam that consisted of normal brain tissues and a heterogeneously enhancing brain tumor. An example application of spatial smoothing was used to illustrate potential applications in assessing quantitative imaging algorithms. A voxel-wise Bland-Altman analysis demonstrated negligible differences between the parameters estimated with and without spatial smoothing (using a small radius Gaussian kernel). In this work, we reported an extensible methodology for generating model-based anthropomorphic DROs containing normal and pathological tissue that can be used to assess quantitative imaging algorithms.

  16. Secure quantum signatures: a practical quantum technology (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Andersson, Erika

    2016-10-01

    Modern cryptography encompasses much more than encryption of secret messages. Signature schemes are widely used to guarantee that messages cannot be forged or tampered with, for example in e-mail, software updates and electronic commerce. Messages are also transferrable, which distinguishes digital signatures from message authentication. Transferability means that messages can be forwarded; in other words, that a sender is unlikely to be able to make one recipient accept a message which is subsequently rejected by another recipient if the message is forwarded. Similar to public-key encryption, the security of commonly used signature schemes relies on the assumed computational difficulty of problems such as finding discrete logarithms or factoring large primes. With quantum computers, such assumptions would no longer be valid. Partly for this reason, it is desirable to develop signature schemes with unconditional or information-theoretic security. Quantum signature schemes are one possible solution. Similar to quantum key distribution (QKD), their unconditional security relies only on the laws of quantum mechanics. Quantum signatures can be realized with the same system components as QKD, but are so far less investigated. This talk aims to provide an introduction to quantum signatures and to review theoretical and experimental progress so far.

  17. SRM 2460/2461 Standard Bullets and Casings Project

    PubMed Central

    Song, J.; Whitenton, E.; Kelley, D.; Clary, R.; Ma, L.; Ballou, S.; Ols, M.

    2004-01-01

    The National Institute of Standards and Technology Standard Reference Material (SRM) 2460/2461 standard bullets and casings project will provide support to firearms examiners and to the National Integrated Ballistics Information Network (NIBIN) in the United States. The SRM bullet is designed as both a virtual and a physical bullet profile signature standard. The virtual standard is a set of six digitized bullet profile signatures originally traced from six master bullets fired at the Bureau of Alcohol, Tobacco and Firearms (ATF) and the Federal Bureau of Investigation (FBI). By using the virtual signature standard to control the tool path on a numerically controlled diamond turning machine, 40 SRM bullets were produced. A profile signature measurement system was established for the SRM bullets. The profile signature differences are quantified by the maximum of the cross correlation function and by the signature difference between pairs of compared profile signatures measured on different SRM bullets. Initial measurement results showed high reproducibility for both the measurement system and production process of the SRM bullets. A traceability scheme has been proposed to establish the measurement traceability for nationwide bullet signature measurements to NIST, ATF and FBI. Prototype SRM casings have also been developed. PMID:27366632

  18. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  19. Contour Connection Method for automated identification and classification of landslide deposits

    NASA Astrophysics Data System (ADS)

    Leshchinsky, Ben A.; Olsen, Michael J.; Tanyu, Burak F.

    2015-01-01

    Landslides are a common hazard worldwide that result in major economic, environmental and social impacts. Despite their devastating effects, inventorying existing landslides, often the regions at highest risk of reoccurrence, is challenging, time-consuming, and expensive. Current landslide mapping techniques include field inventorying, photogrammetric approaches, and use of bare-earth (BE) lidar digital terrain models (DTMs) to highlight regions of instability. However, many techniques do not have sufficient resolution, detail, and accuracy for mapping across landscape scale with the exception of using BE DTMs, which can reveal the landscape beneath vegetation and other obstructions, highlighting landslide features, including scarps, deposits, fans and more. Current approaches to landslide inventorying with lidar to create BE DTMs include manual digitizing, statistical or machine learning approaches, and use of alternate sensors (e.g., hyperspectral imaging) with lidar. This paper outlines a novel algorithm to automatically and consistently detect landslide deposits on a landscape scale. The proposed method is named as the Contour Connection Method (CCM) and is primarily based on bare earth lidar data requiring minimal user input such as the landslide scarp and deposit gradients. The CCM algorithm functions by applying contours and nodes to a map, and using vectors connecting the nodes to evaluate gradient and associated landslide features based on the user defined input criteria. Furthermore, in addition to the detection capabilities, CCM also provides an opportunity to be potentially used to classify different landscape features. This is possible because each landslide feature has a distinct set of metadata - specifically, density of connection vectors on each contour - that provides a unique signature for each landslide. In this paper, demonstrations of using CCM are presented by applying the algorithm to the region surrounding the Oso landslide in Washington (March 2014), as well as two 14,000 ha DTMs in Oregon, which were used as a comparison of CCM and manually delineated landslide deposits. The results show the capability of the CCM with limited data requirements and the agreement with manual delineation but achieving the results at a much faster time.

  20. Can specific transcriptional regulators assemble a universal cancer signature?

    NASA Astrophysics Data System (ADS)

    Roy, Janine; Isik, Zerrin; Pilarsky, Christian; Schroeder, Michael

    2013-10-01

    Recently, there is a lot of interest in using biomarker signatures derived from gene expression data to predict cancer progression. We assembled signatures of 25 published datasets covering 13 types of cancers. How do these signatures compare with each other? On one hand signatures answering the same biological question should overlap, whereas signatures predicting different cancer types should differ. On the other hand, there could also be a Universal Cancer Signature that is predictive independently of the cancer type. Initially, we generate signatures for all datasets using classical approaches such as t-test and fold change and then, we explore signatures resulting from a network-based method, that applies the random surfer model of Google's PageRank algorithm. We show that the signatures as published by the authors and the signatures generated with classical methods do not overlap - not even for the same cancer type - whereas the network-based signatures strongly overlap. Selecting 10 out of 37 universal cancer genes gives the optimal prediction for all cancers thus taking a first step towards a Universal Cancer Signature. We furthermore analyze and discuss the involved genes in terms of the Hallmarks of cancer and in particular single out SP1, JUN/FOS and NFKB1 and examine their specific role in cancer progression.

  1. Self-authentication of value documents

    NASA Astrophysics Data System (ADS)

    Hayosh, Thomas D.

    1998-04-01

    To prevent fraud it is critical to distinguish an authentic document from a counterfeit or altered document. Most current technologies rely on difficult-to-print human detectable features which are added to a document to prevent illegal reproduction. Fraud detection is mostly accomplished by human observation and is based upon the examiner's knowledge, experience and time allotted for examination of a document. Another approach to increasing the security of a value document is to add a unique property to each document. Data about that property is then encoded on the document itself and finally secured using a public key based digital signature. In such a scheme, machine readability of authenticity is possible. This paper describes a patent-applied-for methodology using the unique property of magnetic ink printing, magnetic remanence, that provides for full self- authentication when used with a recordable magnetic stripe for storing a digital signature and other document data. Traditionally the authenticity of a document is determined by physical examination for color, background printing, paper texture, printing resolution, and ink characteristics. On an initial level, there may be numerous security features present on a value document but only a few can be detected and evaluated by the untrained individual. Because security features are normally not standardized except on currency, training tellers and cashiers to do extensive security evaluation is not practical, even though these people are often the only people who get a chance to closely examine the document in a payment system which is back-end automated. In the context of this paper, one should be thinking about value documents such as commercial and personal checks although the concepts presented here can easily be applied to travelers cheques, credit cards, event tickets, passports, driver's licenses, motor vehicle titles, and even currency. For a practical self-authentication system, the false alarms should be less than 1% on the first read pass. Causes of false alarms could be the lack of robustness of the taggant discrimination algorithm, excessive document skew as it is being read, or errors in reading the recordable stripe. The false alarm rate is readily tested by reading the magnetic tags and digitally signing documents in one reader and performing authentication in at least two other reading devices. When reading the same check in the same reader where signed, the error metric is typically in the range of 0.0600. When comparing different checks in different readers, the error metric generally reports values in the range of 0.3930. It is clear from tests to date that the taggant patterns are randomly different for checks even when printed serially one after another using the same printing process. Testing results to date on the robustness of the taggant comparison and discrimination algorithms indicate that it is probable that low false alarms and very low false accept rates will be achieved.

  2. A Study of Bi-Directional Reflectance Distribution Functions and Their Effect on Infrared Signature Models

    DTIC Science & Technology

    2007-03-01

    66 3.5.1 Specular Reflection Assumption . . . . . . . . . . . . . . . . . . . 70 3.5.2 Radiosity ...69 3.31. Radiosity Algorithm Flowchart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 3.32. POV...3.5.2 Radiosity . The first algorithm implemented to attempt to hemispher- ically integrate the irradiance contribution was classical radiosity as

  3. Fuzzy Clustering of Multiple Instance Data

    DTIC Science & Technology

    2015-11-30

    depth is not. To illustrate this data, in figure 1 we display the GPR signatures of the same mine buried at 3 in deep in two geographically different...target signature depends on the soil properties of the site. The same mine type is buried at 3in deep in both sites. Since its formal introduction...drug design [15], and the problem of handwritten digit recognition [16]. To the best of our knowledge, Diet - terich, et. al [1] were the first to

  4. An XML-Based Mission Command Language for Autonomous Underwater Vehicles (AUVs)

    DTIC Science & Technology

    2003-06-01

    P. XML: How To Program . Prentice Hall, Inc. Upper Saddle River, New Jersey, 2001 Digital Signature Activity Statement, W3C www.w3.org/Signature...languages because it does not directly specify how information is to be presented, but rather defines the structure (and thus semantics) of the...command and control (C2) aspects of using XML to increase the utility of AUVs. XML programming will be addressed. Current mine warfare doctrine will be

  5. Quantification of 235U and 238U activity concentrations for undeclared nuclear materials by a digital gamma-gamma coincidence spectroscopy.

    PubMed

    Zhang, Weihua; Yi, Jing; Mekarski, Pawel; Ungar, Kurt; Hauck, Barry; Kramer, Gary H

    2011-06-01

    The purpose of this study is to investigate the possibility of verifying depleted uranium (DU), natural uranium (NU), low enriched uranium (LEU) and high enriched uranium (HEU) by a developed digital gamma-gamma coincidence spectroscopy. The spectroscopy consists of two NaI(Tl) scintillators and XIA LLC Digital Gamma Finder (DGF)/Pixie-4 software and card package. The results demonstrate that the spectroscopy provides an effective method of (235)U and (238)U quantification based on the count rate of their gamma-gamma coincidence counting signatures. The main advantages of this approach over the conventional gamma spectrometry include the facts of low background continuum near coincident signatures of (235)U and (238)U, less interference from other radionuclides by the gamma-gamma coincidence counting, and region-of-interest (ROI) imagine analysis for uranium enrichment determination. Compared to conventional gamma spectrometry, the method offers additional advantage of requiring minimal calibrations for (235)U and (238)U quantification at different sample geometries. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  6. Simulation of atmospheric and terrestrial background signatures for detection and tracking scenarios

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin

    2015-10-01

    In the fields of early warning, one is depending on reliable image exploitation: Only if the applied detection and tracking algorithms work efficiently, the threat approach alert can be given fast enough to ensure an automatic initiation of the countermeasure. In order to evaluate the performance of those algorithms for a certain electro-optical (EO) sensor system, test sequences need to be created as realistic and comprehensive as possible. Since both, background and target signature, depend on the environmental conditions, a detailed knowledge of the meteorology and climatology is necessary. Trials for measuring these environmental characteristics serve as a solid basis, but might only constitute the conditions during a rather short period of time. To represent the entire variation of meteorology and climatology that the future system will be exposed to, the application of comprehensive atmospheric modelling tools is essential. This paper gives an introduction of the atmospheric modelling tools that are currently used at Fraunhofer IOSB to simulate spectral background signatures in the infrared (IR) range. It is also demonstrated, how those signatures are affected by changing atmospheric and climatic conditions. In conclusion - and with a special focus on the modelling of different cloud types - sources of error and limits are discussed.

  7. Floating shock fitting via Lagrangian adaptive meshes

    NASA Technical Reports Server (NTRS)

    Vanrosendale, John

    1994-01-01

    In recent works we have formulated a new approach to compressible flow simulation, combining the advantages of shock-fitting and shock-capturing. Using a cell-centered Roe scheme discretization on unstructured meshes, we warp the mesh while marching to steady state, so that mesh edges align with shocks and other discontinuities. This new algorithm, the Shock-fitting Lagrangian Adaptive Method (SLAM) is, in effect, a reliable shock-capturing algorithm which yields shock-fitted accuracy at convergence. Shock-capturing algorithms like this, which warp the mesh to yield shock-fitted accuracy, are new and relatively untried. However, their potential is clear. In the context of sonic booms, accurate calculation of near-field sonic boom signatures is critical to the design of the High Speed Civil Transport (HSCT). SLAM should allow computation of accurate N-wave pressure signatures on comparatively coarse meshes, significantly enhancing our ability to design low-boom configurations for high-speed aircraft.

  8. Discovering causal signaling pathways through gene-expression patterns

    PubMed Central

    Parikh, Jignesh R.; Klinger, Bertram; Xia, Yu; Marto, Jarrod A.; Blüthgen, Nils

    2010-01-01

    High-throughput gene-expression studies result in lists of differentially expressed genes. Most current meta-analyses of these gene lists include searching for significant membership of the translated proteins in various signaling pathways. However, such membership enrichment algorithms do not provide insight into which pathways caused the genes to be differentially expressed in the first place. Here, we present an intuitive approach for discovering upstream signaling pathways responsible for regulating these differentially expressed genes. We identify consistently regulated signature genes specific for signal transduction pathways from a panel of single-pathway perturbation experiments. An algorithm that detects overrepresentation of these signature genes in a gene group of interest is used to infer the signaling pathway responsible for regulation. We expose our novel resource and algorithm through a web server called SPEED: Signaling Pathway Enrichment using Experimental Data sets. SPEED can be freely accessed at http://speed.sys-bio.net/. PMID:20494976

  9. 5 CFR 850.103 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... (asymmetric) cryptography is a method of creating a unique mark, known as a digital signature, on an... cryptography means a method of authentication in which a single key is used to sign and verify an electronic...

  10. [Digital signature: new prospects for the information of the cardiologic clinical card].

    PubMed

    Cervesato, E; Antonini-Canterin, F; Nicolosi, G L

    2001-02-01

    In the last few years, remarkable improvements have been made in computerized database systems used in cardiology. However, they will not easily lead to further relevant improvements unless the weaknesses and the gaps deriving from the obligation of forming and storing case sheets, according to law, are faced and resolved in an original way. This article covers the topic of the digital signature and how it could form the basis for a new powerful impulse to the process of informatization of cardiology records. The proposal of elaborating a totally computerized case sheet involves the need of rationalizing the flow of clinical information and of implementing a management system integrated with the hospital information system. The elimination of paper support will probably lead to an advantageous cycle that will involve the entire hospital, both clinically as well as administratively.

  11. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  12. Extraction of small boat harmonic signatures from passive sonar.

    PubMed

    Ogden, George L; Zurk, Lisa M; Jones, Mark E; Peterson, Mary E

    2011-06-01

    This paper investigates the extraction of acoustic signatures from small boats using a passive sonar system. Noise radiated from a small boats consists of broadband noise and harmonically related tones that correspond to engine and propeller specifications. A signal processing method to automatically extract the harmonic structure of noise radiated from small boats is developed. The Harmonic Extraction and Analysis Tool (HEAT) estimates the instantaneous fundamental frequency of the harmonic tones, refines the fundamental frequency estimate using a Kalman filter, and automatically extracts the amplitudes of the harmonic tonals to generate a harmonic signature for the boat. Results are presented that show the HEAT algorithms ability to extract these signatures. © 2011 Acoustical Society of America

  13. A Coincidence Signature Library for Multicoincidence Radionuclide Analysis Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Ellis, J E.; Valsan, Andrei B.

    Pacific Northwest National Laboratory (PNNL) is currently developing multicoincidence systems to perform trace radionuclide analysis at or near the sample collection point, for applications that include emergency response, nuclear forensics, and environmental monitoring. Quantifying radionuclide concentrations with these systems requires a library of accurate emission intensities for each detected signature, for all candidate radionuclides. To meet this need, a Coincidence Lookup Library (CLL) is being developed to calculate the emission intensities of coincident signatures from a user-specified radionuclide, or conversely, to determine the radionuclides that may be responsible for a specific detected coincident signature. The algorithms used to generate absolutemore » emission intensities and various query modes for our developmental CLL are described.« less

  14. An adaptive deep Q-learning strategy for handwritten digit recognition.

    PubMed

    Qiao, Junfei; Wang, Gongming; Li, Wenjing; Chen, Min

    2018-02-22

    Handwritten digits recognition is a challenging problem in recent years. Although many deep learning-based classification algorithms are studied for handwritten digits recognition, the recognition accuracy and running time still need to be further improved. In this paper, an adaptive deep Q-learning strategy is proposed to improve accuracy and shorten running time for handwritten digit recognition. The adaptive deep Q-learning strategy combines the feature-extracting capability of deep learning and the decision-making of reinforcement learning to form an adaptive Q-learning deep belief network (Q-ADBN). First, Q-ADBN extracts the features of original images using an adaptive deep auto-encoder (ADAE), and the extracted features are considered as the current states of Q-learning algorithm. Second, Q-ADBN receives Q-function (reward signal) during recognition of the current states, and the final handwritten digits recognition is implemented by maximizing the Q-function using Q-learning algorithm. Finally, experimental results from the well-known MNIST dataset show that the proposed Q-ADBN has a superiority to other similar methods in terms of accuracy and running time. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Semi-automated extraction of landslides in Taiwan based on SPOT imagery and DEMs

    NASA Astrophysics Data System (ADS)

    Eisank, Clemens; Hölbling, Daniel; Friedl, Barbara; Chen, Yi-Chin; Chang, Kang-Tsung

    2014-05-01

    The vast availability and improved quality of optical satellite data and digital elevation models (DEMs), as well as the need for complete and up-to-date landslide inventories at various spatial scales have fostered the development of semi-automated landslide recognition systems. Among the tested approaches for designing such systems, object-based image analysis (OBIA) stepped out to be a highly promising methodology. OBIA offers a flexible, spatially enabled framework for effective landslide mapping. Most object-based landslide mapping systems, however, have been tailored to specific, mainly small-scale study areas or even to single landslides only. Even though reported mapping accuracies tend to be higher than for pixel-based approaches, accuracy values are still relatively low and depend on the particular study. There is still room to improve the applicability and objectivity of object-based landslide mapping systems. The presented study aims at developing a knowledge-based landslide mapping system implemented in an OBIA environment, i.e. Trimble eCognition. In comparison to previous knowledge-based approaches, the classification of segmentation-derived multi-scale image objects relies on digital landslide signatures. These signatures hold the common operational knowledge on digital landslide mapping, as reported by 25 Taiwanese landslide experts during personal semi-structured interviews. Specifically, the signatures include information on commonly used data layers, spectral and spatial features, and feature thresholds. The signatures guide the selection and implementation of mapping rules that were finally encoded in Cognition Network Language (CNL). Multi-scale image segmentation is optimized by using the improved Estimation of Scale Parameter (ESP) tool. The approach described above is developed and tested for mapping landslides in a sub-region of the Baichi catchment in Northern Taiwan based on SPOT imagery and a high-resolution DEM. An object-based accuracy assessment is conducted by quantitatively comparing extracted landslide objects with landslide polygons that were visually interpreted by local experts. The applicability and transferability of the mapping system are evaluated by comparing initial accuracies with those achieved for the following two tests: first, usage of a SPOT image from the same year, but for a different area within the Baichi catchment; second, usage of SPOT images from multiple years for the same region. The integration of the common knowledge via digital landslide signatures is new in object-based landslide studies. In combination with strategies to optimize image segmentation this may lead to a more objective, transferable and stable knowledge-based system for the mapping of landslides from optical satellite data and DEMs.

  16. Comparison of edge analysis techniques for the determination of the MTF of digital radiographic systems.

    PubMed

    Samei, Ehsan; Buhr, Egbert; Granfors, Paul; Vandenbroucke, Dirk; Wang, Xiaohui

    2005-08-07

    The modulation transfer function (MTF) is well established as a metric to characterize the resolution performance of a digital radiographic system. Implemented by various laboratories, the edge technique is currently the most widespread approach to measure the MTF. However, there can be differences in the results attributed to differences in the analysis technique employed. The objective of this study was to determine whether comparable results can be obtained from different algorithms processing identical images representative of those of current digital radiographic systems. Five laboratories participated in a round-robin evaluation of six different algorithms including one prescribed in the International Electrotechnical Commission (IEC) 62220-1 standard. The algorithms were applied to two synthetic and 12 real edge images from different digital radiographic systems including CR, and direct- and indirect-conversion detector systems. The results were analysed in terms of variability as well as accuracy of the resulting presampled MTFs. The results indicated that differences between the individual MTFs and the mean MTF were largely below 0.02. In the case of the two simulated edge images, all algorithms yielded similar results within 0.01 of the expected true MTF. The findings indicated that all algorithms tested in this round-robin evaluation, including the IEC-prescribed algorithm, were suitable for accurate MTF determination from edge images, provided the images are not excessively noisy. The agreement of the MTF results was judged sufficient for the measurement of the MTF necessary for the determination of the DQE.

  17. An automatic fuzzy-based multi-temporal brain digital subtraction angiography image fusion algorithm using curvelet transform and content selection strategy.

    PubMed

    Momeni, Saba; Pourghassem, Hossein

    2014-08-01

    Recently image fusion has prominent role in medical image processing and is useful to diagnose and treat many diseases. Digital subtraction angiography is one of the most applicable imaging to diagnose brain vascular diseases and radiosurgery of brain. This paper proposes an automatic fuzzy-based multi-temporal fusion algorithm for 2-D digital subtraction angiography images. In this algorithm, for blood vessel map extraction, the valuable frames of brain angiography video are automatically determined to form the digital subtraction angiography images based on a novel definition of vessel dispersion generated by injected contrast material. Our proposed fusion scheme contains different fusion methods for high and low frequency contents based on the coefficient characteristic of wrapping second generation of curvelet transform and a novel content selection strategy. Our proposed content selection strategy is defined based on sample correlation of the curvelet transform coefficients. In our proposed fuzzy-based fusion scheme, the selection of curvelet coefficients are optimized by applying weighted averaging and maximum selection rules for the high frequency coefficients. For low frequency coefficients, the maximum selection rule based on local energy criterion is applied to better visual perception. Our proposed fusion algorithm is evaluated on a perfect brain angiography image dataset consisting of one hundred 2-D internal carotid rotational angiography videos. The obtained results demonstrate the effectiveness and efficiency of our proposed fusion algorithm in comparison with common and basic fusion algorithms.

  18. Using Alternative Multiplication Algorithms to "Offload" Cognition

    ERIC Educational Resources Information Center

    Jazby, Dan; Pearn, Cath

    2015-01-01

    When viewed through a lens of embedded cognition, algorithms may enable aspects of the cognitive work of multi-digit multiplication to be "offloaded" to the environmental structure created by an algorithm. This study analyses four multiplication algorithms by viewing different algorithms as enabling cognitive work to be distributed…

  19. Modeling and analysis of LWIR signature variability associated with 3D and BRDF effects

    NASA Astrophysics Data System (ADS)

    Adler-Golden, Steven; Less, David; Jin, Xuemin; Rynes, Peter

    2016-05-01

    Algorithms for retrieval of surface reflectance, emissivity or temperature from a spectral image almost always assume uniform illumination across the scene and horizontal surfaces with Lambertian reflectance. When these algorithms are used to process real 3-D scenes, the retrieved "apparent" values contain the strong, spatially dependent variations in illumination as well as surface bidirectional reflectance distribution function (BRDF) effects. This is especially problematic with horizontal or near-horizontal viewing, where many observed surfaces are vertical, and where horizontal surfaces can show strong specularity. The goals of this study are to characterize long-wavelength infrared (LWIR) signature variability in a HSI 3-D scene and develop practical methods for estimating the true surface values. We take advantage of synthetic near-horizontal imagery generated with the high-fidelity MultiService Electro-optic Signature (MuSES) model, and compare retrievals of temperature and directional-hemispherical reflectance using standard sky downwelling illumination and MuSES-based non-uniform environmental illumination.

  20. A survey of the state-of-the-art and focused research in range systems

    NASA Technical Reports Server (NTRS)

    Kung, Yao; Balakrishnan, A. V.

    1988-01-01

    In this one-year renewal of NASA Contract No. 2-304, basic research, development, and implementation in the areas of modern estimation algorithms and digital communication systems have been performed. In the first area, basic study on the conversion of general classes of practical signal processing algorithms into systolic array algorithms is considered, producing four publications. Also studied were the finite word length effects and convergence rates of lattice algorithms, producing two publications. In the second area of study, the use of efficient importance sampling simulation technique for the evaluation of digital communication system performances were studied, producing two publications.

  1. An adaptive tracker for ShipIR/NTCS

    NASA Astrophysics Data System (ADS)

    Ramaswamy, Srinivasan; Vaitekunas, David A.

    2015-05-01

    A key component in any image-based tracking system is the adaptive tracking algorithm used to segment the image into potential targets, rank-and-select the best candidate target, and the gating of the selected target to further improve tracker performance. This paper will describe a new adaptive tracker algorithm added to the naval threat countermeasure simulator (NTCS) of the NATO-standard ship signature model (ShipIR). The new adaptive tracking algorithm is an optional feature used with any of the existing internal NTCS or user-defined seeker algorithms (e.g., binary centroid, intensity centroid, and threshold intensity centroid). The algorithm segments the detected pixels into clusters, and the smallest set of clusters that meet the detection criterion is obtained by using a knapsack algorithm to identify the set of clusters that should not be used. The rectangular area containing the chosen clusters defines an inner boundary, from which a weighted centroid is calculated as the aim-point. A track-gate is then positioned around the clusters, taking into account the rate of change of the bounding area and compensating for any gimbal displacement. A sequence of scenarios is used to test the new tracking algorithm on a generic unclassified DDG ShipIR model, with and without flares, and demonstrate how some of the key seeker signals are impacted by both the ship and flare intrinsic signatures.

  2. Spectral mapping tools from the earth sciences applied to spectral microscopy data.

    PubMed

    Harris, A Thomas

    2006-08-01

    Spectral imaging, originating from the field of earth remote sensing, is a powerful tool that is being increasingly used in a wide variety of applications for material identification. Several workers have used techniques like linear spectral unmixing (LSU) to discriminate materials in images derived from spectral microscopy. However, many spectral analysis algorithms rely on assumptions that are often violated in microscopy applications. This study explores algorithms originally developed as improvements on early earth imaging techniques that can be easily translated for use with spectral microscopy. To best demonstrate the application of earth remote sensing spectral analysis tools to spectral microscopy data, earth imaging software was used to analyze data acquired with a Leica confocal microscope with mechanical spectral scanning. For this study, spectral training signatures (often referred to as endmembers) were selected with the ENVI (ITT Visual Information Solutions, Boulder, CO) "spectral hourglass" processing flow, a series of tools that use the spectrally over-determined nature of hyperspectral data to find the most spectrally pure (or spectrally unique) pixels within the data set. This set of endmember signatures was then used in the full range of mapping algorithms available in ENVI to determine locations, and in some cases subpixel abundances of endmembers. Mapping and abundance images showed a broad agreement between the spectral analysis algorithms, supported through visual assessment of output classification images and through statistical analysis of the distribution of pixels within each endmember class. The powerful spectral analysis algorithms available in COTS software, the result of decades of research in earth imaging, are easily translated to new sources of spectral data. Although the scale between earth imagery and spectral microscopy is radically different, the problem is the same: mapping material locations and abundances based on unique spectral signatures. (c) 2006 International Society for Analytical Cytology.

  3. Spectroscopic analysis and control

    DOEpatents

    Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles

    2017-04-18

    Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.

  4. Preliminary sonic boom correlation of predicted and measured levels for STS-1 entry

    NASA Technical Reports Server (NTRS)

    Garcia, F., Jr.; Morrison, K. M.; Jones, J. H.; Henderson, H. R.

    1982-01-01

    A preliminary analysis correlating peaks from sonic boom pressure signatures recorded during the descent trajectory of the Orbiter Columbia, which landed in the dry lake bed at Edwards Air Force Base (EAFB), California, with measured wind tunnel signatures extrapolated from flight altitudes to the ground has been made for Mach numbers ranging from 1.3 to 6. The flight pressure signatures were recorded by microphones positioned at ground level near the groundtrack, whereas the wind tunnel signatures were measured during a test of a 0.0041-scale model Orbiter. The agreement between overpressure estimates based on wind tunnel data using preliminary flight trajectory data and oscillograph traces from ground measurements appears reasonable at this time for the range of Mach numbers considered. More detailed studies using final flight trajectory data and digitized ground measured data will be performed.

  5. The terrain signatures of administrative units: a tool for environmental assessment.

    PubMed

    Miliaresis, George Ch

    2009-03-01

    The quantification of knowledge related to the terrain and the landuse/landcover of administrative units in Southern Greece (Peloponnesus) is performed from the CGIAR-CSI SRTM digital elevation model and the CORINE landuse/landcover database. Each administrative unit is parametrically represented by a set of attributes related to its relief. Administrative units are classified on the basis of K-means cluster analysis in an attempt to see how they are organized into groups and cluster derived geometric signatures are defined. Finally each cluster is parametrically represented on the basis of the occurrence of the Corine landuse/landcover classes included and thus, landcover signatures are derived. The geometric and the landuse/landcover signatures revealed a terrain dependent landuse/landcover organization that was used in the assessment of the forest fires impact at moderate resolution scale.

  6. Investigation of FPGA-Based Real-Time Adaptive Digital Pulse Shaping for High-Count-Rate Applications

    NASA Astrophysics Data System (ADS)

    Saxena, Shefali; Hawari, Ayman I.

    2017-07-01

    Digital signal processing techniques have been widely used in radiation spectrometry to provide improved stability and performance with compact physical size over the traditional analog signal processing. In this paper, field-programmable gate array (FPGA)-based adaptive digital pulse shaping techniques are investigated for real-time signal processing. National Instruments (NI) NI 5761 14-bit, 250-MS/s adaptor module is used for digitizing high-purity germanium (HPGe) detector's preamplifier pulses. Digital pulse processing algorithms are implemented on the NI PXIe-7975R reconfigurable FPGA (Kintex-7) using the LabVIEW FPGA module. Based on the time separation between successive input pulses, the adaptive shaping algorithm selects the optimum shaping parameters (rise time and flattop time of trapezoid-shaping filter) for each incoming signal. A digital Sallen-Key low-pass filter is implemented to enhance signal-to-noise ratio and reduce baseline drifting in trapezoid shaping. A recursive trapezoid-shaping filter algorithm is employed for pole-zero compensation of exponentially decayed (with two-decay constants) preamplifier pulses of an HPGe detector. It allows extraction of pulse height information at the beginning of each pulse, thereby reducing the pulse pileup and increasing throughput. The algorithms for RC-CR2 timing filter, baseline restoration, pile-up rejection, and pulse height determination are digitally implemented for radiation spectroscopy. Traditionally, at high-count-rate conditions, a shorter shaping time is preferred to achieve high throughput, which deteriorates energy resolution. In this paper, experimental results are presented for varying count-rate and pulse shaping conditions. Using adaptive shaping, increased throughput is accepted while preserving the energy resolution observed using the longer shaping times.

  7. Image matching for digital close-range stereo photogrammetry based on constraints of Delaunay triangulated network and epipolar-line

    NASA Astrophysics Data System (ADS)

    Zhang, K.; Sheng, Y. H.; Li, Y. Q.; Han, B.; Liang, Ch.; Sha, W.

    2006-10-01

    In the field of digital photogrammetry and computer vision, the determination of conjugate points in a stereo image pair, referred to as "image matching," is the critical step to realize automatic surveying and recognition. Traditional matching methods encounter some problems in the digital close-range stereo photogrammetry, because the change of gray-scale or texture is not obvious in the close-range stereo images. The main shortcoming of traditional matching methods is that geometric information of matching points is not fully used, which will lead to wrong matching results in regions with poor texture. To fully use the geometry and gray-scale information, a new stereo image matching algorithm is proposed in this paper considering the characteristics of digital close-range photogrammetry. Compared with the traditional matching method, the new algorithm has three improvements on image matching. Firstly, shape factor, fuzzy maths and gray-scale projection are introduced into the design of synthetical matching measure. Secondly, the topology connecting relations of matching points in Delaunay triangulated network and epipolar-line are used to decide matching order and narrow the searching scope of conjugate point of the matching point. Lastly, the theory of parameter adjustment with constraint is introduced into least square image matching to carry out subpixel level matching under epipolar-line constraint. The new algorithm is applied to actual stereo images of a building taken by digital close-range photogrammetric system. The experimental result shows that the algorithm has a higher matching speed and matching accuracy than pyramid image matching algorithm based on gray-scale correlation.

  8. Novel Blind Recognition Algorithm of Frame Synchronization Words Based on Soft-Decision in Digital Communication Systems.

    PubMed

    Qin, Jiangyi; Huang, Zhiping; Liu, Chunwu; Su, Shaojing; Zhou, Jing

    2015-01-01

    A novel blind recognition algorithm of frame synchronization words is proposed to recognize the frame synchronization words parameters in digital communication systems. In this paper, a blind recognition method of frame synchronization words based on the hard-decision is deduced in detail. And the standards of parameter recognition are given. Comparing with the blind recognition based on the hard-decision, utilizing the soft-decision can improve the accuracy of blind recognition. Therefore, combining with the characteristics of Quadrature Phase Shift Keying (QPSK) signal, an improved blind recognition algorithm based on the soft-decision is proposed. Meanwhile, the improved algorithm can be extended to other signal modulation forms. Then, the complete blind recognition steps of the hard-decision algorithm and the soft-decision algorithm are given in detail. Finally, the simulation results show that both the hard-decision algorithm and the soft-decision algorithm can recognize the parameters of frame synchronization words blindly. What's more, the improved algorithm can enhance the accuracy of blind recognition obviously.

  9. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    USGS Publications Warehouse

    Gaydos, Leonard

    1978-01-01

    The cost of classifying 5,607 square kilometers (2,165 sq. mi.) in the Portland area was less than 8 cents per square kilometer ($0.0788, or $0.2041 per square mile). Besides saving in costs, this and other signature extension techniques may be useful in completing land use and land cover mapping in other large areas where multispectral and multitemporal Landsat data are available in digital form but other source materials are generally lacking.

  10. Security analysis of boolean algebra based on Zhang-Wang digital signature scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Jinbin, E-mail: jbzheng518@163.com

    2014-10-06

    In 2005, Zhang and Wang proposed an improvement signature scheme without using one-way hash function and message redundancy. In this paper, we show that this scheme exits potential safety concerns through the analysis of boolean algebra, such as bitwise exclusive-or, and point out that mapping is not one to one between assembly instructions and machine code actually by means of the analysis of the result of the assembly program segment, and which possibly causes safety problems unknown to the software.

  11. Method and apparatus for digitally based high speed x-ray spectrometer

    DOEpatents

    Warburton, W.K.; Hubbard, B.

    1997-11-04

    A high speed, digitally based, signal processing system which accepts input data from a detector-preamplifier and produces a spectral analysis of the x-rays illuminating the detector. The system achieves high throughputs at low cost by dividing the required digital processing steps between a ``hardwired`` processor implemented in combinatorial digital logic, which detects the presence of the x-ray signals in the digitized data stream and extracts filtered estimates of their amplitudes, and a programmable digital signal processing computer, which refines the filtered amplitude estimates and bins them to produce the desired spectral analysis. One set of algorithms allow this hybrid system to match the resolution of analog systems while operating at much higher data rates. A second set of algorithms implemented in the processor allow the system to be self calibrating as well. The same processor also handles the interface to an external control computer. 19 figs.

  12. Method and apparatus for digitally based high speed x-ray spectrometer

    DOEpatents

    Warburton, William K.; Hubbard, Bradley

    1997-01-01

    A high speed, digitally based, signal processing system which accepts input data from a detector-preamplifier and produces a spectral analysis of the x-rays illuminating the detector. The system achieves high throughputs at low cost by dividing the required digital processing steps between a "hardwired" processor implemented in combinatorial digital logic, which detects the presence of the x-ray signals in the digitized data stream and extracts filtered estimates of their amplitudes, and a programmable digital signal processing computer, which refines the filtered amplitude estimates and bins them to produce the desired spectral analysis. One set of algorithms allow this hybrid system to match the resolution of analog systems while operating at much higher data rates. A second set of algorithms implemented in the processor allow the system to be self calibrating as well. The same processor also handles the interface to an external control computer.

  13. Direct modeling parameter signature analysis and failure mode prediction of physical systems using hybrid computer optimization

    NASA Technical Reports Server (NTRS)

    Drake, R. L.; Duvoisin, P. F.; Asthana, A.; Mather, T. W.

    1971-01-01

    High speed automated identification and design of dynamic systems, both linear and nonlinear, are discussed. Special emphasis is placed on developing hardware and techniques which are applicable to practical problems. The basic modeling experiment and new results are described. Using the improvements developed successful identification of several systems, including a physical example as well as simulated systems, was obtained. The advantages of parameter signature analysis over signal signature analysis in go-no go testing of operational systems were demonstrated. The feasibility of using these ideas in failure mode prediction in operating systems was also investigated. An improved digital controlled nonlinear function generator was developed, de-bugged, and completely documented.

  14. Supervised Multi-Authority Scheme with Blind Signature for IoT with Attribute Based Encryption

    NASA Astrophysics Data System (ADS)

    Nissenbaum, O. V.; Ponomarov, K. Y.; Zaharov, A. A.

    2018-04-01

    This article proposes a three-side cryptographic scheme for verifying device attributes with a Supervisor and a Certification Authority (CA) for attribute-based encryption. Two options are suggested: using a message authentication code and using a digital signature. The first version is suitable for networks with one CA, and the second one for networks with several CAs, including dynamic systems. Also, the addition of this scheme with a blind signature is proposed to preserve the confidentiality of the device attributes from the CA. The introduction gives a definition and a brief historical overview of attribute-based encryption (ABE), addresses the use of ABE in the Internet of Things.

  15. Clustering Educational Digital Library Usage Data: A Comparison of Latent Class Analysis and K-Means Algorithms

    ERIC Educational Resources Information Center

    Xu, Beijie; Recker, Mimi; Qi, Xiaojun; Flann, Nicholas; Ye, Lei

    2013-01-01

    This article examines clustering as an educational data mining method. In particular, two clustering algorithms, the widely used K-means and the model-based Latent Class Analysis, are compared, using usage data from an educational digital library service, the Instructional Architect (IA.usu.edu). Using a multi-faceted approach and multiple data…

  16. Comparison of rotation algorithms for digital images

    NASA Astrophysics Data System (ADS)

    Starovoitov, Valery V.; Samal, Dmitry

    1999-09-01

    The paper presents a comparative study of several algorithms developed for digital image rotation. No losing generality we studied gray scale images. We have tested methods preserving gray values of the original images, performing some interpolation and two procedures implemented into the Corel Photo-paint and Adobe Photoshop soft packages. By the similar way methods for rotation of color images may be evaluated also.

  17. Systematic computation with functional gene-sets among leukemic and hematopoietic stem cells reveals a favorable prognostic signature for acute myeloid leukemia.

    PubMed

    Yang, Xinan Holly; Li, Meiyi; Wang, Bin; Zhu, Wanqi; Desgardin, Aurelie; Onel, Kenan; de Jong, Jill; Chen, Jianjun; Chen, Luonan; Cunningham, John M

    2015-03-24

    Genes that regulate stem cell function are suspected to exert adverse effects on prognosis in malignancy. However, diverse cancer stem cell signatures are difficult for physicians to interpret and apply clinically. To connect the transcriptome and stem cell biology, with potential clinical applications, we propose a novel computational "gene-to-function, snapshot-to-dynamics, and biology-to-clinic" framework to uncover core functional gene-sets signatures. This framework incorporates three function-centric gene-set analysis strategies: a meta-analysis of both microarray and RNA-seq data, novel dynamic network mechanism (DNM) identification, and a personalized prognostic indicator analysis. This work uses complex disease acute myeloid leukemia (AML) as a research platform. We introduced an adjustable "soft threshold" to a functional gene-set algorithm and found that two different analysis methods identified distinct gene-set signatures from the same samples. We identified a 30-gene cluster that characterizes leukemic stem cell (LSC)-depleted cells and a 25-gene cluster that characterizes LSC-enriched cells in parallel; both mark favorable-prognosis in AML. Genes within each signature significantly share common biological processes and/or molecular functions (empirical p = 6e-5 and 0.03 respectively). The 25-gene signature reflects the abnormal development of stem cells in AML, such as AURKA over-expression. We subsequently determined that the clinical relevance of both signatures is independent of known clinical risk classifications in 214 patients with cytogenetically normal AML. We successfully validated the prognosis of both signatures in two independent cohorts of 91 and 242 patients respectively (log-rank p < 0.0015 and 0.05; empirical p < 0.015 and 0.08). The proposed algorithms and computational framework will harness systems biology research because they efficiently translate gene-sets (rather than single genes) into biological discoveries about AML and other complex diseases.

  18. Biodynamic digital holography of chemoresistance in a pre-clinical trial of canine B-cell lymphoma.

    PubMed

    Choi, Honggu; Li, Zhe; Sun, Hao; Merrill, Dan; Turek, John; Childress, Michael; Nolte, David

    2018-05-01

    Biodynamic digital holography was used to obtain phenotypic profiles of canine non-Hodgkin B-cell lymphoma biopsies treated with standard-of-care chemotherapy. Biodynamic signatures from the living 3D tissues were extracted using fluctuation spectroscopy from intracellular Doppler light scattering in response to the molecular mechanisms of action of therapeutic drugs that modify a range of internal cellular motions. The standard-of-care to treat B-cell lymphoma in both humans and dogs is a combination CHOP therapy that consists of doxorubicin, prednisolone, cyclophosphamide and vincristine. The proportion of dogs experiencing durable cancer remission following CHOP chemotherapy was 68%, with 13 out of 19 dogs responding favorably to therapy and 6 dogs failing to have progression-free survival times greater than 100 days. Biodynamic signatures were found that correlate with inferior survival times, and biomarker selection was optimized to identify specific Doppler signatures related to chemoresistance. A machine learning classifier was constructed based on feature vector correlations and linear separability in high-dimensional feature space. Hold-out validation predicted patient response to therapy with 84% accuracy. These results point to the potential for biodynamic profiling to contribute to personalized medicine by aiding the selection of chemotherapy for cancer patients.

  19. Progress in interpreting CO2 lidar signatures to obtain cirrus microphysical and optical properties

    NASA Technical Reports Server (NTRS)

    Eberhard, Wynn L.

    1993-01-01

    One cloud/radiation issue at FIRE 2 that has been addressed by the CO2 lidar team is the zenith-enhanced backscatter (ZEB) signature from oriented crystals. A second topic is narrow-beam optical depth measurements using CO2 lidar. This paper describes the theoretical models we have developed for these phenomena and the data-processing algorithms derived from them.

  20. Using a genetic algorithm as an optimal band selector in the mid and thermal infrared (2.5-14 μm) to discriminate vegetation species.

    PubMed

    Ullah, Saleem; Groen, Thomas A; Schlerf, Martin; Skidmore, Andrew K; Nieuwenhuis, Willem; Vaiphasa, Chaichoke

    2012-01-01

    Genetic variation between various plant species determines differences in their physio-chemical makeup and ultimately in their hyperspectral emissivity signatures. The hyperspectral emissivity signatures, on the one hand, account for the subtle physio-chemical changes in the vegetation, but on the other hand, highlight the problem of high dimensionality. The aim of this paper is to investigate the performance of genetic algorithms coupled with the spectral angle mapper (SAM) to identify a meaningful subset of wavebands sensitive enough to discriminate thirteen broadleaved vegetation species from the laboratory measured hyperspectral emissivities. The performance was evaluated using an overall classification accuracy and Jeffries Matusita distance. For the multiple plant species, the targeted bands based on genetic algorithms resulted in a high overall classification accuracy (90%). Concentrating on the pairwise comparison results, the selected wavebands based on genetic algorithms resulted in higher Jeffries Matusita (J-M) distances than randomly selected wavebands did. This study concludes that targeted wavebands from leaf emissivity spectra are able to discriminate vegetation species.

  1. E-learning platform for automated testing of electronic circuits using signature analysis method

    NASA Astrophysics Data System (ADS)

    Gherghina, Cǎtǎlina; Bacivarov, Angelica; Bacivarov, Ioan C.; Petricǎ, Gabriel

    2016-12-01

    Dependability of electronic circuits can be ensured only through testing of circuit modules. This is done by generating test vectors and their application to the circuit. Testability should be viewed as a concerted effort to ensure maximum efficiency throughout the product life cycle, from conception and design stage, through production to repairs during products operating. In this paper, is presented the platform developed by authors for training for testability in electronics, in general and in using signature analysis method, in particular. The platform allows highlighting the two approaches in the field namely analog and digital signature of circuits. As a part of this e-learning platform, it has been developed a database for signatures of different electronic components meant to put into the spotlight different techniques implying fault detection, and from this there were also self-repairing techniques of the systems with this kind of components. An approach for realizing self-testing circuits based on MATLAB environment and using signature analysis method is proposed. This paper analyses the benefits of signature analysis method and simulates signature analyzer performance based on the use of pseudo-random sequences, too.

  2. Blind Quantum Signature with Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Guo, Ying

    2017-04-01

    Blind quantum computation allows a client without quantum abilities to interact with a quantum server to perform a unconditional secure computing protocol, while protecting client's privacy. Motivated by confidentiality of blind quantum computation, a blind quantum signature scheme is designed with laconic structure. Different from the traditional signature schemes, the signing and verifying operations are performed through measurement-based quantum computation. Inputs of blind quantum computation are securely controlled with multi-qubit entangled states. The unique signature of the transmitted message is generated by the signer without leaking information in imperfect channels. Whereas, the receiver can verify the validity of the signature using the quantum matching algorithm. The security is guaranteed by entanglement of quantum system for blind quantum computation. It provides a potential practical application for e-commerce in the cloud computing and first-generation quantum computation.

  3. Digital Data Compression Algorithm Performance Comparisons. Proposed NATO Standard Algorithm Provides Better Facsimile in a Noisy Communications Environment Than Present Tactical Digital Facsimile Algorithm.

    DTIC Science & Technology

    1981-04-30

    VILLE 99000 VILLE OOUICILIArION 5ANCAfR( OU VENDEUR PAIS0((VN PA’’ CIE DES’NAT ON) COOT BANQUE CODE GUICHET COMMr CLIENI ctfl"OCNS OE ,PA(SON c ( 4’r...tj~r. ttm - has -J-Ay 4:-,se (in Chr -r. *.nd 10i) .K. ’-" Wihu’ua le ~...only part of the-forvard rnergy_ ks ~ open i ..- ’s:". Ct-.,-~11) and 16

  4. Traveling front solutions to directed diffusion-limited aggregation, digital search trees, and the Lempel-Ziv data compression algorithm.

    PubMed

    Majumdar, Satya N

    2003-08-01

    We use the traveling front approach to derive exact asymptotic results for the statistics of the number of particles in a class of directed diffusion-limited aggregation models on a Cayley tree. We point out that some aspects of these models are closely connected to two different problems in computer science, namely, the digital search tree problem in data structures and the Lempel-Ziv algorithm for data compression. The statistics of the number of particles studied here is related to the statistics of height in digital search trees which, in turn, is related to the statistics of the length of the longest word formed by the Lempel-Ziv algorithm. Implications of our results to these computer science problems are pointed out.

  5. Traveling front solutions to directed diffusion-limited aggregation, digital search trees, and the Lempel-Ziv data compression algorithm

    NASA Astrophysics Data System (ADS)

    Majumdar, Satya N.

    2003-08-01

    We use the traveling front approach to derive exact asymptotic results for the statistics of the number of particles in a class of directed diffusion-limited aggregation models on a Cayley tree. We point out that some aspects of these models are closely connected to two different problems in computer science, namely, the digital search tree problem in data structures and the Lempel-Ziv algorithm for data compression. The statistics of the number of particles studied here is related to the statistics of height in digital search trees which, in turn, is related to the statistics of the length of the longest word formed by the Lempel-Ziv algorithm. Implications of our results to these computer science problems are pointed out.

  6. Two-step digit-set-restricted modified signed-digit addition-subtraction algorithm and its optoelectronic implementation.

    PubMed

    Qian, F; Li, G; Ruan, H; Jing, H; Liu, L

    1999-09-10

    A novel, to our knowledge, two-step digit-set-restricted modified signed-digit (MSD) addition-subtraction algorithm is proposed. With the introduction of the reference digits, the operand words are mapped into an intermediate carry word with all digits restricted to the set {1, 0} and an intermediate sum word with all digits restricted to the set {0, 1}, which can be summed to form the final result without carry generation. The operation can be performed in parallel by use of binary logic. An optical system that utilizes an electron-trapping device is suggested for accomplishing the required binary logic operations. By programming of the illumination of data arrays, any complex logic operations of multiple variables can be realized without additional temporal latency of the intermediate results. This technique has a high space-bandwidth product and signal-to-noise ratio. The main structure can be stacked to construct a compact optoelectronic MSD adder-subtracter.

  7. Algorithmic psychometrics and the scalable subject.

    PubMed

    Stark, Luke

    2018-04-01

    Recent public controversies, ranging from the 2014 Facebook 'emotional contagion' study to psychographic data profiling by Cambridge Analytica in the 2016 American presidential election, Brexit referendum and elsewhere, signal watershed moments in which the intersecting trajectories of psychology and computer science have become matters of public concern. The entangled history of these two fields grounds the application of applied psychological techniques to digital technologies, and an investment in applying calculability to human subjectivity. Today, a quantifiable psychological subject position has been translated, via 'big data' sets and algorithmic analysis, into a model subject amenable to classification through digital media platforms. I term this position the 'scalable subject', arguing it has been shaped and made legible by algorithmic psychometrics - a broad set of affordances in digital platforms shaped by psychology and the behavioral sciences. In describing the contours of this 'scalable subject', this paper highlights the urgent need for renewed attention from STS scholars on the psy sciences, and on a computational politics attentive to psychology, emotional expression, and sociality via digital media.

  8. A Survey of Metal Lines at High-redshift. I. SDSS Absorption Line Studies—the Methodology and First Search Results for O VI

    NASA Astrophysics Data System (ADS)

    Frank, S.; Mathur, S.; Pieri, M.; York, D. G.

    2010-09-01

    We report the results of a systematic search for signatures of metal lines in quasar spectra of the Sloan Digital Sky Survey (SDSS) data release 3 (DR3), focusing on finding intervening absorbers via detection of their O VI doublet. Here, we present the search algorithm and criteria for distinguishing candidates from spurious Lyα forest lines. In addition, we compare our findings with simulations of the Lyα forest in order to estimate the detectability of O VI doublets over various redshift intervals. We have obtained a sample of 1756 O VI doublet candidates with rest-frame equivalent width (EW) >=0.05 Å in 855 active galactic nuclei spectra (out of 3702 objects with redshifts in the accessible range for O VI detection). This sample is further subdivided into three groups according to the likelihood of being real and the potential for follow-up observation of the candidate. The group with the cleanest and most secure candidates is comprised of 145 candidates. Sixty-nine of these reside at a velocity separation >=5000 km s-1 from the QSO and can therefore be classified tentatively as intervening absorbers. Most of these absorbers have not been picked up by earlier, automated QSO absorption line detection algorithms. This sample increases the number of known O VI absorbers at redshifts beyond z abs>= 2.7 substantially.

  9. Identification of Single- and Multiple-Class Specific Signature Genes from Gene Expression Profiles by Group Marker Index

    PubMed Central

    Tsai, Yu-Shuen; Aguan, Kripamoy; Pal, Nikhil R.; Chung, I-Fang

    2011-01-01

    Informative genes from microarray data can be used to construct prediction model and investigate biological mechanisms. Differentially expressed genes, the main targets of most gene selection methods, can be classified as single- and multiple-class specific signature genes. Here, we present a novel gene selection algorithm based on a Group Marker Index (GMI), which is intuitive, of low-computational complexity, and efficient in identification of both types of genes. Most gene selection methods identify only single-class specific signature genes and cannot identify multiple-class specific signature genes easily. Our algorithm can detect de novo certain conditions of multiple-class specificity of a gene and makes use of a novel non-parametric indicator to assess the discrimination ability between classes. Our method is effective even when the sample size is small as well as when the class sizes are significantly different. To compare the effectiveness and robustness we formulate an intuitive template-based method and use four well-known datasets. We demonstrate that our algorithm outperforms the template-based method in difficult cases with unbalanced distribution. Moreover, the multiple-class specific genes are good biomarkers and play important roles in biological pathways. Our literature survey supports that the proposed method identifies unique multiple-class specific marker genes (not reported earlier to be related to cancer) in the Central Nervous System data. It also discovers unique biomarkers indicating the intrinsic difference between subtypes of lung cancer. We also associate the pathway information with the multiple-class specific signature genes and cross-reference to published studies. We find that the identified genes participate in the pathways directly involved in cancer development in leukemia data. Our method gives a promising way to find genes that can involve in pathways of multiple diseases and hence opens up the possibility of using an existing drug on other diseases as well as designing a single drug for multiple diseases. PMID:21909426

  10. Expecting the Unexpected: Towards Robust Credential Infrastructure

    NASA Astrophysics Data System (ADS)

    Xu, Shouhuai; Yung, Moti

    Cryptographic credential infrastructures, such as Public key infrastructure (PKI), allow the building of trust relationships in electronic society and electronic commerce. At the center of credential infrastructures is the methodology of digital signatures. However, methods that assure that credentials and signed messages possess trustworthiness and longevity are not well understood, nor are they adequately addressed in both literature and practice. We believe that, as a basic engineering principle, these properties have to be built into the credential infrastructure rather than be treated as an after-thought since they are crucial to the long term success of this notion. In this paper we present a step in the direction of dealing with these issues. Specifically, we present the basic engineering reasoning as well as a model that helps understand (somewhat formally) the trustworthiness and longevity of digital signatures, and then we give basic mechanisms that help improve these notions.

  11. Method for exponentiating in cryptographic systems

    DOEpatents

    Brickell, Ernest F.; Gordon, Daniel M.; McCurley, Kevin S.

    1994-01-01

    An improved cryptographic method utilizing exponentiation is provided which has the advantage of reducing the number of multiplications required to determine the legitimacy of a message or user. The basic method comprises the steps of selecting a key from a preapproved group of integer keys g; exponentiating the key by an integer value e, where e represents a digital signature, to generate a value g.sup.e ; transmitting the value g.sup.e to a remote facility by a communications network; receiving the value g.sup.e at the remote facility; and verifying the digital signature as originating from the legitimate user. The exponentiating step comprises the steps of initializing a plurality of memory locations with a plurality of values g.sup.xi ; computi The United States Government has rights in this invention pursuant to Contract No. DE-AC04-76DP00789 between the Department of Energy and AT&T Company.

  12. Digital asset management.

    PubMed

    Humphrey, Clinton D; Tollefson, Travis T; Kriet, J David

    2010-05-01

    Facial plastic surgeons are accumulating massive digital image databases with the evolution of photodocumentation and widespread adoption of digital photography. Managing and maximizing the utility of these vast data repositories, or digital asset management (DAM), is a persistent challenge. Developing a DAM workflow that incorporates a file naming algorithm and metadata assignment will increase the utility of a surgeon's digital images. Copyright 2010 Elsevier Inc. All rights reserved.

  13. Content-based video retrieval by example video clip

    NASA Astrophysics Data System (ADS)

    Dimitrova, Nevenka; Abdel-Mottaleb, Mohamed

    1997-01-01

    This paper presents a novel approach for video retrieval from a large archive of MPEG or Motion JPEG compressed video clips. We introduce a retrieval algorithm that takes a video clip as a query and searches the database for clips with similar contents. Video clips are characterized by a sequence of representative frame signatures, which are constructed from DC coefficients and motion information (`DC+M' signatures). The similarity between two video clips is determined by using their respective signatures. This method facilitates retrieval of clips for the purpose of video editing, broadcast news retrieval, or copyright violation detection.

  14. Algorithms for extraction of structural attitudes from 3D outcrop models

    NASA Astrophysics Data System (ADS)

    Duelis Viana, Camila; Endlein, Arthur; Ademar da Cruz Campanha, Ginaldo; Henrique Grohmann, Carlos

    2016-05-01

    The acquisition of geological attitudes on rock cuts using traditional field compass survey can be a time consuming, dangerous, or even impossible task depending on the conditions and location of outcrops. The importance of this type of data in rock-mass classifications and structural geology has led to the development of new techniques, in which the application of photogrammetric 3D digital models has had an increasing use. In this paper we present two algorithms for extraction of attitudes of geological discontinuities from virtual outcrop models: ply2atti and scanline, implemented with the Python programming language. The ply2atti algorithm allows for the virtual sampling of planar discontinuities appearing on the 3D model as individual exposed surfaces, while the scanline algorithm allows the sampling of discontinuities (surfaces and traces) along a virtual scanline. Application to digital models of a simplified test setup and a rock cut demonstrated a good correlation between the surveys undertaken using traditional field compass reading and virtual sampling on 3D digital models.

  15. Digital Terrain from a Two-Step Segmentation and Outlier-Based Algorithm

    NASA Astrophysics Data System (ADS)

    Hingee, Kassel; Caccetta, Peter; Caccetta, Louis; Wu, Xiaoliang; Devereaux, Drew

    2016-06-01

    We present a novel ground filter for remotely sensed height data. Our filter has two phases: the first phase segments the DSM with a slope threshold and uses gradient direction to identify candidate ground segments; the second phase fits surfaces to the candidate ground points and removes outliers. Digital terrain is obtained by a surface fit to the final set of ground points. We tested the new algorithm on digital surface models (DSMs) for a 9600km2 region around Perth, Australia. This region contains a large mix of land uses (urban, grassland, native forest and plantation forest) and includes both a sandy coastal plain and a hillier region (elevations up to 0.5km). The DSMs are captured annually at 0.2m resolution using aerial stereo photography, resulting in 1.2TB of input data per annum. Overall accuracy of the filter was estimated to be 89.6% and on a small semi-rural subset our algorithm was found to have 40% fewer errors compared to Inpho's Match-T algorithm.

  16. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  17. An FPGA Noise Resistant Digital Temperature Sensor with Auto Calibration

    DTIC Science & Technology

    2012-03-01

    temperature sensor [6] . . . . . . . . . . . . . . 14 9 Two different digital temperature sensor placement algorithms: (a) Grid placement (b) Optimal...create a grid over the FPGA. While this method works reasonably well, it requires many sensors, some of which are unnecessary. The optimal placement, on...temperature sensor placement algorithms: (a) Grid placement (b) Optimal Placement [7] 16 2.4 Summary Integrated circuits’ sensitivity to temperatures has

  18. Interactive Digital Signal Processor

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1985-01-01

    Interactive Digital Signal Processor, IDSP, consists of set of time series analysis "operators" based on various algorithms commonly used for digital signal analysis. Processing of digital signal time series to extract information usually achieved by applications of number of fairly standard operations. IDSP excellent teaching tool for demonstrating application for time series operators to artificially generated signals.

  19. Performance Analysis of Blind Subspace-Based Signature Estimation Algorithms for DS-CDMA Systems with Unknown Correlated Noise

    NASA Astrophysics Data System (ADS)

    Zarifi, Keyvan; Gershman, Alex B.

    2006-12-01

    We analyze the performance of two popular blind subspace-based signature waveform estimation techniques proposed by Wang and Poor and Buzzi and Poor for direct-sequence code division multiple-access (DS-CDMA) systems with unknown correlated noise. Using the first-order perturbation theory, analytical expressions for the mean-square error (MSE) of these algorithms are derived. We also obtain simple high SNR approximations of the MSE expressions which explicitly clarify how the performance of these techniques depends on the environmental parameters and how it is related to that of the conventional techniques that are based on the standard white noise assumption. Numerical examples further verify the consistency of the obtained analytical results with simulation results.

  20. Real-time demonstration hardware for enhanced DPCM video compression algorithm

    NASA Technical Reports Server (NTRS)

    Bizon, Thomas P.; Whyte, Wayne A., Jr.; Marcopoli, Vincent R.

    1992-01-01

    The lack of available wideband digital links as well as the complexity of implementation of bandwidth efficient digital video CODECs (encoder/decoder) has worked to keep the cost of digital television transmission too high to compete with analog methods. Terrestrial and satellite video service providers, however, are now recognizing the potential gains that digital video compression offers and are proposing to incorporate compression systems to increase the number of available program channels. NASA is similarly recognizing the benefits of and trend toward digital video compression techniques for transmission of high quality video from space and therefore, has developed a digital television bandwidth compression algorithm to process standard National Television Systems Committee (NTSC) composite color television signals. The algorithm is based on differential pulse code modulation (DPCM), but additionally utilizes a non-adaptive predictor, non-uniform quantizer and multilevel Huffman coder to reduce the data rate substantially below that achievable with straight DPCM. The non-adaptive predictor and multilevel Huffman coder combine to set this technique apart from other DPCM encoding algorithms. All processing is done on a intra-field basis to prevent motion degradation and minimize hardware complexity. Computer simulations have shown the algorithm will produce broadcast quality reconstructed video at an average transmission rate of 1.8 bits/pixel. Hardware implementation of the DPCM circuit, non-adaptive predictor and non-uniform quantizer has been completed, providing realtime demonstration of the image quality at full video rates. Video sampling/reconstruction circuits have also been constructed to accomplish the analog video processing necessary for the real-time demonstration. Performance results for the completed hardware compare favorably with simulation results. Hardware implementation of the multilevel Huffman encoder/decoder is currently under development along with implementation of a buffer control algorithm to accommodate the variable data rate output of the multilevel Huffman encoder. A video CODEC of this type could be used to compress NTSC color television signals where high quality reconstruction is desirable (e.g., Space Station video transmission, transmission direct-to-the-home via direct broadcast satellite systems or cable television distribution to system headends and direct-to-the-home).

  1. Digital sorting of complex tissues for cell type-specific gene expression profiles.

    PubMed

    Zhong, Yi; Wan, Ying-Wooi; Pang, Kaifang; Chow, Lionel M L; Liu, Zhandong

    2013-03-07

    Cellular heterogeneity is present in almost all gene expression profiles. However, transcriptome analysis of tissue specimens often ignores the cellular heterogeneity present in these samples. Standard deconvolution algorithms require prior knowledge of the cell type frequencies within a tissue or their in vitro expression profiles. Furthermore, these algorithms tend to report biased estimations. Here, we describe a Digital Sorting Algorithm (DSA) for extracting cell-type specific gene expression profiles from mixed tissue samples that is unbiased and does not require prior knowledge of cell type frequencies. The results suggest that DSA is a specific and sensitivity algorithm in gene expression profile deconvolution and will be useful in studying individual cell types of complex tissues.

  2. Generation of high-dynamic range image from digital photo

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Potemin, Igor S.; Zhdanov, Dmitry D.; Wang, Xu-yang; Cheng, Han

    2016-10-01

    A number of the modern applications such as medical imaging, remote sensing satellites imaging, virtual prototyping etc use the High Dynamic Range Image (HDRI). Generally to obtain HDRI from ordinary digital image the camera is calibrated. The article proposes the camera calibration method based on the clear sky as the standard light source and takes sky luminance from CIE sky model for the corresponding geographical coordinates and time. The article considers base algorithms for getting real luminance values from ordinary digital image and corresponding programmed implementation of the algorithms. Moreover, examples of HDRI reconstructed from ordinary images illustrate the article.

  3. The design of digital-adaptive controllers for VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Stengel, R. F.; Broussard, J. R.; Berry, P. W.

    1976-01-01

    Design procedures for VTOL automatic control systems have been developed and are presented. Using linear-optimal estimation and control techniques as a starting point, digital-adaptive control laws have been designed for the VALT Research Aircraft, a tandem-rotor helicopter which is equipped for fully automatic flight in terminal area operations. These control laws are designed to interface with velocity-command and attitude-command guidance logic, which could be used in short-haul VTOL operations. Developments reported here include new algorithms for designing non-zero-set-point digital regulators, design procedures for rate-limited systems, and algorithms for dynamic control trim setting.

  4. Novel Perspectives on the Characterization of Species-Dependent Optical Signatures of Bacterial Colonies by Digital Holography.

    PubMed

    Buzalewicz, Igor; Kujawińska, Małgorzata; Krauze, Wojciech; Podbielska, Halina

    2016-01-01

    The use of light diffraction for the microbiological diagnosis of bacterial colonies was a significant breakthrough with widespread implications for the food industry and clinical practice. We previously confirmed that optical sensors for bacterial colony light diffraction can be used for bacterial identification. This paper is focused on the novel perspectives of this method based on digital in-line holography (DIH), which is able to reconstruct the amplitude and phase properties of examined objects, as well as the amplitude and phase patterns of the optical field scattered/diffracted by the bacterial colony in any chosen observation plane behind the object from single digital hologram. Analysis of the amplitude and phase patterns inside a colony revealed its unique optical properties, which are associated with the internal structure and geometry of the bacterial colony. Moreover, on a computational level, it is possible to select the desired scattered/diffracted pattern within the entire observation volume that exhibits the largest amount of unique, differentiating bacterial features. These properties distinguish this method from the already proposed sensing techniques based on light diffraction/scattering of bacterial colonies. The reconstructed diffraction patterns have a similar spatial distribution as the recorded Fresnel patterns, previously applied for bacterial identification with over 98% accuracy, but they are characterized by both intensity and phase distributions. Our results using digital holography provide new optical discriminators of bacterial species revealed in one single step in form of new optical signatures of bacterial colonies: digital holograms, reconstructed amplitude and phase patterns, as well as diffraction patterns from all observation space, which exhibit species-dependent features. To the best of our knowledge, this is the first report on bacterial colony analysis via digital holography and our study represents an innovative approach to the subject.

  5. Novel Perspectives on the Characterization of Species-Dependent Optical Signatures of Bacterial Colonies by Digital Holography

    PubMed Central

    Buzalewicz, Igor; Kujawińska, Małgorzata; Krauze, Wojciech; Podbielska, Halina

    2016-01-01

    The use of light diffraction for the microbiological diagnosis of bacterial colonies was a significant breakthrough with widespread implications for the food industry and clinical practice. We previously confirmed that optical sensors for bacterial colony light diffraction can be used for bacterial identification. This paper is focused on the novel perspectives of this method based on digital in-line holography (DIH), which is able to reconstruct the amplitude and phase properties of examined objects, as well as the amplitude and phase patterns of the optical field scattered/diffracted by the bacterial colony in any chosen observation plane behind the object from single digital hologram. Analysis of the amplitude and phase patterns inside a colony revealed its unique optical properties, which are associated with the internal structure and geometry of the bacterial colony. Moreover, on a computational level, it is possible to select the desired scattered/diffracted pattern within the entire observation volume that exhibits the largest amount of unique, differentiating bacterial features. These properties distinguish this method from the already proposed sensing techniques based on light diffraction/scattering of bacterial colonies. The reconstructed diffraction patterns have a similar spatial distribution as the recorded Fresnel patterns, previously applied for bacterial identification with over 98% accuracy, but they are characterized by both intensity and phase distributions. Our results using digital holography provide new optical discriminators of bacterial species revealed in one single step in form of new optical signatures of bacterial colonies: digital holograms, reconstructed amplitude and phase patterns, as well as diffraction patterns from all observation space, which exhibit species-dependent features. To the best of our knowledge, this is the first report on bacterial colony analysis via digital holography and our study represents an innovative approach to the subject. PMID:26943121

  6. Optical spectral signatures of liquids by means of fiber optic technology for product and quality parameter identification

    NASA Astrophysics Data System (ADS)

    Mignani, A. G.; Ciaccheri, L.; Mencaglia, A. A.; Diaz-Herrera, N.; Garcia-Allende, P. B.; Ottevaere, H.; Thienpont, H.; Attilio, C.; Cimato, A.; Francalanci, S.; Paccagnini, A.; Pavone, F. S.

    2009-01-01

    Absorption spectroscopy in the wide 200-1700 nm spectral range is carried out by means of optical fiber instrumentation to achieve a digital mapping of liquids for the prediction of important quality parameters. Extra virgin olive oils from Italy and lubricant oils from turbines with different degrees of degradation were considered as "case studies". The spectral data were processed by means of multivariate analysis so as to obtain a correlation to quality parameters. In practice, the wide range absorption spectra were considered as an optical signature of the liquids from which to extract product quality information. The optical signatures of extra virgin olive oils were used to predict the content of the most important fatty acids. The optical signatures of lubricant oils were used to predict the concentration of the most important parameters for indicating the oil's degree of degradation, such as TAN, JOAP anti-wear index, and water content.

  7. The FBI compression standard for digitized fingerprint images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brislawn, C.M.; Bradley, J.N.; Onyshczak, R.J.

    1996-10-01

    The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the currentmore » status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.« less

  8. FBI compression standard for digitized fingerprint images

    NASA Astrophysics Data System (ADS)

    Brislawn, Christopher M.; Bradley, Jonathan N.; Onyshczak, Remigius J.; Hopper, Thomas

    1996-11-01

    The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the current status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.

  9. A Nonlinear Digital Control Solution for a DC/DC Power Converter

    NASA Technical Reports Server (NTRS)

    Zhu, Minshao

    2002-01-01

    A digital Nonlinear Proportional-Integral-Derivative (NPID) control algorithm was proposed to control a 1-kW, PWM, DC/DC, switching power converter. The NPID methodology is introduced and a practical hardware control solution is obtained. The design of the controller was completed using Matlab (trademark) Simulink, while the hardware-in-the-loop testing was performed using both the dSPACE (trademark) rapid prototyping system, and a stand-alone Texas Instruments (trademark) Digital Signal Processor (DSP)-based system. The final Nonlinear digital control algorithm was implemented and tested using the ED408043-1 Westinghouse DC-DC switching power converter. The NPID test results are discussed and compared to the results of a standard Proportional-Integral (PI) controller.

  10. 78 FR 47785 - Notice of Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... Standards and Technology (NIST) Federal Information Processing Standard (FIPS) 201: Personal Identity...), address, employment history, biometric identifiers (e.g. fingerprints), signature, digital photograph... use of other forms of information technology. Comments submitted in response to this notice will be...

  11. 78 FR 47784 - Notice of Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... Standards and Technology (NIST) Federal Information Processing Standard (FIPS) 201: Personal Identity...), address, employment history, biometric identifiers (e.g. fingerprints), signature, digital photograph... collection techniques or the use of other forms of information technology. Comments submitted in response to...

  12. Repetitive element signature-based visualization, distance computation, and classification of 1766 microbial genomes.

    PubMed

    Lee, Kang-Hoon; Shin, Kyung-Seop; Lim, Debora; Kim, Woo-Chan; Chung, Byung Chang; Han, Gyu-Bum; Roh, Jeongkyu; Cho, Dong-Ho; Cho, Kiho

    2015-07-01

    The genomes of living organisms are populated with pleomorphic repetitive elements (REs) of varying densities. Our hypothesis that genomic RE landscapes are species/strain/individual-specific was implemented into the Genome Signature Imaging system to visualize and compute the RE-based signatures of any genome. Following the occurrence profiling of 5-nucleotide REs/words, the information from top-50 frequency words was transformed into a genome-specific signature and visualized as Genome Signature Images (GSIs), using a CMYK scheme. An algorithm for computing distances among GSIs was formulated using the GSIs' variables (word identity, frequency, and frequency order). The utility of the GSI-distance computation system was demonstrated with control genomes. GSI-based computation of genome-relatedness among 1766 microbes (117 archaea and 1649 bacteria) identified their clustering patterns; although the majority paralleled the established classification, some did not. The Genome Signature Imaging system, with its visualization and distance computation functions, enables genome-scale evolutionary studies involving numerous genomes with varying sizes. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Color constancy by characterization of illumination chromaticity

    NASA Astrophysics Data System (ADS)

    Nikkanen, Jarno T.

    2011-05-01

    Computational color constancy algorithms play a key role in achieving desired color reproduction in digital cameras. Failure to estimate illumination chromaticity correctly will result in invalid overall colour cast in the image that will be easily detected by human observers. A new algorithm is presented for computational color constancy. Low computational complexity and low memory requirement make the algorithm suitable for resource-limited camera devices, such as consumer digital cameras and camera phones. Operation of the algorithm relies on characterization of the range of possible illumination chromaticities in terms of camera sensor response. The fact that only illumination chromaticity is characterized instead of the full color gamut, for example, increases robustness against variations in sensor characteristics and against failure of diagonal model of illumination change. Multiple databases are used in order to demonstrate the good performance of the algorithm in comparison to the state-of-the-art color constancy algorithms.

  14. EROS Data Center Landsat digital enhancement techniques and imagery availability

    USGS Publications Warehouse

    Rohde, Wayne G.; Lo, Jinn Kai; Pohl, Russell A.

    1978-01-01

    The US Geological Survey's EROS Data Center (EDC) is experimenting with the production of digitally enhanced Landsat imagery. Advanced digital image processing techniques are used to perform geometric and radiometric corrections and to perform contrast and edge enhancements. The enhanced image product is produced from digitally preprocessed Landsat computer compatible tapes (CCTs) on a laser beam film recording system. Landsat CCT data have several geometric distortions which are corrected when NASA produces the standard film products. When producing film images from CCT's, geometric correction of the data is required. The EDC Digital Image Enhancement System (EDIES) compensates for geometric distortions introduced by Earth's rotation, variable line length, non-uniform mirror scan velocity, and detector misregistration. Radiometric anomalies such as bad data lines and striping are common to many Landsat film products and are also in the CCT data. Bad data lines or line segments with more than 150 contiguous bad pixels are corrected by inserting data from the previous line in place of the bad data. Striping, caused by variations in detector gain and offset, is removed with a destriping algorithm applied after digitally enhancing the data. Image enhancement is performed by applying a linear contrast stretch and an edge enhancement algorithm. The linear contrast enhancement algorithm is designed to expand digitally the full range of useful data recorded on the CCT over the range of 256 digital counts. This minimizes the effect of atmospheric scattering and saturates the relative brightness of highly reflecting features such as clouds or snow. It is the intent that no meaningful terrain data are eliminated by the digital processing. The edge enhancement algorithm is designed to enhance boundaries between terrain features that exhibit subtle differences in brightness values along edges of features. After the digital data have been processed, data for each Landsat band are recorded on black-and-white film with a laser beam film recorder (LBR). The LBR corrects for aspect ratio distortions as the digital data are recorded on the recording film over a preselected density range. Positive transparencies of MSS bands 4, 5, and 7 produced by the LBR are used to make color composite transparencies. Color film positives are made photographically from first generation black-and-white products generated on the LBR.

  15. Automatic focusing in digital holography and its application to stretched holograms.

    PubMed

    Memmolo, P; Distante, C; Paturzo, M; Finizio, A; Ferraro, P; Javidi, B

    2011-05-15

    The searching and recovering of the correct reconstruction distance in digital holography (DH) can be a cumbersome and subjective procedure. Here we report on an algorithm for automatically estimating the in-focus image and recovering the correct reconstruction distance for speckle holograms. We have tested the approach in determining the reconstruction distances of stretched digital holograms. Stretching a hologram with a variable elongation parameter makes it possible to change the in-focus distance of the reconstructed image. In this way, the proposed algorithm can be verified at different distances by dispensing the recording of different holograms. Experimental results are shown with the aim of demonstrating the usefulness of the proposed method, and a comparative analysis has been performed with respect to other existing algorithms developed for DH. © 2011 Optical Society of America

  16. Hyperspectral target detection analysis of a cluttered scene from a virtual airborne sensor platform using MuSES

    NASA Astrophysics Data System (ADS)

    Packard, Corey D.; Viola, Timothy S.; Klein, Mark D.

    2017-10-01

    The ability to predict spectral electro-optical (EO) signatures for various targets against realistic, cluttered backgrounds is paramount for rigorous signature evaluation. Knowledge of background and target signatures, including plumes, is essential for a variety of scientific and defense-related applications including contrast analysis, camouflage development, automatic target recognition (ATR) algorithm development and scene material classification. The capability to simulate any desired mission scenario with forecast or historical weather is a tremendous asset for defense agencies, serving as a complement to (or substitute for) target and background signature measurement campaigns. In this paper, a systematic process for the physical temperature and visible-through-infrared radiance prediction of several diverse targets in a cluttered natural environment scene is presented. The ability of a virtual airborne sensor platform to detect and differentiate targets from a cluttered background, from a variety of sensor perspectives and across numerous wavelengths in differing atmospheric conditions, is considered. The process described utilizes the thermal and radiance simulation software MuSES and provides a repeatable, accurate approach for analyzing wavelength-dependent background and target (including plume) signatures in multiple band-integrated wavebands (multispectral) or hyperspectrally. The engineering workflow required to combine 3D geometric descriptions, thermal material properties, natural weather boundary conditions, all modes of heat transfer and spectral surface properties is summarized. This procedure includes geometric scene creation, material and optical property attribution, and transient physical temperature prediction. Radiance renderings, based on ray-tracing and the Sandford-Robertson BRDF model, are coupled with MODTRAN for the inclusion of atmospheric effects. This virtual hyperspectral/multispectral radiance prediction methodology has been extensively validated and provides a flexible process for signature evaluation and algorithm development.

  17. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  18. Examining live cell cultures during apoptosis by digital holographic phase imaging and Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Khmaladze, Alexander

    2017-11-01

    Cellular apoptosis is a unique, organized series of events, leading to programmed cell death. In this work, we present a combined digital holography/Raman spectroscopy technique to study live cell cultures during apoptosis. Digital holographic microscopy measurements of live cell cultures yield information about cell shape and volume, changes to which are indicative of alterations in cell cycle and initiation of cell death mechanisms. Raman spectroscopic measurements provide complementary information about cells, such as protein, lipid and nucleic acid content, and the spectral signatures associated with structural changes in molecules. Our work indicates that the chemical changes in proteins, which were detected by Raman measurements, preceded morphological changes, which were seen with digital holographic microscopy.

  19. SMV⊥: Simplex of maximal volume based upon the Gram-Schmidt process

    NASA Astrophysics Data System (ADS)

    Salazar-Vazquez, Jairo; Mendez-Vazquez, Andres

    2015-10-01

    In recent years, different algorithms for Hyperspectral Image (HI) analysis have been introduced. The high spectral resolution of these images allows to develop different algorithms for target detection, material mapping, and material identification for applications in Agriculture, Security and Defense, Industry, etc. Therefore, from the computer science's point of view, there is fertile field of research for improving and developing algorithms in HI analysis. In some applications, the spectral pixels of a HI can be classified using laboratory spectral signatures. Nevertheless, for many others, there is no enough available prior information or spectral signatures, making any analysis a difficult task. One of the most popular algorithms for the HI analysis is the N-FINDR because it is easy to understand and provides a way to unmix the original HI in the respective material compositions. The N-FINDR is computationally expensive and its performance depends on a random initialization process. This paper proposes a novel idea to reduce the complexity of the N-FINDR by implementing a bottom-up approach based in an observation from linear algebra and the use of the Gram-Schmidt process. Therefore, the Simplex of Maximal Volume Perpendicular (SMV⊥) algorithm is proposed for fast endmember extraction in hyperspectral imagery. This novel algorithm has complexity O(n) with respect to the number of pixels. In addition, the evidence shows that SMV⊥ calculates a bigger volume, and has lower computational time complexity than other poular algorithms on synthetic and real scenarios.

  20. Can SNOMED CT be squeezed without losing its shape?

    PubMed

    López-García, Pablo; Schulz, Stefan

    2016-09-21

    In biomedical applications where the size and complexity of SNOMED CT become problematic, using a smaller subset that can act as a reasonable substitute is usually preferred. In a special class of use cases-like ontology-based quality assurance, or when performing scaling experiments for real-time performance-it is essential that modules show a similar shape than SNOMED CT in terms of concept distribution per sub-hierarchy. Exactly how to extract such balanced modules remains unclear, as most previous work on ontology modularization has focused on other problems. In this study, we investigate to what extent extracting balanced modules that preserve the original shape of SNOMED CT is possible, by presenting and evaluating an iterative algorithm. We used a graph-traversal modularization approach based on an input signature. To conform to our definition of a balanced module, we implemented an iterative algorithm that carefully bootstraped and dynamically adjusted the signature at each step. We measured the error for each sub-hierarchy and defined convergence as a residual sum of squares <1. Using 2000 concepts as an initial signature, our algorithm converged after seven iterations and extracted a module 4.7 % the size of SNOMED CT. Seven sub-hierarhies were either over or under-represented within a range of 1-8 %. Our study shows that balanced modules from large terminologies can be extracted using ontology graph-traversal modularization techniques under certain conditions: that the process is repeated a number of times, the input signature is dynamically adjusted in each iteration, and a moderate under/over-representation of some hierarchies is tolerated. In the case of SNOMED CT, our results conclusively show that it can be squeezed to less than 5 % of its size without any sub-hierarchy losing its shape more than 8 %, which is likely sufficient in most use cases.

  1. An Iterative Time Windowed Signature Algorithm for Time Dependent Transcription Module Discovery

    PubMed Central

    Meng, Jia; Gao, Shou-Jiang; Huang, Yufei

    2010-01-01

    An algorithm for the discovery of time varying modules using genome-wide expression data is present here. When applied to large-scale time serious data, our method is designed to discover not only the transcription modules but also their timing information, which is rarely annotated by the existing approaches. Rather than assuming commonly defined time constant transcription modules, a module is depicted as a set of genes that are co-regulated during a specific period of time, i.e., a time dependent transcription module (TDTM). A rigorous mathematical definition of TDTM is provided, which is serve as an objective function for retrieving modules. Based on the definition, an effective signature algorithm is proposed that iteratively searches the transcription modules from the time series data. The proposed method was tested on the simulated systems and applied to the human time series microarray data during Kaposi's sarcoma-associated herpesvirus (KSHV) infection. The result has been verified by Expression Analysis Systematic Explorer. PMID:21552463

  2. Digital pulse processing for planar TlBr detectors, optimized for ballistic deficit and charge-trapping effect

    NASA Astrophysics Data System (ADS)

    Nakhostin, M.; Hitomi, K.

    2012-05-01

    The energy resolution of thallium bromide (TlBr) detectors is significantly limited by charge-trapping effect and pulse ballistic deficit, caused by the slow charge collection time. A digital pulse processing algorithm has been developed aiming to compensate for charge-trapping effect, while minimizing pulse ballistic deficit. The algorithm is examined using a 1 mm thick TlBr detector and an excellent energy resolution of 3.37% at 662 keV is achieved at room temperature. The pulse processing algorithms are presented in recursive form, suitable for real-time implementations.

  3. Experiences on developing digital down conversion algorithms using Xilinx system generator

    NASA Astrophysics Data System (ADS)

    Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi

    2013-07-01

    The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.

  4. [Research and realization of signal processing algorithms based on FPGA in digital ophthalmic ultrasonography imaging].

    PubMed

    Fang, Simin; Zhou, Sheng; Wang, Xiaochun; Ye, Qingsheng; Tian, Ling; Ji, Jianjun; Wang, Yanqun

    2015-01-01

    To design and improve signal processing algorithms of ophthalmic ultrasonography based on FPGA. Achieved three signal processing modules: full parallel distributed dynamic filter, digital quadrature demodulation, logarithmic compression, using Verilog HDL hardware language in Quartus II. Compared to the original system, the hardware cost is reduced, the whole image shows clearer and more information of the deep eyeball contained in the image, the depth of detection increases from 5 cm to 6 cm. The new algorithms meet the design requirements and achieve the system's optimization that they can effectively improve the image quality of existing equipment.

  5. Comparative Study Of Image Enhancement Algorithms For Digital And Film Mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgado-Gonzalez, A.; Sanmiguel, R. E.

    2008-08-11

    Here we discuss the application of edge enhancement algorithms on images obtained with a Mammography System which has a Selenium Detector and on the other hand, on images obtained from digitized film mammography. Comparative analysis of such images includes the study of technical aspects of image acquisition, storage, compression and display. A protocol for a local database has been created as a result of this study.

  6. Robust and real-time rotor control with magnetic bearings

    NASA Technical Reports Server (NTRS)

    Sinha, A.; Wang, K. W.; Mease, K. L.

    1991-01-01

    This paper deals with the sliding mode control of a rigid rotor via radial magnetic bearings. The digital control algorithm and the results from numerical simulations are presented for an experimental rig. The experimental system which has been set up to digitally implement and validate the sliding mode control algorithm is described. Two methods for the development of control softwares are presented. Experimental results for individual rotor axis are discussed.

  7. Information theoretic analysis of linear shift-invariant edge-detection operators

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2012-06-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the influences by the image gathering process. However, experiments show that the image gathering process has a profound impact on the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. We perform an end-to-end information theory based system analysis to assess linear shift-invariant edge-detection algorithms. We evaluate the performance of the different algorithms as a function of the characteristics of the scene and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge-detection algorithm is regarded as having high performance only if the information rate from the scene to the edge image approaches its maximum possible. This goal can be achieved only by jointly optimizing all processes. Our information-theoretic assessment provides a new tool that allows us to compare different linear shift-invariant edge detectors in a common environment.

  8. ArrayVigil: a methodology for statistical comparison of gene signatures using segregated-one-tailed (SOT) Wilcoxon's signed-rank test.

    PubMed

    Khan, Haseeb Ahmad

    2005-01-28

    Due to versatile diagnostic and prognostic fidelity molecular signatures or fingerprints are anticipated as the most powerful tools for cancer management in the near future. Notwithstanding the experimental advancements in microarray technology, methods for analyzing either whole arrays or gene signatures have not been firmly established. Recently, an algorithm, ArraySolver has been reported by Khan for two-group comparison of microarray gene expression data using two-tailed Wilcoxon signed-rank test. Most of the molecular signatures are composed of two sets of genes (hybrid signatures) wherein up-regulation of one set and down-regulation of the other set collectively define the purpose of a gene signature. Since the direction of a selected gene's expression (positive or negative) with respect to a particular disease condition is known, application of one-tailed statistics could be a more relevant choice. A novel method, ArrayVigil, is described for comparing hybrid signatures using segregated-one-tailed (SOT) Wilcoxon signed-rank test and the results compared with integrated-two-tailed (ITT) procedures (SPSS and ArraySolver). ArrayVigil resulted in lower P values than those obtained from ITT statistics while comparing real data from four signatures.

  9. Authenticity examination of compressed audio recordings using detection of multiple compression and encoders' identification.

    PubMed

    Korycki, Rafal

    2014-05-01

    Since the appearance of digital audio recordings, audio authentication has been becoming increasingly difficult. The currently available technologies and free editing software allow a forger to cut or paste any single word without audible artifacts. Nowadays, the only method referring to digital audio files commonly approved by forensic experts is the ENF criterion. It consists in fluctuation analysis of the mains frequency induced in electronic circuits of recording devices. Therefore, its effectiveness is strictly dependent on the presence of mains signal in the recording, which is a rare occurrence. Recently, much attention has been paid to authenticity analysis of compressed multimedia files and several solutions were proposed for detection of double compression in both digital video and digital audio. This paper addresses the problem of tampering detection in compressed audio files and discusses new methods that can be used for authenticity analysis of digital recordings. Presented approaches consist in evaluation of statistical features extracted from the MDCT coefficients as well as other parameters that may be obtained from compressed audio files. Calculated feature vectors are used for training selected machine learning algorithms. The detection of multiple compression covers up tampering activities as well as identification of traces of montage in digital audio recordings. To enhance the methods' robustness an encoder identification algorithm was developed and applied based on analysis of inherent parameters of compression. The effectiveness of tampering detection algorithms is tested on a predefined large music database consisting of nearly one million of compressed audio files. The influence of compression algorithms' parameters on the classification performance is discussed, based on the results of the current study. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. [Orthogonal Vector Projection Algorithm for Spectral Unmixing].

    PubMed

    Song, Mei-ping; Xu, Xing-wei; Chang, Chein-I; An, Ju-bai; Yao, Li

    2015-12-01

    Spectrum unmixing is an important part of hyperspectral technologies, which is essential for material quantity analysis in hyperspectral imagery. Most linear unmixing algorithms require computations of matrix multiplication and matrix inversion or matrix determination. These are difficult for programming, especially hard for realization on hardware. At the same time, the computation costs of the algorithms increase significantly as the number of endmembers grows. Here, based on the traditional algorithm Orthogonal Subspace Projection, a new method called. Orthogonal Vector Projection is prompted using orthogonal principle. It simplifies this process by avoiding matrix multiplication and inversion. It firstly computes the final orthogonal vector via Gram-Schmidt process for each endmember spectrum. And then, these orthogonal vectors are used as projection vector for the pixel signature. The unconstrained abundance can be obtained directly by projecting the signature to the projection vectors, and computing the ratio of projected vector length and orthogonal vector length. Compared to the Orthogonal Subspace Projection and Least Squares Error algorithms, this method does not need matrix inversion, which is much computation costing and hard to implement on hardware. It just completes the orthogonalization process by repeated vector operations, easy for application on both parallel computation and hardware. The reasonability of the algorithm is proved by its relationship with Orthogonal Sub-space Projection and Least Squares Error algorithms. And its computational complexity is also compared with the other two algorithms', which is the lowest one. At last, the experimental results on synthetic image and real image are also provided, giving another evidence for effectiveness of the method.

  11. Simple algorithms for digital pulse-shape discrimination with liquid scintillation detectors

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2015-01-01

    The development of compact, battery-powered digital liquid scintillation neutron detection systems for field applications requires digital pulse processing (DPP) algorithms with minimum computational overhead. To meet this demand, two DPP algorithms for the discrimination of neutron and γ-rays with liquid scintillation detectors were developed and examined by using a NE213 liquid scintillation detector in a mixed radiation field. The first algorithm is based on the relation between the amplitude of a current pulse at the output of a photomultiplier tube and the amount of charge contained in the pulse. A figure-of-merit (FOM) value of 0.98 with 450 keVee (electron equivalent energy) energy threshold was achieved with this method when pulses were sampled at 250 MSample/s and with 8-bit resolution. Compared to the similar method of charge-comparison this method requires only a single integration window, thereby reducing the amount of computations by approximately 40%. The second approach is a digital version of the trailing-edge constant-fraction discrimination method. A FOM value of 0.84 with an energy threshold of 450 keVee was achieved with this method. In comparison with the similar method of rise-time discrimination this method requires a single time pick-off, thereby reducing the amount of computations by approximately 50%. The algorithms described in this work are useful for developing portable detection systems for applications such as homeland security, radiation dosimetry and environmental monitoring.

  12. A parallel algorithm for viewshed analysis in three-dimensional Digital Earth

    NASA Astrophysics Data System (ADS)

    Feng, Wang; Gang, Wang; Deji, Pan; Yuan, Liu; Liuzhong, Yang; Hongbo, Wang

    2015-02-01

    Viewshed analysis, often supported by geographic information systems, is widely used in the three-dimensional (3D) Digital Earth system. Many of the analyzes involve the siting of features and real-timedecision-making. Viewshed analysis is usually performed at a large scale, which poses substantial computational challenges, as geographic datasets continue to become increasingly large. Previous research on viewshed analysis has been generally limited to a single data structure (i.e., DEM), which cannot be used to analyze viewsheds in complicated scenes. In this paper, a real-time algorithm for viewshed analysis in Digital Earth is presented using the parallel computing of graphics processing units (GPUs). An occlusion for each geometric entity in the neighbor space of the viewshed point is generated according to line-of-sight. The region within the occlusion is marked by a stencil buffer within the programmable 3D visualization pipeline. The marked region is drawn with red color concurrently. In contrast to traditional algorithms based on line-of-sight, the new algorithm, in which the viewshed calculation is integrated with the rendering module, is more efficient and stable. This proposed method of viewshed generation is closer to the reality of the virtual geographic environment. No DEM interpolation, which is seen as a computational burden, is needed. The algorithm was implemented in a 3D Digital Earth system (GeoBeans3D) with the DirectX application programming interface (API) and has been widely used in a range of applications.

  13. Exploring a Physically Based Tool for Lightning Cessation: A Preliminary Study

    NASA Technical Reports Server (NTRS)

    Schultz, Elise V.; Petersen, Walter a.; Carey, Lawrence D.; Deierling, Wiebke

    2010-01-01

    The University of Alabama in Huntsville (UA Huntsville) and NASA's Marshall Space Flight Center are collaborating with the 45th Weather Squadron (45WS) at Cape Canaveral Air Force Station (CCAFS) to enable improved nowcasting of lightning cessation. The project centers on use of dual-polarimetric radar capabilities, and in particular, the new C-band dual-polarimetric weather radar acquired by the 45WS. Special emphasis is placed on the development of a physically based operational algorithm to predict lightning cessation. While previous studies have developed statistically based lightning cessation algorithms, we believe that dual-polarimetric radar variables offer the possibility to improve existing algorithms through the inclusion of physically meaningful trends reflecting interactions between in-cloud electric fields and microphysics. Specifically, decades of polarimetric radar research using propagation differential phase has demonstrated the presence of distinct phase and ice crystal alignment signatures in the presence of strong electric fields associated with lightning. One question yet to be addressed is: To what extent can these ice-crystal alignment signatures be used to nowcast the cessation of lightning activity in a given storm? Accordingly, data from the UA Huntsville Advanced Radar for Meteorological and Operational Research (ARMOR) along with the North Alabama Lightning Mapping Array are used in this study to investigate the radar signatures present before and after lightning cessation. A summary of preliminary results will be presented.

  14. Exploring a Physically Based Tool for Lightning Cessation: Preliminary Results

    NASA Technical Reports Server (NTRS)

    Schultz, Elsie V.; Petersen, Walter A.; Carey, Lawrence D.; Buechler, Dennis E.; Gatlin, Patrick N.

    2010-01-01

    The University of Alabama in Huntsville (UAHuntsville) and NASA s Marshall Space Flight Center are collaborating with the 45th Weather Squadron (45WS) at Cape Canaveral Air Force Station (CCAFS) to enable improved nowcasting of lightning cessation. The project centers on use of dual-polarimetric radar capabilities, and in particular, the new C-band dual-polarimetric weather radar acquired by the 45WS. Special emphasis is placed on the development of a physically based operational algorithm to predict lightning cessation. While previous studies have developed statistically based lightning cessation algorithms, we believe that dual-polarimetric radar variables offer the possibility to improve existing algorithms through the inclusion of physically meaningful trends reflecting interactions between in-cloud electric fields and microphysics. Specifically, decades of polarimetric radar research using propagation differential phase has demonstrated the presence of distinct phase and ice crystal alignment signatures in the presence of strong electric fields associated with lightning. One question yet to be addressed is: To what extent can these ice-crystal alignment signatures be used to nowcast the cessation of lightning activity in a given storm? Accordingly, data from the UAHuntsville Advanced Radar for Meteorological and Operational Research (ARMOR) along with the North Alabama Lightning Mapping Array are used in this study to investigate the radar signatures present before and after lightning cessation. A summary of preliminary results will be presented.

  15. Target detection using the background model from the topological anomaly detection algorithm

    NASA Astrophysics Data System (ADS)

    Dorado Munoz, Leidy P.; Messinger, David W.; Ziemann, Amanda K.

    2013-05-01

    The Topological Anomaly Detection (TAD) algorithm has been used as an anomaly detector in hyperspectral and multispectral images. TAD is an algorithm based on graph theory that constructs a topological model of the background in a scene, and computes an anomalousness ranking for all of the pixels in the image with respect to the background in order to identify pixels with uncommon or strange spectral signatures. The pixels that are modeled as background are clustered into groups or connected components, which could be representative of spectral signatures of materials present in the background. Therefore, the idea of using the background components given by TAD in target detection is explored in this paper. In this way, these connected components are characterized in three different approaches, where the mean signature and endmembers for each component are calculated and used as background basis vectors in Orthogonal Subspace Projection (OSP) and Adaptive Subspace Detector (ASD). Likewise, the covariance matrix of those connected components is estimated and used in detectors: Constrained Energy Minimization (CEM) and Adaptive Coherence Estimator (ACE). The performance of these approaches and the different detectors is compared with a global approach, where the background characterization is derived directly from the image. Experiments and results using self-test data set provided as part of the RIT blind test target detection project are shown.

  16. On the Rapid Computation of Various Polylogarithmic Constants

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Borwein, Peter; Plouffe, Simon

    1996-01-01

    We give algorithms for the computation of the d-th digit of certain transcendental numbers in various bases. These algorithms can be easily implemented (multiple precision arithmetic is not needed), require virtually no memory, and feature run times that scale nearly linearly with the order of the digit desired. They make it feasible to compute, for example, the billionth binary digit of log(2) or pi on a modest workstation in a few hours run time. We demonstrate this technique by computing the ten billionth hexadecimal digit of pi, the billionth hexadecimal digits of pi-squared, log(2) and log-squared(2), and the ten billionth decimal digit of log(9/10). These calculations rest on the observation that very special types of identities exist for certain numbers like pi, pi-squared, log(2) and log-squared(2). These are essentially polylogarithmic ladders in an integer base. A number of these identities that we derive in this work appear to be new, for example a critical identity for pi.

  17. A novel blinding digital watermark algorithm based on lab color space

    NASA Astrophysics Data System (ADS)

    Dong, Bing-feng; Qiu, Yun-jie; Lu, Hong-tao

    2010-02-01

    It is necessary for blinding digital image watermark algorithm to extract watermark information without any extra information except the watermarked image itself. But most of the current blinding watermark algorithms have the same disadvantage: besides the watermarked image, they also need the size and other information about the original image when extracting the watermark. This paper presents an innovative blinding color image watermark algorithm based on Lab color space, which does not have the disadvantages mentioned above. This algorithm first marks the watermark region size and position through embedding some regular blocks called anchor points in image spatial domain, and then embeds the watermark into the image. In doing so, the watermark information can be easily extracted after doing cropping and scale change to the image. Experimental results show that the algorithm is particularly robust against the color adjusting and geometry transformation. This algorithm has already been used in a copyright protecting project and works very well.

  18. Gamma ray spectroscopy employing divalent europium-doped alkaline earth halides and digital readout for accurate histogramming

    DOEpatents

    Cherepy, Nerine Jane; Payne, Stephen Anthony; Drury, Owen B; Sturm, Benjamin W

    2014-11-11

    A scintillator radiation detector system according to one embodiment includes a scintillator; and a processing device for processing pulse traces corresponding to light pulses from the scintillator, wherein pulse digitization is used to improve energy resolution of the system. A scintillator radiation detector system according to another embodiment includes a processing device for fitting digitized scintillation waveforms to an algorithm based on identifying rise and decay times and performing a direct integration of fit parameters. A method according to yet another embodiment includes processing pulse traces corresponding to light pulses from a scintillator, wherein pulse digitization is used to improve energy resolution of the system. A method in a further embodiment includes fitting digitized scintillation waveforms to an algorithm based on identifying rise and decay times; and performing a direct integration of fit parameters. Additional systems and methods are also presented.

  19. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, Brad M.; Nathan, Diane L.; Wang Yan

    Purpose: The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., 'FOR PROCESSING') andmore » vendor postprocessed (i.e., 'FOR PRESENTATION'), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. Methods: This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Results: Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r= 0.82, p < 0.001) and processed (r= 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r= 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's {kappa}{>=} 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). Conclusions: The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies.« less

  20. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation

    PubMed Central

    Keller, Brad M.; Nathan, Diane L.; Wang, Yan; Zheng, Yuanjie; Gee, James C.; Conant, Emily F.; Kontos, Despina

    2012-01-01

    Purpose: The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., “FOR PROCESSING”) and vendor postprocessed (i.e., “FOR PRESENTATION”), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. Methods: This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Results: Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r = 0.82, p < 0.001) and processed (r = 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r = 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's κ ≥ 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). Conclusions: The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies. PMID:22894417

  1. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation.

    PubMed

    Keller, Brad M; Nathan, Diane L; Wang, Yan; Zheng, Yuanjie; Gee, James C; Conant, Emily F; Kontos, Despina

    2012-08-01

    The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., "FOR PROCESSING") and vendor postprocessed (i.e., "FOR PRESENTATION"), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r = 0.82, p < 0.001) and processed (r = 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r = 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's κ ≥ 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies.

  2. An assessment of Landsat MSS and TM data for urban and near-urban land-cover digital classification

    NASA Technical Reports Server (NTRS)

    Haack, Barry; Bryant, Nevin; Adams, Steven

    1987-01-01

    The information content of Landsat TM and MSS data was examined to assess the ability to digitally differentiate urban and near-urban land covers around Miami, FL. This examination included comparisons of unsupervised signature extractions for various cover types, training site statistics for intraclass and interclass separability, and band and band combination selection from an 11-band multisensor data set. The principal analytical tool used in this study was transformed divergence calculations. The TM digital data are typically more useful than the MSS data in the homogeneous near-urban land-covers and less useful in the heterogeneous urban areas.

  3. 78 FR 43145 - Announcing Approval of Federal Information Processing Standard 186-4, Digital Signature Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-19

    ... correction of wording and typographical errors, and further aligns the FIPS with Key Cryptography Standard... Cryptography Standard (PKCS) 1. NIST published a Federal Register Notice (77 FR 21538) on April 10, 2012 to...

  4. 10 CFR 2.4 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (including security plans, procedures, and equipment) for the physical protection of source, byproduct, or... computer that contains the participant's name, e-mail address, and participant's digital signature, proves... inspection. It is also the place where NRC makes computer terminals available to access the Publicly...

  5. 10 CFR 2.4 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... (including security plans, procedures, and equipment) for the physical protection of source, byproduct, or... computer that contains the participant's name, e-mail address, and participant's digital signature, proves... inspection. It is also the place where NRC makes computer terminals available to access the Publicly...

  6. Digitized adiabatic quantum computing with a superconducting circuit.

    PubMed

    Barends, R; Shabani, A; Lamata, L; Kelly, J; Mezzacapo, A; Las Heras, U; Babbush, R; Fowler, A G; Campbell, B; Chen, Yu; Chen, Z; Chiaro, B; Dunsworth, A; Jeffrey, E; Lucero, E; Megrant, A; Mutus, J Y; Neeley, M; Neill, C; O'Malley, P J J; Quintana, C; Roushan, P; Sank, D; Vainsencher, A; Wenner, J; White, T C; Solano, E; Neven, H; Martinis, John M

    2016-06-09

    Quantum mechanics can help to solve complex problems in physics and chemistry, provided they can be programmed in a physical device. In adiabatic quantum computing, a system is slowly evolved from the ground state of a simple initial Hamiltonian to a final Hamiltonian that encodes a computational problem. The appeal of this approach lies in the combination of simplicity and generality; in principle, any problem can be encoded. In practice, applications are restricted by limited connectivity, available interactions and noise. A complementary approach is digital quantum computing, which enables the construction of arbitrary interactions and is compatible with error correction, but uses quantum circuit algorithms that are problem-specific. Here we combine the advantages of both approaches by implementing digitized adiabatic quantum computing in a superconducting system. We tomographically probe the system during the digitized evolution and explore the scaling of errors with system size. We then let the full system find the solution to random instances of the one-dimensional Ising problem as well as problem Hamiltonians that involve more complex interactions. This digital quantum simulation of the adiabatic algorithm consists of up to nine qubits and up to 1,000 quantum logic gates. The demonstration of digitized adiabatic quantum computing in the solid state opens a path to synthesizing long-range correlations and solving complex computational problems. When combined with fault-tolerance, our approach becomes a general-purpose algorithm that is scalable.

  7. Design and Implementation of Hybrid CORDIC Algorithm Based on Phase Rotation Estimation for NCO

    PubMed Central

    Zhang, Chaozhu; Han, Jinan; Li, Ke

    2014-01-01

    The numerical controlled oscillator has wide application in radar, digital receiver, and software radio system. Firstly, this paper introduces the traditional CORDIC algorithm. Then in order to improve computing speed and save resources, this paper proposes a kind of hybrid CORDIC algorithm based on phase rotation estimation applied in numerical controlled oscillator (NCO). Through estimating the direction of part phase rotation, the algorithm reduces part phase rotation and add-subtract unit, so that it decreases delay. Furthermore, the paper simulates and implements the numerical controlled oscillator by Quartus II software and Modelsim software. Finally, simulation results indicate that the improvement over traditional CORDIC algorithm is achieved in terms of ease of computation, resource utilization, and computing speed/delay while maintaining the precision. It is suitable for high speed and precision digital modulation and demodulation. PMID:25110750

  8. Blood Based Biomarkers of Early Onset Breast Cancer

    DTIC Science & Technology

    2016-12-01

    discretizes the data, and also using logistic elastic net – a form of linear regression - we were unable to build a classifier that could accurately...classifier for differentiating cases from controls off discretized data. The first pass analysis demonstrated a 35 gene signature that differentiated...to the discretized data for mRNA gene signature, the samples used to “train” were also included in the final samples used to “test” the algorithm

  9. Kinematics of Signature Writing in Healthy Aging*

    PubMed Central

    Caligiuri, Michael P.; Kim, Chi; Landy, Kelly M.

    2014-01-01

    Forensic document examiners (FDE) called upon to distinguish a genuine from a forged signature of an elderly person are often required to consider the question of age-related deterioration and whether the available exemplars reliably capture the natural effects of aging of the original writer. An understanding of the statistical relationship between advanced age and handwriting movements can reduce the uncertainty that may exist in an examiner’s approach to questioned signatures formed by elderly writers. The primary purpose of this study was to systematically examine age-related changes in signature kinematics in healthy writers. Forty-two healthy subjects between the ages of 60–91 years participated in this study. Signatures were recorded using a digitizing tablet and commercial software was used to examine the temporal and spatial stroke kinematics and pen pressure. Results indicated that vertical stroke duration and dysfluency increased with age, whereas vertical stroke amplitude and velocity decreased with age. Pen pressure decreased with age. We found that a linear model characterized the best-fit relationship between advanced age and handwriting movement parameters for signature formation. Male writers exhibited stronger age effects than female writers, especially for pen pressure and stroke dysfluency. The present study contributes to an understanding of how advanced age alters signature formation in otherwise healthy adults. PMID:24673648

  10. Digital accumulators in phase and frequency tracking loops

    NASA Technical Reports Server (NTRS)

    Hinedi, Sami; Statman, Joseph I.

    1990-01-01

    Results on the effects of digital accumulators in phase and frequency tracking loops are presented. Digital accumulators or summers are used extensively in digital signal processing to perform averaging or to reduce processing rates to acceptable levels. For tracking the Doppler of high-dynamic targets at low carrier-to-noise ratios, it is shown through simulation and experiment that digital accumulators can contribute an additional loss in operating threshold. This loss was not considered in any previous study and needs to be accounted for in performance prediction analysis. Simulation and measurement results are used to characterize the loss due to the digital summers for three different tracking loops: a digital phase-locked loop, a cross-product automatic frequency tracking loop, and an extended Kalman filter. The tracking algorithms are compared with respect to their frequency error performance and their ability to maintain lock during severe maneuvers at various carrier-to-noise ratios. It is shown that failure to account for the effect of accumulators can result in an inaccurate performance prediction, the extent of which depends highly on the algorithm used.

  11. Development of a compact and cost effective multi-input digital signal processing system

    NASA Astrophysics Data System (ADS)

    Darvish-Molla, Sahar; Chin, Kenrick; Prestwich, William V.; Byun, Soo Hyun

    2018-01-01

    A prototype digital signal processing system (DSP) was developed using a microcontroller interfaced with a 12-bit sampling ADC, which offers a considerably inexpensive solution for processing multiple detectors with high throughput. After digitization of the incoming pulses, in order to maximize the output counting rate, a simple algorithm was employed for pulse height analysis. Moreover, an algorithm aiming at the real-time pulse pile-up deconvolution was implemented. The system was tested using a NaI(Tl) detector in comparison with a traditional analogue and commercial digital systems for a variety of count rates. The performance of the prototype system was consistently superior to the analogue and the commercial digital systems up to the input count rate of 61 kcps while was slightly inferior to the commercial digital system but still superior to the analogue system in the higher input rates. Considering overall cost, size and flexibility, this custom made multi-input digital signal processing system (MMI-DSP) was the best reliable choice for the purpose of the 2D microdosimetric data collection, or for any measurement in which simultaneous multi-data collection is required.

  12. Fringe pattern demodulation with a two-dimensional digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-dimensional digital phase-locked loop (DPLL) for fringe pattern demodulation is presented. This algorithm is more suitable for demodulation of fringe patterns with varying phase in two directions than the existing DPLL techniques that assume that the phase of the fringe patterns varies only in one direction. The two-dimensional DPLL technique assumes that the phase of a fringe pattern is continuous in both directions and takes advantage of the phase continuity; consequently, the algorithm has better noise performance than the existing DPLL schemes. The two-dimensional DPLL algorithm is also suitable for demodulation of fringe patterns with low sampling rates, and it outperforms the Fourier fringe analysis technique in this aspect.

  13. DSP Synthesis Algorithm for Generating Florida Scrub Jay Calls

    NASA Technical Reports Server (NTRS)

    Lane, John; Pittman, Tyler

    2017-01-01

    A prototype digital signal processing (DSP) algorithm has been developed to approximate Florida scrub jay calls. The Florida scrub jay (Aphelocoma coerulescens), believed to have been in existence for 2 million years, living only in Florida, has a complicated social system that is evident by examining the spectrograms of its calls. Audio data was acquired at the Helen and Allan Cruickshank Sanctuary, Rockledge, Florida during the 2016 mating season using three digital recorders sampling at 44.1 kHz. The synthesis algorithm is a first step at developing a robust identification and call analysis algorithm. Since the Florida scrub jay is severely threatened by loss of habitat, it is important to develop effective methods to monitor their threatened population using autonomous means.

  14. Gamut extension for cinema: psychophysical evaluation of the state of the art and a new algorithm

    NASA Astrophysics Data System (ADS)

    Zamir, Syed Waqas; Vazquez-Corral, Javier; Bertalmío, Marcelo

    2015-03-01

    Wide gamut digital display technology, in order to show its full potential in terms of colors, is creating an opportunity to develop gamut extension algorithms (GEAs). To this end, in this work we present two contributions. First we report a psychophysical evaluation of GEAs specifically for cinema using a digital cinema projector under cinematic (low ambient light) conditions; to the best of our knowledge this is the first evaluation of this kind reported in the literature. Second, we propose a new GEA by introducing simple but key modifications to the algorithm of Zamir et al. This new algorithm performs well in terms of skin tones and memory colors, with results that look natural and which are free from artifacts.

  15. An Invisible Text Watermarking Algorithm using Image Watermark

    NASA Astrophysics Data System (ADS)

    Jalil, Zunera; Mirza, Anwar M.

    Copyright protection of digital contents is very necessary in today's digital world with efficient communication mediums as internet. Text is the dominant part of the internet contents and there are very limited techniques available for text protection. This paper presents a novel algorithm for protection of plain text, which embeds the logo image of the copyright owner in the text and this logo can be extracted from the text later to prove ownership. The algorithm is robust against content-preserving modifications and at the same time, is capable of detecting malicious tampering. Experimental results demonstrate the effectiveness of the algorithm against tampering attacks by calculating normalized hamming distances. The results are also compared with a recent work in this domain

  16. Spent Fuel Assay with an Ultra-High Rate HPGe Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fast, James; Fulsom, Bryan; Pitts, Karl

    2015-07-01

    Traditional verification of spent nuclear fuel (SNF) includes determination of initial enrichment, burnup and cool down time (IE, BU, CT). Along with neutron measurements, passive gamma assay provides important information for determining BU and CT. Other gamma-ray-based assay methods such as passive tomography and active delayed gamma offer the potential to measure the spatial distribution of fission products and the fissile isotopic concentration of the fuel, respectively. All fuel verification methods involving gamma-ray spectroscopy require that the spectrometers manage very high count rates while extracting the signatures of interest. PNNL has developed new digital filtering and analysis techniques to producemore » an ultra-high rate gamma-ray spectrometer from a standard coaxial high-purity germanium (HPGe) crystal. This 37% relative efficiency detector has been operated for SNF measurements at input count rates of 500-1300 kcps and throughput in excess of 150 kcps. Optimized filtering algorithms preserve the spectroscopic capability of the system even at these high rates. This paper will present the results of both passive and active SNF measurement performed with this system at PNNL. (authors)« less

  17. Secure and Efficient Transmission of Hyperspectral Images for Geosciences Applications

    NASA Astrophysics Data System (ADS)

    Carpentieri, Bruno; Pizzolante, Raffaele

    2017-12-01

    Hyperspectral images are acquired through air-borne or space-borne special cameras (sensors) that collect information coming from the electromagnetic spectrum of the observed terrains. Hyperspectral remote sensing and hyperspectral images are used for a wide range of purposes: originally, they were developed for mining applications and for geology because of the capability of this kind of images to correctly identify various types of underground minerals by analysing the reflected spectrums, but their usage has spread in other application fields, such as ecology, military and surveillance, historical research and even archaeology. The large amount of data obtained by the hyperspectral sensors, the fact that these images are acquired at a high cost by air-borne sensors and that they are generally transmitted to a base, makes it necessary to provide an efficient and secure transmission protocol. In this paper, we propose a novel framework that allows secure and efficient transmission of hyperspectral images, by combining a reversible invisible watermarking scheme, used in conjunction with digital signature techniques, and a state-of-art predictive-based lossless compression algorithm.

  18. New Ground Truth Capability from InSAR Time Series Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, S; Vincent, P; Yang, D

    2005-07-13

    We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less

  19. Automated Breast Density Computation in Digital Mammography and Digital Breast Tomosynthesis: Influence on Mean Glandular Dose and BIRADS Density Categorization.

    PubMed

    Castillo-García, Maria; Chevalier, Margarita; Garayoa, Julia; Rodriguez-Ruiz, Alejandro; García-Pinto, Diego; Valverde, Julio

    2017-07-01

    The study aimed to compare the breast density estimates from two algorithms on full-field digital mammography (FFDM) and digital breast tomosynthesis (DBT) and to analyze the clinical implications. We selected 561 FFDM and DBT examinations from patients without breast pathologies. Two versions of a commercial software (Quantra 2D and Quantra 3D) calculated the volumetric breast density automatically in FFDM and DBT, respectively. Other parameters such as area breast density and total breast volume were evaluated. We compared the results from both algorithms using the Mann-Whitney U non-parametric test and the Spearman's rank coefficient for data correlation analysis. Mean glandular dose (MGD) was calculated following the methodology proposed by Dance et al. Measurements with both algorithms are well correlated (r ≥ 0.77). However, there are statistically significant differences between the medians (P < 0.05) of most parameters. The volumetric and area breast density median values from FFDM are, respectively, 8% and 77% higher than DBT estimations. Both algorithms classify 35% and 55% of breasts into BIRADS (Breast Imaging-Reporting and Data System) b and c categories, respectively. There are no significant differences between the MGD calculated using the breast density from each algorithm. DBT delivers higher MGD than FFDM, with a lower difference (5%) for breasts in the BIRADS d category. MGD is, on average, 6% higher than values obtained with the breast glandularity proposed by Dance et al. Breast density measurements from both algorithms lead to equivalent BIRADS classification and MGD values, hence showing no difference in clinical outcomes. The median MGD values of FFDM and DBT examinations are similar for dense breasts (BIRADS d category). Published by Elsevier Inc.

  20. Enhancing nuclear quadrupole resonance (NQR) signature detection leveraging interference suppression algorithms

    NASA Astrophysics Data System (ADS)

    DeBardelaben, James A.; Miller, Jeremy K.; Myrick, Wilbur L.; Miller, Joel B.; Gilbreath, G. Charmaine; Bajramaj, Blerta

    2012-06-01

    Nuclear quadrupole resonance (NQR) is a radio frequency (RF) magnetic spectroscopic technique that has been shown to detect and identify a wide range of explosive materials containing quadrupolar nuclei. The NQR response signal provides a unique signature of the material of interest. The signal is, however, very weak and can be masked by non-stationary RF interference (RFI) and thermal noise, limiting detection distance. In this paper, we investigate the bounds on the NQR detection range for ammonium nitrate. We leverage a low-cost RFI data acquisition system composed of inexpensive B-field sensing and commercial-off-the-shelf (COTS) software-defined radios (SDR). Using collected data as RFI reference signals, we apply adaptive filtering algorithms to mitigate RFI and enable NQR detection techniques to approach theoretical range bounds in tactical environments.

  1. Lidars for smoke and dust cloud diagnostics

    NASA Astrophysics Data System (ADS)

    Fujimura, S. F.; Warren, R. E.; Lutomirski, R. F.

    1980-11-01

    An algorithm that integrates a time-resolved lidar signature for use in estimating transmittance, extinction coefficient, mass concentration, and CL values generated under battlefield conditions is applied to lidar signatures measured during the DIRT-I tests. Estimates are given for the dependence of the inferred transmittance and extinction coefficient on uncertainties in parameters such as the obscurant backscatter-to-extinction ratio. The enhanced reliability in estimating transmittance through use of a target behind the obscurant cloud is discussed. It is found that the inversion algorithm can produce reliable estimates of smoke or dust transmittance and extinction from all points within the cloud for which a resolvable signal can be detected, and that a single point calibration measurement can convert the extinction values to mass concentration for each resolvable signal point.

  2. FPGA implementation of digital down converter using CORDIC algorithm

    NASA Astrophysics Data System (ADS)

    Agarwal, Ashok; Lakshmi, Boppana

    2013-01-01

    In radio receivers, Digital Down Converters (DDC) are used to translate the signal from Intermediate Frequency level to baseband. It also decimates the oversampled signal to a lower sample rate, eliminating the need of a high end digital signal processors. In this paper we have implemented architecture for DDC employing CORDIC algorithm, which down converts an IF signal of 70MHz (3G) to 200 KHz baseband GSM signal, with an SFDR greater than 100dB. The implemented architecture reduces the hardware resource requirements by 15 percent when compared with other architecture available in the literature due to elimination of explicit multipliers and a quadrature phase shifter for mixing.

  3. Picture archiving and communication in radiology.

    PubMed

    Napoli, Marzia; Nanni, Marinella; Cimarra, Stefania; Crisafulli, Letizia; Campioni, Paolo; Marano, Pasquale

    2003-01-01

    After over 80 years of exclusive archiving of radiologic films, at present, in Radiology, digital archiving is increasingly gaining ground. Digital archiving allows a considerable reduction in costs and space saving, but most importantly, immediate or remote consultation of all examinations and reports in the hospital clinical wards, is feasible. The RIS system, in this case, is the starting point of the process of electronic archiving which however is the task of PACS. The latter can be used as radiologic archive in accordance with the law provided that it is in conformance with some specifications as the use of optical long-term storage media or with electronic track of change. PACS archives, in a hierarchical system, all digital images produced by each diagnostic imaging modality. Images and patient data can be retrieved and used for consultation or remote consultation by the reporting radiologist who requires images and reports of previous radiologic examinations or by the referring physician of the ward. Modern PACS owing to the WEB server allow remote access to extremely simplified images and data however ensuring the due regulations and access protections. Since the PACS enables a simpler data communication within the hospital, security and patient privacy should be protected. A secure and reliable PACS should be able to minimize the risk of accidental data destruction, and should prevent non authorized access to the archive with adequate security measures in relation to the acquired knowledge and based on the technological advances. Archiving of data produced by modern digital imaging is a problem now present also in small Radiology services. The technology is able to readily solve problems which were extremely complex up to some years ago as the connection between equipment and archiving system owing also to the universalization of the DICOM 3.0 standard. The evolution of communication networks and the use of standard protocols as TCP/IP can minimize problems of data and image remote transmission within the healthcare enterprise as well as over the territory. However, new problems are appearing as that of digital data security profiles and of the different systems which should ensure it. Among these, algorithms of electronic signature should be mentioned. In Italy they are validated by law and therefore can be used in digital archives in accordance with the law.

  4. Data Embedding for Covert Communications, Digital Watermarking, and Information Augmentation

    DTIC Science & Technology

    2000-03-01

    proposed an image authentication algorithm based on the fragility of messages embedded in digital images using LSB encoding. In [Walt95], he proposes...Invertibility 2/ 3 SAMPLE DATA EMBEDDING TECHNIQUES 23 3.1 SPATIAL TECHNIQUES 23 LSB Encoding in Intensity Images 23 Data embedding...ATTACK 21 FIGURE 6. EFFECTS OF LSB ENCODING 25 FIGURE 7. ALGORITHM FOR EZSTEGO 28 FIGURE 8. DATA EMBEDDING IN THE FREQUENCY DOMAIN 30 FIGURE 9

  5. An image encryption algorithm based on 3D cellular automata and chaotic maps

    NASA Astrophysics Data System (ADS)

    Del Rey, A. Martín; Sánchez, G. Rodríguez

    2015-05-01

    A novel encryption algorithm to cipher digital images is presented in this work. The digital image is rendering into a three-dimensional (3D) lattice and the protocol consists of two phases: the confusion phase where 24 chaotic Cat maps are applied and the diffusion phase where a 3D cellular automata is evolved. The encryption method is shown to be secure against the most important cryptanalytic attacks.

  6. Encoding Schemes For A Digital Optical Multiplier Using The Modified Signed-Digit Number Representation

    NASA Astrophysics Data System (ADS)

    Lasher, Mark E.; Henderson, Thomas B.; Drake, Barry L.; Bocker, Richard P.

    1986-09-01

    The modified signed-digit (MSD) number representation offers full parallel, carry-free addition. A MSD adder has been described by the authors. This paper describes how the adder can be used in a tree structure to implement an optical multiply algorithm. Three different optical schemes, involving position, polarization, and intensity encoding, are proposed for realizing the trinary logic system. When configured in the generic multiplier architecture, these schemes yield the combinatorial logic necessary to carry out the multiplication algorithm. The optical systems are essentially three dimensional arrangements composed of modular units. Of course, this modularity is important for design considerations, while the parallelism and noninterfering communication channels of optical systems are important from the standpoint of reduced complexity. The authors have also designed electronic hardware to demonstrate and model the combinatorial logic required to carry out the algorithm. The electronic and proposed optical systems will be compared in terms of complexity and speed.

  7. Hyperspectral signatures and WorldView-3 imagery of Indian River Lagoon and Banana River Estuarine water and bottom types

    NASA Astrophysics Data System (ADS)

    Bostater, Charles R.; Oney, Taylor S.; Rotkiske, Tyler; Aziz, Samin; Morrisette, Charles; Callahan, Kelby; Mcallister, Devin

    2017-10-01

    Hyperspectral signatures and imagery collected during the spring and summer of 2017 and 2016 are presented. Ground sampling distances (GSD) and pixel sizes were sampled from just over a meter to less than 4.0 mm. A pushbroom hyperspectral imager was used to calculate bidirectional reflectance factor (BRF) signatures. Hyperspectral signatures of different water types and bottom habitats such as submerged seagrasses, drift algae and algal bloom waters were scanned using a high spectral and digital resolution solid state spectrograph. WorldView-3 satellite imagery with minimal water wave sun glint effects was used to demonstrate the ability to detect bottom features using a derivative reflectance spectroscopy approach with the 1.3 m GSD multispectral satellite channels centered at the solar induced fluorescence band. The hyperspectral remote sensing data collected from the Banana River and Indian River Lagoon watersheds represents previously unknown signatures to be used in satellite and airborne remote sensing of water in turbid waters along the US Atlantic Ocean coastal region and the Florida littoral zone.

  8. Experiment research on infrared targets signature in mid and long IR spectral bands

    NASA Astrophysics Data System (ADS)

    Wang, Chensheng; Hong, Pu; Lei, Bo; Yue, Song; Zhang, Zhijie; Ren, Tingting

    2013-09-01

    Since the infrared imaging system has played a significant role in the military self-defense system and fire control system, the radiation signature of IR target becomes an important topic in IR imaging application technology. IR target signature can be applied in target identification, especially for small and dim targets, as well as the target IR thermal design. To research and analyze the targets IR signature systematically, a practical and experimental project is processed under different backgrounds and conditions. An infrared radiation acquisition system based on a MWIR cooled thermal imager and a LWIR cooled thermal imager is developed to capture the digital infrared images. Furthermore, some instruments are introduced to provide other parameters. According to the original image data and the related parameters in a certain scene, the IR signature of interested target scene can be calculated. Different background and targets are measured with this approach, and a comparison experiment analysis shall be presented in this paper as an example. This practical experiment has proved the validation of this research work, and it is useful in detection performance evaluation and further target identification research.

  9. Signature simulation of mixed materials

    NASA Astrophysics Data System (ADS)

    Carson, Tyler D.; Salvaggio, Carl

    2015-05-01

    Soil target signatures vary due to geometry, chemical composition, and scene radiometry. Although radiative transfer models and function-fit physical models may describe certain targets in limited depth, the ability to incorporate all three signature variables is difficult. This work describes a method to simulate the transient signatures of soil by first considering scene geometry synthetically created using 3D physics engines. Through the assignment of spectral data from the Nonconventional Exploitation Factors Data System (NEFDS), the synthetic scene is represented as a physical mixture of particles. Finally, first principles radiometry is modeled using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model. With DIRSIG, radiometric and sensing conditions were systematically manipulated to produce and record goniometric signatures. The implementation of this virtual goniometer allows users to examine how a target bidirectional reflectance distribution function (BRDF) will change with geometry, composition, and illumination direction. By using 3D computer graphics models, this process does not require geometric assumptions that are native to many radiative transfer models. It delivers a discrete method to circumnavigate the significant cost of time and treasure associated with hardware-based goniometric data collections.

  10. Optimized algorithm for the spatial nonuniformity correction of an imaging system based on a charge-coupled device color camera.

    PubMed

    de Lasarte, Marta; Pujol, Jaume; Arjona, Montserrat; Vilaseca, Meritxell

    2007-01-10

    We present an optimized linear algorithm for the spatial nonuniformity correction of a CCD color camera's imaging system and the experimental methodology developed for its implementation. We assess the influence of the algorithm's variables on the quality of the correction, that is, the dark image, the base correction image, and the reference level, and the range of application of the correction using a uniform radiance field provided by an integrator cube. The best spatial nonuniformity correction is achieved by having a nonzero dark image, by using an image with a mean digital level placed in the linear response range of the camera as the base correction image and taking the mean digital level of the image as the reference digital level. The response of the CCD color camera's imaging system to the uniform radiance field shows a high level of spatial uniformity after the optimized algorithm has been applied, which also allows us to achieve a high-quality spatial nonuniformity correction of captured images under different exposure conditions.

  11. 10 CFR 2.4 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and safety or the common defense and security; security measures for the physical protection and... computer that contains the participant's name, e-mail address, and participant's digital signature, proves... inspection. It is also the place where NRC makes computer terminals available to access the Publicly...

  12. 10 CFR 2.4 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and safety or the common defense and security; security measures for the physical protection and... computer that contains the participant's name, e-mail address, and participant's digital signature, proves... inspection. It is also the place where NRC makes computer terminals available to access the Publicly...

  13. Real social analytics: A contribution towards a phenomenology of a digital world.

    PubMed

    Couldry, Nick; Fotopoulou, Aristea; Dickens, Luke

    2016-03-01

    This article argues against the assumption that agency and reflexivity disappear in an age of 'algorithmic power' (Lash 2007). Following the suggestions of Beer (2009), it proposes that, far from disappearing, new forms of agency and reflexivity around the embedding in everyday practice of not only algorithms but also analytics more broadly are emerging, as social actors continue to pursue their social ends but mediated through digital interfaces: this is the consequence of many social actors now needing their digital presence, regardless of whether they want this, to be measured and counted. The article proposes 'social analytics' as a new topic for sociology: the sociological study of social actors' uses of analytics not for the sake of measurement itself (or to make profit from measurement) but in order to fulfil better their social ends through an enhancement of their digital presence. The article places social analytics in the context of earlier debates about categorization, algorithmic power, and self-presentation online, and describes in detail a case study with a UK community organization which generated the social analytics approach. The article concludes with reflections on the implications of this approach for further sociological fieldwork in a digital world. © London School of Economics and Political Science 2016.

  14. Digital Timing Recovery for High Speed Optical Drives

    NASA Astrophysics Data System (ADS)

    Ko, Seok Jun; Kim, Pan Soo; Choi, Hyung Jin; Lee, Jae-Wook

    2002-03-01

    A new digital timing recovery scheme for the optical drive system is presented. By comparative simulations using digital versatile disc (DVD) patterns with marginal input conditions, the proposed algorithm shows enhanced performances in jitter variance and signal-to-noise ratio (SNR) margin by four times and 3 [dB], respectively.

  15. Governing Software: Networks, Databases and Algorithmic Power in the Digital Governance of Public Education

    ERIC Educational Resources Information Center

    Williamson, Ben

    2015-01-01

    This article examines the emergence of "digital governance" in public education in England. Drawing on and combining concepts from software studies, policy and political studies, it identifies some specific approaches to digital governance facilitated by network-based communications and database-driven information processing software…

  16. Digital health technology and trauma: development of an app to standardize care.

    PubMed

    Hsu, Jeremy M

    2015-04-01

    Standardized practice results in less variation, therefore reducing errors and improving outcome. Optimal trauma care is achieved through standardization, as is evidenced by the widespread adoption of the Advanced Trauma Life Support approach. The challenge for an individual institution is how does one educate and promulgate these standardized processes widely and efficiently? In today's world, digital health technology must be considered in the process. The aim of this study was to describe the process of developing an app, which includes standardized trauma algorithms. The objective of the app was to allow easy, real-time access to trauma algorithms, and therefore reduce omissions/errors. A set of trauma algorithms, relevant to the local setting, was derived from the best available evidence. After obtaining grant funding, a collaborative endeavour was undertaken with an external specialist app developing company. The process required 6 months to translate the existing trauma algorithms into an app. The app contains 32 separate trauma algorithms, formatted as a single-page flow diagram. It utilizes specific smartphone features such as 'pinch to zoom', jump-words and pop-ups to allow rapid access to the desired information. Improvements in trauma care outcomes result from reducing variation. By incorporating digital health technology, a trauma app has been developed, allowing easy and intuitive access to evidenced-based algorithms. © 2015 Royal Australasian College of Surgeons.

  17. The DataBridge: A System For Optimizing The Use Of Dark Data From The Long Tail Of Science

    NASA Astrophysics Data System (ADS)

    Lander, H.; Rajasekar, A.

    2015-12-01

    The DataBridge is a National Science Foundation funded collaborative project (OCI-1247652, OCI-1247602, OCI-1247663) designed to assist in the discovery of dark data sets from the long tail of science. The DataBridge aims to to build queryable communities of datasets using sociometric network analysis. This approach is being tested to evaluate the ability to leverage various forms of metadata to facilitate discovery of new knowledge. Each dataset in the Databridge has an associated name space used as a first level partitioning. In addition to testing known algorithms for SNA community building, the DataBridge project has built a message-based platform that allows users to provide their own algorithms for each of the stages in the community building process. The stages are: Signature Generation (SG): An SG algorithm creates a metadata signature for a dataset. Signature algorithms might use text metadata provided by the dataset creator or derive metadata. Relevance Algorithm (RA): An RA compares a pair of datasets and produces a similarity value between 0 and 1 for the two datasets. Sociometric Network Analysis (SNA): The SNA will operate on a similarity matrix produced by an RA to partition all of the datasets in the name space into a set of clusters. These clusters represent communities of closely related datasets. The DataBridge also includes a web application that produces a visual representation of the clustering. Future work includes a more complete application that will allow different types of searching of the network of datasets. The DataBridge approach is relevant to geoscience research and informatics. In this presentation we will outline the project, illustrate the deployment of the approach, and discuss other potential applications and next steps for the research such as applying this approach to models. In addition we will explore the relevance of DataBridge to other geoscience projects such as various EarthCube Building Blocks and DIBBS projects.

  18. Application of square-root filtering for spacecraft attitude control

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Schmidt, S. F.; Goka, T.

    1978-01-01

    Suitable digital algorithms are developed and tested for providing on-board precision attitude estimation and pointing control for potential use in the Landsat-D spacecraft. These algorithms provide pointing accuracy of better than 0.01 deg. To obtain necessary precision with efficient software, a six state-variable square-root Kalman filter combines two star tracker measurements to update attitude estimates obtained from processing three gyro outputs. The validity of the estimation and control algorithms are established, and the sensitivity of their performance to various error sources and software parameters are investigated by detailed digital simulation. Spacecraft computer memory, cycle time, and accuracy requirements are estimated.

  19. A new approach of watermarking technique by means multichannel wavelet functions

    NASA Astrophysics Data System (ADS)

    Agreste, Santa; Puccio, Luigia

    2012-12-01

    The digital piracy involving images, music, movies, books, and so on, is a legal problem that has not found a solution. Therefore it becomes crucial to create and to develop methods and numerical algorithms in order to solve the copyright problems. In this paper we focus the attention on a new approach of watermarking technique applied to digital color images. Our aim is to describe the realized watermarking algorithm based on multichannel wavelet functions with multiplicity r = 3, called MCWM 1.0. We report a large experimentation and some important numerical results in order to show the robustness of the proposed algorithm to geometrical attacks.

  20. Modeling pilot interaction with automated digital avionics systems: Guidance and control algorithms for contour and nap-of-the-Earth flight

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.

    1990-01-01

    A collection of technical papers are presented that cover modeling pilot interaction with automated digital avionics systems and guidance and control algorithms for contour and nap-of-the-earth flight. The titles of the papers presented are as follows: (1) Automation effects in a multiloop manual control system; (2) A qualitative model of human interaction with complex dynamic systems; (3) Generalized predictive control of dynamic systems; (4) An application of generalized predictive control to rotorcraft terrain-following flight; (5) Self-tuning generalized predictive control applied to terrain-following flight; and (6) Precise flight path control using a predictive algorithm.

  1. A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Rao, Hariprasad Nannapaneni

    1989-01-01

    The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.

  2. Digital SAR processing using a fast polynomial transform

    NASA Technical Reports Server (NTRS)

    Butman, S.; Lipes, R.; Rubin, A.; Truong, T. K.

    1981-01-01

    A new digital processing algorithm based on the fast polynomial transform is developed for producing images from Synthetic Aperture Radar data. This algorithm enables the computation of the two dimensional cyclic correlation of the raw echo data with the impulse response of a point target, thereby reducing distortions inherent in one dimensional transforms. This SAR processing technique was evaluated on a general-purpose computer and an actual Seasat SAR image was produced. However, regular production runs will require a dedicated facility. It is expected that such a new SAR processing algorithm could provide the basis for a real-time SAR correlator implementation in the Deep Space Network.

  3. Implementation theory of distortion-invariant pattern recognition for optical and digital signal processing systems

    NASA Astrophysics Data System (ADS)

    Lhamon, Michael Earl

    A pattern recognition system which uses complex correlation filter banks requires proportionally more computational effort than single-real valued filters. This introduces increased computation burden but also introduces a higher level of parallelism, that common computing platforms fail to identify. As a result, we consider algorithm mapping to both optical and digital processors. For digital implementation, we develop computationally efficient pattern recognition algorithms, referred to as, vector inner product operators that require less computational effort than traditional fast Fourier methods. These algorithms do not need correlation and they map readily onto parallel digital architectures, which imply new architectures for optical processors. These filters exploit circulant-symmetric matrix structures of the training set data representing a variety of distortions. By using the same mathematical basis as with the vector inner product operations, we are able to extend the capabilities of more traditional correlation filtering to what we refer to as "Super Images". These "Super Images" are used to morphologically transform a complicated input scene into a predetermined dot pattern. The orientation of the dot pattern is related to the rotational distortion of the object of interest. The optical implementation of "Super Images" yields feature reduction necessary for using other techniques, such as artificial neural networks. We propose a parallel digital signal processor architecture based on specific pattern recognition algorithms but general enough to be applicable to other similar problems. Such an architecture is classified as a data flow architecture. Instead of mapping an algorithm to an architecture, we propose mapping the DSP architecture to a class of pattern recognition algorithms. Today's optical processing systems have difficulties implementing full complex filter structures. Typically, optical systems (like the 4f correlators) are limited to phase-only implementation with lower detection performance than full complex electronic systems. Our study includes pseudo-random pixel encoding techniques for approximating full complex filtering. Optical filter bank implementation is possible and they have the advantage of time averaging the entire filter bank at real time rates. Time-averaged optical filtering is computational comparable to billions of digital operations-per-second. For this reason, we believe future trends in high speed pattern recognition will involve hybrid architectures of both optical and DSP elements.

  4. FPGA implementation of Santos-Victor optical flow algorithm for real-time image processing: an useful attempt

    NASA Astrophysics Data System (ADS)

    Cobos Arribas, Pedro; Monasterio Huelin Macia, Felix

    2003-04-01

    A FPGA based hardware implementation of the Santos-Victor optical flow algorithm, useful in robot guidance applications, is described in this paper. The system used to do contains an ALTERA FPGA (20K100), an interface with a digital camera, three VRAM memories to contain the data input and some output memories (a VRAM and a EDO) to contain the results. The system have been used previously to develop and test other vision algorithms, such as image compression, optical flow calculation with differential and correlation methods. The designed system let connect the digital camera, or the FPGA output (results of algorithms) to a PC, throw its Firewire or USB port. The problems take place in this occasion have motivated to adopt another hardware structure for certain vision algorithms with special requirements, that need a very hard code intensive processing.

  5. Digital codec for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1989-01-01

    The authors present the hardware implementation of a digital television bandwidth compression algorithm which processes standard NTSC (National Television Systems Committee) composite color television signals and produces broadcast-quality video in real time at an average of 1.8 b/pixel. The sampling rate used with this algorithm results in 768 samples over the active portion of each video line by 512 active video lines per video frame. The algorithm is based on differential pulse code modulation (DPCM), but additionally utilizes a nonadaptive predictor, nonuniform quantizer, and multilevel Huffman coder to reduce the data rate substantially below that achievable with straight DPCM. The nonadaptive predictor and multilevel Huffman coder combine to set this technique apart from prior-art DPCM encoding algorithms. The authors describe the data compression algorithm and the hardware implementation of the codec and provide performance results.

  6. Analysis of Gene Expression Profiles of Multiple Skin Diseases Identifies a Conserved Signature of Disrupted Homeostasis.

    PubMed

    Mills, Kevin J; Robinson, Michael K; Sherrill, Joseph D; Schnell, Daniel J; Xu, Jun

    2018-05-28

    Triggers of skin disease pathogenesis vary, but events associated with the elicitation of a lesion share many features in common. Our objective was to examine gene expression patterns in skin disease to develop a molecular signature of disruption of cutaneous homeostasis. Gene expression data from common inflammatory skin diseases (e.g., psoriasis, atopic dermatitis, seborrheic dermatitis and acne), and a novel statistical algorithm were used to define a unifying molecular signature referred to as the "Unhealthy Skin Signature" (USS). Using a pattern matching algorithm, analysis of public data repositories revealed that the USS is found in diverse epithelial diseases. Studies of milder disruptions of epidermal homeostasis have also shown that these conditions converge, to varying degrees, on the USS and that the degree of convergence is related directly to the severity of homeostatic disruption. The USS contains genes that had no prior published association with skin, but that play important roles in many different disease processes, supporting the importance of the USS to homeostasis. Finally, we show through pattern matching that the USS can be used to discover new potential dermatologic therapeutics. The USS provides a new means to further interrogate epithelial homeostasis and potentially develop novel therapeutics with efficacy across a spectrum of skin conditions. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. Array signal recovery algorithm for a single-RF-channel DBF array

    NASA Astrophysics Data System (ADS)

    Zhang, Duo; Wu, Wen; Fang, Da Gang

    2016-12-01

    An array signal recovery algorithm based on sparse signal reconstruction theory is proposed for a single-RF-channel digital beamforming (DBF) array. A single-RF-channel antenna array is a low-cost antenna array in which signals are obtained from all antenna elements by only one microwave digital receiver. The spatially parallel array signals are converted into time-sequence signals, which are then sampled by the system. The proposed algorithm uses these time-sequence samples to recover the original parallel array signals by exploiting the second-order sparse structure of the array signals. Additionally, an optimization method based on the artificial bee colony (ABC) algorithm is proposed to improve the reconstruction performance. Using the proposed algorithm, the motion compensation problem for the single-RF-channel DBF array can be solved effectively, and the angle and Doppler information for the target can be simultaneously estimated. The effectiveness of the proposed algorithms is demonstrated by the results of numerical simulations.

  8. Rapid execution of fan beam image reconstruction algorithms using efficient computational techniques and special-purpose processors

    NASA Astrophysics Data System (ADS)

    Gilbert, B. K.; Robb, R. A.; Chu, A.; Kenue, S. K.; Lent, A. H.; Swartzlander, E. E., Jr.

    1981-02-01

    Rapid advances during the past ten years of several forms of computer-assisted tomography (CT) have resulted in the development of numerous algorithms to convert raw projection data into cross-sectional images. These reconstruction algorithms are either 'iterative,' in which a large matrix algebraic equation is solved by successive approximation techniques; or 'closed form'. Continuing evolution of the closed form algorithms has allowed the newest versions to produce excellent reconstructed images in most applications. This paper will review several computer software and special-purpose digital hardware implementations of closed form algorithms, either proposed during the past several years by a number of workers or actually implemented in commercial or research CT scanners. The discussion will also cover a number of recently investigated algorithmic modifications which reduce the amount of computation required to execute the reconstruction process, as well as several new special-purpose digital hardware implementations under development in laboratories at the Mayo Clinic.

  9. Quantum Simulation of Tunneling in Small Systems

    PubMed Central

    Sornborger, Andrew T.

    2012-01-01

    A number of quantum algorithms have been performed on small quantum computers; these include Shor's prime factorization algorithm, error correction, Grover's search algorithm and a number of analog and digital quantum simulations. Because of the number of gates and qubits necessary, however, digital quantum particle simulations remain untested. A contributing factor to the system size required is the number of ancillary qubits needed to implement matrix exponentials of the potential operator. Here, we show that a set of tunneling problems may be investigated with no ancillary qubits and a cost of one single-qubit operator per time step for the potential evolution, eliminating at least half of the quantum gates required for the algorithm and more than that in the general case. Such simulations are within reach of current quantum computer architectures. PMID:22916333

  10. Digital Audio Signal Processing and Nde: AN Unlikely but Valuable Partnership

    NASA Astrophysics Data System (ADS)

    Gaydecki, Patrick

    2008-02-01

    In the Digital Signal Processing (DSP) group, within the School of Electrical and Electronic Engineering at The University of Manchester, research is conducted into two seemingly distinct and disparate subjects: instrumentation for nondestructive evaluation, and DSP systems & algorithms for digital audio. We have often found that many of the hardware systems and algorithms employed to recover, extract or enhance audio signals may also be applied to signals provided by ultrasonic or magnetic NDE instruments. Furthermore, modern DSP hardware is so fast (typically performing hundreds of millions of operations per second), that much of the processing and signal reconstruction may be performed in real time. Here, we describe some of the hardware systems we have developed, together with algorithms that can be implemented both in real time and offline. A next generation system has now been designed, which incorporates a processor operating at 0.55 Giga MMACS, six input and eight output analogue channels, digital input/output in the form of S/PDIF, a JTAG and a USB interface. The software allows the user, with no knowledge of filter theory or programming, to design and run standard or arbitrary FIR, IIR and adaptive filters. Using audio as a vehicle, we can demonstrate the remarkable properties of modern reconstruction algorithms when used in conjunction with such hardware; applications in NDE include signal enhancement and recovery in acoustic, ultrasonic, magnetic and eddy current modalities.

  11. An all digital phase locked loop for FM demodulation.

    NASA Technical Reports Server (NTRS)

    Greco, J.; Garodnick, J.; Schilling, D. L.

    1972-01-01

    A phase-locked loop designed with all-digital circuitry which avoids certain problems, and a digital voltage controlled oscillator algorithm are described. The system operates synchronously and performs all required digital calculations within one sampling period, thereby performing as a real-time special-purpose computer. The SNR ratio is computed for frequency offsets and sinusoidal modulation, and experimental results verify the theoretical calculations.

  12. Kramers-Kronig receiver operable without digital upsampling.

    PubMed

    Bo, Tianwai; Kim, Hoon

    2018-05-28

    The Kramers-Kronig (KK) receiver is capable of retrieving the phase information of optical single-sideband (SSB) signal from the optical intensity when the optical signal satisfies the minimum phase condition. Thus, it is possible to direct-detect the optical SSB signal without suffering from the signal-signal beat interference and linear transmission impairments. However, due to the spectral broadening induced by nonlinear operations in the conventional KK algorithm, it is necessary to employ the digital upsampling at the beginning of the digital signal processing (DSP). The increased number of samples at the DSP would hinder the real-time implementation of this attractive receiver. Hence, we propose a new DSP algorithm for KK receiver operable at 2 samples per symbol. We adopt a couple of mathematical approximations to avoid the use of nonlinear operations such as logarithm and exponential functions. By using the proposed algorithm, we demonstrate the transmission of 112-Gb/s SSB orthogonal frequency-division-multiplexed signal over an 80-km fiber link. The results show that the proposed algorithm operating at 2 samples per symbol exhibits similar performance to the conventional KK one operating at 6 samples per symbol. We also present the error analysis of the proposed algorithm for KK receiver in comparison with the conventional one.

  13. A SURVEY OF METAL LINES AT HIGH-REDSHIFT. I. SDSS ABSORPTION LINE STUDIES- THE METHODOLOGY AND FIRST SEARCH RESULTS FOR O VI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frank, S.; Mathur, S.; Pieri, M.

    2010-09-15

    We report the results of a systematic search for signatures of metal lines in quasar spectra of the Sloan Digital Sky Survey (SDSS) data release 3 (DR3), focusing on finding intervening absorbers via detection of their O VI doublet. Here, we present the search algorithm and criteria for distinguishing candidates from spurious Ly{alpha} forest lines. In addition, we compare our findings with simulations of the Ly{alpha} forest in order to estimate the detectability of O VI doublets over various redshift intervals. We have obtained a sample of 1756 O VI doublet candidates with rest-frame equivalent width (EW) {>=}0.05 A inmore » 855 active galactic nuclei spectra (out of 3702 objects with redshifts in the accessible range for O VI detection). This sample is further subdivided into three groups according to the likelihood of being real and the potential for follow-up observation of the candidate. The group with the cleanest and most secure candidates is comprised of 145 candidates. Sixty-nine of these reside at a velocity separation {>=}5000 km s{sup -1} from the QSO and can therefore be classified tentatively as intervening absorbers. Most of these absorbers have not been picked up by earlier, automated QSO absorption line detection algorithms. This sample increases the number of known O VI absorbers at redshifts beyond z{sub abs{>=}} 2.7 substantially.« less

  14. Nonrigid Image Registration in Digital Subtraction Angiography Using Multilevel B-Spline

    PubMed Central

    2013-01-01

    We address the problem of motion artifact reduction in digital subtraction angiography (DSA) using image registration techniques. Most of registration algorithms proposed for application in DSA, have been designed for peripheral and cerebral angiography images in which we mainly deal with global rigid motions. These algorithms did not yield good results when applied to coronary angiography images because of complex nonrigid motions that exist in this type of angiography images. Multiresolution and iterative algorithms are proposed to cope with this problem, but these algorithms are associated with high computational cost which makes them not acceptable for real-time clinical applications. In this paper we propose a nonrigid image registration algorithm for coronary angiography images that is significantly faster than multiresolution and iterative blocking methods and outperforms competing algorithms evaluated on the same data sets. This algorithm is based on a sparse set of matched feature point pairs and the elastic registration is performed by means of multilevel B-spline image warping. Experimental results with several clinical data sets demonstrate the effectiveness of our approach. PMID:23971026

  15. Development of a digital method for neutron/gamma-ray discrimination based on matched filtering

    NASA Astrophysics Data System (ADS)

    Korolczuk, S.; Linczuk, M.; Romaniuk, R.; Zychor, I.

    2016-09-01

    Neutron/gamma-ray discrimination is crucial for measurements with detectors sensitive to both neutron and gamma-ray radiation. Different techniques to discriminate between neutrons and gamma-rays based on pulse shape analysis are widely used in many applications, e.g., homeland security, radiation dosimetry, environmental monitoring, fusion experiments, nuclear spectroscopy. A common requirement is to improve a radiation detection level with a high detection reliability. Modern electronic components, such as high speed analog to digital converters and powerful programmable digital circuits for signal processing, allow us to develop a fully digital measurement system. With this solution it is possible to optimize digital signal processing algorithms without changing any electronic components in an acquisition signal path. We report on results obtained with a digital acquisition system DNG@NCBJ designed at the National Centre for Nuclear Research. A 2'' × 2'' EJ309 liquid scintillator was used to register mixed neutron and gamma-ray radiation from PuBe sources. A dedicated algorithm for pulse shape discrimination, based on real-time filtering, was developed and implemented in hardware.

  16. Post interaural neural net-based vowel recognition

    NASA Astrophysics Data System (ADS)

    Jouny, Ismail I.

    2001-10-01

    Interaural head related transfer functions are used to process speech signatures prior to neural net based recognition. Data representing the head related transfer function of a dummy has been collected at MIT and made available on the Internet. This data is used to pre-process vowel signatures to mimic the effects of human ear on speech perception. Signatures representing various vowels of the English language are then presented to a multi-layer perceptron trained using the back propagation algorithm for recognition purposes. The focus in this paper is to assess the effects of human interaural system on vowel recognition performance particularly when using a classification system that mimics the human brain such as a neural net.

  17. A satellite digital controller or 'play that PID tune again, Sam'. [Position, Integral, Derivative feedback control algorithm for design strategy

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1976-01-01

    The problem discussed is to design a digital controller for a typical satellite. The controlled plant is considered to be a rigid body acting in a plane. The controller is assumed to be a digital computer which, when combined with the proposed control algorithm, can be represented as a sampled-data system. The objective is to present a design strategy and technique for selecting numerical values for the control gains (assuming position, integral, and derivative feedback) and the sample rate. The technique is based on the parameter plane method and requires that the system be amenable to z-transform analysis.

  18. Recognition of digital characteristics based new improved genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Meng; Xu, Guoqiang; Lin, Zihao

    2017-08-01

    In the field of digital signal processing, Estimating the characteristics of signal modulation parameters is an significant research direction. The paper determines the set of eigenvalue which can show the difference of the digital signal modulation based on the deep research of the new improved genetic algorithm. Firstly take them as the best gene pool; secondly, The best gene pool will be changed in the genetic evolvement by selecting, overlapping and eliminating each other; Finally, Adapting the strategy of futher enhance competition and punishment to more optimizer the gene pool and ensure each generation are of high quality gene. The simulation results show that this method not only has the global convergence, stability and faster convergence speed.

  19. Generation of signature databases with fast codes

    NASA Astrophysics Data System (ADS)

    Bradford, Robert A.; Woodling, Arthur E.; Brazzell, James S.

    1990-09-01

    Using the FASTSIG signature code to generate optical signature databases for the Ground-based Surveillance and Traking System (GSTS) Program has improved the efficiency of the database generation process. The goal of the current GSTS database is to provide standardized, threat representative target signatures that can easily be used for acquisition and trk studies, discrimination algorithm development, and system simulations. Large databases, with as many as eight interpolalion parameters, are required to maintain the fidelity demands of discrimination and to generalize their application to other strateg systems. As the need increases for quick availability of long wave infrared (LWIR) target signatures for an evolving design4o-threat, FASTSIG has become a database generation alternative to using the industry standard OptiCal Signatures Code (OSC). FASTSIG, developed in 1985 to meet the unique strategic systems demands imposed by the discrimination function, has the significant advantage of being a faster running signature code than the OSC, typically requiring two percent of the cpu time. It uses analytical approximations to model axisymmetric targets, with the fidelity required for discrimination analysis. Access of the signature database is accomplished through use of the waveband integration and interpolation software, INTEG and SIGNAT. This paper gives details of this procedure as well as sample interpolated signatures and also covers sample verification by comparison to the OSC, in order to establish the fidelity of the FASTSIG generated database.

  20. Acquisition and analysis of coastal ground-truth data for correlation with ERTS-1 imagery. [surface radiance data for Santa Monica Bay and Santa Barbara Channel

    NASA Technical Reports Server (NTRS)

    Anikouchine, W. A. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Radiance profiles drawn along cruise tracks have been examined for use in correlating digital radiance levels with ground truth data. Preliminary examination results are encouraging. Adding weighted levels from the 4 MSS bands appears to enhance specular surface reflections while rendering sensor noise white. Comparing each band signature to the added specular signature ought to enhance non-specular effects caused by ocean turbidity. Preliminary examination of radiance profiles and ground truth turbidity measurements revealed substantial correlation.

  1. Digital Simulation of Thunder from Three-Dimensional Lightning

    NASA Astrophysics Data System (ADS)

    Dunkin, James; Fleisch, Daniel

    2010-04-01

    The physics of lightning and its resultant thunder have been investigated by many people, but we still don't have a full understanding of the governing processes. In this study, we have constructed a three-dimensional model of lightning using MATLAB^ software, and used N-waves as postulated by Ribner and Roy to synthesize the resultant thunder signature. In addition, we have taken an FFT of the thunder signature, and compared the time-domain waveform and frequency spectrum to recordings of thunder taken over the summer of 2009. This analysis is done with the goal of further understanding the processes of thunder production.

  2. Digitally balanced detection for optical tomography.

    PubMed

    Hafiz, Rehan; Ozanyan, Krikor B

    2007-10-01

    Analog balanced Photodetection has found extensive usage for sensing of a weak absorption signal buried in laser intensity noise. This paper proposes schemes for compact, affordable, and flexible digital implementation of the already established analog balanced detection, as part of a multichannel digital tomography system. Variants of digitally balanced detection (DBD) schemes, suitable for weak signals on a largely varying background or weakly varying envelopes of high frequency carrier waves, are introduced analytically and elaborated in terms of algorithmic and hardware flow. The DBD algorithms are implemented on a low-cost general purpose reconfigurable hardware (field-programmable gate array), utilizing less than half of its resources. The performance of the DBD schemes compare favorably with their analog counterpart: A common mode rejection ratio of 50 dB was observed over a bandwidth of 300 kHz, limited mainly by the host digital hardware. The close relationship between the DBD outputs and those of known analog balancing circuits is discussed in principle and shown experimentally in the example case of propane gas detection.

  3. Utility of Characterizing and Monitoring Suspected Underground Nuclear Sites with VideoSAR

    NASA Astrophysics Data System (ADS)

    Dauphin, S. M.; Yocky, D. A.; Riley, R.; Calloway, T. M.; Wahl, D. E.

    2016-12-01

    Sandia National Laboratories proposed using airborne synthetic aperture RADAR (SAR) collected in VideoSAR mode to characterize the Underground Nuclear Explosion Signature Experiment (UNESE) test bed site at the Nevada National Security Site (NNSS). The SNL SAR collected airborne, Ku-band (16.8 GHz center frequency), 0.2032 meter ground resolution over NNSS in August 2014 and X-band (9.6 GHz), 0.1016 meter ground resolution fully-polarimetric SAR in April 2015. This paper reports the findings of processing and exploiting VideoSAR for creating digital elevation maps, detecting cultural artifacts and exploiting full-circle polarimetric signatures. VideoSAR collects a continuous circle of phase history data, therefore, imagery can be formed over the 360-degrees of the site. Since the Ku-band VideoSAR had two antennas suitable for interferometric digital elevation mapping (DEM), DEMs could be generated over numerous aspect angles, filling in holes created by targets with height by imaging from all sides. Also, since the X-band VideoSAR was fully-polarimetric, scattering signatures could be gleaned from all angles also. Both of these collections can be used to find man-made objects and changes in elevation that might indicate testing activities. VideoSAR provides a unique, coherent measure of ground objects allowing one to create accurate DEMS, locate man-made objects, and identify scattering signatures via polarimetric exploitation. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. The authors would like to thank the National Nuclear Security Administration, Defense Nuclear Nonproliferation Research and Development, for sponsoring this work. We would also like to thank the Underground Nuclear Explosion Signatures Experiment team, a multi-institutional and interdisciplinary group of scientists and engineers, for its technical contributions.

  4. Free-Piston Shock Tunnel Test Technique Development: An AEDC/DLR Cooperative Program

    DTIC Science & Technology

    2003-02-01

    Calibration Data." AIAA-95-6039, April 1995. 15. Molvik, G. A., and Merkle , C. L. "A Set of Strongly Coupled, Upwind Algorithms for Com- puting Flows in...April 1997. h. Date of Termination: 28 April 2002. i. All Signing Officials, Title/Offices Represented, and Countries: (1) Herr Rolf Schreiber, Chief...agreement by the RTP MOU Executive Agents. The US RTP/EA Signature Signature Clinton V. Hom, Maj Gen. USAF Rolf Schreiber Name Name Principal Assistant

  5. Isobaric Reconstruction of the Baryonic Acoustic Oscillation

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Yu, Hao-Ran; Zhu, Hong-Ming; Yu, Yu; Pan, Qiaoyin; Pen, Ue-Li

    2017-06-01

    In this Letter, we report a significant recovery of the linear baryonic acoustic oscillation (BAO) signature by applying the isobaric reconstruction algorithm to the nonlinear matter density field. Assuming only the longitudinal component of the displacement being cosmologically relevant, this algorithm iteratively solves the coordinate transform between the Lagrangian and Eulerian frames without requiring any specific knowledge of the dynamics. For dark matter field, it produces the nonlinear displacement potential with very high fidelity. The reconstruction error at the pixel level is within a few percent and is caused only by the emergence of the transverse component after the shell-crossing. As it circumvents the strongest nonlinearity of the density evolution, the reconstructed field is well described by linear theory and immune from the bulk-flow smearing of the BAO signature. Therefore, this algorithm could significantly improve the measurement accuracy of the sound horizon scale s. For a perfect large-scale structure survey at redshift zero without Poisson or instrumental noise, the fractional error {{Δ }}s/s is reduced by a factor of ˜2.7, very close to the ideal limit with the linear power spectrum and Gaussian covariance matrix.

  6. Distributed Unmixing of Hyperspectral Datawith Sparsity Constraint

    NASA Astrophysics Data System (ADS)

    Khoshsokhan, S.; Rajabi, R.; Zayyani, H.

    2017-09-01

    Spectral unmixing (SU) is a data processing problem in hyperspectral remote sensing. The significant challenge in the SU problem is how to identify endmembers and their weights, accurately. For estimation of signature and fractional abundance matrices in a blind problem, nonnegative matrix factorization (NMF) and its developments are used widely in the SU problem. One of the constraints which was added to NMF is sparsity constraint that was regularized by L1/2 norm. In this paper, a new algorithm based on distributed optimization has been used for spectral unmixing. In the proposed algorithm, a network including single-node clusters has been employed. Each pixel in hyperspectral images considered as a node in this network. The distributed unmixing with sparsity constraint has been optimized with diffusion LMS strategy, and then the update equations for fractional abundance and signature matrices are obtained. Simulation results based on defined performance metrics, illustrate advantage of the proposed algorithm in spectral unmixing of hyperspectral data compared with other methods. The results show that the AAD and SAD of the proposed approach are improved respectively about 6 and 27 percent toward distributed unmixing in SNR=25dB.

  7. Renyi entanglement entropy of interacting fermions calculated using the continuous-time quantum Monte Carlo method.

    PubMed

    Wang, Lei; Troyer, Matthias

    2014-09-12

    We present a new algorithm for calculating the Renyi entanglement entropy of interacting fermions using the continuous-time quantum Monte Carlo method. The algorithm only samples the interaction correction of the entanglement entropy, which by design ensures the efficient calculation of weakly interacting systems. Combined with Monte Carlo reweighting, the algorithm also performs well for systems with strong interactions. We demonstrate the potential of this method by studying the quantum entanglement signatures of the charge-density-wave transition of interacting fermions on a square lattice.

  8. Correlation signatures of wet soils and snows. [algorithm development and computer programming

    NASA Technical Reports Server (NTRS)

    Phillips, M. R.

    1972-01-01

    Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.

  9. In-vivo study of blood flow in capillaries using μPIV method

    NASA Astrophysics Data System (ADS)

    Kurochkin, Maxim A.; Fedosov, Ivan V.; Tuchin, Valery V.

    2014-01-01

    A digital optical system for intravital capillaroscopy has been developed. It implements the particle image velocimetry (PIV) based approach for measurements of red blood cells velocity in individual capillary of human nailfold. We propose to use a digital real time stabilization technique for compensation of impact of involuntary movements of a finger on results of measurements. Image stabilization algorithm is based on correlation of feature tracking. The efficiency of designed image stabilization algorithm was experimentally demonstrated.

  10. Predictive Fusion of Geophysical Waveforms using Fisher's Method, under the Alternative Hypothesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmichael, Joshua Daniel; Nemzek, Robert James; Webster, Jeremy David

    2017-05-05

    This presentation tries to understand how to combine different signatures from an event or source together in a defensible way. The objective was to build a digital detector that continuously combines detection statistics recording explosions to screen sources of interest from null sources.

  11. Secure Obfuscation for Encrypted Group Signatures

    PubMed Central

    Fan, Hongfei; Liu, Qin

    2015-01-01

    In recent years, group signature techniques are widely used in constructing privacy-preserving security schemes for various information systems. However, conventional techniques keep the schemes secure only in normal black-box attack contexts. In other words, these schemes suppose that (the implementation of) the group signature generation algorithm is running in a platform that is perfectly protected from various intrusions and attacks. As a complementary to existing studies, how to generate group signatures securely in a more austere security context, such as a white-box attack context, is studied in this paper. We use obfuscation as an approach to acquire a higher level of security. Concretely, we introduce a special group signature functionality-an encrypted group signature, and then provide an obfuscator for the proposed functionality. A series of new security notions for both the functionality and its obfuscator has been introduced. The most important one is the average-case secure virtual black-box property w.r.t. dependent oracles and restricted dependent oracles which captures the requirement of protecting the output of the proposed obfuscator against collision attacks from group members. The security notions fit for many other specialized obfuscators, such as obfuscators for identity-based signatures, threshold signatures and key-insulated signatures. Finally, the correctness and security of the proposed obfuscator have been proven. Thereby, the obfuscated encrypted group signature functionality can be applied to variants of privacy-preserving security schemes and enhance the security level of these schemes. PMID:26167686

  12. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    NASA Astrophysics Data System (ADS)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  13. Digital SAR processing using a fast polynomial transform

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Lipes, R. G.; Butman, S. A.; Reed, I. S.; Rubin, A. L.

    1984-01-01

    A new digital processing algorithm based on the fast polynomial transform is developed for producing images from Synthetic Aperture Radar data. This algorithm enables the computation of the two dimensional cyclic correlation of the raw echo data with the impulse response of a point target, thereby reducing distortions inherent in one dimensional transforms. This SAR processing technique was evaluated on a general-purpose computer and an actual Seasat SAR image was produced. However, regular production runs will require a dedicated facility. It is expected that such a new SAR processing algorithm could provide the basis for a real-time SAR correlator implementation in the Deep Space Network. Previously announced in STAR as N82-11295

  14. Methods and systems for detecting abnormal digital traffic

    DOEpatents

    Goranson, Craig A [Kennewick, WA; Burnette, John R [Kennewick, WA

    2011-03-22

    Aspects of the present invention encompass methods and systems for detecting abnormal digital traffic by assigning characterizations of network behaviors according to knowledge nodes and calculating a confidence value based on the characterizations from at least one knowledge node and on weighting factors associated with the knowledge nodes. The knowledge nodes include a characterization model based on prior network information. At least one of the knowledge nodes should not be based on fixed thresholds or signatures. The confidence value includes a quantification of the degree of confidence that the network behaviors constitute abnormal network traffic.

  15. Design of minimum multiplier fractional order differentiator based on lattice wave digital filter.

    PubMed

    Barsainya, Richa; Rawat, Tarun Kumar; Kumar, Manjeet

    2017-01-01

    In this paper, a novel design of fractional order differentiator (FOD) based on lattice wave digital filter (LWDF) is proposed which requires minimum number of multiplier for its structural realization. Firstly, the FOD design problem is formulated as an optimization problem using the transfer function of lattice wave digital filter. Then, three optimization algorithms, namely, genetic algorithm (GA), particle swarm optimization (PSO) and cuckoo search algorithm (CSA) are applied to determine the optimal LWDF coefficients. The realization of FOD using LWD structure increases the design accuracy, as only N number of coefficients are to be optimized for Nth order FOD. Finally, two design examples of 3rd and 5th order lattice wave digital fractional order differentiator (LWDFOD) are demonstrated to justify the design accuracy. The performance analysis of the proposed design is carried out based on magnitude response, absolute magnitude error (dB), root mean square (RMS) magnitude error, arithmetic complexity, convergence profile and computation time. Simulation results are attained to show the comparison of the proposed LWDFOD with the published works and it is observed that an improvement of 29% is obtained in the proposed design. The proposed LWDFOD approximates the ideal FOD and surpasses the existing ones reasonably well in mid and high frequency range, thereby making the proposed LWDFOD a promising technique for the design of digital FODs. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  16. A Deep Learning Approach to Digitally Stain Optical Coherence Tomography Images of the Optic Nerve Head.

    PubMed

    Devalla, Sripad Krishna; Chin, Khai Sing; Mari, Jean-Martial; Tun, Tin A; Strouthidis, Nicholas G; Aung, Tin; Thiéry, Alexandre H; Girard, Michaël J A

    2018-01-01

    To develop a deep learning approach to digitally stain optical coherence tomography (OCT) images of the optic nerve head (ONH). A horizontal B-scan was acquired through the center of the ONH using OCT (Spectralis) for one eye of each of 100 subjects (40 healthy and 60 glaucoma). All images were enhanced using adaptive compensation. A custom deep learning network was then designed and trained with the compensated images to digitally stain (i.e., highlight) six tissue layers of the ONH. The accuracy of our algorithm was assessed (against manual segmentations) using the dice coefficient, sensitivity, specificity, intersection over union (IU), and accuracy. We studied the effect of compensation, number of training images, and performance comparison between glaucoma and healthy subjects. For images it had not yet assessed, our algorithm was able to digitally stain the retinal nerve fiber layer + prelamina, the RPE, all other retinal layers, the choroid, and the peripapillary sclera and lamina cribrosa. For all tissues, the dice coefficient, sensitivity, specificity, IU, and accuracy (mean) were 0.84 ± 0.03, 0.92 ± 0.03, 0.99 ± 0.00, 0.89 ± 0.03, and 0.94 ± 0.02, respectively. Our algorithm performed significantly better when compensated images were used for training (P < 0.001). Besides offering a good reliability, digital staining also performed well on OCT images of both glaucoma and healthy individuals. Our deep learning algorithm can simultaneously stain the neural and connective tissues of the ONH, offering a framework to automatically measure multiple key structural parameters of the ONH that may be critical to improve glaucoma management.

  17. Systematic review of dermoscopy and digital dermoscopy/ artificial intelligence for the diagnosis of melanoma.

    PubMed

    Rajpara, S M; Botello, A P; Townend, J; Ormerod, A D

    2009-09-01

    Dermoscopy improves diagnostic accuracy of the unaided eye for melanoma, and digital dermoscopy with artificial intelligence or computer diagnosis has also been shown useful for the diagnosis of melanoma. At present there is no clear evidence regarding the diagnostic accuracy of dermoscopy compared with artificial intelligence. To evaluate the diagnostic accuracy of dermoscopy and digital dermoscopy/artificial intelligence for melanoma diagnosis and to compare the diagnostic accuracy of the different dermoscopic algorithms with each other and with digital dermoscopy/artificial intelligence for the detection of melanoma. A literature search on dermoscopy and digital dermoscopy/artificial intelligence for melanoma diagnosis was performed using several databases. Titles and abstracts of the retrieved articles were screened using a literature evaluation form. A quality assessment form was developed to assess the quality of the included studies. Heterogeneity among the studies was assessed. Pooled data were analysed using meta-analytical methods and comparisons between different algorithms were performed. Of 765 articles retrieved, 30 studies were eligible for meta-analysis. Pooled sensitivity for artificial intelligence was slightly higher than for dermoscopy (91% vs. 88%; P = 0.076). Pooled specificity for dermoscopy was significantly better than artificial intelligence (86% vs. 79%; P < 0.001). Pooled diagnostic odds ratio was 51.5 for dermoscopy and 57.8 for artificial intelligence, which were not significantly different (P = 0.783). There were no significance differences in diagnostic odds ratio among the different dermoscopic diagnostic algorithms. Dermoscopy and artificial intelligence performed equally well for diagnosis of melanocytic skin lesions. There was no significant difference in the diagnostic performance of various dermoscopy algorithms. The three-point checklist, the seven-point checklist and Menzies score had better diagnostic odds ratios than the others; however, these results need to be confirmed by a large-scale high-quality population-based study.

  18. Search for a Signature of Interaction between Relativistic Jet and Progenitor in Gamma-Ray Bursts

    NASA Astrophysics Data System (ADS)

    Yoshida, Kazuki; Yoneoku, Daisuke; Sawano, Tatsuya; Ito, Hirotaka; Matsumoto, Jin; Nagataki, Shigehiro

    2017-11-01

    The time variability of prompt emission in gamma-ray bursts (GRBs) is expected to originate from the temporal behavior of the central engine activity and the jet propagation in the massive stellar envelope. Using a pulse search algorithm for bright GRBs, we investigate the time variability of gamma-ray light curves to search a signature of the interaction between the jet and the inner structure of the progenitor. Since this signature might appear in the earlier phase of prompt emission, we divide the light curves into the initial phase and the late phase by referring to the trigger time and the burst duration of each GRB. We also adopt this algorithm for GRBs associated with supernovae/hypernovae that certainly are accompanied by massive stars. However, there is no difference between each pulse interval distribution described by a lognorma distribution in the two phases. We confirm that this result can be explained by the photospheric emission model if the energy injection of the central engine is not steady or completely periodic but episodic and described by the lognormal distribution with a mean of ˜1 s.

  19. Polarimetric Signatures of Initiating Convection During MC3E

    NASA Technical Reports Server (NTRS)

    Emory, Amber

    2012-01-01

    One of the goals of the Mid-latitude Continental Convective Clouds Experiment (MC3E) field campaign was to provide constraints for space-based rainfall retrieval algorithms over land. This study used datasets collected during the 2011 field campaign to combine radiometer and ground-based radar polarimetric retrievals in order to better understand hydrometeor type, habit and distribution for initiating continental convection. Cross-track and conically scanning nadir views from the Conical Scanning Millimeter-wave Imaging Radiometer (CoSMIR) were compared with ground-based polarimetric radar retrievals along the ER-2 flight track. Polarimetric signatures for both airborne radiometers and ground-based radars were well co-located with deep convection to relate radiometric signatures with low-level polarimetric radar data for hydrometeor identification and diameter estimation. For the time period of study, Z(sub DR) values indicated no presence of hail at the surface. However, the Z(sub DR) column extended well above the melting level into the mixed phase region, suggesting a possible source of frozen drop embryos for the future formation of hail. The results shown from this study contribute ground truth datasets for GPM PR algorithm development for convective events, which is an improvement upon previous stratiform precipitation centered framework.

  20. Search for a Signature of Interaction between Relativistic Jet and Progenitor in Gamma-Ray Bursts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshida, Kazuki; Yoneoku, Daisuke; Sawano, Tatsuya

    The time variability of prompt emission in gamma-ray bursts (GRBs) is expected to originate from the temporal behavior of the central engine activity and the jet propagation in the massive stellar envelope. Using a pulse search algorithm for bright GRBs, we investigate the time variability of gamma-ray light curves to search a signature of the interaction between the jet and the inner structure of the progenitor. Since this signature might appear in the earlier phase of prompt emission, we divide the light curves into the initial phase and the late phase by referring to the trigger time and the burstmore » duration of each GRB. We also adopt this algorithm for GRBs associated with supernovae/hypernovae that certainly are accompanied by massive stars. However, there is no difference between each pulse interval distribution described by a lognorma distribution in the two phases. We confirm that this result can be explained by the photospheric emission model if the energy injection of the central engine is not steady or completely periodic but episodic and described by the lognormal distribution with a mean of ∼1 s.« less

  1. Information theoretic analysis of edge detection in visual communication

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  2. Algorithms and applications of aberration correction and American standard-based digital evaluation in surface defects evaluating system

    NASA Astrophysics Data System (ADS)

    Wu, Fan; Cao, Pin; Yang, Yongying; Li, Chen; Chai, Huiting; Zhang, Yihui; Xiong, Haoliang; Xu, Wenlin; Yan, Kai; Zhou, Lin; Liu, Dong; Bai, Jian; Shen, Yibing

    2016-11-01

    The inspection of surface defects is one of significant sections of optical surface quality evaluation. Based on microscopic scattering dark-field imaging, sub-aperture scanning and stitching, the Surface Defects Evaluating System (SDES) can acquire full-aperture image of defects on optical elements surface and then extract geometric size and position information of defects with image processing such as feature recognization. However, optical distortion existing in the SDES badly affects the inspection precision of surface defects. In this paper, a distortion correction algorithm based on standard lattice pattern is proposed. Feature extraction, polynomial fitting and bilinear interpolation techniques in combination with adjacent sub-aperture stitching are employed to correct the optical distortion of the SDES automatically in high accuracy. Subsequently, in order to digitally evaluate surface defects with American standard by using American military standards MIL-PRF-13830B to judge the surface defects information obtained from the SDES, an American standard-based digital evaluation algorithm is proposed, which mainly includes a judgment method of surface defects concentration. The judgment method establishes weight region for each defect and adopts the method of overlap of weight region to calculate defects concentration. This algorithm takes full advantage of convenience of matrix operations and has merits of low complexity and fast in running, which makes itself suitable very well for highefficiency inspection of surface defects. Finally, various experiments are conducted and the correctness of these algorithms are verified. At present, these algorithms have been used in SDES.

  3. Top-attack modeling and automatic target detection using synthetic FLIR scenery

    NASA Astrophysics Data System (ADS)

    Weber, Bruce A.; Penn, Joseph A.

    2004-09-01

    A series of experiments have been performed to verify the utility of algorithmic tools for the modeling and analysis of cold-target signatures in synthetic, top-attack, FLIR video sequences. The tools include: MuSES/CREATION for the creation of synthetic imagery with targets, an ARL target detection algorithm to detect imbedded synthetic targets in scenes, and an ARL scoring algorithm, using Receiver-Operating-Characteristic (ROC) curve analysis, to evaluate detector performance. Cold-target detection variability was examined as a function of target emissivity, surrounding clutter type, and target placement in non-obscuring clutter locations. Detector metrics were also individually scored so as to characterize the effect of signature/clutter variations. Results show that using these tools, a detailed, physically meaningful, target detection analysis is possible and that scenario specific target detectors may be developed by selective choice and/or weighting of detector metrics. However, developing these tools into a reliable predictive capability will require the extension of these results to the modeling and analysis of a large number of data sets configured for a wide range of target and clutter conditions. Finally, these tools should also be useful for the comparison of competitive detection algorithms by providing well defined, and controllable target detection scenarios, as well as for the training and testing of expert human observers.

  4. An Interactive Program on Digitizing Historical Seismograms

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Xu, T.

    2013-12-01

    Retrieving information from historical seismograms is of great importance since they are considered the unique sources that provide quantitative information of historical earthquakes. Modern techniques of seismology require digital forms of seismograms that are essentially a sequence of time-amplitude pairs. However, the historical seismograms, after scanned into computers, are two dimensional arrays. Each element of the arrays contains the grayscale value or RGB value of the corresponding pixel. The problem of digitizing historical seismograms, referred to as converting historical seismograms to digital seismograms, can be formulated as an inverse problem that generating sequences of time-amplitude pairs from a two dimension arrays. This problem has infinite solutions. The algorithm for automatic digitization of historical seismogram presented considers several features of seismograms, including continuity, smoothness of the seismic traces as the prior information, and assumes that the amplitude is a single-valued function of time. An interactive program based on the algorithm is also presented. The program is developed using Matlab GUI and has both automatic and manual modality digitization. Users can easily switch between them, and try different combinations to get the optimal results. Several examples are given to illustrate the results of digitizing seismograms using the program, including a photographic record and a wide-angle reflection/refraction seismogram. Digitized result of the program (redrawn using Golden Software Surfer for high resolution image). (a) shows the result of automatic digitization, and (b) is the result after manual correction.

  5. Ant-cuckoo colony optimization for feature selection in digital mammogram.

    PubMed

    Jona, J B; Nagaveni, N

    2014-01-15

    Digital mammogram is the only effective screening method to detect the breast cancer. Gray Level Co-occurrence Matrix (GLCM) textural features are extracted from the mammogram. All the features are not essential to detect the mammogram. Therefore identifying the relevant feature is the aim of this work. Feature selection improves the classification rate and accuracy of any classifier. In this study, a new hybrid metaheuristic named Ant-Cuckoo Colony Optimization a hybrid of Ant Colony Optimization (ACO) and Cuckoo Search (CS) is proposed for feature selection in Digital Mammogram. ACO is a good metaheuristic optimization technique but the drawback of this algorithm is that the ant will walk through the path where the pheromone density is high which makes the whole process slow hence CS is employed to carry out the local search of ACO. Support Vector Machine (SVM) classifier with Radial Basis Kernal Function (RBF) is done along with the ACO to classify the normal mammogram from the abnormal mammogram. Experiments are conducted in miniMIAS database. The performance of the new hybrid algorithm is compared with the ACO and PSO algorithm. The results show that the hybrid Ant-Cuckoo Colony Optimization algorithm is more accurate than the other techniques.

  6. Multiresolution image registration in digital x-ray angiography with intensity variation modeling.

    PubMed

    Nejati, Mansour; Pourghassem, Hossein

    2014-02-01

    Digital subtraction angiography (DSA) is a widely used technique for visualization of vessel anatomy in diagnosis and treatment. However, due to unavoidable patient motions, both externally and internally, the subtracted angiography images often suffer from motion artifacts that adversely affect the quality of the medical diagnosis. To cope with this problem and improve the quality of DSA images, registration algorithms are often employed before subtraction. In this paper, a novel elastic registration algorithm for registration of digital X-ray angiography images, particularly for the coronary location, is proposed. This algorithm includes a multiresolution search strategy in which a global transformation is calculated iteratively based on local search in coarse and fine sub-image blocks. The local searches are accomplished in a differential multiscale framework which allows us to capture both large and small scale transformations. The local registration transformation also explicitly accounts for local variations in the image intensities which incorporated into our model as a change of local contrast and brightness. These local transformations are then smoothly interpolated using thin-plate spline interpolation function to obtain the global model. Experimental results with several clinical datasets demonstrate the effectiveness of our algorithm in motion artifact reduction.

  7. Simulation of Acoustic Noise Generated by an Airbreathing, Beam-Powered Launch Vehicle

    NASA Astrophysics Data System (ADS)

    Kennedy, W. C.; Van Laak, P.; Scarton, H. A.; Myrabo, L. N.

    2005-04-01

    A simple acoustic model is developed for predicting the noise signature vs. power level for advanced laser-propelled lightcraft — capable of single-stage flights into low Earth orbit. This model predicts the noise levels generated by a pulsed detonation engine (PDE) during the initial lift-off and acceleration phase, for two representative `tractor-beam' lightcraft designs: a 1-place `Mercury' vehicle (2.5-m diameter, 900-kg); and a larger 5-place `Apollo' vehicle (5-m diameter, 5555-kg) — both the subject of an earlier study. The use of digital techniques to simulate the expected PDE noise signature is discussed, and three examples of fly-by noise signatures are presented. The reduction, or complete elimination of perceptible noise from such engines, can be accomplished by shifting the pulse frequency into the supra-audible or sub-audible range.

  8. A digitalized silicon microgyroscope based on embedded FPGA.

    PubMed

    Xia, Dunzhu; Yu, Cheng; Wang, Yuliang

    2012-09-27

    This paper presents a novel digital miniaturization method for a prototype silicon micro-gyroscope (SMG) with the symmetrical and decoupled structure. The schematic blocks of the overall system consist of high precision analog front-end interface, high-speed 18-bit analog to digital convertor, a high-performance core Field Programmable Gate Array (FPGA) chip and other peripherals such as high-speed serial ports for transmitting data. In drive mode, the closed-loop drive circuit are implemented by automatic gain control (AGC) loop and software phase-locked loop (SPLL) based on the Coordinated Rotation Digital Computer (CORDIC) algorithm. Meanwhile, the sense demodulation module based on varying step least mean square demodulation (LMSD) are addressed in detail. All kinds of algorithms are simulated by Simulink and DSPbuilder tools, which is in good agreement with the theoretical design. The experimental results have fully demonstrated the stability and flexibility of the system.

  9. A Digitalized Silicon Microgyroscope Based on Embedded FPGA

    PubMed Central

    Xia, Dunzhu; Yu, Cheng; Wang, Yuliang

    2012-01-01

    This paper presents a novel digital miniaturization method for a prototype silicon micro-gyroscope (SMG) with the symmetrical and decoupled structure. The schematic blocks of the overall system consist of high precision analog front-end interface, high-speed 18-bit analog to digital convertor, a high-performance core Field Programmable Gate Array (FPGA) chip and other peripherals such as high-speed serial ports for transmitting data. In drive mode, the closed-loop drive circuit are implemented by automatic gain control (AGC) loop and software phase-locked loop (SPLL) based on the Coordinated Rotation Digital Computer (CORDIC) algorithm. Meanwhile, the sense demodulation module based on varying step least mean square demodulation (LMSD) are addressed in detail. All kinds of algorithms are simulated by Simulink and DSPbuilder tools, which is in good agreement with the theoretical design. The experimental results have fully demonstrated the stability and flexibility of the system. PMID:23201990

  10. Programming the Gesture of Writing: On the Algorithmic Paratexts of the Digital

    ERIC Educational Resources Information Center

    Adams, Catherine

    2016-01-01

    In the wake of the digital, some have recommended that we abandon the tedium of teaching handwriting to children in service of promoting "more creative" digital literacies. Others worry that an early diet of keyboard and screen may have deleterious effects on children's social, emotional, and cognitive development, as well as their…

  11. Integration of On-Line and Off-Line Diagnostic Algorithms for Aircraft Engine Health Management

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2007-01-01

    This paper investigates the integration of on-line and off-line diagnostic algorithms for aircraft gas turbine engines. The on-line diagnostic algorithm is designed for in-flight fault detection. It continuously monitors engine outputs for anomalous signatures induced by faults. The off-line diagnostic algorithm is designed to track engine health degradation over the lifetime of an engine. It estimates engine health degradation periodically over the course of the engine s life. The estimate generated by the off-line algorithm is used to update the on-line algorithm. Through this integration, the on-line algorithm becomes aware of engine health degradation, and its effectiveness to detect faults can be maintained while the engine continues to degrade. The benefit of this integration is investigated in a simulation environment using a nonlinear engine model.

  12. Parallel computing of a digital hologram and particle searching for microdigital-holographic particle-tracking velocimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Satake, Shin-ichi; Kanamori, Hiroyuki; Kunugi, Tomoaki

    2007-02-01

    We have developed a parallel algorithm for microdigital-holographic particle-tracking velocimetry. The algorithm is used in (1) numerical reconstruction of a particle image computer using a digital hologram, and (2) searching for particles. The numerical reconstruction from the digital hologram makes use of the Fresnel diffraction equation and the FFT (fast Fourier transform),whereas the particle search algorithm looks for local maximum graduation in a reconstruction field represented by a 3D matrix. To achieve high performance computing for both calculations (reconstruction and particle search), two memory partitions are allocated to the 3D matrix. In this matrix, the reconstruction part consists of horizontallymore » placed 2D memory partitions on the x-y plane for the FFT, whereas, the particle search part consists of vertically placed 2D memory partitions set along the z axes.Consequently, the scalability can be obtained for the proportion of processor elements,where the benchmarks are carried out for parallel computation by a SGI Altix machine.« less

  13. Gamma signatures of the C-BORD Tagged Neutron Inspection System

    NASA Astrophysics Data System (ADS)

    Sardet, A.; Pérot, B.; Carasco, C.; Sannié, G.; Moretto, S.; Nebbia, G.; Fontana, C.; Pino, F.; Iovene, A.; Tintori, C.

    2018-01-01

    In the frame of C-BORD project (H2020 program of the EU), a Rapidly relocatable Tagged Neutron Inspection System (RRTNIS) is being developed to non-intrusively detect explosives, chemical threats, and other illicit goods in cargo containers. Material identification is performed through gamma spectroscopy, using twenty NaI detectors and four LaBr3 detectors, to determine the different elements composing the inspected item from their specific gamma signatures induced by fast neutrons. This is performed using an unfolding algorithm to decompose the energy spectrum of a suspect item, selected by X-ray radiography and on which the RRTNIS inspection is focused, on a database of pure element gamma signatures. This paper reports on simulated signatures for the NaI and LaBr3 detectors, constructed using the MCNP6 code. First experimental spectra of a few elements of interest are also presented.

  14. Global rotational motion and displacement estimation of digital image stabilization based on the oblique vectors matching algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Fei; Hui, Mei; Zhao, Yue-jin

    2009-08-01

    The image block matching algorithm based on motion vectors of correlative pixels in oblique direction is presented for digital image stabilization. The digital image stabilization is a new generation of image stabilization technique which can obtains the information of relative motion among frames of dynamic image sequences by the method of digital image processing. In this method the matching parameters are calculated from the vectors projected in the oblique direction. The matching parameters based on the vectors contain the information of vectors in transverse and vertical direction in the image blocks at the same time. So the better matching information can be obtained after making correlative operation in the oblique direction. And an iterative weighted least square method is used to eliminate the error of block matching. The weights are related with the pixels' rotational angle. The center of rotation and the global emotion estimation of the shaking image can be obtained by the weighted least square from the estimation of each block chosen evenly from the image. Then, the shaking image can be stabilized with the center of rotation and the global emotion estimation. Also, the algorithm can run at real time by the method of simulated annealing in searching method of block matching. An image processing system based on DSP was used to exam this algorithm. The core processor in the DSP system is TMS320C6416 of TI, and the CCD camera with definition of 720×576 pixels was chosen as the input video signal. Experimental results show that the algorithm can be performed at the real time processing system and have an accurate matching precision.

  15. A Contract for Excellence in Scientific Education: May I Have Your Signature Please?

    ERIC Educational Resources Information Center

    Tate, William F.; Malancharuvil-Berkes, Elizabeth

    2006-01-01

    Recent advances in biology and digital technology represent unique opportunities for teacher educators to rethink the programmatic experiences of prospective secondary science and mathematics teachers. This article discusses the importance of teacher education programs that connect mathematics and science where appropriate, recognize the…

  16. Survey of Munitions Response Technologies

    DTIC Science & Technology

    2006-06-01

    3-34 3.3.4 Digital Data Processing .......................................................................... 3-36 4.0 SOURCE DATA AND METHODS...6-4 6.1.6 DGM versus Mag and Flag Processes ..................................................... 6-5 6.1.7 Translation to...signatures, surface clutter, variances in operator technique, target selection, and data processing all degrade from and affect optimum performance

  17. Propeller performance analysis and multidisciplinary optimization using a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Burger, Christoph

    A propeller performance analysis program has been developed and integrated into a Genetic Algorithm for design optimization. The design tool will produce optimal propeller geometries for a given goal, which includes performance and/or acoustic signature. A vortex lattice model is used for the propeller performance analysis and a subsonic compact source model is used for the acoustic signature determination. Compressibility effects are taken into account with the implementation of Prandtl-Glauert domain stretching. Viscous effects are considered with a simple Reynolds number based model to account for the effects of viscosity in the spanwise direction. An empirical flow separation model developed from experimental lift and drag coefficient data of a NACA 0012 airfoil is included. The propeller geometry is generated using a recently introduced Class/Shape function methodology to allow for efficient use of a wide design space. Optimizing the angle of attack, the chord, the sweep and the local airfoil sections, produced blades with favorable tradeoffs between single and multiple point optimizations of propeller performance and acoustic noise signatures. Optimizations using a binary encoded IMPROVE(c) Genetic Algorithm (GA) and a real encoded GA were obtained after optimization runs with some premature convergence. The newly developed real encoded GA was used to obtain the majority of the results which produced generally better convergence characteristics when compared to the binary encoded GA. The optimization trade-offs show that single point optimized propellers have favorable performance, but circulation distributions were less smooth when compared to dual point or multiobjective optimizations. Some of the single point optimizations generated propellers with proplets which show a loading shift to the blade tip region. When noise is included into the objective functions some propellers indicate a circulation shift to the inboard sections of the propeller as well as a reduction in propeller diameter. In addition the propeller number was increased in some optimizations to reduce the acoustic blade signature.

  18. Hyperspectral and Hypertemporal Longwave Infrared Data Characterization

    NASA Astrophysics Data System (ADS)

    Jeganathan, Nirmalan

    The Army Research Lab conducted a persistent imaging experiment called the Spectral and Polarimetric Imagery Collection Experiment (SPICE) in 2012 and 2013 which focused on collecting and exploiting long wave infrared hyperspectral and polarimetric imagery. A part of this dataset was made for public release for research and development purposes. This thesis investigated the hyperspectral portion of this released dataset through data characterization and scene characterization of man-made and natural objects. First, the data were contrasted with MODerate resolution atmospheric TRANsmission (MODTRAN) results and found to be comparable. Instrument noise was characterized using an in-scene black panel, and was found to be comparable with the sensor manufacturer's specication. The temporal and spatial variation of certain objects in the scene were characterized. Temporal target detection was conducted on man-made objects in the scene using three target detection algorithms: spectral angle mapper (SAM), spectral matched lter (SMF) and adaptive coherence/cosine estimator (ACE). SMF produced the best results for detecting the targets when the training and testing data originated from different time periods, with a time index percentage result of 52.9%. Unsupervised and supervised classification were conducted using spectral and temporal target signatures. Temporal target signatures produced better visual classification than spectral target signature for unsupervised classification. Supervised classification yielded better results using the spectral target signatures, with a highest weighted accuracy of 99% for 7-class reference image. Four emissivity retrieval algorithms were applied on this dataset. However, the retrieved emissivities from all four methods did not represent true material emissivity and could not be used for analysis. This spectrally and temporally rich dataset enabled to conduct analysis that was not possible with other data collections. Regarding future work, applying noise-reduction techniques before applying temperature-emissivity retrieval algorithms may produce more realistic emissivity values, which could be used for target detection and material identification.

  19. Soil Moisture Active Passive (SMAP) Project Algorithm Theoretical Basis Document SMAP L1B Radiometer Data Product: L1B_TB

    NASA Technical Reports Server (NTRS)

    Piepmeier, Jeffrey; Mohammed, Priscilla; De Amici, Giovanni; Kim, Edward; Peng, Jinzheng; Ruf, Christopher; Hanna, Maher; Yueh, Simon; Entekhabi, Dara

    2016-01-01

    The purpose of the Soil Moisture Active Passive (SMAP) radiometer calibration algorithm is to convert Level 0 (L0) radiometer digital counts data into calibrated estimates of brightness temperatures referenced to the Earth's surface within the main beam. The algorithm theory in most respects is similar to what has been developed and implemented for decades for other satellite radiometers; however, SMAP includes two key features heretofore absent from most satellite borne radiometers: radio frequency interference (RFI) detection and mitigation, and measurement of the third and fourth Stokes parameters using digital correlation. The purpose of this document is to describe the SMAP radiometer and forward model, explain the SMAP calibration algorithm, including approximations, errors, and biases, provide all necessary equations for implementing the calibration algorithm and detail the RFI detection and mitigation process. Section 2 provides a summary of algorithm objectives and driving requirements. Section 3 is a description of the instrument and Section 4 covers the forward models, upon which the algorithm is based. Section 5 gives the retrieval algorithm and theory. Section 6 describes the orbit simulator, which implements the forward model and is the key for deriving antenna pattern correction coefficients and testing the overall algorithm.

  20. Track-Before-Detect Algorithm for Faint Moving Objects based on Random Sampling and Consensus

    NASA Astrophysics Data System (ADS)

    Dao, P.; Rast, R.; Schlaegel, W.; Schmidt, V.; Dentamaro, A.

    2014-09-01

    There are many algorithms developed for tracking and detecting faint moving objects in congested backgrounds. One obvious application is detection of targets in images where each pixel corresponds to the received power in a particular location. In our application, a visible imager operated in stare mode observes geostationary objects as fixed, stars as moving and non-geostationary objects as drifting in the field of view. We would like to achieve high sensitivity detection of the drifters. The ability to improve SNR with track-before-detect (TBD) processing, where target information is collected and collated before the detection decision is made, allows respectable performance against dim moving objects. Generally, a TBD algorithm consists of a pre-processing stage that highlights potential targets and a temporal filtering stage. However, the algorithms that have been successfully demonstrated, e.g. Viterbi-based and Bayesian-based, demand formidable processing power and memory. We propose an algorithm that exploits the quasi constant velocity of objects, the predictability of the stellar clutter and the intrinsically low false alarm rate of detecting signature candidates in 3-D, based on an iterative method called "RANdom SAmple Consensus” and one that can run real-time on a typical PC. The technique is tailored for searching objects with small telescopes in stare mode. Our RANSAC-MT (Moving Target) algorithm estimates parameters of a mathematical model (e.g., linear motion) from a set of observed data which contains a significant number of outliers while identifying inliers. In the pre-processing phase, candidate blobs were selected based on morphology and an intensity threshold that would normally generate unacceptable level of false alarms. The RANSAC sampling rejects candidates that conform to the predictable motion of the stars. Data collected with a 17 inch telescope by AFRL/RH and a COTS lens/EM-CCD sensor by the AFRL/RD Satellite Assessment Center is used to assess the performance of the algorithm. In the second application, a visible imager operated in sidereal mode observes geostationary objects as moving, stars as fixed except for field rotation, and non-geostationary objects as drifting. RANSAC-MT is used to detect the drifter. In this set of data, the drifting space object was detected at a distance of 13800 km. The AFRL/RH set of data, collected in the stare mode, contained the signature of two geostationary satellites. The signature of a moving object was simulated and added to the sequence of frames to determine the sensitivity in magnitude. The performance compares well with the more intensive TBD algorithms reported in the literature.

Top