NASA Astrophysics Data System (ADS)
Pérez-Cabré, Elisabet; Millán, María S.; Javidi, Bahram
2006-09-01
Verification of a piece of information and/or authentication of a given object or person are common operations carried out by automatic security systems that can be applied, for instance, to control the entrance to restricted areas, access to public buildings, identification of cardholders, etc. Vulnerability of such security systems may depend on the ease of counterfeiting the information used as a piece of identification for verification and authentication. To protect data against tampering, the signature that identifies an object is usually encrypted to avoid an easy recognition at human sight and an easy reproduction using conventional devices for imaging or scanning. To make counterfeiting even more difficult, we propose to combine data from visible and near infrared (NIR) spectral bands. By doing this, neither the visible content nor the NIR data by theirselves are sufficient to allow the signature recognition and thus, the identification of a given object. Only the appropriate combination of both signals permits a satisfactory authentication. In addition, the resulting signature is encrypted following a fully-phase encryption technique and the obtained complex-amplitude distribution is encoded on an ID tag. Spatial multiplexing of the encrypted signature allows us to build a distortion-invariant ID tag, so that remote authentication can be achieved even if the tag is captured under rotation or at different distances. We also explore the possibility of using partial information of the encrypted signature to simplify the ID tag design.
Offline Signature Verification Using the Discrete Radon Transform and a Hidden Markov Model
NASA Astrophysics Data System (ADS)
Coetzer, J.; Herbst, B. M.; du Preez, J. A.
2004-12-01
We developed a system that automatically authenticates offline handwritten signatures using the discrete Radon transform (DRT) and a hidden Markov model (HMM). Given the robustness of our algorithm and the fact that only global features are considered, satisfactory results are obtained. Using a database of 924 signatures from 22 writers, our system achieves an equal error rate (EER) of 18% when only high-quality forgeries (skilled forgeries) are considered and an EER of 4.5% in the case of only casual forgeries. These signatures were originally captured offline. Using another database of 4800 signatures from 51 writers, our system achieves an EER of 12.2% when only skilled forgeries are considered. These signatures were originally captured online and then digitally converted into static signature images. These results compare well with the results of other algorithms that consider only global features.
Heckeroth, J; Boywitt, C D
2017-06-01
Considering the increasing relevance of handwritten electronically captured signatures, we evaluated the ability of forensic handwriting examiners (FHEs) to distinguish between authentic and simulated electronic signatures. Sixty-six professional FHEs examined the authenticity of electronic signatures captured with software by signotec on a smartphone Galaxy Note 4 by Samsung and signatures made with a ballpoint pen on paper (conventional signatures). In addition, we experimentally varied the name ("J. König" vs. "A. Zaiser") and the status (authentic vs. simulated) of the signatures in question. FHEs' conclusions about the authenticity did not show a statistically significant general difference between electronic and conventional signatures. Furthermore, no significant discrepancies between electronic and conventional signatures were found with regard to other important aspects of the authenticity examination such as questioned signatures' graphic information content, the suitability of the provided sample signatures, the necessity of further examinations and the levels of difficulty of the cases under examination. Thus, this study did not reveal any indications that electronic signatures captured with software by signotec on a Galaxy Note 4 are less well suited than conventional signatures for the examination of authenticity, precluding potential technical problems concerning the integrity of electronic signatures. Copyright © 2017 Elsevier B.V. All rights reserved.
On Hunting Animals of the Biometric Menagerie for Online Signature.
Houmani, Nesma; Garcia-Salicetti, Sonia
2016-01-01
Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database.
About machine-readable travel documents
NASA Astrophysics Data System (ADS)
Vaudenay, S.; Vuagnoux, M.
2007-07-01
Passports are documents that help immigration officers to identify people. In order to strongly authenticate their data and to automatically identify people, they are now equipped with RFID chips. These contain private information, biometrics, and a digital signature by issuing authorities. Although they substantially increase security at the border controls, they also come with new security and privacy issues. In this paper, we survey existing protocols and their weaknesses.
On Hunting Animals of the Biometric Menagerie for Online Signature
Houmani, Nesma; Garcia-Salicetti, Sonia
2016-01-01
Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database. PMID:27054836
26 CFR 301.6064-1 - Signature presumed authentic.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Signature presumed authentic. 301.6064-1 Section 301.6064-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED....6064-1 Signature presumed authentic. An individual's name signed to a return, statement, or other...
Comparison of Fingerprint and Iris Biometric Authentication for Control of Digital Signatures
Zuckerman, Alan E.; Moon, Kenneth A.; Eaddy, Kenneth
2002-01-01
Biometric authentication systems can be used to control digital signature of medical documents. This pilot study evaluated the use of two different fingerprint technologies and one iris technology to control creation of digital signatures on a central server using public private key pairs stored on the server. Documents and signatures were stored in XML for portability. Key pairs and authentication certificates were generated during biometric enrollment. Usability and user acceptance were guarded and limitations of biometric systems prevented use of the system with all test subjects. The system detected alternations in the data content and provided future signer re-authentication for non-repudiation.
A covert authentication and security solution for GMOs.
Mueller, Siguna; Jafari, Farhad; Roth, Don
2016-09-21
Proliferation and expansion of security risks necessitates new measures to ensure authenticity and validation of GMOs. Watermarking and other cryptographic methods are available which conceal and recover the original signature, but in the process reveal the authentication information. In many scenarios watermarking and standard cryptographic methods are necessary but not sufficient and new, more advanced, cryptographic protocols are necessary. Herein, we present a new crypto protocol, that is applicable in broader settings, and embeds the authentication string indistinguishably from a random element in the signature space and the string is verified or denied without disclosing the actual signature. Results show that in a nucleotide string of 1000, the algorithm gives a correlation of 0.98 or higher between the distribution of the codon and that of E. coli, making the signature virtually invisible. This algorithm may be used to securely authenticate and validate GMOs without disclosing the actual signature. While this protocol uses watermarking, its novelty is in use of more complex cryptographic techniques based on zero knowledge proofs to encode information.
27 CFR 70.52 - Signature presumed authentic.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Signature presumed authentic. 70.52 Section 70.52 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE... Collection of Excise and Special (Occupational) Tax Collection-General Provisions § 70.52 Signature presumed...
Research on Signature Verification Method Based on Discrete Fréchet Distance
NASA Astrophysics Data System (ADS)
Fang, J. L.; Wu, W.
2018-05-01
This paper proposes a multi-feature signature template based on discrete Fréchet distance, which breaks through the limitation of traditional signature authentication using a single signature feature. It solves the online handwritten signature authentication signature global feature template extraction calculation workload, signature feature selection unreasonable problem. In this experiment, the false recognition rate (FAR) and false rejection rate (FRR) of the statistical signature are calculated and the average equal error rate (AEER) is calculated. The feasibility of the combined template scheme is verified by comparing the average equal error rate of the combination template and the original template.
NASA Astrophysics Data System (ADS)
Boling, M. E.
1989-09-01
Prototypes were assembled pursuant to recommendations made in report K/DSRD-96, Issues and Approaches for Electronic Document Approval and Transmittal Using Digital Signatures and Text Authentication, and to examine and discover the possibilities for integrating available hardware and software to provide cost effective systems for digital signatures and text authentication. These prototypes show that on a LAN, a multitasking, windowed, mouse/keyboard menu-driven interface can be assembled to provide easy and quick access to bit-mapped images of documents, electronic forms and electronic mail messages with a means to sign, encrypt, deliver, receive or retrieve and authenticate text and signatures. In addition they show that some of this same software may be used in a classified environment using host to terminal transactions to accomplish these same operations. Finally, a prototype was developed demonstrating that binary files may be signed electronically and sent by point to point communication and over ARPANET to remote locations where the authenticity of the code and signature may be verified. Related studies on the subject of electronic signatures and text authentication using public key encryption were done within the Department of Energy. These studies include timing studies of public key encryption software and hardware and testing of experimental user-generated host resident software for public key encryption. This software used commercially available command-line source code. These studies are responsive to an initiative within the Office of the Secretary of Defense (OSD) for the protection of unclassified but sensitive data. It is notable that these related studies are all built around the same commercially available public key encryption products from the private sector and that the software selection was made independently by each study group.
Security authentication using phase-encoded nanoparticle structures and polarized light.
Carnicer, Artur; Hassanfiroozi, Amir; Latorre-Carmona, Pedro; Huang, Yi-Pai; Javidi, Bahram
2015-01-15
Phase-encoded nanostructures such as quick response (QR) codes made of metallic nanoparticles are suggested to be used in security and authentication applications. We present a polarimetric optical method able to authenticate random phase-encoded QR codes. The system is illuminated using polarized light, and the QR code is encoded using a phase-only random mask. Using classification algorithms, it is possible to validate the QR code from the examination of the polarimetric signature of the speckle pattern. We used Kolmogorov-Smirnov statistical test and Support Vector Machine algorithms to authenticate the phase-encoded QR codes using polarimetric signatures.
Authentication Based on Pole-zero Models of Signature Velocity
Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad
2013-01-01
With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797
78 FR 39200 - Authentication of Electronic Signatures on Electronically Filed Statements of Account
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-01
... LIBRARY OF CONGRESS Copyright Office 37 CFR Part 201 [Docket No. 2013-5] Authentication of Electronic Signatures on Electronically Filed Statements of Account AGENCY: U.S. Copyright Office, Library of Congress. ACTION: Notice of proposed rulemaking; correction. SUMMARY: The U.S. Copyright Office published a...
Chemical markup, XML and the World-Wide Web. 3. Toward a signed semantic chemical web of trust.
Gkoutos, G V; Murray-Rust, P; Rzepa, H S; Wright, M
2001-01-01
We describe how a collection of documents expressed in XML-conforming languages such as CML and XHTML can be authenticated and validated against digital signatures which make use of established X.509 certificate technology. These can be associated either with specific nodes in the XML document or with the entire document. We illustrate this with two examples. An entire journal article expressed in XML has its individual components digitally signed by separate authors, and the collection is placed in an envelope and again signed. The second example involves using a software robot agent to acquire a collection of documents from a specified URL, to perform various operations and transformations on the content, including expressing molecules in CML, and to automatically sign the various components and deposit the result in a repository. We argue that these operations can used as components for building what we term an authenticated and semantic chemical web of trust.
Digital Camera with Apparatus for Authentication of Images Produced from an Image File
NASA Technical Reports Server (NTRS)
Friedman, Gary L. (Inventor)
1996-01-01
A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.
Wang, Wei; Wang, Chunqiu; Zhao, Min
2014-03-01
To ease the burdens on the hospitalization capacity, an emerging swallowable-capsule technology has evolved to serve as a remote gastrointestinal (GI) disease examination technique with the aid of the wireless body sensor network (WBSN). Secure multimedia transmission in such a swallowable-capsule-based WBSN faces critical challenges including energy efficiency and content quality guarantee. In this paper, we propose a joint resource allocation and stream authentication scheme to maintain the best possible video quality while ensuring security and energy efficiency in GI-WBSNs. The contribution of this research is twofold. First, we establish a unique signature-hash (S-H) diversity approach in the authentication domain to optimize video authentication robustness and the authentication bit rate overhead over a wireless channel. Based on the full exploration of S-H authentication diversity, we propose a new two-tier signature-hash (TTSH) stream authentication scheme to improve the video quality by reducing authentication dependence overhead while protecting its integrity. Second, we propose to combine this authentication scheme with a unique S-H oriented unequal resource allocation (URA) scheme to improve the energy-distortion-authentication performance of wireless video delivery in GI-WBSN. Our analysis and simulation results demonstrate that the proposed TTSH with URA scheme achieves considerable gain in both authenticated video quality and energy efficiency.
Limitations and requirements of content-based multimedia authentication systems
NASA Astrophysics Data System (ADS)
Wu, Chai W.
2001-08-01
Recently, a number of authentication schemes have been proposed for multimedia data such as images and sound data. They include both label based systems and semifragile watermarks. The main requirement for such authentication systems is that minor modifications such as lossy compression which do not alter the content of the data preserve the authenticity of the data, whereas modifications which do modify the content render the data not authentic. These schemes can be classified into two main classes depending on the model of image authentication they are based on. One of the purposes of this paper is to look at some of the advantages and disadvantages of these image authentication schemes and their relationship with fundamental limitations of the underlying model of image authentication. In particular, we study feature-based algorithms which generate an authentication tag based on some inherent features in the image such as the location of edges. The main disadvantage of most proposed feature-based algorithms is that similar images generate similar features, and therefore it is possible for a forger to generate dissimilar images that have the same features. On the other hand, the class of hash-based algorithms utilizes a cryptographic hash function or a digital signature scheme to reduce the data and generate an authentication tag. It inherits the security of digital signatures to thwart forgery attacks. The main disadvantage of hash-based algorithms is that the image needs to be modified in order to be made authenticatable. The amount of modification is on the order of the noise the image can tolerate before it is rendered inauthentic. The other purpose of this paper is to propose a multimedia authentication scheme which combines some of the best features of both classes of algorithms. The proposed scheme utilizes cryptographic hash functions and digital signature schemes and the data does not need to be modified in order to be made authenticatable. Several applications including the authentication of images on CD-ROM and handwritten documents will be discussed.
Filtering methods for broadcast authentication against PKC-based denial of service in WSN: a survey
NASA Astrophysics Data System (ADS)
Afianti, Farah; Wirawan, Iwan; Suryani, Titiek
2017-11-01
Broadcast authentication is used to determine legitimate packet from authorized user. The received packet can be forwarded or used for the further purpose. The use of digital signature is one of the compromising methods but it is followed by high complexity especially in the verification process. That phenomenon is used by the adversary to force the user to verify a lot of false packet data. Kind of Denial of Service (DoS) which attacks the main signature can be mitigated by using pre-authentication methods as the first layer to filter false packet data. The objective of the filter is not replacing the main signature but as an addition to actual verification in the sensor node. This paper contributes in comparing the cost of computation, storage, and communication among several filters. The result shows Pre- Authenticator and Dos Attack-Resistant scheme have the lower overhead than the others. Thus followed by needing powerful sender. Moreover, the key chain is promising methods because of efficiency and effectiveness.
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin
2010-04-01
The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.
Game meat authentication through rare earth elements fingerprinting.
Danezis, G P; Pappas, A C; Zoidis, E; Papadomichelakis, G; Hadjigeorgiou, I; Zhang, P; Brusic, V; Georgiou, C A
2017-10-23
Accurate labelling of meat (e.g. wild versus farmed, geographical and genetic origin, organic versus conventional, processing treatment) is important to inform the consumers about the products they buy. Meat and meat products declared as game have higher commercial value making them target to fraudulent labelling practices and replacement with non-game meat. We have developed and validated a new method for authentication of wild rabbit meat using elemental metabolomics approach. Elemental analysis was performed using rapid ultra-trace multi-element measurement by inductively coupled plasma mass spectrometry (ICP-MS). Elemental signatures showed excellent ability to discriminate the wild rabbit from non-wild rabbit meat. Our results demonstrate the usefulness of metabolic markers -rare earth signatures, as well as other trace element signatures for game meat authentication. Copyright © 2017 Elsevier B.V. All rights reserved.
Practical quantum digital signature
NASA Astrophysics Data System (ADS)
Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing
2016-03-01
Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.
Department of Defense Law of War Manual
2015-06-12
pretences, he obtains or endeavours to obtain information in the zone of operations of a belligerent, with the intention of communicating it to the...received within their territory.757 All written communications made by the National POW Information Bureau shall be authenticated by a signature or a ...722 All communications in writing made by any National Protected Person Information Bureau shall be authenticated by a signature or a seal.552 The
25 CFR 82.7 - Notarization of petition signatures.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 25 Indians 1 2010-04-01 2010-04-01 false Notarization of petition signatures. 82.7 Section 82.7... signatures. (a) Signatures to a petition must be authenticated in one of the following ways: (1) Through having each signer subscribe or acknowledge his/her signature before a notary public; (2) Through having...
Cryptographic framework for document-objects resulting from multiparty collaborative transactions.
Goh, A
2000-01-01
Multiparty transactional frameworks--i.e. Electronic Data Interchange (EDI) or Health Level (HL) 7--often result in composite documents which can be accurately modelled using hyperlinked document-objects. The structural complexity arising from multiauthor involvement and transaction-specific sequencing would be poorly handled by conventional digital signature schemes based on a single evaluation of a one-way hash function and asymmetric cryptography. In this paper we outline the generation of structure-specific authentication hash-trees for the the authentication of transactional document-objects, followed by asymmetric signature generation on the hash-tree value. Server-side multi-client signature verification would probably constitute the single most compute-intensive task, hence the motivation for our usage of the Rabin signature protocol which results in significantly reduced verification workloads compared to the more commonly applied Rivest-Shamir-Adleman (RSA) protocol. Data privacy is handled via symmetric encryption of message traffic using session-specific keys obtained through key-negotiation mechanisms based on discrete-logarithm cryptography. Individual client-to-server channels can be secured using a double key-pair variation of Diffie-Hellman (DH) key negotiation, usage of which also enables bidirectional node authentication. The reciprocal server-to-client multicast channel is secured through Burmester-Desmedt (BD) key-negotiation which enjoys significant advantages over the usual multiparty extensions to the DH protocol. The implementation of hash-tree signatures and bi/multidirectional key negotiation results in a comprehensive cryptographic framework for multiparty document-objects satisfying both authentication and data privacy requirements.
Long-term verifiability of the electronic healthcare records' authenticity.
Lekkas, Dimitrios; Gritzalis, Dimitris
2007-01-01
To investigate whether the long-term preservation of the authenticity of electronic healthcare records (EHR) is possible. To propose a mechanism that enables the secure validation of an EHR for long periods, far beyond the lifespan of a digital signature and at least as long as the lifetime of a patient. The study is based on the fact that although the attributes of data authenticity, i.e. integrity and origin verifiability, can be preserved by digital signatures, the necessary period for the retention of EHRs is far beyond the lifespan of a simple digital signature. It is identified that the lifespan of signed data is restricted by the validity period of the relevant keys and the digital certificates, by the future unavailability of signature-verification data, and by suppression of trust relationships. In this paper, the notarization paradigm is exploited, and a mechanism for cumulative notarization of signed EHR is proposed. The proposed mechanism implements a successive trust transition towards new entities, modern technologies, and refreshed data, eliminating any dependency of the relying party on ceased entities, obsolete data, or weak old technologies. The mechanism also exhibits strength against various threat scenarios. A future relying party will have to trust only the fresh technology and information provided by the last notary, in order to verify the authenticity of an old signed EHR. A Cumulatively Notarized Signature is strong even in the case of the compromise of a notary in the chain.
Evaluation of the automatic optical authentication technologies for control systems of objects
NASA Astrophysics Data System (ADS)
Averkin, Vladimir V.; Volegov, Peter L.; Podgornov, Vladimir A.
2000-03-01
The report considers the evaluation of the automatic optical authentication technologies for the automated integrated system of physical protection, control and accounting of nuclear materials at RFNC-VNIITF, and for providing of the nuclear materials nonproliferation regime. The report presents the nuclear object authentication objectives and strategies, the methodology of the automatic optical authentication and results of the development of pattern recognition techniques carried out under the ISTC project #772 with the purpose of identification of unique features of surface structure of a controlled object and effects of its random treatment. The current decision of following functional control tasks is described in the report: confirmation of the item authenticity (proof of the absence of its substitution by an item of similar shape), control over unforeseen change of item state, control over unauthorized access to the item. The most important distinctive feature of all techniques is not comprehensive description of some properties of controlled item, but unique identification of item using minimum necessary set of parameters, properly comprising identification attribute of the item. The main emphasis in the technical approach is made on the development of rather simple technological methods for the first time intended for use in the systems of physical protection, control and accounting of nuclear materials. The developed authentication devices and system are described.
NASA Astrophysics Data System (ADS)
Lee, Jasper C.; Ma, Kevin C.; Liu, Brent J.
2008-03-01
A Data Grid for medical images has been developed at the Image Processing and Informatics Laboratory, USC to provide distribution and fault-tolerant storage of medical imaging studies across Internet2 and public domain. Although back-up policies and grid certificates guarantee privacy and authenticity of grid-access-points, there still lacks a method to guarantee the sensitive DICOM images have not been altered or corrupted during transmission across a public domain. This paper takes steps toward achieving full image transfer security within the Data Grid by utilizing DICOM image authentication and a HIPAA-compliant auditing system. The 3-D lossless digital signature embedding procedure involves a private 64 byte signature that is embedded into each original DICOM image volume, whereby on the receiving end the signature can to be extracted and verified following the DICOM transmission. This digital signature method has also been developed at the IPILab. The HIPAA-Compliant Auditing System (H-CAS) is required to monitor embedding and verification events, and allows monitoring of other grid activity as well. The H-CAS system federates the logs of transmission and authentication events at each grid-access-point and stores it into a HIPAA-compliant database. The auditing toolkit is installed at the local grid-access-point and utilizes Syslog [1], a client-server standard for log messaging over an IP network, to send messages to the H-CAS centralized database. By integrating digital image signatures and centralized logging capabilities, DICOM image integrity within the Medical Imaging and Informatics Data Grid can be monitored and guaranteed without loss to any image quality.
NASA Astrophysics Data System (ADS)
Yin, Aihan; Ding, Yisheng
2014-11-01
Identity-related security issues inherently present in passive optical networks (PON) still exist in the current (1G) and next-generation (10G) Ethernet-based passive optical network (EPON) systems. We propose a mutual authentication scheme that integrates an NTRUsign digital signature algorithm with inherent multipoint control protocol (MPCP) frames over an EPON system between the optical line terminal (OLT) and optical network unit (ONU). Here, a primitive NTRUsign algorithm is significantly modified through the use of a new perturbation so that it can be effectively used for simultaneously completing signature and authentication functions on the OLT and the ONU sides. Also, in order to transmit their individual sensitive messages, which include public key, signature, and random value and so forth, to each other, we redefine three unique frames according to MPCP format frame. These generated messages can be added into the frames and delivered to each other, allowing the OLT and the ONU to go ahead with a mutual identity authentication process to verify their legal identities. Our simulation results show that this proposed scheme performs very well in resisting security attacks and has low influence on the registration efficiency to to-be-registered ONUs. A performance comparison with traditional authentication algorithms is also presented. To the best of our knowledge, no detailed design of mutual authentication in EPON can be found in the literature up to now.
ERIC Educational Resources Information Center
Pesonen, Inari
2008-01-01
This paper examines two improvisational processes, Authentic Movement (AM) and automatic drawing (AD), the possibility of their presentation to the viewer and the meanings such presentation may bring to the work presented. Improvisation has traditionally been used in the process of creating a finished work of art rather than in the finished art…
Digital camera with apparatus for authentication of images produced from an image file
NASA Technical Reports Server (NTRS)
Friedman, Gary L. (Inventor)
1993-01-01
A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.
Object migration and authentication. [in computer operating systems design
NASA Technical Reports Server (NTRS)
Gligor, V. D.; Lindsay, B. G.
1979-01-01
The paper presents a mechanism permitting a type manager to fabricate a migrated object representation which can be entrusted to other subsystems or transmitted outside of the control of a local computer system. The migrated object representation is signed by the type manager in such a way that the type manager's signature cannot be forged and the manager is able to authenticate its own signature. Subsequently, the type manager can retrieve the migrated representation and validate its contents before reconstructing the object in its original representation. This facility allows type managers to authenticate the contents of off-line or network storage and solves problems stemming from the hierarchical structure of the system itself.
NASA Astrophysics Data System (ADS)
Markman, Adam; Carnicer, Artur; Javidi, Bahram
2017-05-01
We overview our recent work [1] on utilizing three-dimensional (3D) optical phase codes for object authentication using the random forest classifier. A simple 3D optical phase code (OPC) is generated by combining multiple diffusers and glass slides. This tag is then placed on a quick-response (QR) code, which is a barcode capable of storing information and can be scanned under non-uniform illumination conditions, rotation, and slight degradation. A coherent light source illuminates the OPC and the transmitted light is captured by a CCD to record the unique signature. Feature extraction on the signature is performed and inputted into a pre-trained random-forest classifier for authentication.
22 CFR 204.12 - Guaranty eligibility.
Code of Federal Regulations, 2010 CFR
2010-04-01
... representative of A.I.D. together with a certificate of authentication manually executed by a Paying Agent whose... signature shall be binding on A.I.D. The certificate of authentication of the Paying Agent issued pursuant...
Efficient cost-sensitive human-machine collaboration for offline signature verification
NASA Astrophysics Data System (ADS)
Coetzer, Johannes; Swanepoel, Jacques; Sabourin, Robert
2012-01-01
We propose a novel strategy for the optimal combination of human and machine decisions in a cost-sensitive environment. The proposed algorithm should be especially beneficial to financial institutions where off-line signatures, each associated with a specific transaction value, require authentication. When presented with a collection of genuine and fraudulent training signatures, produced by so-called guinea pig writers, the proficiency of a workforce of human employees and a score-generating machine can be estimated and represented in receiver operating characteristic (ROC) space. Using a set of Boolean fusion functions, the majority vote decision of the human workforce is combined with each threshold-specific machine-generated decision. The performance of the candidate ensembles is estimated and represented in ROC space, after which only the optimal ensembles and associated decision trees are retained. When presented with a questioned signature linked to an arbitrary writer, the system first uses the ROC-based cost gradient associated with the transaction value to select the ensemble that minimises the expected cost, and then uses the corresponding decision tree to authenticate the signature in question. We show that, when utilising the entire human workforce, the incorporation of a machine streamlines the authentication process and decreases the expected cost for all operating conditions.
Internet Protocol Security (IPSEC): Testing and Implications on IPv4 and IPv6 Networks
2008-08-27
Message Authentication Code-Message Digest 5-96). Due to the processing power consumption and slowness of public key authentication methods, RSA ...MODP) group with a 768 -bit modulus 2. a MODP group with a 1024-bit modulus 3. an Elliptic Curve Group over GF[ 2n ] (EC2N) group with a 155-bit...nonces, digital signatures using the Digital Signature Algorithm, and the Rivest-Shamir- Adelman ( RSA ) algorithm. For more information about the
Chao, Hui-Mei; Hsu, Chin-Ming; Miaou, Shaou-Gang
2002-03-01
A data-hiding technique called the "bipolar multiple-number base" was developed to provide capabilities of authentication, integration, and confidentiality for an electronic patient record (EPR) transmitted among hospitals through the Internet. The proposed technique is capable of hiding those EPR related data such as diagnostic reports, electrocardiogram, and digital signatures from doctors or a hospital into a mark image. The mark image could be the mark of a hospital used to identify the origin of an EPR. Those digital signatures from doctors and a hospital could be applied for the EPR authentication. Thus, different types of medical data can be integrated into the same mark image. The confidentiality is ultimately achieved by decrypting the EPR related data and digital signatures with an exact copy of the original mark image. The experimental results validate the integrity and the invisibility of the hidden EPR related data. This newly developed technique allows all of the hidden data to be separated and restored perfectly by authorized users.
Caranguian, Luther Paul R; Pancho-Festin, Susan; Sison, Luis G
2012-01-01
In this study, we focused on the interoperability and authentication of medical devices in the context of telemedical systems. A recent standard called the ISO/IEEE 11073 Personal Health Device (X73-PHD) Standards addresses the device interoperability problem by defining common protocols for agent (medical device) and manager (appliance) interface. The X73-PHD standard however has not addressed security and authentication of medical devices which is important in establishing integrity of a telemedical system. We have designed and implemented a security policy within the X73-PHD standards. The policy will enable device authentication using Asymmetric-Key Cryptography and the RSA algorithm as the digital signature scheme. We used two approaches for performing the digital signatures: direct software implementation and use of embedded security modules (ESM). The two approaches were evaluated and compared in terms of execution time and memory requirement. For the standard 2048-bit RSA, ESM calculates digital signatures only 12% of the total time for the direct implementation. Moreover, analysis shows that ESM offers more security advantage such as secure storage of keys compared to using direct implementation. Interoperability with other systems was verified by testing the system with LNI Healthlink, a manager software that implements the X73-PHD standard. Lastly, security analysis was done and the system's response to common attacks on authentication systems was analyzed and several measures were implemented to protect the system against them.
Schwartze, J; Haarbrandt, B; Fortmeier, D; Haux, R; Seidel, C
2014-01-01
Integration of electronic signatures embedded in health care processes in Germany challenges health care service and supply facilities. The suitability of the signature level of an eligible authentication procedure is confirmed for a large part of documents in clinical practice. However, the concrete design of such a procedure remains unclear. To create a summary of usable user authentication systems suitable for clinical workflows. A Systematic literature review based on nine online bibliographic databases. Search keywords included authentication, access control, information systems, information security and biometrics with terms user authentication, user identification and login in title or abstract. Searches were run between 7 and 12 September 2011. Relevant conference proceedings were searched manually in February 2013. Backward reference search of selected results was done. Only publications fully describing authentication systems used or usable were included. Algorithms or purely theoretical concepts were excluded. Three authors did selection independently. DATA EXTRACTION AND ASSESSMENT: Semi-structured extraction of system characteristics was done by the main author. Identified procedures were assessed for security and fulfillment of relevant laws and guidelines as well as for applicability. Suitability for clinical workflows was derived from the assessments using a weighted sum proposed by Bonneau. Of 7575 citations retrieved, 55 publications meet our inclusion criteria. They describe 48 different authentication systems; 39 were biometric and nine graphical password systems. Assessment of authentication systems showed high error rates above European CENELEC standards and a lack of applicability of biometric systems. Graphical passwords did not add overall value compared to conventional passwords. Continuous authentication can add an additional layer of safety. Only few systems are suitable partially or entirely for use in clinical processes. Suitability strongly depends on national or institutional requirements. Four authentication systems seem to fulfill requirements of authentication procedures for clinical workflows. Research is needed in the area of continuous authentication with biometric methods. A proper authentication system should combine all factors of authentication implementing and connecting secure individual measures.
A Continuous Identity Authentication Scheme Based on Physiological and Behavioral Characteristics.
Wu, Guannan; Wang, Jian; Zhang, Yongrong; Jiang, Shuai
2018-01-10
Wearable devices have flourished over the past ten years providing great advantages to people and, recently, they have also been used for identity authentication. Most of the authentication methods adopt a one-time authentication manner which cannot provide continuous certification. To address this issue, we present a two-step authentication method based on an own-built fingertip sensor device which can capture motion data (e.g., acceleration and angular velocity) and physiological data (e.g., a photoplethysmography (PPG) signal) simultaneously. When the device is worn on the user's fingertip, it will automatically recognize whether the wearer is a legitimate user or not. More specifically, multisensor data is collected and analyzed to extract representative and intensive features. Then, human activity recognition is applied as the first step to enhance the practicability of the authentication system. After correctly discriminating the motion state, a one-class machine learning algorithm is applied for identity authentication as the second step. When a user wears the device, the authentication process is carried on automatically at set intervals. Analyses were conducted using data from 40 individuals across various operational scenarios. Extensive experiments were executed to examine the effectiveness of the proposed approach, which achieved an average accuracy rate of 98.5% and an F1-score of 86.67%. Our results suggest that the proposed scheme provides a feasible and practical solution for authentication.
A Continuous Identity Authentication Scheme Based on Physiological and Behavioral Characteristics
Wu, Guannan; Wang, Jian; Zhang, Yongrong; Jiang, Shuai
2018-01-01
Wearable devices have flourished over the past ten years providing great advantages to people and, recently, they have also been used for identity authentication. Most of the authentication methods adopt a one-time authentication manner which cannot provide continuous certification. To address this issue, we present a two-step authentication method based on an own-built fingertip sensor device which can capture motion data (e.g., acceleration and angular velocity) and physiological data (e.g., a photoplethysmography (PPG) signal) simultaneously. When the device is worn on the user’s fingertip, it will automatically recognize whether the wearer is a legitimate user or not. More specifically, multisensor data is collected and analyzed to extract representative and intensive features. Then, human activity recognition is applied as the first step to enhance the practicability of the authentication system. After correctly discriminating the motion state, a one-class machine learning algorithm is applied for identity authentication as the second step. When a user wears the device, the authentication process is carried on automatically at set intervals. Analyses were conducted using data from 40 individuals across various operational scenarios. Extensive experiments were executed to examine the effectiveness of the proposed approach, which achieved an average accuracy rate of 98.5% and an F1-score of 86.67%. Our results suggest that the proposed scheme provides a feasible and practical solution for authentication. PMID:29320463
Counterfeit-resistant materials and a method and apparatus for authenticating materials
Ramsey, J. Michael; Klatt, Leon N.
2001-01-01
Fluorescent dichroic fibers randomly incorporated within a media provide an improved method for authentication and counterfeiting protection. The dichroism is provided by an alignment of fluorescent molecules along the length of the fibers. The fluorescent fibers provide an authentication mechanism of varying levels of capability. The authentication signature depends on four parameters, the x,y position, the dichroism and the local environment. The availability of so many non-deterministic variables makes production of counterfeit articles (e.g., currency, credit cards, etc.) essentially impossible Counterfeit-resistant articles, an apparatus for authenticating articles, and a process for forming counterfeit-resistant media are also provided&
Counterfeit-resistant materials and a method and apparatus for authenticating materials
Ramsey, J. Michael; Klatt, Leon N.
2000-01-01
Fluorescent dichroic fibers randomly incorporated within a media provide an improved method for authentication and counterfeiting protection. The dichroism is provided by an alignment of fluorescent molecules along the length of the fibers. The fluorescent fibers provide an authentication mechanism of varying levels of capability. The authentication signature depends on four parameters; the x,y position, the dichroism and the local environment. The availability of so many non-deterministic variables makes production of counterfeit articles (e.g., currency, credit cards, etc.) essentially impossible. Counterfeit-resistant articles, an apparatus for authenticating articles, and a process for forming counterfeit-resistant media are also provided.
The construction of a public key infrastructure for healthcare information networks in Japan.
Sakamoto, N
2001-01-01
The digital signature is a key technology in the forthcoming Internet society for electronic healthcare as well as for electronic commerce. Efficient exchanges of authorized information with a digital signature in healthcare information networks require a construction of a public key infrastructure (PKI). In order to introduce a PKI to healthcare information networks in Japan, we proposed a development of a user authentication system based on a PKI for user management, user authentication and privilege management of healthcare information systems. In this paper, we describe the design of the user authentication system and its implementation. The user authentication system provides a certification authority service and a privilege management service while it is comprised of a user authentication client and user authentication serves. It is designed on a basis of an X.509 PKI and is implemented with using OpenSSL and OpenLDAP. It was incorporated into the financial information management system for the national university hospitals and has been successfully working for about one year. The hospitals plan to use it as a user authentication method for their whole healthcare information systems. One implementation of the system is free to the national university hospitals with permission of the Japanese Ministry of Education, Culture, Sports, Science and Technology. Another implementation is open to the other healthcare institutes by support of the Medical Information System Development Center (MEDIS-DC). We are moving forward to a nation-wide construction of a PKI for healthcare information networks based on it.
NASA Astrophysics Data System (ADS)
Ricci, R.; Chollet, G.; Crispino, M. V.; Jassim, S.; Koreman, J.; Olivar-Dimas, M.; Garcia-Salicetti, S.; Soria-Rodriguez, P.
2006-05-01
This article presents an overview of the SecurePhone project, with an account of the first results obtained. SecurePhone's primary aim is to realise a mobile phone prototype - the 'SecurePhone' - in which biometrical authentication enables users to deal secure, dependable transactions over a mobile network. The SecurePhone is based on a commercial PDA-phone, supplemented with specific software modules and a customised SIM card. It integrates in a single environment a number of advanced features: access to cryptographic keys through strong multimodal biometric authentication; appending and verification of digital signatures; real-time exchange and interactive modification of (esigned) documents and voice recordings. SecurePhone's 'biometric recogniser' is based on original research. A fused combination of three different biometric methods - speaker, face and handwritten signature verification - is exploited, with no need for dedicated hardware components. The adoption of non-intrusive, psychologically neutral biometric techniques is expected to mitigate rejection problems that often inhibit the social use of biometrics, and speed up the spread of e-signature technology. Successful biometric authentication grants access to SecurePhone's built-in esignature services through a user-friendly interface. Special emphasis is accorded to the definition of a trustworthy security chain model covering all aspects of system operation. The SecurePhone is expected to boost m-commerce and open new scenarios for m-business and m-work, by changing the way people interact and by improving trust and confidence in information technologies, often considered intimidating and difficult to use. Exploitation plans will also explore other application domains (physical and logical access control, securised mobile communications).
Signature Verification Based on Handwritten Text Recognition
NASA Astrophysics Data System (ADS)
Viriri, Serestina; Tapamo, Jules-R.
Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.
On Robust Key Agreement Based on Public Key Authentication
NASA Astrophysics Data System (ADS)
Hao, Feng
We describe two new attacks on the HMQV protocol. The first attack raises a serious question on the basic definition of "authentication" in HMQV, while the second attack is generally applicable to many other protocols. In addition, we present a new authenticated key agreement protocol called YAK. Our approach is to depend on well-established techniques such as Schnorr's signature. Among all the related protocols, YAK appears to be the simplest so far. We believe simplicity is an important engineering principle.
Basis for the implementation of digital signature in Argentine's health environment
NASA Astrophysics Data System (ADS)
Escobar, P. P.; Formica, M.
2007-11-01
The growth of telemedical applications and electronic transactions in health environments is paced by the constant technology evolution. This implies a big cultural change in traditional medicine and in hospital information systems' users which arrival is delayed, basically, by the lack of solid laws and a well defined role-based infrastructure. The use of digital signature as a mean of identification, authentication, confidentiality and non-repudiation is the most suitable tool for assuring the electronic transactions and patient's data protection. The implementation of a Public Key Infrastructure (PKI) in health environment allows for authentication, encryption and use of digital signature for assuring confidentiality and control of the movement of sensitive information. This work defines the minimum technological, legal and procedural basis for a successful PKI implementation and establishes the roles for the different actors in the chain of confidence in the public health environment of Argentine.
Integrating Iris and Signature Traits for Personal Authentication Using User-Specific Weighting
Viriri, Serestina; Tapamo, Jules R.
2012-01-01
Biometric systems based on uni-modal traits are characterized by noisy sensor data, restricted degrees of freedom, non-universality and are susceptible to spoof attacks. Multi-modal biometric systems seek to alleviate some of these drawbacks by providing multiple evidences of the same identity. In this paper, a user-score-based weighting technique for integrating the iris and signature traits is presented. This user-specific weighting technique has proved to be an efficient and effective fusion scheme which increases the authentication accuracy rate of multi-modal biometric systems. The weights are used to indicate the importance of matching scores output by each biometrics trait. The experimental results show that our biometric system based on the integration of iris and signature traits achieve a false rejection rate (FRR) of 0.08% and a false acceptance rate (FAR) of 0.01%. PMID:22666032
ECDSA B-233 with Precomputation 1.0 Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draelos, Timothy; Schroeppel, Richard; Schoeneman, Barry
2009-12-11
This software, written in C, performs two functions: 1) the generation of digital signatures using ECDSA with the B-233 curve and a table of precomputed values, and 2) the generation and encryption of a table of precomputed values to support the generation of many digital signatures. The computationally expensive operations of ECDSA signature generation are precomputed, stored in a table, and protected with AES encryption. This allows digital signatures to be generated in low-power, computationally-constrained environments, such as are often found in non-proliferation monitoring applications. The encrypted, precomputed table and digital signature generation software are used to provide public keymore » data authentication for sensor data. When digital data is presented for signing, a set of values from the table is decrypted and used to generate an ECDSA digital signatureThis software, written in C, performs two functions: 1) the generation of digital signatures using ECDSA with the B-233 curve and a table of precomputed values, and 2) the generation and encryption of a table of precomputed values to support the generation of many digital signatures. The computationally expensive operations of ECDSA signature generation are precomputed, stored in a table, and protected with AES encryption. This allows digital signatures to be generated in low-power, computationally-constrained environments, such as are often found in non-proliferation monitoring applications. The encrypted, precomputed table and digital signature generation software are used to provide public key data authentication for sensor data. When digital data is presented for signing, a set of values from the table is decrypted and used to generate an ECDSA digital signature« less
Thermal imaging as a biometrics approach to facial signature authentication.
Guzman, A M; Goryawala, M; Wang, Jin; Barreto, A; Andrian, J; Rishe, N; Adjouadi, M
2013-01-01
A new thermal imaging framework with unique feature extraction and similarity measurements for face recognition is presented. The research premise is to design specialized algorithms that would extract vasculature information, create a thermal facial signature and identify the individual. The proposed algorithm is fully integrated and consolidates the critical steps of feature extraction through the use of morphological operators, registration using the Linear Image Registration Tool and matching through unique similarity measures designed for this task. The novel approach at developing a thermal signature template using four images taken at various instants of time ensured that unforeseen changes in the vasculature over time did not affect the biometric matching process as the authentication process relied only on consistent thermal features. Thirteen subjects were used for testing the developed technique on an in-house thermal imaging system. The matching using the similarity measures showed an average accuracy of 88.46% for skeletonized signatures and 90.39% for anisotropically diffused signatures. The highly accurate results obtained in the matching process clearly demonstrate the ability of the thermal infrared system to extend in application to other thermal imaging based systems. Empirical results applying this approach to an existing database of thermal images proves this assertion.
NASA Astrophysics Data System (ADS)
Vielhauer, Claus; Croce Ferri, Lucilla
2003-06-01
Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.
Identity-Based Authentication for Cloud Computing
NASA Astrophysics Data System (ADS)
Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao
Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.
Spectroscopically Enhanced Method and System for Multi-Factor Biometric Authentication
NASA Astrophysics Data System (ADS)
Pishva, Davar
This paper proposes a spectroscopic method and system for preventing spoofing of biometric authentication. One of its focus is to enhance biometrics authentication with a spectroscopic method in a multifactor manner such that a person's unique ‘spectral signatures’ or ‘spectral factors’ are recorded and compared in addition to a non-spectroscopic biometric signature to reduce the likelihood of imposter getting authenticated. By using the ‘spectral factors’ extracted from reflectance spectra of real fingers and employing cluster analysis, it shows how the authentic fingerprint image presented by a real finger can be distinguished from an authentic fingerprint image embossed on an artificial finger, or molded on a fingertip cover worn by an imposter. This paper also shows how to augment two widely used biometrics systems (fingerprint and iris recognition devices) with spectral biometrics capabilities in a practical manner and without creating much overhead or inconveniencing their users.
A Secure and Efficient Threshold Group Signature Scheme
NASA Astrophysics Data System (ADS)
Zhang, Yansheng; Wang, Xueming; Qiu, Gege
The paper presents a secure and efficient threshold group signature scheme aiming at two problems of current threshold group signature schemes: conspiracy attack and inefficiency. Scheme proposed in this paper takes strategy of separating designed clerk who is responsible for collecting and authenticating each individual signature from group, the designed clerk don't participate in distribution of group secret key and has his own public key and private key, designed clerk needs to sign part information of threshold group signature after collecting signatures. Thus verifier has to verify signature of the group after validating signature of the designed clerk. This scheme is proved to be secure against conspiracy attack at last and is more efficient by comparing with other schemes.
Fast, Accurate and Automatic Ancient Nucleosome and Methylation Maps with epiPALEOMIX.
Hanghøj, Kristian; Seguin-Orlando, Andaine; Schubert, Mikkel; Madsen, Tobias; Pedersen, Jakob Skou; Willerslev, Eske; Orlando, Ludovic
2016-12-01
The first epigenomes from archaic hominins (AH) and ancient anatomically modern humans (AMH) have recently been characterized, based, however, on a limited number of samples. The extent to which ancient genome-wide epigenetic landscapes can be reconstructed thus remains contentious. Here, we present epiPALEOMIX, an open-source and user-friendly pipeline that exploits post-mortem DNA degradation patterns to reconstruct ancient methylomes and nucleosome maps from shotgun and/or capture-enrichment data. Applying epiPALEOMIX to the sequence data underlying 35 ancient genomes including AMH, AH, equids and aurochs, we investigate the temporal, geographical and preservation range of ancient epigenetic signatures. We first assess the quality of inferred ancient epigenetic signatures within well-characterized genomic regions. We find that tissue-specific methylation signatures can be obtained across a wider range of DNA preparation types than previously thought, including when no particular experimental procedures have been used to remove deaminated cytosines prior to sequencing. We identify a large subset of samples for which DNA associated with nucleosomes is protected from post-mortem degradation, and nucleosome positioning patterns can be reconstructed. Finally, we describe parameters and conditions such as DNA damage levels and sequencing depth that limit the preservation of epigenetic signatures in ancient samples. When such conditions are met, we propose that epigenetic profiles of CTCF binding regions can be used to help data authentication. Our work, including epiPALEOMIX, opens for further investigations of ancient epigenomes through time especially aimed at tracking possible epigenetic changes during major evolutionary, environmental, socioeconomic, and cultural shifts. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Zhao, Zhenguo; Shi, Wenbo
2014-01-01
Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hymel, Ross
The Public Key (PK) FPGA software performs asymmetric authentication using the 163-bit Elliptic Curve Digital Signature Algorithm (ECDSA) on an embedded FPGA platform. A digital signature is created on user-supplied data, and communication with a host system is performed via a Serial Peripheral Interface (SPI) bus. Software includes all components necessary for signing, including custom random number generator for key creation and SHA-256 for data hashing.
ERIC Educational Resources Information Center
Chen, Hao-Jan Howard
2011-01-01
Authentic videos are always motivational for foreign language learners. According to the findings of many empirical studies, subtitled L2 videos are particularly useful for foreign language learning. Although there are many authentic English videos available on the Internet, most of these videos do not have subtitles. If subtitles can be added to…
Kim, Daehee; Kim, Dongwan; An, Sunshin
2016-07-09
Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.
Kim, Daehee; Kim, Dongwan; An, Sunshin
2016-01-01
Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616
Advanced information processing system: Authentication protocols for network communication
NASA Technical Reports Server (NTRS)
Harper, Richard E.; Adams, Stuart J.; Babikyan, Carol A.; Butler, Bryan P.; Clark, Anne L.; Lala, Jaynarayan H.
1994-01-01
In safety critical I/O and intercomputer communication networks, reliable message transmission is an important concern. Difficulties of communication and fault identification in networks arise primarily because the sender of a transmission cannot be identified with certainty, an intermediate node can corrupt a message without certainty of detection, and a babbling node cannot be identified and silenced without lengthy diagnosis and reconfiguration . Authentication protocols use digital signature techniques to verify the authenticity of messages with high probability. Such protocols appear to provide an efficient solution to many of these problems. The objective of this program is to develop, demonstrate, and evaluate intercomputer communication architectures which employ authentication. As a context for the evaluation, the authentication protocol-based communication concept was demonstrated under this program by hosting a real-time flight critical guidance, navigation and control algorithm on a distributed, heterogeneous, mixed redundancy system of workstations and embedded fault-tolerant computers.
New Capabilities in Security and QoS Using the Updated MANET Routing Protocol OLSRv2
2010-09-01
integrity, by the authentication of packets or messages, and confidentiality. These are discussed in the following sections. Issues of availability...fully specified, in [2] is the addition of a TLV including a cryptographic signature that will allow the authentication of the received information...The objective is to ensure the integrity of the ad hoc network, that only authorised routers can join the network because unauthorised routers will
Bercovich, A; Edan, Y; Alchanatis, V; Moallem, U; Parmet, Y; Honig, H; Maltz, E; Antler, A; Halachmi, I
2013-01-01
Body condition evaluation is a common tool to assess energy reserves of dairy cows and to estimate their fatness or thinness. This study presents a computer-vision tool that automatically estimates cow's body condition score. Top-view images of 151 cows were collected on an Israeli research dairy farm using a digital still camera located at the entrance to the milking parlor. The cow's tailhead area and its contour were segmented and extracted automatically. Two types of features of the tailhead contour were extracted: (1) the angles and distances between 5 anatomical points; and (2) the cow signature, which is a 1-dimensional vector of the Euclidean distances from each point in the normalized tailhead contour to the shape center. Two methods were applied to describe the cow's signature and to reduce its dimension: (1) partial least squares regression, and (2) Fourier descriptors of the cow signature. Three prediction models were compared with manual scores of an expert. Results indicate that (1) it is possible to automatically extract and predict body condition from color images without any manual interference; and (2) Fourier descriptors of the cow's signature result in improved performance (R(2)=0.77). Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Zhao, Zhenguo; Shi, Wenbo
2014-01-01
Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications. PMID:25025083
Glove-based approach to online signature verification.
Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A
2008-06-01
Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (asymmetric) cryptography is a method of creating a unique mark, known as a digital signature, on an... cryptography means a method of authentication in which a single key is used to sign and verify an electronic...
An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks.
Zhu, Hongfei; Tan, Yu-An; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang
2018-05-22
With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people's lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.
An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks
Zhu, Hongfei; Tan, Yu-an; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang
2018-01-01
With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people’s lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size. PMID:29789475
Secure and Efficient Signature Scheme Based on NTRU for Mobile Payment
NASA Astrophysics Data System (ADS)
Xia, Yunhao; You, Lirong; Sun, Zhe; Sun, Zhixin
2017-10-01
Mobile payment becomes more and more popular, however the traditional public-key encryption algorithm has higher requirements for hardware which is not suitable for mobile terminals of limited computing resources. In addition, these public-key encryption algorithms do not have the ability of anti-quantum computing. This paper researches public-key encryption algorithm NTRU for quantum computation through analyzing the influence of parameter q and k on the probability of generating reasonable signature value. Two methods are proposed to improve the probability of generating reasonable signature value. Firstly, increase the value of parameter q. Secondly, add the authentication condition that meet the reasonable signature requirements during the signature phase. Experimental results show that the proposed signature scheme can realize the zero leakage of the private key information of the signature value, and increase the probability of generating the reasonable signature value. It also improve rate of the signature, and avoid the invalid signature propagation in the network, but the scheme for parameter selection has certain restrictions.
Shea, S; Sengupta, S; Crosswell, A; Clayton, P D
1992-01-01
The developing Integrated Academic Information System (IAIMS) at Columbia-Presbyterian Medical Center provides data sharing links between two separate corporate entities, namely Columbia University Medical School and The Presbyterian Hospital, using a network-based architecture. Multiple database servers with heterogeneous user authentication protocols are linked to this network. "One-stop information shopping" implies one log-on procedure per session, not separate log-on and log-off procedures for each server or application used during a session. These circumstances provide challenges at the policy and technical levels to data security at the network level and insuring smooth information access for end users of these network-based services. Five activities being conducted as part of our security project are described: (1) policy development; (2) an authentication server for the network; (3) Kerberos as a tool for providing mutual authentication, encryption, and time stamping of authentication messages; (4) a prototype interface using Kerberos services to authenticate users accessing a network database server; and (5) a Kerberized electronic signature.
Extraction of small boat harmonic signatures from passive sonar.
Ogden, George L; Zurk, Lisa M; Jones, Mark E; Peterson, Mary E
2011-06-01
This paper investigates the extraction of acoustic signatures from small boats using a passive sonar system. Noise radiated from a small boats consists of broadband noise and harmonically related tones that correspond to engine and propeller specifications. A signal processing method to automatically extract the harmonic structure of noise radiated from small boats is developed. The Harmonic Extraction and Analysis Tool (HEAT) estimates the instantaneous fundamental frequency of the harmonic tones, refines the fundamental frequency estimate using a Kalman filter, and automatically extracts the amplitudes of the harmonic tonals to generate a harmonic signature for the boat. Results are presented that show the HEAT algorithms ability to extract these signatures. © 2011 Acoustical Society of America
A sessional blind signature based on quantum cryptography
NASA Astrophysics Data System (ADS)
Khodambashi, Siavash; Zakerolhosseini, Ali
2014-01-01
In this paper, we present a sessional blind signature protocol whose security is guaranteed by fundamental principles of quantum physics. It allows a message owner to get his message signed by an authorized signatory. However, the signatory is not capable of reading the message contents and everyone can verify authenticity of the message. For this purpose, we took advantage of a sessional signature as well as quantum entangled pairs which are generated with respect to it in our proposed protocol. We describe our proposed blind signature through an example and briefly discuss about its unconditional security. Due to the feasibility of the protocol, it can be widely employed for e-payment, e-government, e-business and etc.
Environmental Requirements for Authentication Protocols
2002-01-01
Engineering for Informa- tion Security, March 2001. 10. D. Chaum . Blind signatures for untraceable payments. In Advances in Cryptology{ Proceedings of...the connection, the idea relies on a concept similar to blinding in the sense of Chaum [10], who used it e ectively in the design of anonymous payment...digital signature on the key and a nonce provided by the server, in which the client’s challenge response was independent of the type of cipher
Methods for using a biometric parameter in the identification of persons
Hively, Lee M [Philadelphia, TN
2011-11-22
Brain waves are used as a biometric parameter to provide for authentication and identification of personnel. The brain waves are sampled using EEG equipment and are processed using phase-space distribution functions to compare digital signature data from enrollment of authorized individuals to data taken from a test subject to determine if the data from the test subject matches the signature data to a degree to support positive identification.
Nonintrusive multibiometrics on a mobile device: a comparison of fusion techniques
NASA Astrophysics Data System (ADS)
Allano, Lorene; Morris, Andrew C.; Sellahewa, Harin; Garcia-Salicetti, Sonia; Koreman, Jacques; Jassim, Sabah; Ly-Van, Bao; Wu, Dalei; Dorizzi, Bernadette
2006-04-01
In this article we test a number of score fusion methods for the purpose of multimodal biometric authentication. These tests were made for the SecurePhone project, whose aim is to develop a prototype mobile communication system enabling biometrically authenticated users to deal legally binding m-contracts during a mobile phone call on a PDA. The three biometrics of voice, face and signature were selected because they are all traditional non-intrusive and easy to use means of authentication which can readily be captured on a PDA. By combining multiple biometrics of relatively low security it may be possible to obtain a combined level of security which is at least as high as that provided by a PIN or handwritten signature, traditionally used for user authentication. As the relative success of different fusion methods depends on the database used and tests made, the database we used was recorded on a suitable PDA (the Qtek2020) and the test protocol was designed to reflect the intended application scenario, which is expected to use short text prompts. Not all of the fusion methods tested are original. They were selected for their suitability for implementation within the constraints imposed by the application. All of the methods tested are based on fusion of the match scores output by each modality. Though computationally simple, the methods tested have shown very promising results. All of the 4 fusion methods tested obtain a significant performance increase.
Data to hardware binding with physical unclonable functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamlet, Jason
The various technologies presented herein relate to binding data (e.g., software) to hardware, wherein the hardware is to utilize the data. The generated binding can be utilized to detect whether at least one of the hardware or the data has been modified between an initial moment (enrollment) and a later moment (authentication). During enrollment, an enrollment value is generated that includes a signature of the data, a first response from a PUF located on the hardware, and a code word. During authentication, a second response from the PUF is utilized to authenticate any of the content in the enrollment value,more » and based upon the authentication, a determination can be made regarding whether the hardware and/or the data have been modified. If modification is detected then a mitigating operation can be performed, e.g., the hardware is prevented from utilizing the data. If no modification is detected, the data can be utilized.« less
[Study of automatic marine oil spills detection using imaging spectroscopy].
Liu, De-Lian; Han, Liang; Zhang, Jian-Qi
2013-11-01
To reduce artificial auxiliary works in oil spills detection process, an automatic oil spill detection method based on adaptive matched filter is presented. Firstly, the characteristics of reflectance spectral signature of C-H bond in oil spill are analyzed. And an oil spill spectral signature extraction model is designed by using the spectral feature of C-H bond. It is then used to obtain the reference spectral signature for the following oil spill detection step. Secondly, the characteristics of reflectance spectral signature of sea water, clouds, and oil spill are compared. The bands which have large difference in reflectance spectral signatures of the sea water, clouds, and oil spill are selected. By using these bands, the sea water pixels are segmented. And the background parameters are then calculated. Finally, the classical adaptive matched filter from target detection algorithms is improved and introduced for oil spill detection. The proposed method is applied to the real airborne visible infrared imaging spectrometer (AVIRIS) hyperspectral image captured during the deepwater horizon oil spill in the Gulf of Mexico for oil spill detection. The results show that the proposed method has, high efficiency, does not need artificial auxiliary work, and can be used for automatic detection of marine oil spill.
Structural analysis of paintings based on brush strokes
NASA Astrophysics Data System (ADS)
Sablatnig, Robert; Kammerer, Paul; Zolda, Ernestine
1998-05-01
The origin of works of art can often not be attributed to a certain artist. Likewise it is difficult to say whether paintings or drawings are originals or forgeries. In various fields of art new technical methods are used to examine the age, the state of preservation and the origin of the materials used. For the examination of paintings, radiological methods like X-ray and infra-red diagnosis, digital radiography, computer-tomography, etc. and color analyzes are employed to authenticate art. But all these methods do not relate certain characteristics in art work to a specific artist -- the artist's personal style. In order to study this personal style of a painter, experts in art history and image processing try to examine the 'structural signature' based on brush strokes within paintings, in particular in portrait miniatures. A computer-aided classification and recognition system for portrait miniatures is developed, which enables a semi- automatic classification and forgery detection based on content, color, and brush strokes. A hierarchically structured classification scheme is introduced which separates the classification into three different levels of information: color, shape of region, and structure of brush strokes.
NASA Astrophysics Data System (ADS)
Zhang, Jianguo; Chen, Xiaomeng; Zhuang, Jun; Jiang, Jianrong; Zhang, Xiaoyan; Wu, Dongqing; Huang, H. K.
2003-05-01
In this paper, we presented a new security approach to provide security measures and features in both healthcare information systems (PACS, RIS/HIS), and electronic patient record (EPR). We introduced two security components, certificate authoring (CA) system and patient record digital signature management (DSPR) system, as well as electronic envelope technology, into the current hospital healthcare information infrastructure to provide security measures and functions such as confidential or privacy, authenticity, integrity, reliability, non-repudiation, and authentication for in-house healthcare information systems daily operating, and EPR exchanging among the hospitals or healthcare administration levels, and the DSPR component manages the all the digital signatures of patient medical records signed through using an-symmetry key encryption technologies. The electronic envelopes used for EPR exchanging are created based on the information of signers, digital signatures, and identifications of patient records stored in CAS and DSMS, as well as the destinations and the remote users. The CAS and DSMS were developed and integrated into a RIS-integrated PACS, and the integration of these new security components is seamless and painless. The electronic envelopes designed for EPR were used successfully in multimedia data transmission.
Bland, Andrew J; Topping, Annie; Tobbell, Jane
2014-07-01
High-fidelity patient simulation is a method of education increasingly utilised by educators of nursing to provide authentic learning experiences. Fidelity and authenticity, however, are not conceptually equivalent. Whilst fidelity is important when striving to replicate a life experience such as clinical practice, authenticity can be produced with low fidelity. A challenge for educators of undergraduate nursing is to ensure authentic representation of the clinical situation which is a core component for potential success. What is less clear is the relationship between fidelity and authenticity in the context of simulation based learning. Authenticity does not automatically follow fidelity and as a result, educators of nursing cannot assume that embracing the latest technology-based educational tools will in isolation provide a learning environment perceived authentic by the learner. As nursing education programmes increasingly adopt simulators that offer the possibility of representing authentic real world situations, there is an urgency to better articulate and understand the terms fidelity and authenticity. Without such understanding there is a real danger that simulation as a teaching and learning resource in nurse education will never reach its potential and be misunderstood, creating a potential barrier to learning. This paper examines current literature to promote discussion within nurse education, concluding that authenticity in the context of simulation-based learning is complex, relying on far more than engineered fidelity. Copyright © 2014 Elsevier Ltd. All rights reserved.
Jin, Chunhua; Xu, Chunxiang; Zhang, Xiaojun; Zhao, Jining
2015-03-01
Radio Frequency Identification(RFID) is an automatic identification technology, which can be widely used in healthcare environments to locate and track staff, equipment and patients. However, potential security and privacy problems in RFID system remain a challenge. In this paper, we design a mutual authentication protocol for RFID based on elliptic curve cryptography(ECC). We use pre-computing method within tag's communication, so that our protocol can get better efficiency. In terms of security, our protocol can achieve confidentiality, unforgeability, mutual authentication, tag's anonymity, availability and forward security. Our protocol also can overcome the weakness in the existing protocols. Therefore, our protocol is suitable for healthcare environments.
Authentication of digital video evidence
NASA Astrophysics Data System (ADS)
Beser, Nicholas D.; Duerr, Thomas E.; Staisiunas, Gregory P.
2003-11-01
In response to a requirement from the United States Postal Inspection Service, the Technical Support Working Group tasked The Johns Hopkins University Applied Physics Laboratory (JHU/APL) to develop a technique tha will ensure the authenticity, or integrity, of digital video (DV). Verifiable integrity is needed if DV evidence is to withstand a challenge to its admissibility in court on the grounds that it can be easily edited. Specifically, the verification technique must detect additions, deletions, or modifications to DV and satisfy the two-part criteria pertaining to scientific evidence as articulated in Daubert et al. v. Merrell Dow Pharmaceuticals Inc., 43 F3d (9th Circuit, 1995). JHU/APL has developed a prototype digital video authenticator (DVA) that generates digital signatures based on public key cryptography at the frame level of the DV. Signature generation and recording is accomplished at the same time as DV is recorded by the camcorder. Throughput supports the consumer-grade camcorder data rate of 25 Mbps. The DVA software is implemented on a commercial laptop computer, which is connected to a commercial digital camcorder via the IEEE-1394 serial interface. A security token provides agent identification and the interface to the public key infrastructure (PKI) that is needed for management of the public keys central to DV integrity verification.
Current Federal Identity Management and the Dynamic Signature Biometrics Option
2009-03-01
A lack of general public knowledge on biometrics combined with a lack of open discussion and detailed product advertising has created an atmosphere...Authentication: Fingerprint Sensor Product Guidelines. Version 1.03, September 2003 http:/Avww,intel.comidesign/ mobi /e/platform
Multimodal person authentication on a smartphone under realistic conditions
NASA Astrophysics Data System (ADS)
Morris, Andrew C.; Jassim, Sabah; Sellahewa, Harin; Allano, Lorene; Ehlers, Johan; Wu, Dalei; Koreman, Jacques; Garcia-Salicetti, Sonia; Ly-Van, Bao; Dorizzi, Bernadette
2006-05-01
Verification of a person's identity by the combination of more than one biometric trait strongly increases the robustness of person authentication in real applications. This is particularly the case in applications involving signals of degraded quality, as for person authentication on mobile platforms. The context of mobility generates degradations of input signals due to the variety of environments encountered (ambient noise, lighting variations, etc.), while the sensors' lower quality further contributes to decrease in system performance. Our aim in this work is to combine traits from the three biometric modalities of speech, face and handwritten signature in a concrete application, performing non intrusive biometric verification on a personal mobile device (smartphone/PDA). Most available biometric databases have been acquired in more or less controlled environments, which makes it difficult to predict performance in a real application. Our experiments are performed on a database acquired on a PDA as part of the SecurePhone project (IST-2002-506883 project "Secure Contracts Signed by Mobile Phone"). This database contains 60 virtual subjects balanced in gender and age. Virtual subjects are obtained by coupling audio-visual signals from real English speaking subjects with signatures from other subjects captured on the touch screen of the PDA. Video data for the PDA database was recorded in 2 recording sessions separated by at least one week. Each session comprises 4 acquisition conditions: 2 indoor and 2 outdoor recordings (with in each case, a good and a degraded quality recording). Handwritten signatures were captured in one session in realistic conditions. Different scenarios of matching between training and test conditions are tested to measure the resistance of various fusion systems to different types of variability and different amounts of enrolment data.
A Secure and Robust Object-Based Video Authentication System
NASA Astrophysics Data System (ADS)
He, Dajun; Sun, Qibin; Tian, Qi
2004-12-01
An object-based video authentication system, which combines watermarking, error correction coding (ECC), and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART) coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT) coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI).
The application of data encryption technology in computer network communication security
NASA Astrophysics Data System (ADS)
Gong, Lina; Zhang, Li; Zhang, Wei; Li, Xuhong; Wang, Xia; Pan, Wenwen
2017-04-01
With the rapid development of Intemet and the extensive application of computer technology, the security of information becomes more and more serious, and the information security technology with data encryption technology as the core has also been developed greatly. Data encryption technology not only can encrypt and decrypt data, but also can realize digital signature, authentication and authentication and other functions, thus ensuring the confidentiality, integrity and confirmation of data transmission over the network. In order to improve the security of data in network communication, in this paper, a hybrid encryption system is used to encrypt and decrypt the triple DES algorithm with high security, and the two keys are encrypted with RSA algorithm, thus ensuring the security of the triple DES key and solving the problem of key management; At the same time to realize digital signature using Java security software, to ensure data integrity and non-repudiation. Finally, the data encryption system is developed by Java language. The data encryption system is simple and effective, with good security and practicality.
USign--a security enhanced electronic consent model.
Li, Yanyan; Xie, Mengjun; Bian, Jiang
2014-01-01
Electronic consent becomes increasingly popular in the healthcare sector given the many benefits it provides. However, security concerns, e.g., how to verify the identity of a person who is remotely accessing the electronic consent system in a secure and user-friendly manner, also arise along with the popularity of electronic consent. Unfortunately, existing electronic consent systems do not pay sufficient attention to those issues. They mainly rely on conventional password based authentication to verify the identity of an electronic consent user, which is far from being sufficient given that identity theft threat is real and significant in reality. In this paper, we present a security enhanced electronic consent model called USign. USign enhances the identity protection and authentication for electronic consent systems by leveraging handwritten signatures everyone is familiar with and mobile computing technologies that are becoming ubiquitous. We developed a prototype of USign and conducted preliminary evaluation on accuracy and usability of signature verification. Our experimental results show the feasibility of the proposed model.
NASA Astrophysics Data System (ADS)
Lenzini, Gabriele
We describe an existing software architecture for context and proximity aware services that enables trust-based and context-aware authentication. A service is proximity aware when it automatically detects the presence of entities in its proximity. Authentication is context-aware when it uses contextual information to discern among different identities and to evaluate to which extent they are authentic. The software architecture that we describe here is functioning in our Institute: It manages a sensor network to detect the presence and location of users and their devices. A context manager is responsible to merge the different sources of contextual information, to solve potential contradictions, and to determine the level of authentication of the identity of the person approaching one of the services offered in the coffee-break corners of our Institute. In our solution for context-aware authentication, sensors are managed as if they were recommenders having subjective belief, disbelief, and uncertainty (i.e., trust) on the position and identity of users. A sensor’s subjective trust depends on what it has been sensing in the environment. We discuss the results of an array of simulations that we conducted to validate our concept of trust-based and context-aware authentication. We use Subjective Logic to manage trust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykstra, D.; Blomer, J.
Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFSmore » and Frontier.« less
Supervised Multi-Authority Scheme with Blind Signature for IoT with Attribute Based Encryption
NASA Astrophysics Data System (ADS)
Nissenbaum, O. V.; Ponomarov, K. Y.; Zaharov, A. A.
2018-04-01
This article proposes a three-side cryptographic scheme for verifying device attributes with a Supervisor and a Certification Authority (CA) for attribute-based encryption. Two options are suggested: using a message authentication code and using a digital signature. The first version is suitable for networks with one CA, and the second one for networks with several CAs, including dynamic systems. Also, the addition of this scheme with a blind signature is proposed to preserve the confidentiality of the device attributes from the CA. The introduction gives a definition and a brief historical overview of attribute-based encryption (ABE), addresses the use of ABE in the Internet of Things.
22 CFR 92.78 - Translating documents.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Service are not authorized to translate documents or to certify to the correctness of translations... of a translation; to take an acknowledgment of the preparation of a translation; and to authenticate the seal and signature of a local official affixed to a translation. Separate fees should be charged...
Flexible and Transparent User Authentication for Mobile Devices
NASA Astrophysics Data System (ADS)
Clarke, Nathan; Karatzouni, Sevasti; Furnell, Steven
The mobile device has become a ubiquitous technology that is capable of supporting an increasingly large array of services, applications and information. Given their increasing importance, it is imperative to ensure that such devices are not misused or abused. Unfortunately, a key enabling control to prevent this, user authentication, has not kept up with the advances in device technology. This paper presents the outcomes of a 2 year study that proposes the use of transparent and continuous biometric authentication of the user: providing more comprehensive identity verification; minimizing user inconvenience; and providing security throughout the period of use. A Non-Intrusive and Continuous Authentication (NICA) system is described that maintains a continuous measure of confidence in the identity of the user, removing access to sensitive services and information with low confidence levels and providing automatic access with higher confidence levels. An evaluation of the framework is undertaken from an end-user perspective via a trial involving 27 participants. Whilst the findings raise concerns over education, privacy and intrusiveness, overall 92% of users felt the system offered a more secure environment when compared to existing forms of authentication.
Huang, Qinlong; Yang, Yixian; Shi, Yuxiang
2018-02-24
With the growing number of vehicles and popularity of various services in vehicular cloud computing (VCC), message exchanging among vehicles under traffic conditions and in emergency situations is one of the most pressing demands, and has attracted significant attention. However, it is an important challenge to authenticate the legitimate sources of broadcast messages and achieve fine-grained message access control. In this work, we propose SmartVeh, a secure and efficient message access control and authentication scheme in VCC. A hierarchical, attribute-based encryption technique is utilized to achieve fine-grained and flexible message sharing, which ensures that vehicles whose persistent or dynamic attributes satisfy the access policies can access the broadcast message with equipped on-board units (OBUs). Message authentication is enforced by integrating an attribute-based signature, which achieves message authentication and maintains the anonymity of the vehicles. In order to reduce the computations of the OBUs in the vehicles, we outsource the heavy computations of encryption, decryption and signing to a cloud server and road-side units. The theoretical analysis and simulation results reveal that our secure and efficient scheme is suitable for VCC.
Yang, Yixian; Shi, Yuxiang
2018-01-01
With the growing number of vehicles and popularity of various services in vehicular cloud computing (VCC), message exchanging among vehicles under traffic conditions and in emergency situations is one of the most pressing demands, and has attracted significant attention. However, it is an important challenge to authenticate the legitimate sources of broadcast messages and achieve fine-grained message access control. In this work, we propose SmartVeh, a secure and efficient message access control and authentication scheme in VCC. A hierarchical, attribute-based encryption technique is utilized to achieve fine-grained and flexible message sharing, which ensures that vehicles whose persistent or dynamic attributes satisfy the access policies can access the broadcast message with equipped on-board units (OBUs). Message authentication is enforced by integrating an attribute-based signature, which achieves message authentication and maintains the anonymity of the vehicles. In order to reduce the computations of the OBUs in the vehicles, we outsource the heavy computations of encryption, decryption and signing to a cloud server and road-side units. The theoretical analysis and simulation results reveal that our secure and efficient scheme is suitable for VCC. PMID:29495269
Providing integrity, authenticity, and confidentiality for header and pixel data of DICOM images.
Al-Haj, Ali
2015-04-01
Exchange of medical images over public networks is subjected to different types of security threats. This has triggered persisting demands for secured telemedicine implementations that will provide confidentiality, authenticity, and integrity for the transmitted images. The medical image exchange standard (DICOM) offers mechanisms to provide confidentiality for the header data of the image but not for the pixel data. On the other hand, it offers mechanisms to achieve authenticity and integrity for the pixel data but not for the header data. In this paper, we propose a crypto-based algorithm that provides confidentially, authenticity, and integrity for the pixel data, as well as for the header data. This is achieved by applying strong cryptographic primitives utilizing internally generated security data, such as encryption keys, hashing codes, and digital signatures. The security data are generated internally from the header and the pixel data, thus a strong bond is established between the DICOM data and the corresponding security data. The proposed algorithm has been evaluated extensively using DICOM images of different modalities. Simulation experiments show that confidentiality, authenticity, and integrity have been achieved as reflected by the results we obtained for normalized correlation, entropy, PSNR, histogram analysis, and robustness.
8 CFR 1287.6 - Proof of official records.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Proof of official records. 1287.6 Section 1287.6 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE... existence on a certain date, and official and notarial authentication of signatures. (4) In accordance with...
8 CFR 1287.6 - Proof of official records.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Proof of official records. 1287.6 Section 1287.6 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE... existence on a certain date, and official and notarial authentication of signatures. (4) In accordance with...
8 CFR 1287.6 - Proof of official records.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 8 Aliens and Nationality 1 2011-01-01 2011-01-01 false Proof of official records. 1287.6 Section 1287.6 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE... existence on a certain date, and official and notarial authentication of signatures. (4) In accordance with...
8 CFR 1287.6 - Proof of official records.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Proof of official records. 1287.6 Section 1287.6 Aliens and Nationality EXECUTIVE OFFICE FOR IMMIGRATION REVIEW, DEPARTMENT OF JUSTICE... existence on a certain date, and official and notarial authentication of signatures. (4) In accordance with...
78 FR 11814 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-20
...Authentication Service provides public and government businesses single sign-on capability for USDA applications... be collected; (d) ways to minimize the burden of the collection of information on those who are to... (GPEA, Pub. L. 105-277), the Electronic Signatures in Global and National Commerce Act (E-Sign, Pub. L...
21 CFR 11.10 - Controls for closed systems.
Code of Federal Regulations, 2014 CFR
2014-04-01
... controls designed to ensure the authenticity, integrity, and, when appropriate, the confidentiality of... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Controls for closed systems. 11.10 Section 11.10... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.10 Controls for closed systems. Persons who use...
21 CFR 11.10 - Controls for closed systems.
Code of Federal Regulations, 2010 CFR
2010-04-01
... controls designed to ensure the authenticity, integrity, and, when appropriate, the confidentiality of... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Controls for closed systems. 11.10 Section 11.10... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.10 Controls for closed systems. Persons who use...
21 CFR 11.10 - Controls for closed systems.
Code of Federal Regulations, 2013 CFR
2013-04-01
... controls designed to ensure the authenticity, integrity, and, when appropriate, the confidentiality of... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Controls for closed systems. 11.10 Section 11.10... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.10 Controls for closed systems. Persons who use...
21 CFR 11.10 - Controls for closed systems.
Code of Federal Regulations, 2011 CFR
2011-04-01
... controls designed to ensure the authenticity, integrity, and, when appropriate, the confidentiality of... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Controls for closed systems. 11.10 Section 11.10... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.10 Controls for closed systems. Persons who use...
21 CFR 11.10 - Controls for closed systems.
Code of Federal Regulations, 2012 CFR
2012-04-01
... controls designed to ensure the authenticity, integrity, and, when appropriate, the confidentiality of... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Controls for closed systems. 11.10 Section 11.10... RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.10 Controls for closed systems. Persons who use...
Physical unclonable functions: A primer
Bauer, Todd; Hamlet, Jason
2014-11-01
Physical unclonable functions (PUFs) make use of the measurable intrinsic randomness of physical systems to establish signatures for those systems. Thus, PUFs provide a means to generate unique keys that don't need to be stored in nonvolatile memory, and they offer exciting opportunities for new authentication and supply chain security technologies.
Automatic removal of cosmic ray signatures in Deep Impact images
NASA Astrophysics Data System (ADS)
Ipatov, S. I.; A'Hearn, M. F.; Klaasen, K. P.
The results of recognition of cosmic ray (CR) signatures on single images made during the Deep Impact mission were analyzed for several codes written by several authors. For automatic removal of CR signatures on many images, we suggest using the code imgclean ( http://pdssbn.astro.umd.edu/volume/didoc_0001/document/calibration_software/dical_v5/) written by E. Deutsch as other codes considered do not work properly automatically with a large number of images and do not run to completion for some images; however, other codes can be better for analysis of certain specific images. Sometimes imgclean detects false CR signatures near the edge of a comet nucleus, and it often does not recognize all pixels of long CR signatures. Our code rmcr is the only code among those considered that allows one to work with raw images. For most visual images made during low solar activity at exposure time t > 4 s, the number of clusters of bright pixels on an image per second per sq. cm of CCD was about 2-4, both for dark and normal sky images. At high solar activity, it sometimes exceeded 10. The ratio of the number of CR signatures consisting of n pixels obtained at high solar activity to that at low solar activity was greater for greater n. The number of clusters detected as CR signatures on a single infrared image is by at least a factor of several greater than the actual number of CR signatures; the number of clusters based on analysis of two successive dark infrared frames is in agreement with an expected number of CR signatures. Some glitches of false CR signatures include bright pixels repeatedly present on different infrared images. Our interactive code imr allows a user to choose the regions on a considered image where glitches detected by imgclean as CR signatures are ignored. In other regions chosen by the user, the brightness of some pixels is replaced by the local median brightness if the brightness of these pixels is greater by some factor than the median brightness. The interactive code allows one to delete long CR signatures and prevents removal of false CR signatures near the edge of the nucleus of the comet. The interactive code can be applied to editing any digital images. Results obtained can be used for other missions to comets.
Modelling Metamorphism by Abstract Interpretation
NASA Astrophysics Data System (ADS)
Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.
Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.
Security in the CernVM File System and the Frontier Distributed Database Caching System
NASA Astrophysics Data System (ADS)
Dykstra, D.; Blomer, J.
2014-06-01
Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.
Digital image envelope: method and evaluation
NASA Astrophysics Data System (ADS)
Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang
2003-05-01
Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.
A Fast lattice-based polynomial digital signature system for m-commerce
NASA Astrophysics Data System (ADS)
Wei, Xinzhou; Leung, Lin; Anshel, Michael
2003-01-01
The privacy and data integrity are not guaranteed in current wireless communications due to the security hole inside the Wireless Application Protocol (WAP) version 1.2 gateway. One of the remedies is to provide an end-to-end security in m-commerce by applying application level security on top of current WAP1.2. The traditional security technologies like RSA and ECC applied on enterprise's server are not practical for wireless devices because wireless devices have relatively weak computation power and limited memory compared with server. In this paper, we developed a lattice based polynomial digital signature system based on NTRU's Polynomial Authentication and Signature Scheme (PASS), which enabled the feasibility of applying high-level security on both server and wireless device sides.
Secure quantum signatures: a practical quantum technology (Conference Presentation)
NASA Astrophysics Data System (ADS)
Andersson, Erika
2016-10-01
Modern cryptography encompasses much more than encryption of secret messages. Signature schemes are widely used to guarantee that messages cannot be forged or tampered with, for example in e-mail, software updates and electronic commerce. Messages are also transferrable, which distinguishes digital signatures from message authentication. Transferability means that messages can be forwarded; in other words, that a sender is unlikely to be able to make one recipient accept a message which is subsequently rejected by another recipient if the message is forwarded. Similar to public-key encryption, the security of commonly used signature schemes relies on the assumed computational difficulty of problems such as finding discrete logarithms or factoring large primes. With quantum computers, such assumptions would no longer be valid. Partly for this reason, it is desirable to develop signature schemes with unconditional or information-theoretic security. Quantum signature schemes are one possible solution. Similar to quantum key distribution (QKD), their unconditional security relies only on the laws of quantum mechanics. Quantum signatures can be realized with the same system components as QKD, but are so far less investigated. This talk aims to provide an introduction to quantum signatures and to review theoretical and experimental progress so far.
Practical Computer Security through Cryptography
NASA Technical Reports Server (NTRS)
McNab, David; Twetev, David (Technical Monitor)
1998-01-01
The core protocols upon which the Internet was built are insecure. Weak authentication and the lack of low level encryption services introduce vulnerabilities that propagate upwards in the network stack. Using statistics based on CERT/CC Internet security incident reports, the relative likelihood of attacks via these vulnerabilities is analyzed. The primary conclusion is that the standard UNIX BSD-based authentication system is by far the most commonly exploited weakness. Encryption of Sensitive password data and the adoption of cryptographically-based authentication protocols can greatly reduce these vulnerabilities. Basic cryptographic terminology and techniques are presented, with attention focused on the ways in which technology such as encryption and digital signatures can be used to protect against the most commonly exploited vulnerabilities. A survey of contemporary security software demonstrates that tools based on cryptographic techniques, such as Kerberos, ssh, and PGP, are readily available and effectively close many of the most serious security holes. Nine practical recommendations for improving security are described.
Clone-preventive technique that features magnetic microfibers and cryptography
NASA Astrophysics Data System (ADS)
Matsumoto, Hiroyuki; Suzuki, Keiichi; Matsumoto, Tsutomu
1998-04-01
We have used the term 'clone' to refer to those things which are produced by methods such as counterfeiting, alteration, duplication or simulation. To satisfy the requirements of secure and low-cost techniques for preventing card fraud, we have recently developed a clone preventive system called 'FibeCrypt (Fiber Cryptosystem)' which utilizes physical characteristics. Each card has a canonical domain (i.e. a distinctive part), similar to fingerprints as the biometric measurement, made up of magnetic micro-fibers scattered randomly inside. We have applied cryptosystems to the system. FibeCrypt examines and authenticates the unique pattern of the canonical domain using pre-stored reference data and a digital signature. In our paper, the schemes and the features of this system are described in detail. The results of our examinations show the accuracy of authentication of the system. We conclude that this authentication technique which utilizes physical characteristics can be very effective for clone prevention in various fields.
Optimised to Fail: Card Readers for Online Banking
NASA Astrophysics Data System (ADS)
Drimer, Saar; Murdoch, Steven J.; Anderson, Ross
The Chip Authentication Programme (CAP) has been introduced by banks in Europe to deal with the soaring losses due to online banking fraud. A handheld reader is used together with the customer’s debit card to generate one-time codes for both login and transaction authentication. The CAP protocol is not public, and was rolled out without any public scrutiny. We reverse engineered the UK variant of card readers and smart cards and here provide the first public description of the protocol. We found numerous weaknesses that are due to design errors such as reusing authentication tokens, overloading data semantics, and failing to ensure freshness of responses. The overall strategic error was excessive optimisation. There are also policy implications. The move from signature to PIN for authorising point-of-sale transactions shifted liability from banks to customers; CAP introduces the same problem for online banking. It may also expose customers to physical harm.
Authentication of beef versus horse meat using 60 MHz 1H NMR spectroscopy.
Jakes, W; Gerdova, A; Defernez, M; Watson, A D; McCallum, C; Limer, E; Colquhoun, I J; Williamson, D C; Kemsley, E K
2015-05-15
This work reports a candidate screening protocol to distinguish beef from horse meat based upon comparison of triglyceride signatures obtained by 60 MHz (1)H NMR spectroscopy. Using a simple chloroform-based extraction, we obtained classic low-field triglyceride spectra from typically a 10 min acquisition time. Peak integration was sufficient to differentiate samples of fresh beef (76 extractions) and horse (62 extractions) using Naïve Bayes classification. Principal component analysis gave a two-dimensional "authentic" beef region (p=0.001) against which further spectra could be compared. This model was challenged using a subset of 23 freeze-thawed training samples. The outcomes indicated that storing samples by freezing does not adversely affect the analysis. Of a further collection of extractions from previously unseen samples, 90/91 beef spectra were classified as authentic, and 16/16 horse spectra as non-authentic. We conclude that 60 MHz (1)H NMR represents a feasible high-throughput approach for screening raw meat. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Siregar, H.; Junaeti, E.; Hayatno, T.
2017-03-01
Activities correspondence is often used by agencies or companies, so that institutions or companies set up a special division to handle issues related to the letter management. Most of the distribution of letters using electronic media, then the letter should be kept confidential in order to avoid things that are not desirable. Techniques that can be done to meet the security aspect is by using cryptography or by giving a digital signature. The addition of asymmetric and symmetric algorithms, i.e. RSA and AES algorithms, on the digital signature had been done in this study to maintain data security. The RSA algorithm was used during the process of giving digital signature, while the AES algorithm was used during the process of encoding a message that will be sent to the receiver. Based on the research can be concluded that the additions of AES and RSA algorithms on the digital signature meet four objectives of cryptography: Secrecy, Data Integrity, Authentication and Non-repudiation.
78 FR 38240 - Authentication of Electronic Signatures on Electronically Filed Statements of Account
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-26
... up by any trick, scheme, or device a material fact; (2) makes any materially false, fictitious, or fraudulent statement or representation; or (3) makes or uses any false writing or document knowing the same to contain any materially false, fictitious, or fraudulent statement or entry; shall be fined under...
Cryptography Would Reveal Alterations In Photographs
NASA Technical Reports Server (NTRS)
Friedman, Gary L.
1995-01-01
Public-key decryption method proposed to guarantee authenticity of photographic images represented in form of digital files. In method, digital camera generates original data from image in standard public format; also produces coded signature to verify standard-format image data. Scheme also helps protect against other forms of lying, such as attaching false captions.
48 CFR 28.101-3 - Authority of an attorney-in-fact for a bid bond.
Code of Federal Regulations, 2010 CFR
2010-10-01
... responsiveness; and (2) Treat questions regarding the authenticity and enforceability of the power of attorney at..., or a photocopy or facsimile of an original, power of attorney is sufficient evidence of such... and dates on the power of attorney shall be considered original signatures, seals and dates, without...
Kabir, Muhammad N.; Alginahi, Yasser M.
2014-01-01
This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247
Systems and methods for performing wireless financial transactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCown, Steven Harvey
2012-07-03
A secure computing module (SCM) is configured for connection with a host device. The SCM includes a processor for performing secure processing operations, a host interface for coupling the processor to the host device, and a memory connected to the processor wherein the processor logically isolates at least some of the memory from access by the host device. The SCM also includes a proximate-field wireless communicator connected to the processor to communicate with another SCM associated with another host device. The SCM generates a secure digital signature for a financial transaction package and communicates the package and the signature tomore » the other SCM using the proximate-field wireless communicator. Financial transactions are performed from person to person using the secure digital signature of each person's SCM and possibly message encryption. The digital signatures and transaction details are communicated to appropriate financial organizations to authenticate the transaction parties and complete the transaction.« less
A Weak Quantum Blind Signature with Entanglement Permutation
NASA Astrophysics Data System (ADS)
Lou, Xiaoping; Chen, Zhigang; Guo, Ying
2015-09-01
Motivated by the permutation encryption algorithm, a weak quantum blind signature (QBS) scheme is proposed. It involves three participants, including the sender Alice, the signatory Bob and the trusted entity Charlie, in four phases, i.e., initializing phase, blinding phase, signing phase and verifying phase. In a small-scale quantum computation network, Alice blinds the message based on a quantum entanglement permutation encryption algorithm that embraces the chaotic position string. Bob signs the blinded message with private parameters shared beforehand while Charlie verifies the signature's validity and recovers the original message. Analysis shows that the proposed scheme achieves the secure blindness for the signer and traceability for the message owner with the aid of the authentic arbitrator who plays a crucial role when a dispute arises. In addition, the signature can neither be forged nor disavowed by the malicious attackers. It has a wide application to E-voting and E-payment system, etc.
Hughes, Richard John; Thrasher, James Thomas; Nordholt, Jane Elizabeth
2016-11-29
Innovations for quantum key management harness quantum communications to form a cryptography system within a public key infrastructure framework. In example implementations, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a Merkle signature scheme (using Winternitz one-time digital signatures or other one-time digital signatures, and Merkle hash trees) to constitute a cryptography system. More generally, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a hash-based signature scheme. This provides a secure way to identify, authenticate, verify, and exchange secret cryptographic keys. Features of the quantum key management innovations further include secure enrollment of users with a registration authority, as well as credential checking and revocation with a certificate authority, where the registration authority and/or certificate authority can be part of the same system as a trusted authority for quantum key distribution.
Understanding overlay signatures using machine learning on non-lithography context information
NASA Astrophysics Data System (ADS)
Overcast, Marshall; Mellegaard, Corey; Daniel, David; Habets, Boris; Erley, Georg; Guhlemann, Steffen; Thrun, Xaver; Buhl, Stefan; Tottewitz, Steven
2018-03-01
Overlay errors between two layers can be caused by non-lithography processes. While these errors can be compensated by the run-to-run system, such process and tool signatures are not always stable. In order to monitor the impact of non-lithography context on overlay at regular intervals, a systematic approach is needed. Using various machine learning techniques, significant context parameters that relate to deviating overlay signatures are automatically identified. Once the most influential context parameters are found, a run-to-run simulation is performed to see how much improvement can be obtained. The resulting analysis shows good potential for reducing the influence of hidden context parameters on overlay performance. Non-lithographic contexts are significant contributors, and their automatic detection and classification will enable the overlay roadmap, given the corresponding control capabilities.
A network identity authentication system based on Fingerprint identification technology
NASA Astrophysics Data System (ADS)
Xia, Hong-Bin; Xu, Wen-Bo; Liu, Yuan
2005-10-01
Fingerprint verification is one of the most reliable personal identification methods. However, most of the automatic fingerprint identification system (AFIS) is not run via Internet/Intranet environment to meet today's increasing Electric commerce requirements. This paper describes the design and implementation of the archetype system of identity authentication based on fingerprint biometrics technology, and the system can run via Internet environment. And in our system the COM and ASP technology are used to integrate Fingerprint technology with Web database technology, The Fingerprint image preprocessing algorithms are programmed into COM, which deployed on the internet information server. The system's design and structure are proposed, and the key points are discussed. The prototype system of identity authentication based on Fingerprint have been successfully tested and evaluated on our university's distant education applications in an internet environment.
Scalable Authenticated Tree Based Group Key Exchange for Ad-Hoc Groups
NASA Astrophysics Data System (ADS)
Desmedt, Yvo; Lange, Tanja; Burmester, Mike
Task-specific groups are often formed in an ad-hoc manner within large corporate structures, such as companies. Take the following typical scenario: A director decides to set up a task force group for some specific project. An order is passed down the hierarchy where it finally reaches a manager who selects some employees to form the group. The members should communicate in a secure way and for efficiency, a symmetric encryption system is chosen. To establish a joint secret key for the group, a group key exchange (GKE) protocol is used. We show how to use an existing Public Key Infrastructure (PKI) to achieve authenticated GKE by modifying the protocol and particularly by including signatures.
Seo, Jung Woo; Lee, Sang Jin
2016-01-01
Weather information provides a safe working environment by contributing to the economic activity of the nation, and plays role of the prevention of natural disasters, which can cause large scaled casualties and damage of property. Especially during times of war, weather information plays a more important role than strategy, tactics and information about trends of the enemy. Also, it plays an essential role for the taking off and landing of fighter jet and the sailing of warships. If weather information, which plays a major role in national security and economy, gets misused for cyber terrorism resulting false weather information, it could be a huge threat for national security and the economy. We propose a plan to safely transmit the measured value from meteorological sensors through a meteorological telecommunication network in order to guarantee the confidentiality and integrity of the data despite cyber-attacks. Also, such a plan allows one to produce reliable weather forecasts by performing mutual authentication through authentication devices. To make sure of this, one can apply an Identity Based Signature to ensure the integrity of measured data, and transmit the encrypted weather information with mutual authentication about the authentication devices. There are merits of this research: It is not necessary to manage authentication certificates unlike the Public Key Infrastructure methodology, and it provides a powerful security measure with the capability to be realized in a small scale computing environment, such as the meteorological observation system due to the low burden on managing keys.
Secure Authentication for Remote Patient Monitoring with Wireless Medical Sensor Networks †
Hayajneh, Thaier; Mohd, Bassam J; Imran, Muhammad; Almashaqbeh, Ghada; Vasilakos, Athanasios V.
2016-01-01
There is broad consensus that remote health monitoring will benefit all stakeholders in the healthcare system and that it has the potential to save billions of dollars. Among the major concerns that are preventing the patients from widely adopting this technology are data privacy and security. Wireless Medical Sensor Networks (MSNs) are the building blocks for remote health monitoring systems. This paper helps to identify the most challenging security issues in the existing authentication protocols for remote patient monitoring and presents a lightweight public-key-based authentication protocol for MSNs. In MSNs, the nodes are classified into sensors that report measurements about the human body and actuators that receive commands from the medical staff and perform actions. Authenticating these commands is a critical security issue, as any alteration may lead to serious consequences. The proposed protocol is based on the Rabin authentication algorithm, which is modified in this paper to improve its signature signing process, making it suitable for delay-sensitive MSN applications. To prove the efficiency of the Rabin algorithm, we implemented the algorithm with different hardware settings using Tmote Sky motes and also programmed the algorithm on an FPGA to evaluate its design and performance. Furthermore, the proposed protocol is implemented and tested using the MIRACL (Multiprecision Integer and Rational Arithmetic C/C++) library. The results show that secure, direct, instant and authenticated commands can be delivered from the medical staff to the MSN nodes. PMID:27023540
Secure Authentication for Remote Patient Monitoring with Wireless Medical Sensor Networks.
Hayajneh, Thaier; Mohd, Bassam J; Imran, Muhammad; Almashaqbeh, Ghada; Vasilakos, Athanasios V
2016-03-24
There is broad consensus that remote health monitoring will benefit all stakeholders in the healthcare system and that it has the potential to save billions of dollars. Among the major concerns that are preventing the patients from widely adopting this technology are data privacy and security. Wireless Medical Sensor Networks (MSNs) are the building blocks for remote health monitoring systems. This paper helps to identify the most challenging security issues in the existing authentication protocols for remote patient monitoring and presents a lightweight public-key-based authentication protocol for MSNs. In MSNs, the nodes are classified into sensors that report measurements about the human body and actuators that receive commands from the medical staff and perform actions. Authenticating these commands is a critical security issue, as any alteration may lead to serious consequences. The proposed protocol is based on the Rabin authentication algorithm, which is modified in this paper to improve its signature signing process, making it suitable for delay-sensitive MSN applications. To prove the efficiency of the Rabin algorithm, we implemented the algorithm with different hardware settings using Tmote Sky motes and also programmed the algorithm on an FPGA to evaluate its design and performance. Furthermore, the proposed protocol is implemented and tested using the MIRACL (Multiprecision Integer and Rational Arithmetic C/C++) library. The results show that secure, direct, instant and authenticated commands can be delivered from the medical staff to the MSN nodes.
Automatic Plagiarism Detection with PAIRwise 2.0
ERIC Educational Resources Information Center
Knight, Allan; Almeroth, Kevin
2011-01-01
As part of the research carried out at the University of California, Santa Barbara's Center for Information Technology and Society (CITS), the Paper Authentication and Integrity Research (PAIR) project was launched. We began by investigating how one recent technology affected student learning outcomes. One aspect of this research was to study the…
A Framework for Designing Scaffolds that Improve Motivation and Cognition
ERIC Educational Resources Information Center
Belland, Brian R.; Kim, ChanMin; Hannafin, Michael J.
2013-01-01
A problematic, yet common, assumption among educational researchers is that when teachers provide authentic, problem-based experiences, students will automatically be engaged. Evidence indicates that this is often not the case. In this article, we discuss (a) problems with ignoring motivation in the design of learning environments, (b)…
Podcasts as a Learning Tool: German Language and Culture Every Day
ERIC Educational Resources Information Center
Schmidt, Johannes
2008-01-01
Podcasts provide a straightforward opportunity to stay connected with language, culture, and recent events of German-speaking countries. Podcasts offer clearly articulated, authentic material that can be automatically and regularly delivered to your computer and classrooms; continuously exposing students and teachers to German. This article…
McGettigan, C.; Walsh, E.; Jessop, R.; Agnew, Z. K.; Sauter, D. A.; Warren, J. E.; Scott, S. K.
2015-01-01
Humans express laughter differently depending on the context: polite titters of agreement are very different from explosions of mirth. Using functional MRI, we explored the neural responses during passive listening to authentic amusement laughter and controlled, voluntary laughter. We found greater activity in anterior medial prefrontal cortex (amPFC) to the deliberate, Emitted Laughs, suggesting an obligatory attempt to determine others' mental states when laughter is perceived as less genuine. In contrast, passive perception of authentic Evoked Laughs was associated with greater activity in bilateral superior temporal gyri. An individual differences analysis found that greater accuracy on a post hoc test of authenticity judgments of laughter predicted the magnitude of passive listening responses to laughter in amPFC, as well as several regions in sensorimotor cortex (in line with simulation accounts of emotion perception). These medial prefrontal and sensorimotor sites showed enhanced positive connectivity with cortical and subcortical regions during listening to involuntary laughter, indicating a complex set of interacting systems supporting the automatic emotional evaluation of heard vocalizations. PMID:23968840
McGettigan, C; Walsh, E; Jessop, R; Agnew, Z K; Sauter, D A; Warren, J E; Scott, S K
2015-01-01
Humans express laughter differently depending on the context: polite titters of agreement are very different from explosions of mirth. Using functional MRI, we explored the neural responses during passive listening to authentic amusement laughter and controlled, voluntary laughter. We found greater activity in anterior medial prefrontal cortex (amPFC) to the deliberate, Emitted Laughs, suggesting an obligatory attempt to determine others' mental states when laughter is perceived as less genuine. In contrast, passive perception of authentic Evoked Laughs was associated with greater activity in bilateral superior temporal gyri. An individual differences analysis found that greater accuracy on a post hoc test of authenticity judgments of laughter predicted the magnitude of passive listening responses to laughter in amPFC, as well as several regions in sensorimotor cortex (in line with simulation accounts of emotion perception). These medial prefrontal and sensorimotor sites showed enhanced positive connectivity with cortical and subcortical regions during listening to involuntary laughter, indicating a complex set of interacting systems supporting the automatic emotional evaluation of heard vocalizations. © The Author 2013. Published by Oxford University Press.
Enhancing the authenticity of assessments through grounding in first impressions.
Humă, Bogdana
2015-09-01
This article examines first impressions through a discursive and interactional lens. Until now, social psychologists have studied first impressions in laboratory conditions, in isolation from their natural environment, thus overseeing their discursive roles as devices for managing situated interactional concerns. I examine fragments of text and talk in which individuals spontaneously invoke first impressions of other persons as part of assessment activities in settings where the authenticity of speakers' stances might be threatened: (1) in activities with inbuilt evaluative components and (2) in sequential contexts where recipients have been withholding affiliation to speakers' actions. I discuss the relationship between authenticity, as a type of credibility issue related to intersubjective trouble, and the characteristics of first impression assessments, which render them useful for dealing with this specific credibility concern. I identify four features of first impression assessments which make them effective in enhancing authenticity: witness positioning (Potter, 1996, Representing reality: Discourse, rhetoric and social construction, Sage, London), (dis)location in time and space, automaticity, and extreme formulations (Edwards, 2003, Analyzing race talk: Multidisciplinary perspectives on the research interview, Cambridge University Press, New York). © 2014 The British Psychological Society.
Threshold Things That Think: Authorisation for Resharing
NASA Astrophysics Data System (ADS)
Peeters, Roel; Kohlweiss, Markulf; Preneel, Bart
As we are evolving towards ubiquitous computing, users carry an increasing number of mobile devices with sensitive information. The security of this information can be protected using threshold cryptography, in which secret computations are shared between multiple devices. Threshold cryptography can be made more robust by resharing protocols, which allow recovery from partial compromises. This paper introduces user-friendly and secure protocols for the authorisation of resharing protocols. We present both automatic and manual protocols, utilising a group manual authentication protocol to add a new device. We analyse the security of these protocols: our analysis considers permanent and temporary compromises, denial of service attacks and manual authentications errors of the user.
AdaBoost-based on-line signature verifier
NASA Astrophysics Data System (ADS)
Hongo, Yasunori; Muramatsu, Daigo; Matsumoto, Takashi
2005-03-01
Authentication of individuals is rapidly becoming an important issue. The authors previously proposed a Pen-input online signature verification algorithm. The algorithm considers a writer"s signature as a trajectory of pen position, pen pressure, pen azimuth, and pen altitude that evolve over time, so that it is dynamic and biometric. Many algorithms have been proposed and reported to achieve accuracy for on-line signature verification, but setting the threshold value for these algorithms is a problem. In this paper, we introduce a user-generic model generated by AdaBoost, which resolves this problem. When user- specific models (one model for each user) are used for signature verification problems, we need to generate the models using only genuine signatures. Forged signatures are not available because imposters do not give forged signatures for training in advance. However, we can make use of another's forged signature in addition to the genuine signatures for learning by introducing a user generic model. And Adaboost is a well-known classification algorithm, making final decisions depending on the sign of the output value. Therefore, it is not necessary to set the threshold value. A preliminary experiment is performed on a database consisting of data from 50 individuals. This set consists of western-alphabet-based signatures provide by a European research group. In this experiment, our algorithm gives an FRR of 1.88% and an FAR of 1.60%. Since no fine-tuning was done, this preliminary result looks very promising.
Authentication of gold nanoparticle encoded pharmaceutical tablets using polarimetric signatures.
Carnicer, Artur; Arteaga, Oriol; Suñé-Negre, Josep M; Javidi, Bahram
2016-10-01
The counterfeiting of pharmaceutical products represents concerns for both industry and the safety of the general public. Falsification produces losses to companies and poses health risks for patients. In order to detect fake pharmaceutical tablets, we propose producing film-coated tablets with gold nanoparticle encoding. These coated tablets contain unique polarimetric signatures. We present experiments to show that ellipsometric optical techniques, in combination with machine learning algorithms, can be used to distinguish genuine and fake samples. To the best of our knowledge, this is the first report using gold nanoparticles encoded with optical polarimetric classifiers to prevent the counterfeiting of pharmaceutical products.
Patients’ Data Management System Protected by Identity-Based Authentication and Key Exchange
Rivero-García, Alexandra; Santos-González, Iván; Hernández-Goya, Candelaria; Caballero-Gil, Pino; Yung, Moti
2017-01-01
A secure and distributed framework for the management of patients’ information in emergency and hospitalization services is proposed here in order to seek improvements in efficiency and security in this important area. In particular, confidentiality protection, mutual authentication, and automatic identification of patients are provided. The proposed system is based on two types of devices: Near Field Communication (NFC) wristbands assigned to patients, and mobile devices assigned to medical staff. Two other main elements of the system are an intermediate server to manage the involved data, and a second server with a private key generator to define the information required to protect communications. An identity-based authentication and key exchange scheme is essential to provide confidential communication and mutual authentication between the medical staff and the private key generator through an intermediate server. The identification of patients is carried out through a keyed-hash message authentication code. Thanks to the combination of the aforementioned tools, a secure alternative mobile health (mHealth) scheme for managing patients’ data is defined for emergency and hospitalization services. Different parts of the proposed system have been implemented, including mobile application, intermediate server, private key generator and communication channels. Apart from that, several simulations have been performed, and, compared with the current system, significant improvements in efficiency have been observed. PMID:28362328
Patients' Data Management System Protected by Identity-Based Authentication and Key Exchange.
Rivero-García, Alexandra; Santos-González, Iván; Hernández-Goya, Candelaria; Caballero-Gil, Pino; Yung, Moti
2017-03-31
A secure and distributed framework for the management of patients' information in emergency and hospitalization services is proposed here in order to seek improvements in efficiency and security in this important area. In particular, confidentiality protection, mutual authentication, and automatic identification of patients are provided. The proposed system is based on two types of devices: Near Field Communication (NFC) wristbands assigned to patients, and mobile devices assigned to medical staff. Two other main elements of the system are an intermediate server to manage the involved data, and a second server with a private key generator to define the information required to protect communications. An identity-based authentication and key exchange scheme is essential to provide confidential communication and mutual authentication between the medical staff and the private key generator through an intermediate server. The identification of patients is carried out through a keyed-hash message authentication code. Thanks to the combination of the aforementioned tools, a secure alternative mobile health (mHealth) scheme for managing patients' data is defined for emergency and hospitalization services. Different parts of the proposed system have been implemented, including mobile application, intermediate server, private key generator and communication channels. Apart from that, several simulations have been performed, and, compared with the current system, significant improvements in efficiency have been observed.
Authentication of beef versus horse meat using 60 MHz 1H NMR spectroscopy
Jakes, W.; Gerdova, A.; Defernez, M.; Watson, A.D.; McCallum, C.; Limer, E.; Colquhoun, I.J.; Williamson, D.C.; Kemsley, E.K.
2015-01-01
This work reports a candidate screening protocol to distinguish beef from horse meat based upon comparison of triglyceride signatures obtained by 60 MHz 1H NMR spectroscopy. Using a simple chloroform-based extraction, we obtained classic low-field triglyceride spectra from typically a 10 min acquisition time. Peak integration was sufficient to differentiate samples of fresh beef (76 extractions) and horse (62 extractions) using Naïve Bayes classification. Principal component analysis gave a two-dimensional “authentic” beef region (p = 0.001) against which further spectra could be compared. This model was challenged using a subset of 23 freeze–thawed training samples. The outcomes indicated that storing samples by freezing does not adversely affect the analysis. Of a further collection of extractions from previously unseen samples, 90/91 beef spectra were classified as authentic, and 16/16 horse spectra as non-authentic. We conclude that 60 MHz 1H NMR represents a feasible high-throughput approach for screening raw meat. PMID:25577043
Automatic Target Recognition Based on Cross-Plot
Wong, Kelvin Kian Loong; Abbott, Derek
2011-01-01
Automatic target recognition that relies on rapid feature extraction of real-time target from photo-realistic imaging will enable efficient identification of target patterns. To achieve this objective, Cross-plots of binary patterns are explored as potential signatures for the observed target by high-speed capture of the crucial spatial features using minimal computational resources. Target recognition was implemented based on the proposed pattern recognition concept and tested rigorously for its precision and recall performance. We conclude that Cross-plotting is able to produce a digital fingerprint of a target that correlates efficiently and effectively to signatures of patterns having its identity in a target repository. PMID:21980508
Zampella, Mariavittoria; Quétel, Christophe R; Paredes, Eduardo; Goitom Asfaha, Daniel; Vingiani, Simona; Adamo, Paola
2011-10-15
We propose a method for the authentication of the origin of vegetables grown under similar weather conditions, in sites less than 10 km distance from the sea and distributed over a rather small scale area (58651 km(2)). We studied how the strontium (Sr) isotopic signature and selected elemental concentrations ([Mn], [Cu], [Zn], [Rb], [Sr] and [Cd]) in early potatoes from three neighbouring administrative regions in the south of Italy were related to the geological substrate (alluvial sediments, volcanic substrates and carbonate rocks) and to selected soil chemical properties influencing the bioavailability of elements in soils (pH, cation exchange capacity and total carbonate content). Through multiple-step multivariate statistics (PLS-DA) we could assign 26 potatoes (including two already commercialised samples) to their respective eight sites of production, corresponding to the first two types of geological substrates. The other 12 potatoes from four sites of production had similar characteristics in terms of the geological substrate (third type) and these soil properties could be grouped together. In this case, more discriminative parameters would be required to allow the differentiation between sites. The validation of our models included external prediction tests with data of potatoes harvested the year before and a study on the robustness of the uncertainties of the measurement results. Annual variations between multi-elemental and Sr isotopic fingerprints were observed in potatoes harvested from soils overlying carbonate rocks, stressing the importance of testing long term variations in authentication studies. Copyright © 2011 John Wiley & Sons, Ltd. and European Union [2011].
Optical ID Tags for Secure Verification of Multispectral Visible and NIR Signatures
NASA Astrophysics Data System (ADS)
Pérez-Cabré, Elisabet; Millán, María S.; Javidi, Bahram
2008-04-01
We propose to combine information from visible (VIS) and near infrared (NIR) spectral bands to increase robustness on security systems and deter from unauthorized use of optical tags that permit the identification of a given person or object. The signature that identifies the element under surveillance will be only obtained by the appropriate combination of the visible content and the NIR data. The fully-phase encryption technique is applied to avoid an easy recognition of the resultant signature at the naked eye and an easy reproduction using conventional devices for imaging or scanning. The obtained complex-amplitude encrypted distribution is encoded on an identity (ID) tag. Spatial multiplexing of the encrypted signature allows us to build a distortion-invariant ID tag, so that remote authentication can be achieved even if the tag is captured under rotation or at different distances. We explore the possibility of using partial information of the encrypted distribution. Simulation results are provided and discussed.
ERIC Educational Resources Information Center
Hill, K. Dara
2017-01-01
The current climate of reading instruction calls for fluency strategies that stress automaticity, accuracy, and prosody, within the scope of prescribed reading programs that compromise teacher autonomy, with texts that are often irrelevant to the students' experiences. Consequently, accuracy and speed are developed, but deep comprehension is…
The Suitability of Cloud-Based Speech Recognition Engines for Language Learning
ERIC Educational Resources Information Center
Daniels, Paul; Iwago, Koji
2017-01-01
As online automatic speech recognition (ASR) engines become more accurate and more widely implemented with call software, it becomes important to evaluate the effectiveness and the accuracy of these recognition engines using authentic speech samples. This study investigates two of the most prominent cloud-based speech recognition engines--Apple's…
A privacy authentication scheme based on cloud for medical environment.
Chen, Chin-Ling; Yang, Tsai-Tung; Chiang, Mao-Lun; Shih, Tzay-Farn
2014-11-01
With the rapid development of the information technology, the health care technologies already became matured. Such as electronic medical records that can be easily stored. However, how to get medical resources more convenient is currently concerning issue. In spite of many literatures discussed about medical systems, these literatures should face many security challenges. The most important issue is patients' privacy. Therefore, we propose a privacy authentication scheme based on cloud environment. In our scheme, we use mobile device's characteristics, allowing peoples to use medical resources on the cloud environment to find medical advice conveniently. The digital signature is used to ensure the security of the medical information that is certified by the medical department in our proposed scheme.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2008-03-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The function to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and Success in login" effective. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
ERIC Educational Resources Information Center
Keane, Brian P.; Mettler, Everett; Tsoi, Vicky; Kellman, Philip J.
2011-01-01
Multiple object tracking (MOT) is an attentional task wherein observers attempt to track multiple targets among moving distractors. Contour interpolation is a perceptual process that fills-in nonvisible edges on the basis of how surrounding edges (inducers) are spatiotemporally related. In five experiments, we explored the automaticity of…
Isotopic (d18O/d2H) integrity of water samples collected and stored by automatic samplers
USDA-ARS?s Scientific Manuscript database
Stable water isotopes are increasingly becoming part of routine monitoring programs that utilize automatic samplers. The objectives of this study were to quantify the uncertainty in isotope signatures due to the length of sample storage (1-24 d) inside autosamplers over a range of air temperatures (...
Cross spectral, active and passive approach to face recognition for improved performance
NASA Astrophysics Data System (ADS)
Grudzien, A.; Kowalski, M.; Szustakowski, M.
2017-08-01
Biometrics is a technique for automatic recognition of a person based on physiological or behavior characteristics. Since the characteristics used are unique, biometrics can create a direct link between a person and identity, based on variety of characteristics. The human face is one of the most important biometric modalities for automatic authentication. The most popular method of face recognition which relies on processing of visual information seems to be imperfect. Thermal infrared imagery may be a promising alternative or complement to visible range imaging due to its several reasons. This paper presents an approach of combining both methods.
Ballistic Signature Identification System Study
NASA Technical Reports Server (NTRS)
1976-01-01
The first phase of a research project directed toward development of a high speed automatic process to be used to match gun barrel signatures imparted to fired bullets was documented. An optical projection technique has been devised to produce and photograph a planar image of the entire signature, and the phototransparency produced is subjected to analysis using digital Fourier transform techniques. The success of this approach appears to be limited primarily by the accuracy of the photographic step since no significant processing limitations have been encountered.
Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum
2010-08-01
activity by providing a check on the relevance and currency of the process used to develop the MSwA2010 curriculum content. Figure 2 is an expansion of...random oracle model, symmetric crypto primitives, modes of operations, asymmetric crypto primitives (Chapter 5) [16] Detailed design...encryption, public key encryption, digital signatures, message authentication codes, crypto protocols, cryptanalysis, and further detailed crypto
Evaluation of security algorithms used for security processing on DICOM images
NASA Astrophysics Data System (ADS)
Chen, Xiaomeng; Shuai, Jie; Zhang, Jianguo; Huang, H. K.
2005-04-01
In this paper, we developed security approach to provide security measures and features in PACS image acquisition and Tele-radiology image transmission. The security processing on medical images was based on public key infrastructure (PKI) and including digital signature and data encryption to achieve the security features of confidentiality, privacy, authenticity, integrity, and non-repudiation. There are many algorithms which can be used in PKI for data encryption and digital signature. In this research, we select several algorithms to perform security processing on different DICOM images in PACS environment, evaluate the security processing performance of these algorithms, and find the relationship between performance with image types, sizes and the implementation methods.
Advanced Information Assurance Handbook
2004-03-01
and extraction of stored or recorded digital /electronic evidence (from the Latin root “ forensis ,” relating to the forum/legal business). 4.3...up over 100 MB of memory ) and also open the door to potential security incidents. Services can be set to start automatically upon boot or they can... digital certificates, thereby increasing the reliability of the authentication considerably. 12
Authenticity screening of stained glass windows using optical spectroscopy
Meulebroeck, Wendy; Wouters, Hilde; Nys, Karin; Thienpont, Hugo
2016-01-01
Civilized societies should safeguard their heritage as it plays an important role in community building. Moreover, past technologies often inspire new technology. Authenticity is besides conservation and restoration a key aspect in preserving our past, for example in museums when exposing showpieces. The classification of being authentic relies on an interdisciplinary approach integrating art historical and archaeological research complemented with applied research. In recent decades analytical dating tools are based on determining the raw materials used. However, the traditional applied non-portable, chemical techniques are destructive and time-consuming. Since museums oftentimes only consent to research actions which are completely non-destructive, optical spectroscopy might offer a solution. As a case-study we apply this technique on two stained glass panels for which the 14th century dating is nowadays questioned. With this research we were able to identify how simultaneous mapping of spectral signatures measured with a low cost optical spectrum analyser unveils information regarding the production period. The significance of this research extends beyond the re-dating of these panels to the 19th century as it provides an instant tool enabling immediate answering authenticity questions during the conservation process of stained glass, thereby providing the necessary data for solving deontological questions about heritage preservation. PMID:27883056
Authenticity screening of stained glass windows using optical spectroscopy
NASA Astrophysics Data System (ADS)
Meulebroeck, Wendy; Wouters, Hilde; Nys, Karin; Thienpont, Hugo
2016-11-01
Civilized societies should safeguard their heritage as it plays an important role in community building. Moreover, past technologies often inspire new technology. Authenticity is besides conservation and restoration a key aspect in preserving our past, for example in museums when exposing showpieces. The classification of being authentic relies on an interdisciplinary approach integrating art historical and archaeological research complemented with applied research. In recent decades analytical dating tools are based on determining the raw materials used. However, the traditional applied non-portable, chemical techniques are destructive and time-consuming. Since museums oftentimes only consent to research actions which are completely non-destructive, optical spectroscopy might offer a solution. As a case-study we apply this technique on two stained glass panels for which the 14th century dating is nowadays questioned. With this research we were able to identify how simultaneous mapping of spectral signatures measured with a low cost optical spectrum analyser unveils information regarding the production period. The significance of this research extends beyond the re-dating of these panels to the 19th century as it provides an instant tool enabling immediate answering authenticity questions during the conservation process of stained glass, thereby providing the necessary data for solving deontological questions about heritage preservation.
Self-Assembled Resonance Energy Transfer Keys for Secure Communication over Classical Channels.
Nellore, Vishwa; Xi, Sam; Dwyer, Chris
2015-12-22
Modern authentication and communication protocols increasingly use physical keys in lieu of conventional software-based keys for security. This shift is primarily driven by the ability to derive a unique, unforgeable signature from a physical key. The sole demonstration of an unforgeable key, thus far, has been through quantum key distribution, which suffers from limited communication distances and expensive infrastructure requirements. Here, we show a method for creating unclonable keys by molecular self-assembly of resonance energy transfer (RET) devices. It is infeasible to clone the RET-key due to the inability to characterize the key using current technology, the large number of input-output combinations per key, and the variation of the key's response with time. However, the manufacturer can produce multiple identical devices, which enables inexpensive, secure authentication and communication over classical channels, and thus any distance. Through a detailed experimental survey of the nanoscale keys, we demonstrate that legitimate users are successfully authenticated 99.48% of the time and the false-positives are only 0.39%, over two attempts. We estimate that a legitimate user would have a computational advantage of more than 10(340) years over an attacker. Our method enables the discovery of physical key based multiparty authentication and communication schemes that are both practical and possess unprecedented security.
Optical Radiation: Susceptibility and Countermeasures
1998-12-01
1995). "Early Visual Acuity Side Effects After Laser Retinal Photocoagulation in Diabetic Retinopathy ," W.D. Kosnik, L. Marouf, and M. Myers...tests. The automatic positioner (AMPS) coupled with the automatic optical test system (PEATS) permits timely and consistent evaluation of candidate...Science and Engineering OR:S&C Optical Radiation: Susceptibility and Countermeasures OSADS Optical Signature, Acquisition, and Detection System
Self-authentication of value documents
NASA Astrophysics Data System (ADS)
Hayosh, Thomas D.
1998-04-01
To prevent fraud it is critical to distinguish an authentic document from a counterfeit or altered document. Most current technologies rely on difficult-to-print human detectable features which are added to a document to prevent illegal reproduction. Fraud detection is mostly accomplished by human observation and is based upon the examiner's knowledge, experience and time allotted for examination of a document. Another approach to increasing the security of a value document is to add a unique property to each document. Data about that property is then encoded on the document itself and finally secured using a public key based digital signature. In such a scheme, machine readability of authenticity is possible. This paper describes a patent-applied-for methodology using the unique property of magnetic ink printing, magnetic remanence, that provides for full self- authentication when used with a recordable magnetic stripe for storing a digital signature and other document data. Traditionally the authenticity of a document is determined by physical examination for color, background printing, paper texture, printing resolution, and ink characteristics. On an initial level, there may be numerous security features present on a value document but only a few can be detected and evaluated by the untrained individual. Because security features are normally not standardized except on currency, training tellers and cashiers to do extensive security evaluation is not practical, even though these people are often the only people who get a chance to closely examine the document in a payment system which is back-end automated. In the context of this paper, one should be thinking about value documents such as commercial and personal checks although the concepts presented here can easily be applied to travelers cheques, credit cards, event tickets, passports, driver's licenses, motor vehicle titles, and even currency. For a practical self-authentication system, the false alarms should be less than 1% on the first read pass. Causes of false alarms could be the lack of robustness of the taggant discrimination algorithm, excessive document skew as it is being read, or errors in reading the recordable stripe. The false alarm rate is readily tested by reading the magnetic tags and digitally signing documents in one reader and performing authentication in at least two other reading devices. When reading the same check in the same reader where signed, the error metric is typically in the range of 0.0600. When comparing different checks in different readers, the error metric generally reports values in the range of 0.3930. It is clear from tests to date that the taggant patterns are randomly different for checks even when printed serially one after another using the same printing process. Testing results to date on the robustness of the taggant comparison and discrimination algorithms indicate that it is probable that low false alarms and very low false accept rates will be achieved.
Keystroke dynamics in the pre-touchscreen era
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.
2013-01-01
Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568
Keystroke dynamics in the pre-touchscreen era.
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A
2013-12-19
Biometric authentication seeks to measure an individual's unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.
Signature detection and matching for document image retrieval.
Zhu, Guangyu; Zheng, Yefeng; Doermann, David; Jaeger, Stefan
2009-11-01
As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches.
A Framework for Designing Scaffolds That Improve Motivation and Cognition
Belland, Brian R.; Kim, ChanMin; Hannafin, Michael J.
2013-01-01
A problematic, yet common, assumption among educational researchers is that when teachers provide authentic, problem-based experiences, students will automatically be engaged. Evidence indicates that this is often not the case. In this article, we discuss (a) problems with ignoring motivation in the design of learning environments, (b) problem-based learning and scaffolding as one way to help, (c) how scaffolding has strayed from what was originally equal parts motivational and cognitive support, and (d) a conceptual framework for the design of scaffolds that can enhance motivation as well as cognitive outcomes. We propose guidelines for the design of computer-based scaffolds to promote motivation and engagement while students are solving authentic problems. Remaining questions and suggestions for future research are then discussed. PMID:24273351
Terrain type recognition using ERTS-1 MSS images
NASA Technical Reports Server (NTRS)
Gramenopoulos, N.
1973-01-01
For the automatic recognition of earth resources from ERTS-1 digital tapes, both multispectral and spatial pattern recognition techniques are important. Recognition of terrain types is based on spatial signatures that become evident by processing small portions of an image through selected algorithms. An investigation of spatial signatures that are applicable to ERTS-1 MSS images is described. Artifacts in the spatial signatures seem to be related to the multispectral scanner. A method for suppressing such artifacts is presented. Finally, results of terrain type recognition for one ERTS-1 image are presented.
Modeling the Lexical Morphology of Western Handwritten Signatures
Diaz-Cabrera, Moises; Ferrer, Miguel A.; Morales, Aythami
2015-01-01
A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures. PMID:25860942
Authentication, Time-Stamping and Digital Signatures
NASA Technical Reports Server (NTRS)
Levine, Judah
1996-01-01
Time and frequency data are often transmitted over public packet-switched networks, and the use of this mode of distribution is likely to increase in the near future as high-speed logical circuits transmitted via networks replace point-to-point physical circuits. ALthough these networks have many technical advantages, they are susceptible to evesdropping, spoofing, and the alteration of messages enroute using techniques that are relatively simple to implement and quite difficult to detect. I will discuss a number of solutions to these problems, including the authentication mechanism used in the Network Time Protocol (NTP) and the more general technique of signing time-stamps using public key cryptography. This public key method can also be used to implement the digital analog of a Notary Public, and I will discuss how such a system could be realized on a public network such as the Internet.
Image-based electronic patient records for secured collaborative medical applications.
Zhang, Jianguo; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Yao, Yihong; Cai, Weihua; Jin, Jin; Zhang, Guozhen; Sun, Kun
2005-01-01
We developed a Web-based system to interactively display image-based electronic patient records (EPR) for secured intranet and Internet collaborative medical applications. The system consists of four major components: EPR DICOM gateway (EPR-GW), Image-based EPR repository server (EPR-Server), Web Server and EPR DICOM viewer (EPR-Viewer). In the EPR-GW and EPR-Viewer, the security modules of Digital Signature and Authentication are integrated to perform the security processing on the EPR data with integrity and authenticity. The privacy of EPR in data communication and exchanging is provided by SSL/TLS-based secure communication. This presentation gave a new approach to create and manage image-based EPR from actual patient records, and also presented a way to use Web technology and DICOM standard to build an open architecture for collaborative medical applications.
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan A.
2012-06-01
This paper proposes a new technique to obfuscate an authentication-challenge program (named LocProg) using randomly generated data together with a client's current location in real-time. LocProg can be used to enable any handsetapplication on mobile-devices (e.g. mCommerce on Smartphones) that requires authentication with a remote authenticator (e.g. bank). The motivation of this novel technique is to a) enhance the security against replay attacks, which is currently based on using real-time nonce(s), and b) add a new security factor, which is location verified by two independent sources, to challenge / response methods for authentication. To assure a secure-live transaction, thus reducing the possibility of replay and other remote attacks, the authors have devised a novel technique to obtain the client's location from two independent sources of GPS on the client's side and the cellular network on authenticator's side. The algorithm of LocProg is based on obfuscating "random elements plus a client's data" with a location-based key, generated on the bank side. LocProg is then sent to the client and is designed so it will automatically integrate into the target application on the client's handset. The client can then de-obfuscate LocProg if s/he is within a certain range around the location calculated by the bank and if the correct personal data is supplied. LocProg also has features to protect against trial/error attacks. Analysis of LocAuth's security (trust, threat and system models) and trials based on a prototype implementation (on Android platform) prove the viability and novelty of LocAuth.
Ten-Ecosystem Study (TES) site 9, Washington County, Missouri
NASA Technical Reports Server (NTRS)
Echert, W. H. (Principal Investigator)
1979-01-01
The author has identified the following significant results. Sufficient spectral separability exists among softwood, hardwood, grassland, and water to develop a level 2 classification and inventory. Using the tested automatic data processing technology, softwood and grassland signatures can be extended across the county with acceptable accuracy; with more dense sampling, the hardwood signature probably could also be extended. Fall was found to be the best season for mapping this ecosystem.
Li, Chuan; Peng, Juan; Liang, Ming
2014-01-01
Oil debris sensors are effective tools to monitor wear particles in lubricants. For in situ applications, surrounding noise and vibration interferences often distort the oil debris signature of the sensor. Hence extracting oil debris signatures from sensor signals is a challenging task for wear particle monitoring. In this paper we employ the maximal overlap discrete wavelet transform (MODWT) with optimal decomposition depth to enhance the wear particle monitoring capability. The sensor signal is decomposed by the MODWT into different depths for detecting the wear particle existence. To extract the authentic particle signature with minimal distortion, the root mean square deviation of kurtosis value of the segmented signal residue is adopted as a criterion to obtain the optimal decomposition depth for the MODWT. The proposed approach is evaluated using both simulated and experimental wear particles. The results show that the present method can improve the oil debris monitoring capability without structural upgrade requirements. PMID:24686730
ElMasry, Gamal; Nakauchi, Shigeki
2016-03-01
A simulation method for approximating spectral signatures of minced meat samples was developed depending on concentrations and optical properties of the major chemical constituents. Minced beef samples of different compositions scanned on a near-infrared spectroscopy and on a hyperspectral imaging system were examined. Chemical composition determined heuristically and optical properties collected from authenticated references were simulated to approximate samples' spectral signatures. In short-wave infrared range, the resulting spectrum equals the sum of the absorption of three individual absorbers, that is, water, protein, and fat. By assuming homogeneous distributions of the main chromophores in the mince samples, the obtained absorption spectra are found to be a linear combination of the absorption spectra of the major chromophores present in the sample. Results revealed that developed models were good enough to derive spectral signatures of minced meat samples with a reasonable level of robustness of a high agreement index value more than 0.90 and ratio of performance to deviation more than 1.4.
Li, Chuan; Peng, Juan; Liang, Ming
2014-03-28
Oil debris sensors are effective tools to monitor wear particles in lubricants. For in situ applications, surrounding noise and vibration interferences often distort the oil debris signature of the sensor. Hence extracting oil debris signatures from sensor signals is a challenging task for wear particle monitoring. In this paper we employ the maximal overlap discrete wavelet transform (MODWT) with optimal decomposition depth to enhance the wear particle monitoring capability. The sensor signal is decomposed by the MODWT into different depths for detecting the wear particle existence. To extract the authentic particle signature with minimal distortion, the root mean square deviation of kurtosis value of the segmented signal residue is adopted as a criterion to obtain the optimal decomposition depth for the MODWT. The proposed approach is evaluated using both simulated and experimental wear particles. The results show that the present method can improve the oil debris monitoring capability without structural upgrade requirements.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kakinuma, Ryutaru; Moriyama, Noriyuki
2009-02-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. To overcome these problems, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The functions to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and "Success in login" effective. As a result, patients' private information is protected. We can share the screen of Web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with workstation. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Built-in-test by signature inspection (bitsi)
Bergeson, Gary C.; Morneau, Richard A.
1991-01-01
A system and method for fault detection for electronic circuits. A stimulus generator sends a signal to the input of the circuit under test. Signature inspection logic compares the resultant signal from test nodes on the circuit to an expected signal. If the signals do not match, the signature inspection logic sends a signal to the control logic for indication of fault detection in the circuit. A data input multiplexer between the test nodes of the circuit under test and the signature inspection logic can provide for identification of the specific node at fault by the signature inspection logic. Control logic responsive to the signature inspection logic conveys information about fault detection for use in determining the condition of the circuit. When used in conjunction with a system test controller, the built-in test by signature inspection system and method can be used to poll a plurality of circuits automatically and continuous for faults and record the results of such polling in the system test controller.
Angular relational signature-based chest radiograph image view classification.
Santosh, K C; Wendling, Laurent
2018-01-22
In a computer-aided diagnosis (CAD) system, especially for chest radiograph or chest X-ray (CXR) screening, CXR image view information is required. Automatically separating CXR image view, frontal and lateral can ease subsequent CXR screening process, since the techniques may not equally work for both views. We present a novel technique to classify frontal and lateral CXR images, where we introduce angular relational signature through force histogram to extract features and apply three different state-of-the-art classifiers: multi-layer perceptron, random forest, and support vector machine to make a decision. We validated our fully automatic technique on a set of 8100 images hosted by the U.S. National Library of Medicine (NLM), National Institutes of Health (NIH), and achieved an accuracy close to 100%. Our method outperforms the state-of-the-art methods in terms of processing time (less than or close to 2 s for the whole test data) while the accuracies can be compared, and therefore, it justifies its practicality. Graphical Abstract Interpreting chest X-ray (CXR) through the angular relational signature.
Marini, Francesco; Marzi, Carlo A.
2016-01-01
The visual system leverages organizational regularities of perceptual elements to create meaningful representations of the world. One clear example of such function, which has been formalized in the Gestalt psychology principles, is the perceptual grouping of simple visual elements (e.g., lines and arcs) into unitary objects (e.g., forms and shapes). The present study sought to characterize automatic attentional capture and related cognitive processing of Gestalt-like visual stimuli at the psychophysiological level by using event-related potentials (ERPs). We measured ERPs during a simple visual reaction time task with bilateral presentations of physically matched elements with or without a Gestalt organization. Results showed that Gestalt (vs. non-Gestalt) stimuli are characterized by a larger N2pc together with enhanced ERP amplitudes of non-lateralized components (N1, N2, P3) starting around 150 ms post-stimulus onset. Thus, we conclude that Gestalt stimuli capture attention automatically and entail characteristic psychophysiological signatures at both early and late processing stages. Highlights We studied the neural signatures of the automatic processes of visual attention elicited by Gestalt stimuli. We found that a reliable early correlate of attentional capture turned out to be the N2pc component. Perceptual and cognitive processing of Gestalt stimuli is associated with larger N1, N2, and P3 PMID:27630555
The trustworthy digital camera: Restoring credibility to the photographic image
NASA Technical Reports Server (NTRS)
Friedman, Gary L.
1994-01-01
The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.
Secure method for biometric-based recognition with integrated cryptographic functions.
Chiou, Shin-Yan
2013-01-01
Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.
Kim, Heejoong; Suresh Kumar, K; Shin, Kyung-Hoon
2015-04-01
Globalisation of seafood and aquaculture products and their convenient marketing worldwide, increases the possibility for the distribution of mislabelled products; thereby, underlining the need to identify their origin. Stable isotope analysis is a promising approach to identify the authenticity and traceability of seafood and aquaculture products. In this investigation, we measured carbon and nitrogen stable isotope ratios (δ(13)C and δ(15)N) of three commercial fish, viz. Mackerel, Yellow Croaker and Pollock, originating from various countries. Apart from the species-dependent variation in the isotopic values, marked differences in the δ(13)C and δ(15)N ratios were also observed with respect to the country of origin. This suggests that C and N isotopic signatures could be reliable tools to identify and trace the origin of commercial fish. Copyright © 2014 Elsevier Ltd. All rights reserved.
[Trust in the care relationship].
Sureau, Patrick
2018-04-01
A relationship of trust is an expression often used by caregivers, to such an extent that it almost seems self-evident. It is nevertheless important to give some thought to this aspect in order to construct a reliable, authentic and ethical care relationship. Indeed, trust is not automatic. It requires reciprocity, a deliberate choice on the part of the caregiver and the patient. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Kuznetsov, Michael V.
2006-05-01
For reliable teamwork of various systems of automatic telecommunication including transferring systems of optical communication networks it is necessary authentic recognition of signals for one- or two-frequency service signal system. The analysis of time parameters of an accepted signal allows increasing reliability of detection and recognition of the service signal system on a background of speech.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru; Sasagawa, Michizou
2006-03-01
Multi-helical CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system. The results of this study indicate that our computer-aided diagnosis workstation and network system can increase diagnostic speed, diagnostic accuracy and safety of medical information.
1997-11-01
status can sometimes be reflected in the infectious potential or drug resistance of those pathogens. For example, in Mycobacterium tuberculosis ... Mycobacterium tuberculosis , its antibiotic resistance and prediction of pathogenicity amongst Mycobacterium spp. based on signature lipid biomarkers ...TITLE AND SUBTITLE Rapid, Potentially Automatable, Method Extract Biomarkers for HPLC/ESI/MS/MS to Detect and Identify BW Agents 5a. CONTRACT NUMBER 5b
Watermarking protocols for authentication and ownership protection based on timestamps and holograms
NASA Astrophysics Data System (ADS)
Dittmann, Jana; Steinebach, Martin; Croce Ferri, Lucilla
2002-04-01
Digital watermarking has become an accepted technology for enabling multimedia protection schemes. One problem here is the security of these schemes. Without a suitable framework, watermarks can be replaced and manipulated. We discuss different protocols providing security against rightful ownership attacks and other fraud attempts. We compare the characteristics of existing protocols for different media like direct embedding or seed based and required attributes of the watermarking technology like robustness or payload. We introduce two new media independent protocol schemes for rightful ownership authentication. With the first scheme we ensure security of digital watermarks used for ownership protection with a combination of two watermarks: first watermark of the copyright holder and a second watermark from a Trusted Third Party (TTP). It is based on hologram embedding and the watermark consists of e.g. a company logo. As an example we use digital images and specify the properties of the embedded additional security information. We identify components necessary for the security protocol like timestamp, PKI and cryptographic algorithms. The second scheme is used for authentication. It is designed for invertible watermarking applications which require high data integrity. We combine digital signature schemes and digital watermarking to provide a public verifiable integrity. The original data can only be reproduced with a secret key. Both approaches provide solutions for copyright and authentication watermarking and are introduced for image data but can be easily adopted for video and audio data as well.
HDL-level automated watermarking of IP cores
NASA Astrophysics Data System (ADS)
Castillo, E.; Meyer-Baese, U.; Parrilla, L.; García, A.; Lloris, A.
2008-04-01
This paper presents significant improvements to our previous watermarking technique for Intellectual Property Protection (IPP) of IP cores. The technique relies on hosting the bits of a digital signature at the HDL design level using resources included within the original system. Thus, any attack trying to change or remove the digital signature will damage the design. The technique also includes a procedure for secure signature extraction requiring minimal modifications to the system. The new advances refer to increasing the applicability of this watermarking technique to any design, not only to those including look-ups, and the provision of an automatic tool for signature hosting purposes. Synthesis results show that the application of the proposed watermarking strategy results in negligible degradation of system performance and very low area penalties and that the use of the automated tool, in addition to easy the signature hosting, leads to reduced area penalties.
Implementing traceability using particle randomness-based textile printed tags
NASA Astrophysics Data System (ADS)
Agrawal, T. K.; Koehl, L.; Campagne, C.
2017-10-01
This article introduces a random particle-based traceability tag for textiles. The proposed tag not only act as a unique signature for the corresponding textile product but also possess the features such as easy to manufacture and hard to copy. It seeks applications in brand authentication and traceability in textile and clothing (T&C) supply chain. A prototype has been developed by screen printing process, in which micron-scale particles were mixed with the printing paste and printed on cotton fabrics to attain required randomness. To encode the randomness, the image of the developed tag was taken and analyzed using image processing. The randomness of the particles acts as a product key or unique signature which is required to decode the tag. Finally, washing and abrasion resistance tests were conducted to check the durability of the printed tag.
In Internet-Based Visualization System Study about Breakthrough Applet Security Restrictions
NASA Astrophysics Data System (ADS)
Chen, Jie; Huang, Yan
In the process of realization Internet-based visualization system of the protein molecules, system needs to allow users to use the system to observe the molecular structure of the local computer, that is, customers can generate the three-dimensional graphics from PDB file on the client computer. This requires Applet access to local file, related to the Applet security restrictions question. In this paper include two realization methods: 1.Use such as signature tools, key management tools and Policy Editor tools provided by the JDK to digital signature and authentication for Java Applet, breakthrough certain security restrictions in the browser. 2. Through the use of Servlet agent implement indirect access data methods, breakthrough the traditional Java Virtual Machine sandbox model restriction of Applet ability. The two ways can break through the Applet's security restrictions, but each has its own strengths.
HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation
NASA Astrophysics Data System (ADS)
Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.
2006-03-01
As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.
Neural Signatures of Controlled and Automatic Retrieval Processes in Memory-based Decision-making.
Khader, Patrick H; Pachur, Thorsten; Weber, Lilian A E; Jost, Kerstin
2016-01-01
Decision-making often requires retrieval from memory. Drawing on the neural ACT-R theory [Anderson, J. R., Fincham, J. M., Qin, Y., & Stocco, A. A central circuit of the mind. Trends in Cognitive Sciences, 12, 136-143, 2008] and other neural models of memory, we delineated the neural signatures of two fundamental retrieval aspects during decision-making: automatic and controlled activation of memory representations. To disentangle these processes, we combined a paradigm developed to examine neural correlates of selective and sequential memory retrieval in decision-making with a manipulation of associative fan (i.e., the decision options were associated with one, two, or three attributes). The results show that both the automatic activation of all attributes associated with a decision option and the controlled sequential retrieval of specific attributes can be traced in material-specific brain areas. Moreover, the two facets of memory retrieval were associated with distinct activation patterns within the frontoparietal network: The dorsolateral prefrontal cortex was found to reflect increasing retrieval effort during both automatic and controlled activation of attributes. In contrast, the superior parietal cortex only responded to controlled retrieval, arguably reflecting the sequential updating of attribute information in working memory. This dissociation in activation pattern is consistent with ACT-R and constitutes an important step toward a neural model of the retrieval dynamics involved in memory-based decision-making.
Design of Provider-Provisioned Website Protection Scheme against Malware Distribution
NASA Astrophysics Data System (ADS)
Yagi, Takeshi; Tanimoto, Naoto; Hariu, Takeo; Itoh, Mitsutaka
Vulnerabilities in web applications expose computer networks to security threats, and many websites are used by attackers as hopping sites to attack other websites and user terminals. These incidents prevent service providers from constructing secure networking environments. To protect websites from attacks exploiting vulnerabilities in web applications, service providers use web application firewalls (WAFs). WAFs filter accesses from attackers by using signatures, which are generated based on the exploit codes of previous attacks. However, WAFs cannot filter unknown attacks because the signatures cannot reflect new types of attacks. In service provider environments, the number of exploit codes has recently increased rapidly because of the spread of vulnerable web applications that have been developed through cloud computing. Thus, generating signatures for all exploit codes is difficult. To solve these problems, our proposed scheme detects and filters malware downloads that are sent from websites which have already received exploit codes. In addition, to collect information for detecting malware downloads, web honeypots, which automatically extract the communication records of exploit codes, are used. According to the results of experiments using a prototype, our scheme can filter attacks automatically so that service providers can provide secure and cost-effective network environments.
On continuous user authentication via typing behavior.
Roth, Joseph; Liu, Xiaoming; Metaxas, Dimitris
2014-10-01
We hypothesize that an individual computer user has a unique and consistent habitual pattern of hand movements, independent of the text, while typing on a keyboard. As a result, this paper proposes a novel biometric modality named typing behavior (TB) for continuous user authentication. Given a webcam pointing toward a keyboard, we develop real-time computer vision algorithms to automatically extract hand movement patterns from the video stream. Unlike the typical continuous biometrics, such as keystroke dynamics (KD), TB provides a reliable authentication with a short delay, while avoiding explicit key-logging. We collect a video database where 63 unique subjects type static text and free text for multiple sessions. For one typing video, the hands are segmented in each frame and a unique descriptor is extracted based on the shape and position of hands, as well as their temporal dynamics in the video sequence. We propose a novel approach, named bag of multi-dimensional phrases, to match the cross-feature and cross-temporal pattern between a gallery sequence and probe sequence. The experimental results demonstrate a superior performance of TB when compared with KD, which, together with our ultrareal-time demo system, warrant further investigation of this novel vision application and biometric modality.
Enhancing the AliEn Web Service Authentication
NASA Astrophysics Data System (ADS)
Zhu, Jianlin; Saiz, Pablo; Carminati, Federico; Betev, Latchezar; Zhou, Daicui; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Grigoras, Costin; Furano, Fabrizio; Schreiner, Steffen; Vladimirovna Datskova, Olga; Sankar Banerjee, Subho; Zhang, Guoping
2011-12-01
Web Services are an XML based technology that allow applications to communicate with each other across disparate systems. Web Services are becoming the de facto standard that enable inter operability between heterogeneous processes and systems. AliEn2 is a grid environment based on web services. The AliEn2 services can be divided in three categories: Central services, deployed once per organization; Site services, deployed on each of the participating centers; Job Agents running on the worker nodes automatically. A security model to protect these services is essential for the whole system. Current implementations of web server, such as Apache, are not suitable to be used within the grid environment. Apache with the mod_ssl and OpenSSL only supports the X.509 certificates. But in the grid environment, the common credential is the proxy certificate for the purpose of providing restricted proxy and delegation. An Authentication framework was taken for AliEn2 web services to add the ability to accept X.509 certificates and proxy certificates from client-side to Apache Web Server. The authentication framework could also allow the generation of access control policies to limit access to the AliEn2 web services.
Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions
Chiou, Shin-Yan
2013-01-01
Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied. PMID:23762851
Measurement-device-independent quantum digital signatures
NASA Astrophysics Data System (ADS)
Puthoor, Ittoop Vergheese; Amiri, Ryan; Wallden, Petros; Curty, Marcos; Andersson, Erika
2016-08-01
Digital signatures play an important role in software distribution, modern communication, and financial transactions, where it is important to detect forgery and tampering. Signatures are a cryptographic technique for validating the authenticity and integrity of messages, software, or digital documents. The security of currently used classical schemes relies on computational assumptions. Quantum digital signatures (QDS), on the other hand, provide information-theoretic security based on the laws of quantum physics. Recent work on QDS Amiri et al., Phys. Rev. A 93, 032325 (2016);, 10.1103/PhysRevA.93.032325 Yin, Fu, and Zeng-Bing, Phys. Rev. A 93, 032316 (2016), 10.1103/PhysRevA.93.032316 shows that such schemes do not require trusted quantum channels and are unconditionally secure against general coherent attacks. However, in practical QDS, just as in quantum key distribution (QKD), the detectors can be subjected to side-channel attacks, which can make the actual implementations insecure. Motivated by the idea of measurement-device-independent quantum key distribution (MDI-QKD), we present a measurement-device-independent QDS (MDI-QDS) scheme, which is secure against all detector side-channel attacks. Based on the rapid development of practical MDI-QKD, our MDI-QDS protocol could also be experimentally implemented, since it requires a similar experimental setup.
A Standard Nomenclature for Referencing and Authentication of Pluripotent Stem Cells.
Kurtz, Andreas; Seltmann, Stefanie; Bairoch, Amos; Bittner, Marie-Sophie; Bruce, Kevin; Capes-Davis, Amanda; Clarke, Laura; Crook, Jeremy M; Daheron, Laurence; Dewender, Johannes; Faulconbridge, Adam; Fujibuchi, Wataru; Gutteridge, Alexander; Hei, Derek J; Kim, Yong-Ou; Kim, Jung-Hyun; Kokocinski, Anja Kolb-; Lekschas, Fritz; Lomax, Geoffrey P; Loring, Jeanne F; Ludwig, Tenneille; Mah, Nancy; Matsui, Tohru; Müller, Robert; Parkinson, Helen; Sheldon, Michael; Smith, Kelly; Stachelscheid, Harald; Stacey, Glyn; Streeter, Ian; Veiga, Anna; Xu, Ren-He
2018-01-09
Unambiguous cell line authentication is essential to avoid loss of association between data and cells. The risk for loss of references increases with the rapidity that new human pluripotent stem cell (hPSC) lines are generated, exchanged, and implemented. Ideally, a single name should be used as a generally applied reference for each cell line to access and unify cell-related information across publications, cell banks, cell registries, and databases and to ensure scientific reproducibility. We discuss the needs and requirements for such a unique identifier and implement a standard nomenclature for hPSCs, which can be automatically generated and registered by the human pluripotent stem cell registry (hPSCreg). To avoid ambiguities in PSC-line referencing, we strongly urge publishers to demand registration and use of the standard name when publishing research based on hPSC lines. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Automatic Threshold Design for a Bound Document Scanner.
1982-12-01
IS k- A AL O. N J MJt A ,4. TITLE foodSublitOio ). TYP R F EPOAT A PERIOD COVEREO Automatic Threshold De~i~n - ’W::d 1)o, i ,-r THESIS /DASSET’T...due to data uncertainty and other shortcomings in the scanner L * rather than in the ATC scheme. (Page count: 224) * Thesis Supervisor: Dr. J. F...permission to reproduce and distribute copies of this thesis document in whole or in part. Signature of Author Certified b y_ ___ -F . Reites, Thesis
Secure electronic commerce communication system based on CA
NASA Astrophysics Data System (ADS)
Chen, Deyun; Zhang, Junfeng; Pei, Shujun
2001-07-01
In this paper, we introduce the situation of electronic commercial security, then we analyze the working process and security for SSL protocol. At last, we propose a secure electronic commerce communication system based on CA. The system provide secure services such as encryption, integer, peer authentication and non-repudiation for application layer communication software of browser clients' and web server. The system can implement automatic allocation and united management of key through setting up the CA in the network.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2007-03-01
Multislice CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multislice CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. Moreover, we have provided diagnostic assistance methods to medical screening specialists by using a lung cancer screening algorithm built into mobile helical CT scanner for the lung cancer mass screening done in the region without the hospital. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system.
Feature maps driven no-reference image quality prediction of authentically distorted images
NASA Astrophysics Data System (ADS)
Ghadiyaram, Deepti; Bovik, Alan C.
2015-03-01
Current blind image quality prediction models rely on benchmark databases comprised of singly and synthetically distorted images, thereby learning image features that are only adequate to predict human perceived visual quality on such inauthentic distortions. However, real world images often contain complex mixtures of multiple distortions. Rather than a) discounting the effect of these mixtures of distortions on an image's perceptual quality and considering only the dominant distortion or b) using features that are only proven to be efficient for singly distorted images, we deeply study the natural scene statistics of authentically distorted images, in different color spaces and transform domains. We propose a feature-maps-driven statistical approach which avoids any latent assumptions about the type of distortion(s) contained in an image, and focuses instead on modeling the remarkable consistencies in the scene statistics of real world images in the absence of distortions. We design a deep belief network that takes model-based statistical image features derived from a very large database of authentically distorted images as input and discovers good feature representations by generalizing over different distortion types, mixtures, and severities, which are later used to learn a regressor for quality prediction. We demonstrate the remarkable competence of our features for improving automatic perceptual quality prediction on a benchmark database and on the newly designed LIVE Authentic Image Quality Challenge Database and show that our approach of combining robust statistical features and the deep belief network dramatically outperforms the state-of-the-art.
An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology
Winata, Doni
2018-01-01
The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer’s smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol. PMID:29587399
An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology.
Yohan, Alexander; Lo, Nai-Wei; Winata, Doni
2018-03-25
The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer's smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol.
NASA Technical Reports Server (NTRS)
Mielke, H. W. (Principal Investigator)
1980-01-01
The use of LANDSAT as a tool for geobotanical prospecting was studied in a 13,137 sq km area from southeastern Pennsylvania to northern Virginia. Vegetation differences between known serpentine and non-sepentine sites were most easily distinguished on early summer images. A multispectral signature was derived from vegetation of two known serpentine sites and a map was produced of 159 similar signatures of vegetation in the study area. Authenticity of the serpentine nature of the mapped sites was checked via geochemical analysis of collected soils from 14% of the sites. Overall success of geobotanical prospecting was about 35% for the total study area. When vegetation distribution was taken into account, the success rate was 67% for the region north of the Potomac, demonstrates the effectiveness of the multispectral satellite for quickly and accurately locating mineral sensitive vegetation communities over vast tracts of land.
Security, privacy, and confidentiality issues on the Internet
Kelly, Grant; McKenzie, Bruce
2002-01-01
We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standard for encrypting Web-based information interchanges is Secure Sockets Layer (SSL), more recently known as Transport Layer Security or TLS. There is a public key infrastructure process to `sign' a message whereby the private key of an individual can be used to `hash' the message. This can then be verified against the sender's public key. This ensures the data's authenticity and origin without conferring privacy, and is called a `digital signature'. The best protection against viruses is not opening e-mails from unknown sources or those containing unusual message headers. PMID:12554559
NASA Astrophysics Data System (ADS)
Currie, L. A.; Klinedinst, D. B.; Burch, R.; Feltham, N.; Dorsch, R.
2000-10-01
There are twin pressures mounting in US industry for increased utilization of biomass feedstocks and biotechnology in production. The more demanding pressure relates to economic sustainability, that is, because of increased competition globally, businesses will fail unless a minimum margin of profit is maintained while meeting the demands of consumers for less expensive products. The second pressure relates to "Green Technology" where environmental sustainability, linked for example to concerns about climate change and the preservation of natural resources, represents a worldwide driving force to reduce the consumption of fossil hydrocarbons. The resulting transition of biomass production in the industrial plant, as opposed to the agricultural plant, has resulted in an increasing need for isotopic methods of authenticating and dating feedstocks, intermediates and industrial products. The research described represents a prototypical case study leading to the definition of a unique dual isotopic ( 13C, 14C) signature or "fingerprint" for a new biomass-based commercial polymer, polypropylene terephthalate (3GT).
Security, privacy, and confidentiality issues on the Internet.
Kelly, Grant; McKenzie, Bruce
2002-01-01
We introduce the issues around protecting information about patients and related data sent via the Internet. We begin by reviewing three concepts necessary to any discussion about data security in a healthcare environment: privacy, confidentiality, and consent. We are giving some advice on how to protect local data. Authentication and privacy of e-mail via encryption is offered by Pretty Good Privacy (PGP) and Secure Multipurpose Internet Mail Extensions (S/MIME). The de facto Internet standard for encrypting Web-based information interchanges is Secure Sockets Layer (SSL), more recently known as Transport Layer Security or TLS. There is a public key infrastructure process to 'sign' a message whereby the private key of an individual can be used to 'hash' the message. This can then be verified against the sender's public key. This ensures the data's authenticity and origin without conferring privacy, and is called a 'digital signature'. The best protection against viruses is not opening e-mails from unknown sources or those containing unusual message headers.
Environmental Impact on Fossil Record for Palaecological Reconstruction Studies
NASA Astrophysics Data System (ADS)
Paraskevi, Chantzi; Elissavet, Dotsika; Brunella, Raco; Konstadinos, Albanakis; Anastasia, Poutouki; Eleni, Samarztidou
2016-10-01
Paleoecological studies have an important role in understanding past environmental, dietary and/or societal changes however require the authentic signature of fossil materials. Therefore, a significant part of these studies concerns the isolation of the material authentic matrix. Bone hydroxyapatite from different animal species from the archaeological site of Dispilio in Kastoria Lake basin in northern Greece has been subjected to mineral analysis in order to detect if there are suitable for palaoecological studies. Calcium, phosphorus, oxygen and hydrogen are the main components of bones resulting rigidity, hardness and compressive strength of their structure. However different bone structure resulting different calcium- phosphate phases and different compositions, including Ca/P ratios. These disparities may be attributable to different physiological characteristic, conditions under which the bones were formed or burial environment. Trace element analysis (Ca/P, Sr/P, Fe/Mn) concluded that treated fossil bones retained their biochemical signal without any strong influence by soil remains however without suggesting that no chemical alteration have been occurred.
Strict integrity control of biomedical images
NASA Astrophysics Data System (ADS)
Coatrieux, Gouenou; Maitre, Henri; Sankur, Bulent
2001-08-01
The control of the integrity and authentication of medical images is becoming ever more important within the Medical Information Systems (MIS). The intra- and interhospital exchange of images, such as in the PACS (Picture Archiving and Communication Systems), and the ease of copying, manipulation and distribution of images have brought forth the security aspects. In this paper we focus on the role of watermarking for MIS security and address the problem of integrity control of medical images. We discuss alternative schemes to extract verification signatures and compare their tamper detection performance.
Mathematical and information maintenance of biometric systems
NASA Astrophysics Data System (ADS)
Boriev, Z.; Sokolov, S.; Nyrkov, A.; Nekrasova, A.
2016-04-01
This article describes the different mathematical methods for processing biometric data. A brief overview of methods for personality recognition by means of a signature is conducted. Mathematical solutions of a dynamic authentication method are considered. Recommendations on use of certain mathematical methods, depending on specific tasks, are provided. Based on the conducted analysis of software and the choice made in favor of the wavelet analysis, a brief basis for its use in the course of software development for biometric personal identification is given for the purpose of its practical application.
Scanner OPC signatures: automatic vendor-to-vendor OPE matching
NASA Astrophysics Data System (ADS)
Renwick, Stephen P.
2009-03-01
As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.
A new approach for SSVEP detection using PARAFAC and canonical correlation analysis.
Tello, Richard; Pouryazdian, Saeed; Ferreira, Andre; Beheshti, Soosan; Krishnan, Sridhar; Bastos, Teodiano
2015-01-01
This paper presents a new way for automatic detection of SSVEPs through correlation analysis between tensor models. 3-way EEG tensor of channel × frequency × time is decomposed into constituting factor matrices using PARAFAC model. PARAFAC analysis of EEG tensor enables us to decompose multichannel EEG into constituting temporal, spectral and spatial signatures. SSVEPs characterized with localized spectral and spatial signatures are then detected exploiting a correlation analysis between extracted signatures of the EEG tensor and the corresponding simulated signatures of all target SSVEP signals. The SSVEP that has the highest correlation is selected as the intended target. Two flickers blinking at 8 and 13 Hz were used as visual stimuli and the detection was performed based on data packets of 1 second without overlapping. Five subjects participated in the experiments and the highest classification rate of 83.34% was achieved, leading to the Information Transfer Rate (ITR) of 21.01 bits/min.
Spectral signature of alpine snow cover from the Landsat Thematic Mapper
NASA Technical Reports Server (NTRS)
Dozier, Jeff
1989-01-01
In rugged terrain, snow in the shadows can appear darker than soil or vegetation in the sunlight, making it difficult to interpret satellite data images of rugged terrains. This paper discusses methods for using Thematic Mapper (TM) and SPOT data for automatic analyses of alpine snow cover. Typical spectral signatures of the Landsat TM are analyzed for a range of snow types, atmospheric profiles, and topographic illumination conditions. A number of TM images of Sierra Nevada are analyzed to distinguish several classes of snow from other surface covers.
Invariants of polarization transformations.
Sadjadi, Firooz A
2007-05-20
The use of polarization-sensitive sensors is being explored in a variety of applications. Polarization diversity has been shown to improve the performance of the automatic target detection and recognition in a significant way. However, it also brings out the problems associated with processing and storing more data and the problem of polarization distortion during transmission. We present a technique for extracting attributes that are invariant under polarization transformations. The polarimetric signatures are represented in terms of the components of the Stokes vectors. Invariant algebra is then used to extract a set of signature-related attributes that are invariant under linear transformation of the Stokes vectors. Experimental results using polarimetric infrared signatures of a number of manmade and natural objects undergoing systematic linear transformations support the invariancy of these attributes.
Perceptual quality prediction on authentically distorted images using a bag of features approach
Ghadiyaram, Deepti; Bovik, Alan C.
2017-01-01
Current top-performing blind perceptual image quality prediction models are generally trained on legacy databases of human quality opinion scores on synthetically distorted images. Therefore, they learn image features that effectively predict human visual quality judgments of inauthentic and usually isolated (single) distortions. However, real-world images usually contain complex composite mixtures of multiple distortions. We study the perceptually relevant natural scene statistics of such authentically distorted images in different color spaces and transform domains. We propose a “bag of feature maps” approach that avoids assumptions about the type of distortion(s) contained in an image and instead focuses on capturing consistencies—or departures therefrom—of the statistics of real-world images. Using a large database of authentically distorted images, human opinions of them, and bags of features computed on them, we train a regressor to conduct image quality prediction. We demonstrate the competence of the features toward improving automatic perceptual quality prediction by testing a learned algorithm using them on a benchmark legacy database as well as on a newly introduced distortion-realistic resource called the LIVE In the Wild Image Quality Challenge Database. We extensively evaluate the perceptual quality prediction model and algorithm and show that it is able to achieve good-quality prediction power that is better than other leading models. PMID:28129417
Authentication Sensing System Using Resonance Evaluation Spectroscopy (ASSURES)
NASA Astrophysics Data System (ADS)
Trolinger, James D.; Dioumaev, Andrei K.; Lal, Amit K.; Dimas, Dave
2017-08-01
This paper describes an ongoing instrument development project to distinguish genuine manufactured components from counterfeit components; we call the instrument ASSURES (Authentication Sensing System Using Resonance Evaluation Spectroscopy). The system combines Laser Doppler Vibrometry with acoustical resonance spectroscopy, augmented with finite element analysis. Vibrational properties of components, such as resonant modes, damping, and spectral frequency response to various forcing functions depend strongly upon the mechanical properties of the material, including its size, shape, internal hardness, tensile strength, alloy/composite compositions, flaws, defects, and other internal material properties. Although acoustic resonant spectroscopy has seen limited application, the information rich signals in the vibrational spectra of objects provide a pathway to many new applications. Components with the same shape but made of different materials, different fatigue histories, damage, tampering, or heat treatment, will respond differently to high frequency stimulation. Laser Doppler Vibrometry offers high sensitivity and frequency bandwidth to measure the component's frequency spectrum, and overcomes many issues that limit conventional acoustical resonance spectroscopy, since the sensor laser beam can be aimed anywhere along the part as well as to multiple locations on a part in a non-contact way. ASSURES is especially promising for use in additive manufacturing technology by providing signatures as digital codes that are unique to specific objects and even to specific locations on objects. We believe that such signatures can be employed to address many important issues in the manufacturing industry. These include insuring the part meets the often very rigid specifications of the customer and being able to detect non-visible internal manufacturing defects or non-visible damage that has occurred after manufacturing.
Gerhardt, Natalie; Birkenmeier, Markus; Schwolow, Sebastian; Rohn, Sascha; Weller, Philipp
2018-02-06
This work describes a simple approach for the untargeted profiling of volatile compounds for the authentication of the botanical origins of honey based on resolution-optimized HS-GC-IMS combined with optimized chemometric techniques, namely PCA, LDA, and kNN. A direct comparison of the PCA-LDA models between the HS-GC-IMS and 1 H NMR data demonstrated that HS-GC-IMS profiling could be used as a complementary tool to NMR-based profiling of honey samples. Whereas NMR profiling still requires comparatively precise sample preparation, pH adjustment in particular, HS-GC-IMS fingerprinting may be considered an alternative approach for a truly fully automatable, cost-efficient, and in particular highly sensitive method. It was demonstrated that all tested honey samples could be distinguished on the basis of their botanical origins. Loading plots revealed the volatile compounds responsible for the differences among the monofloral honeys. The HS-GC-IMS-based PCA-LDA model was composed of two linear functions of discrimination and 10 selected PCs that discriminated canola, acacia, and honeydew honeys with a predictive accuracy of 98.6%. Application of the LDA model to an external test set of 10 authentic honeys clearly proved the high predictive ability of the model by correctly classifying them into three variety groups with 100% correct classifications. The constructed model presents a simple and efficient method of analysis and may serve as a basis for the authentication of other food types.
An Identity Based Key Exchange Protocol in Cloud Computing
NASA Astrophysics Data System (ADS)
Molli, Venkateswara Rao; Tiwary, Omkar Nath
2012-10-01
Workflow systems often use delegation to enhance the flexibility of authorization; delegation transfers privileges among users across different administrative domains and facilitates information sharing. We present an independently verifiable delegation mechanism, where a delegation credential can be verified without the participation of domain administrators. This protocol, called role-based cascaded delegation (RBCD), supports simple and efficient cross-domain delegation of authority. RBCD enables a role member to create delegations based on the dynamic needs of collaboration; in the meantime, a delegation chain canbe verified by anyone without the participation of role administrators. We also propose the Measurable Risk Adaptive decentralized Role-based Delegation framework to address this problem. Describe an efficient realization of RBCD by using aggregate signatures, where the authentication information for an arbitrarily long role-based delegation chain is captured by one short signature of constant size. RBCD enables a role member to create delegations based on the need of collaboration; in the meantime anyone can verify a delegation chain without the participation of role administrators. The protocol is general and can be realized by any signature scheme. We have described a specific realization with a hierarchical certificate-based encryption scheme that gives delegation compact credentials.
Automatic identification of bullet signatures based on consecutive matching striae (CMS) criteria.
Chu, Wei; Thompson, Robert M; Song, John; Vorburger, Theodore V
2013-09-10
The consecutive matching striae (CMS) numeric criteria for firearm and toolmark identifications have been widely accepted by forensic examiners, although there have been questions concerning its observer subjectivity and limited statistical support. In this paper, based on signal processing and extraction, a model for the automatic and objective counting of CMS is proposed. The position and shape information of the striae on the bullet land is represented by a feature profile, which is used for determining the CMS number automatically. Rapid counting of CMS number provides a basis for ballistics correlations with large databases and further statistical and probability analysis. Experimental results in this report using bullets fired from ten consecutively manufactured barrels support this developed model. Published by Elsevier Ireland Ltd.
Automatic location of L/H transition times for physical studies with a large statistical basis
NASA Astrophysics Data System (ADS)
González, S.; Vega, J.; Murari, A.; Pereira, A.; Dormido-Canto, S.; Ramírez, J. M.; contributors, JET-EFDA
2012-06-01
Completely automatic techniques to estimate and validate L/H transition times can be essential in L/H transition analyses. The generation of databases with hundreds of transition times and without human intervention is an important step to accomplish (a) L/H transition physics analysis, (b) validation of L/H theoretical models and (c) creation of L/H scaling laws. An entirely unattended methodology is presented in this paper to build large databases of transition times in JET using time series. The proposed technique has been applied to a dataset of 551 JET discharges between campaigns C21 and C26. A prediction with discharges that show a clear signature in time series is made through the locating properties of the wavelet transform. It is an accurate prediction and the uncertainty interval is ±3.2 ms. The discharges with a non-clear pattern in the time series use an L/H mode classifier based on discharges with a clear signature. In this case, the estimation error shows a distribution with mean and standard deviation of 27.9 ms and 37.62 ms, respectively. Two different regression methods have been applied to the measurements acquired at the transition times identified by the automatic system. The obtained scaling laws for the threshold power are not significantly different from those obtained using the data at the transition times determined manually by the experts. The automatic methods allow performing physical studies with a large number of discharges, showing, for example, that there are statistically different types of transitions characterized by different scaling laws.
Lavan, Nadine; Lima, César F; Harvey, Hannah; Scott, Sophie K; McGettigan, Carolyn
2015-01-01
It is well established that categorising the emotional content of facial expressions may differ depending on contextual information. Whether this malleability is observed in the auditory domain and in genuine emotion expressions is poorly explored. We examined the perception of authentic laughter and crying in the context of happy, neutral and sad facial expressions. Participants rated the vocalisations on separate unipolar scales of happiness and sadness and on arousal. Although they were instructed to focus exclusively on the vocalisations, consistent context effects were found: For both laughter and crying, emotion judgements were shifted towards the information expressed by the face. These modulations were independent of response latencies and were larger for more emotionally ambiguous vocalisations. No effects of context were found for arousal ratings. These findings suggest that the automatic encoding of contextual information during emotion perception generalises across modalities, to purely non-verbal vocalisations, and is not confined to acted expressions.
Static Signature Synthesis: A Neuromotor Inspired Approach for Biometrics.
Ferrer, Miguel A; Diaz-Cabrera, Moises; Morales, Aythami
2015-03-01
In this paper we propose a new method for generating synthetic handwritten signature images for biometric applications. The procedures we introduce imitate the mechanism of motor equivalence which divides human handwriting into two steps: the working out of an effector independent action plan and its execution via the corresponding neuromuscular path. The action plan is represented as a trajectory on a spatial grid. This contains both the signature text and its flourish, if there is one. The neuromuscular path is simulated by applying a kinematic Kaiser filter to the trajectory plan. The length of the filter depends on the pen speed which is generated using a scalar version of the sigma lognormal model. An ink deposition model, applied pixel by pixel to the pen trajectory, provides realistic static signature images. The lexical and morphological properties of the synthesized signatures as well as the range of the synthesis parameters have been estimated from real databases of real signatures such as the MCYT Off-line and the GPDS960GraySignature corpuses. The performance experiments show that by tuning only four parameters it is possible to generate synthetic identities with different stability and forgers with different skills. Therefore it is possible to create datasets of synthetic signatures with a performance similar to databases of real signatures. Moreover, we can customize the created dataset to produce skilled forgeries or simple forgeries which are easier to detect, depending on what the researcher needs. Perceptual evaluation gives an average confusion of 44.06 percent between real and synthetic signatures which shows the realism of the synthetic ones. The utility of the synthesized signatures is demonstrated by studying the influence of the pen type and number of users on an automatic signature verifier.
Using Quantum Confinement to Uniquely Identify Devices
Roberts, J.; Bagci, I. E.; Zawawi, M. A. M.; Sexton, J.; Hulbert, N.; Noori, Y. J.; Young, M. P.; Woodhead, C. S.; Missous, M.; Migliorato, M. A.; Roedig, U.; Young, R. J.
2015-01-01
Modern technology unintentionally provides resources that enable the trust of everyday interactions to be undermined. Some authentication schemes address this issue using devices that give a unique output in response to a challenge. These signatures are generated by hard-to-predict physical responses derived from structural characteristics, which lend themselves to two different architectures, known as unique objects (UNOs) and physically unclonable functions (PUFs). The classical design of UNOs and PUFs limits their size and, in some cases, their security. Here we show that quantum confinement lends itself to the provision of unique identities at the nanoscale, by using fluctuations in tunnelling measurements through quantum wells in resonant tunnelling diodes (RTDs). This provides an uncomplicated measurement of identity without conventional resource limitations whilst providing robust security. The confined energy levels are highly sensitive to the specific nanostructure within each RTD, resulting in a distinct tunnelling spectrum for every device, as they contain a unique and unpredictable structure that is presently impossible to clone. This new class of authentication device operates with minimal resources in simple electronic structures above room temperature. PMID:26553435
Eat-by-light fiber-optic and micro-optic devices for food quality and safety assessment
NASA Astrophysics Data System (ADS)
Mignani, A. G.; Ciaccheri, L.; Cucci, C.; Mencaglia, A. A.; Cimato, A.; Attilio, C.; Thienpont, H.; Ottevaere, H.; Paolesse, R.; Mastroianni, M.; Monti, D.; Buonocore, G.; Del Nobile, A.; Mentana, A.; Grimaldi, M. F.; Dall'Asta, C.; Faccini, A.; Galaverna, G.; Dossena, A.
2007-06-01
A selection is presented of fiber-optic and micro-optic devices that have been designed and tested for guaranteeing the quality and safety of typical foods, such as extra virgin olive oil, beer, and milk. Scattered colorimetry is used to authenticate various types of extra virgin olive oil and beer, while a fiber-optic-based device for UV-VIS-NIR absorption spectroscopy is exploited in order to obtain the hyperspectral optical signature of olive oil. This is done not only for authentication purposes, but also so as to correlate the spectral data with the content of fatty acids, which are important nutritional factors. A micro-optic sensor for the detection of olive oil aroma that is capable of distinguishing different ageing levels of extra virgin olive oil is also presented. It shows effective potential for acting as a smart cap of bottled olive oil in order to achieve a non-destructive olfactory perception of oil ageing. Lastly, a compact portable fluorometer for the rapid monitoring of the carcinogenic M1 aflatoxin in milk, is experimented.
Eat-by-light: fiber-optic and micro-optic devices for food safety and quality assessment
NASA Astrophysics Data System (ADS)
Mignani, A. G.; Ciaccheri, L.; Cucci, C.; Mencaglia, A. A.; Cimato, A.; Attilio, C.; Thienpont, H.; Ottevaere, H.; Paolesse, R.; Mastroianni, M.; Monti, D.; Buonocore, G.; Del Nobile, A.; Mentana, A.; Dall'Asta, C.; Faccini, A.; Galaverna, G.; Dossena, A.
2007-07-01
A selection of fiber-optic and micro-optic devices is presented designed and tested for monitoring the quality and safety of typical foods, namely the extra virgin olive oil, the beer, and the milk. Scattered colorimetry is used for the authentication of various types of extra virgin olive oil and beer, while a fiber-optic-based device for UV-VIS-NIR absorption spectroscopy is exploited in order to obtain the hyperspectral optical signature of olive oil. This is done not only for authentication purposes, but also so as to correlate the spectral data with the content of fatty acids that are important nutritional factors. A micro-optic sensor for the detection of olive oil aroma is presented. It is capable of distinguishing different ageing levels of extra virgin olive oil. It shows effective potential for acting as a smart cap of bottled olive oil in order to achieve a non-destructive olfactory perception of oil ageing. Lastly, a compact portable fluorometer is experimented for the rapid monitoring of the carcinogenic M1 aflatoxin in milk.
2018-01-01
Vehicle ad hoc networks (VANETs) is a promising network scenario for greatly improving traffic efficiency and safety, in which smart vehicles can communicate with other vehicles or roadside units. For the availability of VANETs, it is very important to deal with the security and privacy problems for VANETs. In this paper, based on certificateless cryptography and elliptic curve cryptography, we present a certificateless signature with message recovery (CLS-MR), which we believe are of independent interest. Then, a practical certificateless conditional privacy preserving authentication (PCPA) scheme is proposed by incorporating the proposed CLS-MR scheme. Furthermore, the security analysis shows that PCPA satisfies all security and privacy requirements. The evaluation results indicate that PCPA achieves low computation and communication costs because there is no need to use the bilinear pairing and map-to-point hash operations. Moreover, extensive simulations show that PCPA is feasible and achieves prominent performances in terms of message delay and message loss ratio, and thus is more suitable for the deployment and adoption of VANETs. PMID:29762511
Evans, Jane A; Pashley, Vanessa; Richards, Gemma J; Brereton, Nicola; Knowles, Toby G
2015-12-15
This paper presents lead (Pb) isotope data from samples of farm livestock raised in three areas of Britain that have elevated natural Pb levels: Central Wales, the Mendips and the Derbyshire Peak District. This study highlights three important observations; that the Pb found in modern British meat from these three areas is geogenic and shows no clear evidence of modern tetraethyl anthropogenic Pb contribution; that the generally excellent match between the biological samples and the ore field data, particularly for the Mendip and Welsh data, suggests that this technique might be used to provenance biological products to specific ore sites, under favourable conditions; and that modern systems reflect the same process of biosphere averaging that is analogous to cultural focusing in human archaeological studies that is the process of biological averaging leading to an homogenised isotope signature with increasing Pb concentration. Copyright © 2015. Published by Elsevier B.V.
Real time biometric surveillance with gait recognition
NASA Astrophysics Data System (ADS)
Mohapatra, Subasish; Swain, Anisha; Das, Manaswini; Mohanty, Subhadarshini
2018-04-01
Bio metric surveillance has become indispensable for every system in the recent years. The contribution of bio metric authentication, identification, and screening purposes are widely used in various domains for preventing unauthorized access. A large amount of data needs to be updated, segregated and safeguarded from malicious software and misuse. Bio metrics is the intrinsic characteristics of each individual. Recently fingerprints, iris, passwords, unique keys, and cards are commonly used for authentication purposes. These methods have various issues related to security and confidentiality. These systems are not yet automated to provide the safety and security. The gait recognition system is the alternative for overcoming the drawbacks of the recent bio metric based authentication systems. Gait recognition is newer as it hasn't been implemented in the real-world scenario so far. This is an un-intrusive system that requires no knowledge or co-operation of the subject. Gait is a unique behavioral characteristic of every human being which is hard to imitate. The walking style of an individual teamed with the orientation of joints in the skeletal structure and inclinations between them imparts the unique characteristic. A person can alter one's own external appearance but not skeletal structure. These are real-time, automatic systems that can even process low-resolution images and video frames. In this paper, we have proposed a gait recognition system and compared the performance with conventional bio metric identification systems.
Novel continuous authentication using biometrics
NASA Astrophysics Data System (ADS)
Dubey, Prakash; Patidar, Rinku; Mishra, Vikas; Norman, Jasmine; Mangayarkarasi, R.
2017-11-01
We explore whether a classifier can consistent1y verify c1ients and interact with the computer using camera and behavior of users. In this paper we propose a new way of authentication of user which wi1l capture many images of user in random time and ana1ysis of its touch biometric behavior. In this system experiment the touch conduct of a c1ient/user between an en1istment stage is stored in the database and it is checked its mean time behavior during equa1 partition of time. This touch behavior wi1l ab1e to accept or reject the user. This wi1l modify the use of biometric more accurate to use. In this system the work p1an going to perform is the user wi1l ask single time to a1low to take it picture before 1ogin. Then it wi1l take images of user without permission of user automatica1ly and store in the database. This images and existing image of user wi1l be compare and reject or accept wi1l depend on its comparison. The user touch behavior wi1l keep storing with number of touch make in equa1 amount of time of the user. This touch behavior and image wi1l fina1ly perform authentication of the user automatically.
Automatic Target Recognition for Hyperspectral Imagery
2012-03-01
representation, b) NDVI representation .... 13 Figure 6. Vegetation Reflectance Spectra, taken directly from (Eismann, 2011) ........... 15 Figure 7...46 Figure 22. Example NDVI Mean and Shade Spectrum Signatures ................................. 47 Figure 23. Example Average...locate vegetation within an image normalized-difference vegetation index ( NDVI ) is applied. NDVI was first introduced by Rouse et al. while monitoring
Stability across environments of the coffee variety near infrared spectral signature.
Posada, H; Ferrand, M; Davrieux, F; Lashermes, P; Bertrand, B
2009-02-01
Previous study on food plants has shown that near infrared (NIR) spectral methods seem effective for authenticating coffee varieties. We confirm that result, but also show that inter-variety differences are not stable from one harvest to the next. We put forward the hypothesis that the spectral signature is affected by environmental factors. The purpose of this study was to find a way of reducing this environmental variance to increase the method's reliability and to enable practical application in breeding. Spectral collections were obtained from ground green coffee samples from multilocation trials. Two harvests of bean samples from 11 homozygous introgressed lines, and the cv 'Caturra' as the control, supplied from three different sites, were compared. For each site, squared Euclidean distances among the 12 varieties were estimated from the NIR spectra. Matrix correlation coefficients were assessed by the Mantel test. We obtained very good stability (high correlations) for inter-variety differences across the sites when using the two harvests data. If only the most heritable zones of the spectrum were used, there was a marked improvement in the efficiency of the method. This improvement was achieved by treating the spectrum as succession of phenotypic variables, each resulting from an environmental and genetic effect. Heritabilities were calculated with confidence intervals. A near infrared spectroscopy signature, acquired over a set of harvests, can therefore effectively characterize a coffee variety. We indicated how this typical signature can be used in breeding to assist in selection.
DEVELOPMENT OF A CERAMIC TAMPER INDICATING SEAL: SRNL CONTRIBUTIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krementz, D.; Brinkman, K.; Martinez-Rodriguez, M.
2013-06-03
Savannah River National Laboratory (SRNL) and Sandia National Laboratories (SNL) are collaborating on development of a Ceramic Seal, also sometimes designated the Intrinsically Tamper Indicating Ceramic Seal (ITICS), which is a tamper indicating seal for international safeguards applications. The Ceramic Seal is designed to be a replacement for metal loop seals that are currently used by the IAEA and other safeguards organizations. The Ceramic Seal has numerous features that enhance the security of the seal, including a frangible ceramic body, protective and tamper indicating coatings, an intrinsic unique identifier using Laser Surface Authentication, electronics incorporated into the seal that providemore » cryptographic seal authentication, and user-friendly seal wire capture. A second generation prototype of the seal is currently under development whose seal body is of Low Temperature Co-fired Ceramic (LTCC) construction. SRNL has developed the mechanical design of the seal in an iterative process incorporating comments from the SNL vulnerability review team. SRNL is developing fluorescent tamper indicating coatings, with recent development focusing on optimizing the durability of the coatings and working with a vendor to develop a method to apply coatings on a 3-D surface. SRNL performed a study on the effects of radiation on the electronics of the seal and possible radiation shielding techniques to minimize the effects. SRNL is also investigating implementation of Laser Surface Authentication (LSA) as a means of unique identification of each seal and the effects of the surface coatings on the LSA signature.« less
Mishra, Priyanka; Shukla, Ashutosh K; Sundaresan, Velusamy
2018-01-01
Senna alexandrina (Fabaceae) is a globally recognized medicinal plant for its laxative properties as well as the only source of sennosides, and is highly exported bulk herb from India. Its major procurement is exclusively from limited cultivation, which leads to risks of deliberate or unintended adulteration. The market raw materials are in powdered or finished product form, which lead to difficulties in authentication. Here, DNA barcode tags based on chloroplast genes ( rbcL and matK ) and intergenic spacers ( psbA-trnH and ITS ) were developed for S. alexandrina along with the allied species. The ability and performance of the ITS1 region to discriminate among the Senna species resulted in the present proposal of the ITS1 tags as successful barcode. Further, these tags were coupled with high-resolution melting (HRM) curve analysis in a real-time PCR genotyping method to derive Bar-HRM (Barcoding-HRM) assays. Suitable HRM primer sets were designed through SNP detection and mutation scanning in genomic signatures of Senna species. The melting profiles of S. alexandrina and S . italica subsp. micrantha were almost identical and the remaining five species were clearly separated so that they can be differentiated by HRM method. The sensitivity of the method was utilized to authenticate market samples [Herbal Sample Assays (HSAs)]. HSA01 ( S. alexandrina crude drug sample from Bangalore) and HSA06 ( S. alexandrina crude drug sample from Tuticorin, Tamil Nadu, India) were found to be highly contaminated with S . italica subsp. micrantha . Species admixture samples mixed in varying percentage was identified sensitively with detection of contamination as low as 1%. The melting profiles of PCR amplicons are clearly distinct, which enables the authentic differentiation of species by the HRM method. This study reveals that DNA barcoding coupled with HRM is an efficient molecular tool to authenticate Senna herbal products in the market for quality control in the drug supply chain. CIMAP Communication Number: CIMAP/PUB/2017/31.
Mishra, Priyanka; Shukla, Ashutosh K.; Sundaresan, Velusamy
2018-01-01
Senna alexandrina (Fabaceae) is a globally recognized medicinal plant for its laxative properties as well as the only source of sennosides, and is highly exported bulk herb from India. Its major procurement is exclusively from limited cultivation, which leads to risks of deliberate or unintended adulteration. The market raw materials are in powdered or finished product form, which lead to difficulties in authentication. Here, DNA barcode tags based on chloroplast genes (rbcL and matK) and intergenic spacers (psbA-trnH and ITS) were developed for S. alexandrina along with the allied species. The ability and performance of the ITS1 region to discriminate among the Senna species resulted in the present proposal of the ITS1 tags as successful barcode. Further, these tags were coupled with high-resolution melting (HRM) curve analysis in a real-time PCR genotyping method to derive Bar-HRM (Barcoding-HRM) assays. Suitable HRM primer sets were designed through SNP detection and mutation scanning in genomic signatures of Senna species. The melting profiles of S. alexandrina and S. italica subsp. micrantha were almost identical and the remaining five species were clearly separated so that they can be differentiated by HRM method. The sensitivity of the method was utilized to authenticate market samples [Herbal Sample Assays (HSAs)]. HSA01 (S. alexandrina crude drug sample from Bangalore) and HSA06 (S. alexandrina crude drug sample from Tuticorin, Tamil Nadu, India) were found to be highly contaminated with S. italica subsp. micrantha. Species admixture samples mixed in varying percentage was identified sensitively with detection of contamination as low as 1%. The melting profiles of PCR amplicons are clearly distinct, which enables the authentic differentiation of species by the HRM method. This study reveals that DNA barcoding coupled with HRM is an efficient molecular tool to authenticate Senna herbal products in the market for quality control in the drug supply chain. CIMAP Communication Number: CIMAP/PUB/2017/31 PMID:29593755
Laser surface alloying of coins for authenticity
NASA Astrophysics Data System (ADS)
Liu, Zhu; Watkins, Kenneth G.; Steen, William M.; Hatherley, P. G.
1997-08-01
This paper presents an exploratory investigation on verifying the feasibility of using a laser surface alloying technique to produce designs in the surface of coinage blanks. The specific aim of the work concerns the production of design features in coins that are difficult to produce by other techniques and which hence act as a barrier to forgery and features which permit automatic recognition in vending machines, particularly as a means of establishing the authenticity of the coins. Coins in many countries today are commonly manufactured from metal composites, where one substrate metal or alloy is coated with another by a process of electrodeposition or by mechanical bonding. The technique here described entails the use of a high power CO2 laser to bring about localized melting of the two layers. Visible distinction between alloyed and unalloyed regions or difference in other physical properties such as conductivity or magnetic properties can be obtained. The work also involved a fundamental study of the influence of the thermal properties of the materials on the CO2 laser alloying process. It was found that the thermal properties such as thermal conductivity of the substrate materials and the difference of the melting points between the coating layer and the substrate materials played an important role in the process. Laser control variables required for localized alloying for different substrate and coatings types were determined. The influence of both thermal properties and laser control variables on alloy type and alloy depth were investigated. Initial work on coin validation showed promising results of an automatic recognition of laser treated coins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Zachary C.; Stephenson, David E.; Christ, Josef F.
The significant rise of additive manufacturing (AM) in recent years is in part due to the open sourced nature of the printing processes and reduced cost and capital barriers relative to traditional manufacturing. However, this democratization of manufacturing spurs an increased demand for producers and end-users to verify the authenticity and quality of individual parts. To this end, we introduce an anti-counterfeiting method composed of first embedding engineered nanomaterials into features of a 3D-printed part followed by non-destructive interrogation of these features to quantify a chemical signature profile. The part specific chemical signature data is then linked to a securitized,more » distributed, and time-stamped blockchain ledger entry. To demonstrate the utility of this approach, lanthanide-aspartic acid nanoscale coordination polymers (Ln3+- Asp NCs) / poly(lactic) acid (PLA) composites were formulated and transformed into a filament feedstock for fused deposition modeling (FDM) 3D printing. In the present case, a quick-response (QR) code containing the doped Ln3+-Asp NCs was printed using a dual-extruder FDM printer into pure PLA parts. The QR code provides a searchable reference to an Ethereum-based blockchain entry. The QR code physical features also serve as defined areas to probe the signatures arising from the embedded Ln3+-Asp NCs. Visible fluorescence emission with UV-excitation was quantified in terms of color using a smartphone camera and incorporated into blockchain entries. Ultimately, linking unique chemical signature data to blockchain databases is anticipated to make the costs of counterfeiting AM materials significantly more prohibitive and transactions between those in the supply chain more trustworthy.« less
A Dynamic Time Warping based covariance function for Gaussian Processes signature identification
NASA Astrophysics Data System (ADS)
Silversides, Katherine L.; Melkumyan, Arman
2016-11-01
Modelling stratiform deposits requires a detailed knowledge of the stratigraphic boundaries. In Banded Iron Formation (BIF) hosted ores of the Hamersley Group in Western Australia these boundaries are often identified using marker shales. Both Gaussian Processes (GP) and Dynamic Time Warping (DTW) have been previously proposed as methods to automatically identify marker shales in natural gamma logs. However, each method has different advantages and disadvantages. We propose a DTW based covariance function for the GP that combines the flexibility of the DTW with the probabilistic framework of the GP. The three methods are tested and compared on their ability to identify two natural gamma signatures from a Marra Mamba type iron ore deposit. These tests show that while all three methods can identify boundaries, the GP with the DTW covariance function combines and balances the strengths and weaknesses of the individual methods. This method identifies more positive signatures than the GP with the standard covariance function, and has a higher accuracy for identified signatures than the DTW. The combined method can handle larger variations in the signature without requiring multiple libraries, has a probabilistic output and does not require manual cut-off selections.
A Mechanism for Anonymous Credit Card Systems
NASA Astrophysics Data System (ADS)
Tamura, Shinsuke; Yanase, Tatsuro
This paper proposes a mechanism for anonymous credit card systems, in which each credit card holder can conceal individual transactions from the credit card company, while enabling the credit card company to calculate the total expenditures of transactions of individual card holders during specified periods, and to identify card holders who executed dishonest transactions. Based on three existing mechanisms, i.e. anonymous authentication, blind signature and secure statistical data gathering, together with implicit transaction links proposed here, the proposed mechanism enables development of anonymous credit card systems without assuming any absolutely trustworthy entity like tamper resistant devices or organizations faithful both to the credit card company and card holders.
RFID identity theft and countermeasures
NASA Astrophysics Data System (ADS)
Herrigel, Alexander; Zhao, Jian
2006-02-01
This paper reviews the ICAO security architecture for biometric passports. An attack enabling RFID identity theft for a later misuse is presented. Specific countermeasures against this attack are described. Furthermore, it is shown that robust high capacity digital watermarking for the embedding and retrieving of binary digital signature data can be applied as an effective mean against RFID identity theft. This approach requires only minimal modifications of the passport manufacturing process and is an enhancement of already proposed solutions. The approach may also be applied in combination with a RFID as a backup solution (damaged RFID chip) to verify with asymmetric cryptographic techniques the authenticity and the integrity of the passport data.
Ben Salem, Samira; Bacha, Khmais; Chaari, Abdelkader
2012-09-01
In this work we suggest an original fault signature based on an improved combination of Hilbert and Park transforms. Starting from this combination we can create two fault signatures: Hilbert modulus current space vector (HMCSV) and Hilbert phase current space vector (HPCSV). These two fault signatures are subsequently analysed using the classical fast Fourier transform (FFT). The effects of mechanical faults on the HMCSV and HPCSV spectrums are described, and the related frequencies are determined. The magnitudes of spectral components, relative to the studied faults (air-gap eccentricity and outer raceway ball bearing defect), are extracted in order to develop the input vector necessary for learning and testing the support vector machine with an aim of classifying automatically the various states of the induction motor. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Management of natural resources through automatic cartographic inventory. [France
NASA Technical Reports Server (NTRS)
Rey, P.; Gourinard, Y.; Cambou, F. (Principal Investigator)
1974-01-01
The author has identified the following significant results. (1) Accurate recognition of previously known ground features from ERTS-1 imagery has been confirmed and a probable detection range for the major signatures can be given. (2) Unidentified elements, however, must be decoded by means of the equal densitometric value zone method. (3) Determination of these zonings involves an analogical treatment of images using the color equidensity methods (pseudo-color), color composites and especially temporal color composite (repetitive superposition). (4) After this analogical preparation, the digital equidensities can be processed by computer in the four MSS bands, according to a series of transfer operations from imagery and automatic cartography.
NASA Astrophysics Data System (ADS)
El Bekri, Nadia; Angele, Susanne; Ruckhäberle, Martin; Peinsipp-Byma, Elisabeth; Haelke, Bruno
2015-10-01
This paper introduces an interactive recognition assistance system for imaging reconnaissance. This system supports aerial image analysts on missions during two main tasks: Object recognition and infrastructure analysis. Object recognition concentrates on the classification of one single object. Infrastructure analysis deals with the description of the components of an infrastructure and the recognition of the infrastructure type (e.g. military airfield). Based on satellite or aerial images, aerial image analysts are able to extract single object features and thereby recognize different object types. It is one of the most challenging tasks in the imaging reconnaissance. Currently, there are no high potential ATR (automatic target recognition) applications available, as consequence the human observer cannot be replaced entirely. State-of-the-art ATR applications cannot assume in equal measure human perception and interpretation. Why is this still such a critical issue? First, cluttered and noisy images make it difficult to automatically extract, classify and identify object types. Second, due to the changed warfare and the rise of asymmetric threats it is nearly impossible to create an underlying data set containing all features, objects or infrastructure types. Many other reasons like environmental parameters or aspect angles compound the application of ATR supplementary. Due to the lack of suitable ATR procedures, the human factor is still important and so far irreplaceable. In order to use the potential benefits of the human perception and computational methods in a synergistic way, both are unified in an interactive assistance system. RecceMan® (Reconnaissance Manual) offers two different modes for aerial image analysts on missions: the object recognition mode and the infrastructure analysis mode. The aim of the object recognition mode is to recognize a certain object type based on the object features that originated from the image signatures. The infrastructure analysis mode pursues the goal to analyze the function of the infrastructure. The image analyst extracts visually certain target object signatures, assigns them to corresponding object features and is finally able to recognize the object type. The system offers him the possibility to assign the image signatures to features given by sample images. The underlying data set contains a wide range of objects features and object types for different domains like ships or land vehicles. Each domain has its own feature tree developed by aerial image analyst experts. By selecting the corresponding features, the possible solution set of objects is automatically reduced and matches only the objects that contain the selected features. Moreover, we give an outlook of current research in the field of ground target analysis in which we deal with partly automated methods to extract image signatures and assign them to the corresponding features. This research includes methods for automatically determining the orientation of an object and geometric features like width and length of the object. This step enables to reduce automatically the possible object types offered to the image analyst by the interactive recognition assistance system.
Magsat investigation. [Canadian shield
NASA Technical Reports Server (NTRS)
Hall, D. H. (Principal Investigator)
1980-01-01
A computer program was prepared for modeling segments of the Earth's crust allowing for heterogeneity in magnetization in calculating the Earth's field at Magsat heights. This permits investigation of a large number of possible models in assessing the magnetic signatures of subprovinces of the Canadian shield. The fit between the model field and observed fields is optimized in a semi-automatic procedure.
Telescope Automation and Remote Observing System (TAROS)
NASA Astrophysics Data System (ADS)
Wilson, G.; Czezowski, A.; Hovey, G. R.; Jarnyk, M. A.; Nielsen, J.; Roberts, B.; Sebo, K.; Smith, D.; Vaccarella, A.; Young, P.
2005-12-01
TAROS is a system that will allow for the Australian National University telescopes at a remote location to be operated automatically or interactively with authenticated control via the internet. TAROS is operated by a Java front-end GUI and employs the use of several Java technologies - such as Java Message Service (JMS) for communication between the telescope and the remote observer, Java Native Interface to integrate existing data acquisition software written in C++ (CICADA) with new Java programs and the JSky collection of Java GUI components for parts of the remote observer client. In this poster the design and implementation of TAROS is described.
Vehicle fault diagnostics and management system
NASA Astrophysics Data System (ADS)
Gopal, Jagadeesh; Gowthamsachin
2017-11-01
This project is a kind of advanced automatic identification technology, and is more and more widely used in the fields of transportation and logistics. It looks over the main functions with like Vehicle management, Vehicle Speed limit and Control. This system starts with authentication process to keep itself secure. Here we connect sensors to the STM32 board which in turn is connected to the car through Ethernet cable, as Ethernet in capable of sending large amounts of data at high speeds. This technology involved clearly shows how a careful combination of software and hardware can produce an extremely cost-effective solution to a problem.
Ng, Tsz-Tsun; So, Pui-Kin; Zheng, Bo; Yao, Zhong-Ping
2015-07-16
Authentication of edible oils is a long-term issue in food safety, and becomes particularly important with the emergence and wide spread of gutter oils in recent years. Due to the very high analytical demand and diversity of gutter oils, a high throughput analytical method and a versatile strategy for authentication of mixed edible oils and gutter oils are highly desirable. In this study, an improved matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) method has been developed for direct analysis of edible oils. This method involved on-target sample loading, automatic data acquisition and simple data processing. MALDI-MS spectra with high quality and high reproducibility have been obtained using this method, and a preliminary spectral database of edible oils has been set up. The authenticity of an edible oil sample can be determined by comparing its MALDI-MS spectrum and principal component analysis (PCA) results with those of its labeled oil in the database. This method is simple and the whole process only takes several minutes for analysis of one oil sample. We demonstrated that the method was sensitive to change in oil compositions and can be used for measuring compositions of mixed oils. The capability of the method for determining mislabeling enables it for rapid screening of gutter oils since fraudulent mislabeling is a common feature of gutter oils. Copyright © 2015 Elsevier B.V. All rights reserved.
x509-free access to WLCG resources
NASA Astrophysics Data System (ADS)
Short, H.; Manzi, A.; De Notaris, V.; Keeble, O.; Kiryanov, A.; Mikkonen, H.; Tedesco, P.; Wartel, R.
2017-10-01
Access to WLCG resources is authenticated using an x509 and PKI infrastructure. Even though HEP users have always been exposed to certificates directly, the development of modern Web Applications by the LHC experiments calls for simplified authentication processes keeping the underlying software unmodified. In this work we will show a solution with the goal of providing access to WLCG resources using the user’s home organisations credentials, without the need for user-acquired x509 certificates. In particular, we focus on identity providers within eduGAIN, which interconnects research and education organisations worldwide, and enables the trustworthy exchange of identity-related information. eduGAIN has been integrated at CERN in the SSO infrastructure so that users can authenticate without the need of a CERN account. This solution achieves x509-free access to Grid resources with the help of two services: STS and an online CA. The STS (Security Token Service) allows credential translation from the SAML2 format used by Identity Federations to the VOMS-enabled x509 used by most of the Grid. The IOTA CA (Identifier-Only Trust Assurance Certification Authority) is responsible for the automatic issuing of short-lived x509 certificates. The IOTA CA deployed at CERN has been accepted by EUGridPMA as the CERN LCG IOTA CA, included in the IGTF trust anchor distribution and installed by the sites in WLCG. We will also describe the first pilot projects which are integrating the solution.
Design and implementation of a smart card based healthcare information system.
Kardas, Geylani; Tunali, E Turhan
2006-01-01
Smart cards are used in information technologies as portable integrated devices with data storage and data processing capabilities. As in other fields, smart card use in health systems became popular due to their increased capacity and performance. Their efficient use with easy and fast data access facilities leads to implementation particularly widespread in security systems. In this paper, a smart card based healthcare information system is developed. The system uses smart card for personal identification and transfer of health data and provides data communication via a distributed protocol which is particularly developed for this study. Two smart card software modules are implemented that run on patient and healthcare professional smart cards, respectively. In addition to personal information, general health information about the patient is also loaded to patient smart card. Health care providers use their own smart cards to be authenticated on the system and to access data on patient cards. Encryption keys and digital signature keys stored on smart cards of the system are used for secure and authenticated data communication between clients and database servers over distributed object protocol. System is developed on Java platform by using object oriented architecture and design patterns.
Security in Distributed Collaborative Environments: Limitations and Solutions
NASA Astrophysics Data System (ADS)
Saadi, Rachid; Pierson, Jean-Marc; Brunie, Lionel
The main goal of establishing collaboration between heterogeneous environment is to create such as Pervasive context which provide nomadic users with ubiquitous access to digital information and surrounding resources. However, the constraints of mobility and heterogeneity arise a number of crucial issues related to security, especially authentication access control and privacy. First of all, in this chapter we explore the trust paradigm, specially the transitive capability to enable a trust peer to peer collaboration. In this manner, when each organization sets its own security policy to recognize (authenticate) users members of a trusted community and provide them a local access (access control), the trust transitivity between peers will allows users to gain a broad, larger and controlled access inside the pervasive environment. Next, we study the problem of user's privacy. In fact in pervasive and ubiquitous environments, nomadic users gather and exchange certificates or credential which providing them rights to access by transitivity unknown and trusted environments. These signed documents embeds increasing number of attribute that require to be filtered according to such contextual situation. In this chapter, we propose a new morph signature enabling each certificate owner to preserve his privacy by discloses or blinds some sensitive attributes according to faced situation.
Zhang, Ying; Chen, Wei; Liang, Jixing; Zheng, Bingxin; Jiang, Shengming
2015-01-01
It is expected that in the near future wireless sensor network (WSNs) will be more widely used in the mobile environment, in applications such as Autonomous Underwater Vehicles (AUVs) for marine monitoring and mobile robots for environmental investigation. The sensor nodes’ mobility can easily cause changes to the structure of a network topology, and lead to the decline in the amount of transmitted data, excessive energy consumption, and lack of security. To solve these problems, a kind of efficient Topology Control algorithm for node Mobility (TCM) is proposed. In the topology construction stage, an efficient clustering algorithm is adopted, which supports sensor node movement. It can ensure the balance of clustering, and reduce the energy consumption. In the topology maintenance stage, the digital signature authentication based on Error Correction Code (ECC) and the communication mechanism of soft handover are adopted. After verifying the legal identity of the mobile nodes, secure communications can be established, and this can increase the amount of data transmitted. Compared to some existing schemes, the proposed scheme has significant advantages regarding network topology stability, amounts of data transferred, lifetime and safety performance of the network. PMID:26633405
Zhang, Ying; Chen, Wei; Liang, Jixing; Zheng, Bingxin; Jiang, Shengming
2015-12-01
It is expected that in the near future wireless sensor network (WSNs) will be more widely used in the mobile environment, in applications such as Autonomous Underwater Vehicles (AUVs) for marine monitoring and mobile robots for environmental investigation. The sensor nodes' mobility can easily cause changes to the structure of a network topology, and lead to the decline in the amount of transmitted data, excessive energy consumption, and lack of security. To solve these problems, a kind of efficient Topology Control algorithm for node Mobility (TCM) is proposed. In the topology construction stage, an efficient clustering algorithm is adopted, which supports sensor node movement. It can ensure the balance of clustering, and reduce the energy consumption. In the topology maintenance stage, the digital signature authentication based on Error Correction Code (ECC) and the communication mechanism of soft handover are adopted. After verifying the legal identity of the mobile nodes, secure communications can be established, and this can increase the amount of data transmitted. Compared to some existing schemes, the proposed scheme has significant advantages regarding network topology stability, amounts of data transferred, lifetime and safety performance of the network.
Spangenberg, Jorge E; Lavric, Jost V; Meisser, Nicolas; Serneels, Vincent
2010-10-15
The most valuable pigment of the Roman wall paintings was the red color obtained from powdered cinnabar (Minium Cinnabaris pigment), the red mercury sulfide (HgS), which was brought from mercury (Hg) deposits in the Roman Empire. To address the question of whether sulfur isotope signatures can serve as a rapid method to establish the provenance of the red pigment in Roman frescoes, we have measured the sulfur isotope composition (δ(34)S value in ‰ VCDT) in samples of wall painting from the Roman city Aventicum (Avenches, Vaud, Switzerland) and compared them with values from cinnabar from European mercury deposits (Almadén in Spain, Idria in Slovenia, Monte Amiata in Italy, Moschellandsberg in Germany, and Genepy in France). Our study shows that the δ(34)S values of cinnabar from the studied Roman wall paintings fall within or near to the composition of Almadén cinnabar; thus, the provenance of the raw material may be deduced. This approach may provide information on provenance and authenticity in archaeological, restoration and forensic studies of Roman and Greek frescoes. Copyright © 2010 John Wiley & Sons, Ltd.
Automatic limb identification and sleeping parameters assessment for pressure ulcer prevention.
Baran Pouyan, Maziyar; Birjandtalab, Javad; Nourani, Mehrdad; Matthew Pompeo, M D
2016-08-01
Pressure ulcers (PUs) are common among vulnerable patients such as elderly, bedridden and diabetic. PUs are very painful for patients and costly for hospitals and nursing homes. Assessment of sleeping parameters on at-risk limbs is critical for ulcer prevention. An effective assessment depends on automatic identification and tracking of at-risk limbs. An accurate limb identification can be used to analyze the pressure distribution and assess risk for each limb. In this paper, we propose a graph-based clustering approach to extract the body limbs from the pressure data collected by a commercial pressure map system. A robust signature-based technique is employed to automatically label each limb. Finally, an assessment technique is applied to evaluate the experienced stress by each limb over time. The experimental results indicate high performance and more than 94% average accuracy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Carestia, Mariachiara; Pizzoferrato, Roberto; Gelfusa, Michela; Cenciarelli, Orlando; Ludovici, Gian Marco; Gabriele, Jessica; Malizia, Andrea; Murari, Andrea; Vega, Jesus; Gaudio, Pasquale
2015-11-01
Biosecurity and biosafety are key concerns of modern society. Although nanomaterials are improving the capacities of point detectors, standoff detection still appears to be an open issue. Laser-induced fluorescence of biological agents (BAs) has proved to be one of the most promising optical techniques to achieve early standoff detection, but its strengths and weaknesses are still to be fully investigated. In particular, different BAs tend to have similar fluorescence spectra due to the ubiquity of biological endogenous fluorophores producing a signal in the UV range, making data analysis extremely challenging. The Universal Multi Event Locator (UMEL), a general method based on support vector regression, is commonly used to identify characteristic structures in arrays of data. In the first part of this work, we investigate fluorescence emission spectra of different simulants of BAs and apply UMEL for their automatic classification. In the second part of this work, we elaborate a strategy for the application of UMEL to the discrimination of different BAs' simulants spectra. Through this strategy, it has been possible to discriminate between these BAs' simulants despite the high similarity of their fluorescence spectra. These preliminary results support the use of SVR methods to classify BAs' spectral signatures.
Micro-Doppler analysis of multiple frequency continuous wave radar signatures
NASA Astrophysics Data System (ADS)
Anderson, Michael G.; Rogers, Robert L.
2007-04-01
Micro-Doppler refers to Doppler scattering returns produced by non rigid-body motion. Micro-Doppler gives rise to many detailed radar image features in addition to those associated with bulk target motion. Targets of different classes (for example, humans, animals, and vehicles) produce micro-Doppler images that are often distinguishable even by nonexpert observers. Micro-Doppler features have great potential for use in automatic target classification algorithms. Although the potential benefit of using micro-Doppler in classification algorithms is high, relatively little experimental (non-synthetic) micro-Doppler data exists. Much of the existing experimental data comes from highly cooperative targets (human or vehicle targets directly approaching the radar). This research involved field data collection and analysis of micro-Doppler radar signatures from non-cooperative targets. The data was collected using a low cost Xband multiple frequency continuous wave (MFCW) radar with three transmit frequencies. The collected MFCW radar signatures contain data from humans, vehicles, and animals. The presented data includes micro-Doppler signatures previously unavailable in the literature such as crawling humans and various animal species. The animal micro-Doppler signatures include deer, dog, and goat datasets. This research focuses on the analysis of micro-Doppler from noncooperative targets approaching the radar at various angles, maneuvers, and postures.
Recognition of surface lithologic and topographic patterns in southwest Colorado with ADP techniques
NASA Technical Reports Server (NTRS)
Melhorn, W. N.; Sinnock, S.
1973-01-01
Analysis of ERTS-1 multispectral data by automatic pattern recognition procedures is applicable toward grappling with current and future resource stresses by providing a means for refining existing geologic maps. The procedures used in the current analysis already yield encouraging results toward the eventual machine recognition of extensive surface lithologic and topographic patterns. Automatic mapping of a series of hogbacks, strike valleys, and alluvial surfaces along the northwest flank of the San Juan Basin in Colorado can be obtained by minimal man-machine interaction. The determination of causes for separable spectral signatures is dependent upon extensive correlation of micro- and macro field based ground truth observations and aircraft underflight data with the satellite data.
EnzML: multi-label prediction of enzyme classes using InterPro signatures
2012-01-01
Background Manual annotation of enzymatic functions cannot keep up with automatic genome sequencing. In this work we explore the capacity of InterPro sequence signatures to automatically predict enzymatic function. Results We present EnzML, a multi-label classification method that can efficiently account also for proteins with multiple enzymatic functions: 50,000 in UniProt. EnzML was evaluated using a standard set of 300,747 proteins for which the manually curated Swiss-Prot and KEGG databases have agreeing Enzyme Commission (EC) annotations. EnzML achieved more than 98% subset accuracy (exact match of all correct Enzyme Commission classes of a protein) for the entire dataset and between 87 and 97% subset accuracy in reannotating eight entire proteomes: human, mouse, rat, mouse-ear cress, fruit fly, the S. pombe yeast, the E. coli bacterium and the M. jannaschii archaebacterium. To understand the role played by the dataset size, we compared the cross-evaluation results of smaller datasets, either constructed at random or from specific taxonomic domains such as archaea, bacteria, fungi, invertebrates, plants and vertebrates. The results were confirmed even when the redundancy in the dataset was reduced using UniRef100, UniRef90 or UniRef50 clusters. Conclusions InterPro signatures are a compact and powerful attribute space for the prediction of enzymatic function. This representation makes multi-label machine learning feasible in reasonable time (30 minutes to train on 300,747 instances with 10,852 attributes and 2,201 class values) using the Mulan Binary Relevance Nearest Neighbours algorithm implementation (BR-kNN). PMID:22533924
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jared Verba; Michael Milvich
2008-05-01
Current Intrusion Detection System (IDS) technology is not suited to be widely deployed inside a Supervisory, Control and Data Acquisition (SCADA) environment. Anomaly- and signature-based IDS technologies have developed methods to cover information technology-based networks activity and protocols effectively. However, these IDS technologies do not include the fine protocol granularity required to ensure network security inside an environment with weak protocols lacking authentication and encryption. By implementing a more specific and more intelligent packet inspection mechanism, tailored traffic flow analysis, and unique packet tampering detection, IDS technology developed specifically for SCADA environments can be deployed with confidence in detecting maliciousmore » activity.« less
Design of Complex BPF with Automatic Digital Tuning Circuit for Low-IF Receivers
NASA Astrophysics Data System (ADS)
Kondo, Hideaki; Sawada, Masaru; Murakami, Norio; Masui, Shoichi
This paper describes the architecture and implementations of an automatic digital tuning circuit for a complex bandpass filter (BPF) in a low-power and low-cost transceiver for applications such as personal authentication and wireless sensor network systems. The architectural design analysis demonstrates that an active RC filter in a low-IF architecture can be at least 47.7% smaller in area than a conventional gm-C filter; in addition, it features a simple implementation of an associated tuning circuit. The principle of simultaneous tuning of both the center frequency and bandwidth through calibration of a capacitor array is illustrated as based on an analysis of filter characteristics, and a scalable automatic digital tuning circuit with simple analog blocks and control logic having only 835 gates is introduced. The developed capacitor tuning technique can achieve a tuning error of less than ±3.5% and lower a peaking in the passband filter characteristics. An experimental complex BPF using 0.18µm CMOS technology can successfully reduce the tuning error from an initial value of -20% to less than ±2.5% after tuning. The filter block dimensions are 1.22mm × 1.01mm; and in measurement results of the developed complex BPF with the automatic digital tuning circuit, current consumption is 705µA and the image rejection ratio is 40.3dB. Complete evaluation of the BPF indicates that this technique can be applied to low-power, low-cost transceivers.
Computer loss experience and predictions
NASA Astrophysics Data System (ADS)
Parker, Donn B.
1996-03-01
The types of losses organizations must anticipate have become more difficult to predict because of the eclectic nature of computers and the data communications and the decrease in news media reporting of computer-related losses as they become commonplace. Total business crime is conjectured to be decreasing in frequency and increasing in loss per case as a result of increasing computer use. Computer crimes are probably increasing, however, as their share of the decreasing business crime rate grows. Ultimately all business crime will involve computers in some way, and we could see a decline of both together. The important information security measures in high-loss business crime generally concern controls over authorized people engaged in unauthorized activities. Such controls include authentication of users, analysis of detailed audit records, unannounced audits, segregation of development and production systems and duties, shielding the viewing of screens, and security awareness and motivation controls in high-value transaction areas. Computer crimes that involve highly publicized intriguing computer misuse methods, such as privacy violations, radio frequency emanations eavesdropping, and computer viruses, have been reported in waves that periodically have saturated the news media during the past 20 years. We must be able to anticipate such highly publicized crimes and reduce the impact and embarrassment they cause. On the basis of our most recent experience, I propose nine new types of computer crime to be aware of: computer larceny (theft and burglary of small computers), automated hacking (use of computer programs to intrude), electronic data interchange fraud (business transaction fraud), Trojan bomb extortion and sabotage (code security inserted into others' systems that can be triggered to cause damage), LANarchy (unknown equipment in use), desktop forgery (computerized forgery and counterfeiting of documents), information anarchy (indiscriminate use of crypto without control), Internet abuse (antisocial use of data communications), and international industrial espionage (governments stealing business secrets). A wide variety of safeguards are necessary to deal with these new crimes. The most powerful controls include (1) carefully controlled use of cryptography and digital signatures with good key management and overriding business and government decryption capability and (2) use of tokens such as smart cards to increase the strength of secret passwords for authentication of computer users. Jewelry-type security for small computers--including registration of serial numbers and security inventorying of equipment, software, and connectivity--will be necessary. Other safeguards include automatic monitoring of computer use and detection of unusual activities, segmentation and filtering of networks, special paper and ink for documents, and reduction of paper documents. Finally, international cooperation of governments to create trusted environments for business is essential.
Simulation of atmospheric and terrestrial background signatures for detection and tracking scenarios
NASA Astrophysics Data System (ADS)
Schweitzer, Caroline; Stein, Karin
2015-10-01
In the fields of early warning, one is depending on reliable image exploitation: Only if the applied detection and tracking algorithms work efficiently, the threat approach alert can be given fast enough to ensure an automatic initiation of the countermeasure. In order to evaluate the performance of those algorithms for a certain electro-optical (EO) sensor system, test sequences need to be created as realistic and comprehensive as possible. Since both, background and target signature, depend on the environmental conditions, a detailed knowledge of the meteorology and climatology is necessary. Trials for measuring these environmental characteristics serve as a solid basis, but might only constitute the conditions during a rather short period of time. To represent the entire variation of meteorology and climatology that the future system will be exposed to, the application of comprehensive atmospheric modelling tools is essential. This paper gives an introduction of the atmospheric modelling tools that are currently used at Fraunhofer IOSB to simulate spectral background signatures in the infrared (IR) range. It is also demonstrated, how those signatures are affected by changing atmospheric and climatic conditions. In conclusion - and with a special focus on the modelling of different cloud types - sources of error and limits are discussed.
Qualification of security printing features
NASA Astrophysics Data System (ADS)
Simske, Steven J.; Aronoff, Jason S.; Arnabat, Jordi
2006-02-01
This paper describes the statistical and hardware processes involved in qualifying two related printing features for their deployment in product (e.g. document and package) security. The first is a multi-colored tiling feature that can also be combined with microtext to provide additional forms of security protection. The color information is authenticated automatically with a variety of handheld, desktop and production scanners. The microtext is authenticated either following magnification or manually by a field inspector. The second security feature can also be tile-based. It involves the use of two inks that provide the same visual color, but differ in their transparency to infrared (IR) wavelengths. One of the inks is effectively transparent to IR wavelengths, allowing emitted IR light to pass through. The other ink is effectively opaque to IR wavelengths. These inks allow the printing of a seemingly uniform, or spot, color over a (truly) uniform IR emitting ink layer. The combination converts a uniform covert ink and a spot color to a variable data region capable of encoding identification sequences with high density. Also, it allows the extension of variable data printing for security to ostensibly static printed regions, affording greater security protection while meeting branding and marketing specifications.
Real time automatic detection of bearing fault in induction machine using kurtogram analysis.
Tafinine, Farid; Mokrani, Karim
2012-11-01
A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.
Visualization of hyperspectral imagery
NASA Astrophysics Data System (ADS)
Hogervorst, Maarten A.; Bijl, Piet; Toet, Alexander
2007-04-01
We developed four new techniques to visualize hyper spectral image data for man-in-the-loop target detection. The methods respectively: (1) display the subsequent bands as a movie ("movie"), (2) map the data onto three channels and display these as a colour image ("colour"), (3) display the correlation between the pixel signatures and a known target signature ("match") and (4) display the output of a standard anomaly detector ("anomaly"). The movie technique requires no assumptions about the target signature and involves no information loss. The colour technique produces a single image that can be displayed in real-time. A disadvantage of this technique is loss of information. A display of the match between a target signature and pixels and can be interpreted easily and fast, but this technique relies on precise knowledge of the target signature. The anomaly detector signifies pixels with signatures that deviate from the (local) background. We performed a target detection experiment with human observers to determine their relative performance with the four techniques,. The results show that the "match" presentation yields the best performance, followed by "movie" and "anomaly", while performance with the "colour" presentation was the poorest. Each scheme has its advantages and disadvantages and is more or less suited for real-time and post-hoc processing. The rationale is that the final interpretation is best done by a human observer. In contrast to automatic target recognition systems, the interpretation of hyper spectral imagery by the human visual system is robust to noise and image transformations and requires a minimal number of assumptions (about signature of target and background, target shape etc.) When more knowledge about target and background is available this may be used to help the observer interpreting the data (aided target detection).
Joint Dictionary Learning for Multispectral Change Detection.
Lu, Xiaoqiang; Yuan, Yuan; Zheng, Xiangtao
2017-04-01
Change detection is one of the most important applications of remote sensing technology. It is a challenging task due to the obvious variations in the radiometric value of spectral signature and the limited capability of utilizing spectral information. In this paper, an improved sparse coding method for change detection is proposed. The intuition of the proposed method is that unchanged pixels in different images can be well reconstructed by the joint dictionary, which corresponds to knowledge of unchanged pixels, while changed pixels cannot. First, a query image pair is projected onto the joint dictionary to constitute the knowledge of unchanged pixels. Then reconstruction error is obtained to discriminate between the changed and unchanged pixels in the different images. To select the proper thresholds for determining changed regions, an automatic threshold selection strategy is presented by minimizing the reconstruction errors of the changed pixels. Adequate experiments on multispectral data have been tested, and the experimental results compared with the state-of-the-art methods prove the superiority of the proposed method. Contributions of the proposed method can be summarized as follows: 1) joint dictionary learning is proposed to explore the intrinsic information of different images for change detection. In this case, change detection can be transformed as a sparse representation problem. To the authors' knowledge, few publications utilize joint learning dictionary in change detection; 2) an automatic threshold selection strategy is presented, which minimizes the reconstruction errors of the changed pixels without the prior assumption of the spectral signature. As a result, the threshold value provided by the proposed method can adapt to different data due to the characteristic of joint dictionary learning; and 3) the proposed method makes no prior assumption of the modeling and the handling of the spectral signature, which can be adapted to different data.
NASA Astrophysics Data System (ADS)
Al-Mansoori, Saeed; Kunhu, Alavi
2013-10-01
This paper proposes a blind multi-watermarking scheme based on designing two back-to-back encoders. The first encoder is implemented to embed a robust watermark into remote sensing imagery by applying a Discrete Cosine Transform (DCT) approach. Such watermark is used in many applications to protect the copyright of the image. However, the second encoder embeds a fragile watermark using `SHA-1' hash function. The purpose behind embedding a fragile watermark is to prove the authenticity of the image (i.e. tamper-proof). Thus, the proposed technique was developed as a result of new challenges with piracy of remote sensing imagery ownership. This led researchers to look for different means to secure the ownership of satellite imagery and prevent the illegal use of these resources. Therefore, Emirates Institution for Advanced Science and Technology (EIAST) proposed utilizing existing data security concept by embedding a digital signature, "watermark", into DubaiSat-1 satellite imagery. In this study, DubaiSat-1 images with 2.5 meter resolution are used as a cover and a colored EIAST logo is used as a watermark. In order to evaluate the robustness of the proposed technique, a couple of attacks are applied such as JPEG compression, rotation and synchronization attacks. Furthermore, tampering attacks are applied to prove image authenticity.
A MEMS-based, wireless, biometric-like security system
NASA Astrophysics Data System (ADS)
Cross, Joshua D.; Schneiter, John L.; Leiby, Grant A.; McCarter, Steven; Smith, Jeremiah; Budka, Thomas P.
2010-04-01
We present a system for secure identification applications that is based upon biometric-like MEMS chips. The MEMS chips have unique frequency signatures resulting from fabrication process variations. The MEMS chips possess something analogous to a "voiceprint". The chips are vacuum encapsulated, rugged, and suitable for low-cost, highvolume mass production. Furthermore, the fabrication process is fully integrated with standard CMOS fabrication methods. One is able to operate the MEMS-based identification system similarly to a conventional RFID system: the reader (essentially a custom network analyzer) detects the power reflected across a frequency spectrum from a MEMS chip in its vicinity. We demonstrate prototype "tags" - MEMS chips placed on a credit card-like substrate - to show how the system could be used in standard identification or authentication applications. We have integrated power scavenging to provide DC bias for the MEMS chips through the use of a 915 MHz source in the reader and a RF-DC conversion circuit on the tag. The system enables a high level of protection against typical RFID hacking attacks. There is no need for signal encryption, so back-end infrastructure is minimal. We believe this system would make a viable low-cost, high-security system for a variety of identification and authentication applications.
Towards a Scalable Group Vehicle-based Security System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, Jason M
2016-01-01
In August 2014, the National Highway Traffic Safety Administration (NHTSA) proposed new rulemaking to require V2V communication in light vehicles. To establish trust in the basic safety messages (BSMs) that are exchanged by vehicles to improve driver safety, a vehicle public key infrastructure (VPKI) is required. We outline a system where a group or groups of vehicles manage and generate their own BSM signing keys and authenticating certificates -- a Vehicle-Based Security System (VBSS). Based on our preliminary examination, we assert the mechanisms exist to implement a VBSS that supports V2V communications; however, maintaining uniform trust throughout the system whilemore » protecting individual privacy does require reliance on nascent group signature technology which may require a significant amount of communication overhead for trust maintenance. To better evaluate the VBSS approach, we compare it to the proposed Security Credential Management System (SCMS) in four major areas including bootstrapping, pseudonym provisioning, BSM signing and authentication, and revocation. System scale, driver privacy, and the distribution and dynamics of participants make designing an effective VPKI an interesting and challenging problem; no clear-cut strategy exists to satisfy the security and privacy expectations in a highly efficient way. More work is needed in VPKI research, so the life-saving promise of V2V technology can be achieved.« less
On the nature of short and long gamma-ray bursts
NASA Astrophysics Data System (ADS)
Ruffini, Remo; Fryer, Chris; Muccino, Marco; Rueda Hernandez, Jorge
2016-03-01
For a long GRB (L-GRB) the induced gravitational collapse (IGC) paradigm proposes as progenitor a binary system made up of a carbon-oxygen core undergoing a supernova (SN) that triggers hypercritical accretion onto a neutron star (NS) companion. For a short GRB (S-GRB), a NS-NS merger is adopted. We divide L-GRBs and S-GRBs into subclasses, depending whether or not a black hole (BH) is formed. For long bursts, when no BH is formed we have the X-ray flashes (XRFs), with isotropic energy Eiso <=1052 erg and rest-frame spectral peak energy Ep , i <= 200 keV. When a BH is formed we have authentic L-GRBs, with Eiso >1052 erg and Ep , i > 200 keV. For short bursts, when no BH is formed we have short gamma-ray flashes (S-GRFs) with Eiso <=1052 erg and Ep , i <= 2 MeV, while an authentic S-GRBs occur if BH is formed, with Eiso >1052 erg and Ep , i > 2 MeV. We give examples and observational signatures of the four subclasses. In the case of S-GRBs and BdHNe evidence is given of the coincidence of the onset of the high-energy GeV emission with the birth of a Kerr-Newman BH.
Protection of Health Imagery by Region Based Lossless Reversible Watermarking Scheme
Priya, R. Lakshmi; Sadasivam, V.
2015-01-01
Providing authentication and integrity in medical images is a problem and this work proposes a new blind fragile region based lossless reversible watermarking technique to improve trustworthiness of medical images. The proposed technique embeds the watermark using a reversible least significant bit embedding scheme. The scheme combines hashing, compression, and digital signature techniques to create a content dependent watermark making use of compressed region of interest (ROI) for recovery of ROI as reported in literature. The experiments were carried out to prove the performance of the scheme and its assessment reveals that ROI is extracted in an intact manner and PSNR values obtained lead to realization that the presented scheme offers greater protection for health imageries. PMID:26649328
NASA Astrophysics Data System (ADS)
Kiktenko, E. O.; Pozhar, N. O.; Anufriev, M. N.; Trushechkin, A. S.; Yunusov, R. R.; Kurochkin, Y. V.; Lvovsky, A. I.; Fedorov, A. K.
2018-07-01
Blockchain is a distributed database which is cryptographically protected against malicious modifications. While promising for a wide range of applications, current blockchain platforms rely on digital signatures, which are vulnerable to attacks by means of quantum computers. The same, albeit to a lesser extent, applies to cryptographic hash functions that are used in preparing new blocks, so parties with access to quantum computation would have unfair advantage in procuring mining rewards. Here we propose a possible solution to the quantum era blockchain challenge and report an experimental realization of a quantum-safe blockchain platform that utilizes quantum key distribution across an urban fiber network for information-theoretically secure authentication. These results address important questions about realizability and scalability of quantum-safe blockchains for commercial and governmental applications.
Non-invasive detection of cocaine dissolved in beverages using displaced Raman spectroscopy.
Eliasson, C; Macleod, N A; Matousek, P
2008-01-21
We demonstrate the potential of Raman spectroscopy to detect cocaine concealed inside transparent glass bottles containing alcoholic beverages. A clear Raman signature of cocaine with good signal-to-noise was obtained from a approximately 300 g solution of adulterated cocaine (purity 75%) in a 0.7 L authentic brown bottle of rum with 1 s acquisition time. The detection limit was estimated to be of the order of 9 g of pure cocaine per 0.7 L (approximately 0.04 moles L(-1)) with 1 s acquisition time. The technique holds great promise for the fast, non-invasive, detection of concealed illicit compounds inside beverages using portable Raman instruments, thus permitting drug trafficking to be combated more effectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strait, R.S.; Wagner, E.E.
1994-07-01
The US Department of Energy (DOE) Office of Safeguards and Security initiated the DOE Integrated Security System / Electronic Transfer (DISS/ET) for the purpose of reducing the time required to process security clearance requests. DISS/ET will be an integrated system using electronic commerce technologies for the collection and processing of personnel security clearance data, and its transfer between DOE local security clearance offices, DOE Operations Offices, and the Office of Personnel Management. The system will use electronic forms to collect clearance applicant data. The forms data will be combined with electronic fingerprint images and packaged in a secure encrypted electronicmore » mail envelope for transmission across the Internet. Information provided by the applicant will be authenticated using digital signatures. All processing will be done electronically.« less
A bimodal biometric identification system
NASA Astrophysics Data System (ADS)
Laghari, Mohammad S.; Khuwaja, Gulzar A.
2013-03-01
Biometrics consists of methods for uniquely recognizing humans based upon one or more intrinsic physical or behavioral traits. Physicals are related to the shape of the body. Behavioral are related to the behavior of a person. However, biometric authentication systems suffer from imprecision and difficulty in person recognition due to a number of reasons and no single biometrics is expected to effectively satisfy the requirements of all verification and/or identification applications. Bimodal biometric systems are expected to be more reliable due to the presence of two pieces of evidence and also be able to meet the severe performance requirements imposed by various applications. This paper presents a neural network based bimodal biometric identification system by using human face and handwritten signature features.
Design of Distortion-Invariant Optical ID Tags for Remote Identification and Verification of Objects
NASA Astrophysics Data System (ADS)
Pérez-Cabré, Elisabet; Millán, María Sagrario; Javidi, Bahram
Optical identification (ID) tags [1] have a promising future in a number of applications such as the surveillance of vehicles in transportation, control of restricted areas for homeland security, item tracking on conveyor belts or other industrial environment, etc. More specifically, passive optical ID tag [1] was introduced as an optical code containing a signature (that is, a characteristic image or other relevant information of the object), which permits its real-time remote detection and identification. Since their introduction in the literature [1], some contributions have been proposed to increase their usefulness and robustness. To increase security and avoid counterfeiting, the signature was introduced in the optical code as an encrypted function [2-5] following the double-phase encryption technique [6]. Moreover, the design of the optical ID tag was done in such a way that tolerance to variations in scale and rotation was achieved [2-5]. To do that, the encrypted information was multiplexed and distributed in the optical code following an appropriate topology. Further studies were carried out to analyze the influence of different sources of noise. In some proposals [5, 7], the designed ID tag consists of two optical codes where the complex-valued encrypted signature was separately introduced in two real-valued functions according to its magnitude and phase distributions. This solution was introduced to overcome some difficulties in the readout of complex values in outdoors environments. Recently, the fully phase encryption technique [8] has been proposed to increase noise robustness of the authentication system.
Motor signatures of emotional reactivity in frontotemporal dementia.
Marshall, Charles R; Hardy, Chris J D; Russell, Lucy L; Clark, Camilla N; Bond, Rebecca L; Dick, Katrina M; Brotherhood, Emilie V; Mummery, Cath J; Schott, Jonathan M; Rohrer, Jonathan D; Kilner, James M; Warren, Jason D
2018-01-18
Automatic motor mimicry is essential to the normal processing of perceived emotion, and disrupted automatic imitation might underpin socio-emotional deficits in neurodegenerative diseases, particularly the frontotemporal dementias. However, the pathophysiology of emotional reactivity in these diseases has not been elucidated. We studied facial electromyographic responses during emotion identification on viewing videos of dynamic facial expressions in 37 patients representing canonical frontotemporal dementia syndromes versus 21 healthy older individuals. Neuroanatomical associations of emotional expression identification accuracy and facial muscle reactivity were assessed using voxel-based morphometry. Controls showed characteristic profiles of automatic imitation, and this response predicted correct emotion identification. Automatic imitation was reduced in the behavioural and right temporal variant groups, while the normal coupling between imitation and correct identification was lost in the right temporal and semantic variant groups. Grey matter correlates of emotion identification and imitation were delineated within a distributed network including primary visual and motor, prefrontal, insular, anterior temporal and temporo-occipital junctional areas, with common involvement of supplementary motor cortex across syndromes. Impaired emotional mimesis may be a core mechanism of disordered emotional signal understanding and reactivity in frontotemporal dementia, with implications for the development of novel physiological biomarkers of socio-emotional dysfunction in these diseases.
NASA Astrophysics Data System (ADS)
Larnier, H.; Sailhac, P.; Chambodut, A.
2018-01-01
Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.
NASA Astrophysics Data System (ADS)
Riesmeier, Joerg; Eichelberg, Marco; Kleber, Klaus; Groenemeyer, Dietrich H.; Oosterwijk, Herman J.; Jensch, Peter F.
2002-05-01
With the release of 'DICOM Structured Reporting' (SR) as an official extension of the standard about two years ago, DICOM has entered a new domain that is only indirectly related to medical imaging. Basically, DICOM SR is a general model allowing to encode medical reports in a structured manner in DICOM's tag-based format. Therefore, the existing DICOM infrastructure can be used to archive and communicate structured reports, with only relatively small changes to existing systems. As a consequence of the introduction of medical reports in a digital form, the relevance of security measures increases significantly. We have developed a prototype implementation of DICOM structured reporting together with the new security extensions for secure transport connections and digital signatures. The application allows to create, read and modify any SR document, to digitally sign an SR document in whole or part and to transmit such documents over a network. While the secure transport connection protects data from modifications or unauthorized access only during transmission, digital signatures provide a lifetime integrity check and, therefore, maintain the legal document status of structured reports. The application has been successfully demonstrated at RSNA 2000 and ECR 2001, and is freely available on the Internet.
Lightweight and scalable secure communication in VANET
NASA Astrophysics Data System (ADS)
Zhu, Xiaoling; Lu, Yang; Zhu, Xiaojuan; Qiu, Shuwei
2015-05-01
To avoid a message to be tempered and forged in vehicular ad hoc network (VANET), the digital signature method is adopted by IEEE1609.2. However, the costs of the method are excessively high for large-scale networks. The paper efficiently copes with the issue with a secure communication framework by introducing some lightweight cryptography primitives. In our framework, point-to-point and broadcast communications for vehicle-to-infrastructure (V2I) and vehicle-to-vehicle (V2V) are studied, mainly based on symmetric cryptography. A new issue incurred is symmetric key management. Thus, we develop key distribution and agreement protocols for two-party key and group key under different environments, whether a road side unit (RSU) is deployed or not. The analysis shows that our protocols provide confidentiality, authentication, perfect forward secrecy, forward secrecy and backward secrecy. The proposed group key agreement protocol especially solves the key leak problem caused by members joining or leaving in existing key agreement protocols. Due to aggregated signature and substitution of XOR for point addition, the average computation and communication costs do not significantly increase with the increase in the number of vehicles; hence, our framework provides good scalability.
Lessons learned from the implementation of clinical messaging systems.
Barnes, Mike
2007-10-11
The Regenstrief Institute has designed and implemented two clinical messaging systems over the past six years, both called DOCS4DOCS. These systems receive HL7 messages from data sources and deliver results to clinicians via the web, fax, or as HL7 directed to an EMR. This paper focuses on some of the lessons we have learned, both good and bad. We discuss important issues in clinical messaging including provider mapping, document delivery and duplicate prevention, creating uniform HL7 outbound feeds, user authentication, the problems of allowing Active-X controls, why automatic printing of documents is not important although a frequently requested feature, and assorted other pearls of wisdom we have acquired.
Lessons Learned from the Implementation of Clinical Messaging Systems
Barnes, Mike
2007-01-01
The Regenstrief Institute has designed and implemented two clinical messaging systems over the past six years, both called DOCS4DOCS®. These systems receive HL7 messages from data sources and deliver results to clinicians via the web, fax, or as HL7 directed to an EMR. This paper focuses on some of the lessons we have learned, both good and bad. We discuss important issues in clinical messaging including provider mapping, document delivery and duplicate prevention, creating uniform HL7 outbound feeds, user authentication, the problems of allowing Active-X controls, why automatic printing of documents is not important although a frequently requested feature, and assorted other pearls of wisdom we have acquired. PMID:18693793
System and method for authentication
Duerksen, Gary L.; Miller, Seth A.
2015-12-29
Described are methods and systems for determining authenticity. For example, the method may include providing an object of authentication, capturing characteristic data from the object of authentication, deriving authentication data from the characteristic data of the object of authentication, and comparing the authentication data with an electronic database comprising reference authentication data to provide an authenticity score for the object of authentication. The reference authentication data may correspond to one or more reference objects of authentication other than the object of authentication.
Authentic Teachers: Student Criteria Perceiving Authenticity of Teachers
ERIC Educational Resources Information Center
De Bruyckere, Pedro; Kirschner, Paul A.
2016-01-01
Authenticity is seen by many as a key for good learning and education. There is talk of authentic instruction, authentic learning, authentic problems, authentic assessment, authentic tools and authentic teachers. The problem is that while authenticity is an often-used adjective describing almost all aspects of teaching and learning, the concept…
Qiu, Shi; Yang, Wen-Zhi; Yao, Chang-Liang; Qiu, Zhi-Dong; Shi, Xiao-Jian; Zhang, Jing-Xian; Hou, Jin-Jun; Wang, Qiu-Rong; Wu, Wan-Ying; Guo, De-An
2016-07-01
A key segment in authentication of herbal medicines is the establishment of robust biomarkers that embody the intrinsic metabolites difference independent of the growing environment or processing technics. We present a strategy by nontargeted metabolomics and "Commercial-homophyletic" comparison-induced biomarkers verification with new bioinformatic vehicles, to improve the efficiency and reliability in authentication of herbal medicines. The chemical differentiation of five different parts (root, leaf, flower bud, berry, and seed) of Panax ginseng was illustrated as a case study. First, an optimized ultra-performance liquid chromatography/quadrupole time-of-flight-MS(E) (UPLC/QTOF-MS(E)) approach was established for global metabolites profiling. Second, UNIFI™ combined with search of an in-house library was employed to automatically characterize the metabolites. Third, pattern recognition multivariate statistical analysis of the MS(E) data of different parts of commercial and homophyletic samples were separately performed to explore potential biomarkers. Fourth, potential biomarkers deduced from commercial and homophyletic root and leaf samples were cross-compared to infer robust biomarkers. Fifth, discriminating models by artificial neutral network (ANN) were established to identify different parts of P. ginseng. Consequently, 164 compounds were characterized, and 11 robust biomarkers enabling the differentiation among root, leaf, flower bud, and berry, were discovered by removing those structurally unstable and possibly processing-related ones. The ANN models using the robust biomarkers managed to exactly discriminate four different parts and root adulterant with leaf as well. Conclusively, biomarkers verification using homophyletic samples conduces to the discovery of robust biomarkers. The integrated strategy facilitates authentication of herbal medicines in a more efficient and more intelligent manner. Copyright © 2016 Elsevier B.V. All rights reserved.
Pi, Yiming
2017-01-01
The frequency of terahertz radar ranges from 0.1 THz to 10 THz, which is higher than that of microwaves. Multi-modal signals, including high-resolution range profile (HRRP) and Doppler signatures, can be acquired by the terahertz radar system. These two kinds of information are commonly used in automatic target recognition; however, dynamic gesture recognition is rarely discussed in the terahertz regime. In this paper, a dynamic gesture recognition system using a terahertz radar is proposed, based on multi-modal signals. The HRRP sequences and Doppler signatures were first achieved from the radar echoes. Considering the electromagnetic scattering characteristics, a feature extraction model is designed using location parameter estimation of scattering centers. Dynamic Time Warping (DTW) extended to multi-modal signals is used to accomplish the classifications. Ten types of gesture signals, collected from a terahertz radar, are applied to validate the analysis and the recognition system. The results of the experiment indicate that the recognition rate reaches more than 91%. This research verifies the potential applications of dynamic gesture recognition using a terahertz radar. PMID:29267249
Zhou, Zhi; Cao, Zongjie; Pi, Yiming
2017-12-21
The frequency of terahertz radar ranges from 0.1 THz to 10 THz, which is higher than that of microwaves. Multi-modal signals, including high-resolution range profile (HRRP) and Doppler signatures, can be acquired by the terahertz radar system. These two kinds of information are commonly used in automatic target recognition; however, dynamic gesture recognition is rarely discussed in the terahertz regime. In this paper, a dynamic gesture recognition system using a terahertz radar is proposed, based on multi-modal signals. The HRRP sequences and Doppler signatures were first achieved from the radar echoes. Considering the electromagnetic scattering characteristics, a feature extraction model is designed using location parameter estimation of scattering centers. Dynamic Time Warping (DTW) extended to multi-modal signals is used to accomplish the classifications. Ten types of gesture signals, collected from a terahertz radar, are applied to validate the analysis and the recognition system. The results of the experiment indicate that the recognition rate reaches more than 91%. This research verifies the potential applications of dynamic gesture recognition using a terahertz radar.
Automatic Identification of Application I/O Signatures from Noisy Server-Side Traces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yang; Gunasekaran, Raghul; Ma, Xiaosong
2014-01-01
Competing workloads on a shared storage system cause I/O resource contention and application performance vagaries. This problem is already evident in today s HPC storage systems and is likely to become acute at exascale. We need more interaction between application I/O requirements and system software tools to help alleviate the I/O bottleneck, moving towards I/O-aware job scheduling. However, this requires rich techniques to capture application I/O characteristics, which remain evasive in production systems. Traditionally, I/O characteristics have been obtained using client-side tracing tools, with drawbacks such as non-trivial instrumentation/development costs, large trace traffic, and inconsistent adoption. We present a novelmore » approach, I/O Signature Identifier (IOSI), to characterize the I/O behavior of data-intensive applications. IOSI extracts signatures from noisy, zero-overhead server-side I/O throughput logs that are already collected on today s supercomputers, without interfering with the compiling/execution of applications. We evaluated IOSI using the Spider storage system at Oak Ridge National Laboratory, the S3D turbulence application (running on 18,000 Titan nodes), and benchmark-based pseudo-applications. Through our ex- periments we confirmed that IOSI effectively extracts an application s I/O signature despite significant server-side noise. Compared to client-side tracing tools, IOSI is transparent, interface-agnostic, and incurs no overhead. Compared to alternative data alignment techniques (e.g., dynamic time warping), it offers higher signature accuracy and shorter processing time.« less
Bhattacharyya, Parthasarathi; Mondal, Ashok; Dey, Rana; Saha, Dipanjan; Saha, Goutam
2015-05-01
Auscultation is an important part of the clinical examination of different lung diseases. Objective analysis of lung sounds based on underlying characteristics and its subsequent automatic interpretations may help a clinical practice. We collected the breath sounds from 8 normal subjects and 20 diffuse parenchymal lung disease (DPLD) patients using a newly developed instrument and then filtered off the heart sounds using a novel technology. The collected sounds were thereafter analysed digitally on several characteristics as dynamical complexity, texture information and regularity index to find and define their unique digital signatures for differentiating normality and abnormality. For convenience of testing, these characteristic signatures of normal and DPLD lung sounds were transformed into coloured visual representations. The predictive power of these images has been validated by six independent observers that include three physicians. The proposed method gives a classification accuracy of 100% for composite features for both the normal as well as lung sound signals from DPLD patients. When tested by independent observers on the visually transformed images, the positive predictive value to diagnose the normality and DPLD remained 100%. The lung sounds from the normal and DPLD subjects could be differentiated and expressed according to their digital signatures. On visual transformation to coloured images, they retain 100% predictive power. This technique may assist physicians to diagnose DPLD from visual images bearing the digital signature of the condition. © 2015 Asian Pacific Society of Respirology.
IRLooK: an advanced mobile infrared signature measurement, data reduction, and analysis system
NASA Astrophysics Data System (ADS)
Cukur, Tamer; Altug, Yelda; Uzunoglu, Cihan; Kilic, Kayhan; Emir, Erdem
2007-04-01
Infrared signature measurement capability has a key role in the electronic warfare (EW) self protection systems' development activities. In this article, the IRLooK System and its capabilities will be introduced. IRLooK is a truly innovative mobile infrared signature measurement system with all its design, manufacturing and integration accomplished by an engineering philosophy peculiar to ASELSAN. IRLooK measures the infrared signatures of military and civil platforms such as fixed/rotary wing aircrafts, tracked/wheeled vehicles and navy vessels. IRLooK has the capabilities of data acquisition, pre-processing, post-processing, analysis, storing and archiving over shortwave, mid-wave and long wave infrared spectrum by means of its high resolution radiometric sensors and highly sophisticated software analysis tools. The sensor suite of IRLooK System includes imaging and non-imaging radiometers and a spectroradiometer. Single or simultaneous multiple in-band measurements as well as high radiant intensity measurements can be performed. The system provides detailed information on the spectral, spatial and temporal infrared signature characteristics of the targets. It also determines IR Decoy characteristics. The system is equipped with a high quality field proven two-axes tracking mount to facilitate target tracking. Manual or automatic tracking is achieved by using a passive imaging tracker. The system also includes a high quality weather station and field-calibration equipment including cavity and extended area blackbodies. The units composing the system are mounted on flat-bed trailers and the complete system is designed to be transportable by large body aircraft.
Automated secured cost effective key refreshing technique to enhance WiMAX privacy key management
NASA Astrophysics Data System (ADS)
Sridevi, B.; Sivaranjani, S.; Rajaram, S.
2013-01-01
In all walks of life the way of communication is transformed by the rapid growth of wireless communication and its pervasive use. A wireless network which is fixed and richer in bandwidth is specified as IEEE 802.16, promoted and launched by an industrial forum is termed as Worldwide Interoperability for Microwave Access (WiMAX). This technology enables seamless delivery of wireless broadband service for fixed and/or mobile users. The obscurity is the long delay which occurs during the handoff management in every network. Mobile WiMAX employs an authenticated key management protocol as a part of handoff management in which the Base Station (BS) controls the distribution of keying material to the Mobile Station (MS). The protocol employed is Privacy Key Management Version 2- Extensible Authentication Protocol (PKMV2-EAP) which is responsible for the normal and periodical authorization of MSs, reauthorization as well as key refreshing. Authorization key (AK) and Traffic Encryption key (TEK) plays a vital role in key exchange. When the lifetime of key expires, MS has to request for a new key to BS which in turn leads to repetition of authorization, authentication as well as key exchange. To avoid service interruption during reauthorization , two active keys are transmitted at the same time by BS to MS. The consequences of existing work are hefty amount of bandwidth utilization, time consumption and large storage. It is also endured by Man in the Middle attack and Impersonation due to lack of security in key exchange. This paper designs an automatic mutual refreshing of keys to minimize bandwidth utilization, key storage and time consumption by proposing Previous key and Iteration based Key Refreshing Function (PKIBKRF). By integrating PKIBKRF in key generation, the simulation results indicate that 21.8% of the bandwidth and storage of keys are reduced and PKMV2 mutual authentication time is reduced by 66.67%. The proposed work is simulated with Qualnet model and backed by MATLAB for processing and MYSQL for storing keys.
Spinelli, Lionel; Carpentier, Sabrina; Montañana Sanchis, Frédéric; Dalod, Marc; Vu Manh, Thien-Phong
2015-10-19
Recent advances in the analysis of high-throughput expression data have led to the development of tools that scaled-up their focus from single-gene to gene set level. For example, the popular Gene Set Enrichment Analysis (GSEA) algorithm can detect moderate but coordinated expression changes of groups of presumably related genes between pairs of experimental conditions. This considerably improves extraction of information from high-throughput gene expression data. However, although many gene sets covering a large panel of biological fields are available in public databases, the ability to generate home-made gene sets relevant to one's biological question is crucial but remains a substantial challenge to most biologists lacking statistic or bioinformatic expertise. This is all the more the case when attempting to define a gene set specific of one condition compared to many other ones. Thus, there is a crucial need for an easy-to-use software for generation of relevant home-made gene sets from complex datasets, their use in GSEA, and the correction of the results when applied to multiple comparisons of many experimental conditions. We developed BubbleGUM (GSEA Unlimited Map), a tool that allows to automatically extract molecular signatures from transcriptomic data and perform exhaustive GSEA with multiple testing correction. One original feature of BubbleGUM notably resides in its capacity to integrate and compare numerous GSEA results into an easy-to-grasp graphical representation. We applied our method to generate transcriptomic fingerprints for murine cell types and to assess their enrichments in human cell types. This analysis allowed us to confirm homologies between mouse and human immunocytes. BubbleGUM is an open-source software that allows to automatically generate molecular signatures out of complex expression datasets and to assess directly their enrichment by GSEA on independent datasets. Enrichments are displayed in a graphical output that helps interpreting the results. This innovative methodology has recently been used to answer important questions in functional genomics, such as the degree of similarities between microarray datasets from different laboratories or with different experimental models or clinical cohorts. BubbleGUM is executable through an intuitive interface so that both bioinformaticians and biologists can use it. It is available at http://www.ciml.univ-mrs.fr/applications/BubbleGUM/index.html .
NASA Astrophysics Data System (ADS)
Packard, Corey D.; Klein, Mark D.; Viola, Timothy S.; Hepokoski, Mark A.
2016-10-01
The ability to predict electro-optical (EO) signatures of diverse targets against cluttered backgrounds is paramount for signature evaluation and/or management. Knowledge of target and background signatures is essential for a variety of defense-related applications. While there is no substitute for measured target and background signatures to determine contrast and detection probability, the capability to simulate any mission scenario with desired environmental conditions is a tremendous asset for defense agencies. In this paper, a systematic process for the thermal and visible-through-infrared simulation of camouflaged human dismounts in cluttered outdoor environments is presented. This process, utilizing the thermal and EO/IR radiance simulation tool TAIThermIR (and MuSES), provides a repeatable and accurate approach for analyzing contrast, signature and detectability of humans in multiple wavebands. The engineering workflow required to combine natural weather boundary conditions and the human thermoregulatory module developed by ThermoAnalytics is summarized. The procedure includes human geometry creation, human segmental physiology description and transient physical temperature prediction using environmental boundary conditions and active thermoregulation. Radiance renderings, which use Sandford-Robertson BRDF optical surface property descriptions and are coupled with MODTRAN for the calculation of atmospheric effects, are demonstrated. Sensor effects such as optical blurring and photon noise can be optionally included, increasing the accuracy of detection probability outputs that accompany each rendering. This virtual evaluation procedure has been extensively validated and provides a flexible evaluation process that minimizes the difficulties inherent in human-subject field testing. Defense applications such as detection probability assessment, camouflage pattern evaluation, conspicuity tests and automatic target recognition are discussed.
NASA Astrophysics Data System (ADS)
Packard, Corey D.; Viola, Timothy S.; Klein, Mark D.
2017-10-01
The ability to predict spectral electro-optical (EO) signatures for various targets against realistic, cluttered backgrounds is paramount for rigorous signature evaluation. Knowledge of background and target signatures, including plumes, is essential for a variety of scientific and defense-related applications including contrast analysis, camouflage development, automatic target recognition (ATR) algorithm development and scene material classification. The capability to simulate any desired mission scenario with forecast or historical weather is a tremendous asset for defense agencies, serving as a complement to (or substitute for) target and background signature measurement campaigns. In this paper, a systematic process for the physical temperature and visible-through-infrared radiance prediction of several diverse targets in a cluttered natural environment scene is presented. The ability of a virtual airborne sensor platform to detect and differentiate targets from a cluttered background, from a variety of sensor perspectives and across numerous wavelengths in differing atmospheric conditions, is considered. The process described utilizes the thermal and radiance simulation software MuSES and provides a repeatable, accurate approach for analyzing wavelength-dependent background and target (including plume) signatures in multiple band-integrated wavebands (multispectral) or hyperspectrally. The engineering workflow required to combine 3D geometric descriptions, thermal material properties, natural weather boundary conditions, all modes of heat transfer and spectral surface properties is summarized. This procedure includes geometric scene creation, material and optical property attribution, and transient physical temperature prediction. Radiance renderings, based on ray-tracing and the Sandford-Robertson BRDF model, are coupled with MODTRAN for the inclusion of atmospheric effects. This virtual hyperspectral/multispectral radiance prediction methodology has been extensively validated and provides a flexible process for signature evaluation and algorithm development.
[Security aspects on the Internet].
Seibel, R M; Kocher, K; Landsberg, P
2000-04-01
Is it possible to use the Internet as a secure media for transport of telemedicine? Which risks exist for routine use? In this article state of the art methods of security were analysed. Telemedicine in the Internet has severe risks, because patient data and hospital data of a secure Intranet can be manipulated by connecting it to the Web. Establishing of a firewall and the introduction of HPC (Health Professional Card) are minimizing the risk of un-authorized access to the hospital server. HPC allows good safety with digital signature and authentication of host and client of medical data. For secure e-mail PGP (Pretty Good Privacy) is easy to use as a standard protocol. Planning all activities exactly as well as following legal regulations are important requisites for reduction of safety risks in Internet.
Uses of software in digital image analysis: a forensic report
NASA Astrophysics Data System (ADS)
Sharma, Mukesh; Jha, Shailendra
2010-02-01
Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.
A new technology for automatic identification and sorting of plastics for recycling.
Ahmad, S R
2004-10-01
A new technology for automatic sorting of plastics, based upon optical identification of fluorescence signatures of dyes, incorporated in such materials in trace concentrations prior to product manufacturing, is described. Three commercial tracers were selected primarily on the basis of their good absorbency in the 310-370 nm spectral band and their identifiable narrow-band fluorescence signatures in the visible band of the spectrum when present in binary combinations. This absorption band was selected because of the availability of strong emission lines in this band from a commercial Hg-arc lamp and high fluorescence quantum yields of the tracers at this excitation wavelength band. The plastics chosen for tracing and identification are HDPE, LDPE, PP, EVA, PVC and PET and the tracers were compatible and chemically non-reactive with the host matrices and did not affect the transparency of the plastics. The design of a monochromatic and collimated excitation source, the sensor system are described and their performances in identifying and sorting plastics doped with tracers at a few parts per million concentration levels are evaluated. In an industrial sorting system, the sensor was able to sort 300 mm long plastic bottles at a conveyor belt speed of 3.5 m.sec(-1) with a sorting purity of -95%. The limitation was imposed due to mechanical singulation irregularities at high speed and the limited processing speed of the computer used.
Lesiak, Ashton D; Cody, Robert B; Dane, A John; Musah, Rabi A
2015-09-01
Plant species identification based on the morphological features of plant parts is a well-established science in botany. However, species identification from seeds has largely been unexplored, despite the fact that the seeds contain all of the genetic information that distinguishes one plant from another. Using seeds of genus Datura plants, we show here that the mass spectrum-derived chemical fingerprints for seeds of the same species are similar. On the other hand, seeds from different species within the same genus display distinct chemical signatures, even though they may contain similar characteristic biomarkers. The intraspecies chemical signature similarities on the one hand, and interspecies fingerprint differences on the other, can be processed by multivariate statistical analysis methods to enable rapid species-level identification and differentiation. The chemical fingerprints can be acquired rapidly and in a high-throughput manner by direct analysis in real time mass spectrometry (DART-MS) analysis of the seeds in their native form, without use of a solvent extract. Importantly, knowledge of the identity of the detected molecules is not required for species level identification. However, confirmation of the presence within the seeds of various characteristic tropane and other alkaloids, including atropine, scopolamine, scopoline, tropine, tropinone, and tyramine, was accomplished by comparison of the in-source collision-induced dissociation (CID) fragmentation patterns of authentic standards, to the fragmentation patterns observed in the seeds when analyzed under similar in-source CID conditions. The advantages, applications, and implications of the chemometric processing of DART-MS derived seed chemical signatures for species level identification and differentiation are discussed.
Protecting against cyber threats in networked information systems
NASA Astrophysics Data System (ADS)
Ertoz, Levent; Lazarevic, Aleksandar; Eilertson, Eric; Tan, Pang-Ning; Dokas, Paul; Kumar, Vipin; Srivastava, Jaideep
2003-07-01
This paper provides an overview of our efforts in detecting cyber attacks in networked information systems. Traditional signature based techniques for detecting cyber attacks can only detect previously known intrusions and are useless against novel attacks and emerging threats. Our current research at the University of Minnesota is focused on developing data mining techniques to automatically detect attacks against computer networks and systems. This research is being conducted as a part of MINDS (Minnesota Intrusion Detection System) project at the University of Minnesota. Experimental results on live network traffic at the University of Minnesota show that the new techniques show great promise in detecting novel intrusions. In particular, during the past few months our techniques have been successful in automatically identifying several novel intrusions that could not be detected using state-of-the-art tools such as SNORT.
A comparison of machine learning techniques for survival prediction in breast cancer
2011-01-01
Background The ability to accurately classify cancer patients into risk classes, i.e. to predict the outcome of the pathology on an individual basis, is a key ingredient in making therapeutic decisions. In recent years gene expression data have been successfully used to complement the clinical and histological criteria traditionally used in such prediction. Many "gene expression signatures" have been developed, i.e. sets of genes whose expression values in a tumor can be used to predict the outcome of the pathology. Here we investigate the use of several machine learning techniques to classify breast cancer patients using one of such signatures, the well established 70-gene signature. Results We show that Genetic Programming performs significantly better than Support Vector Machines, Multilayered Perceptrons and Random Forests in classifying patients from the NKI breast cancer dataset, and comparably to the scoring-based method originally proposed by the authors of the 70-gene signature. Furthermore, Genetic Programming is able to perform an automatic feature selection. Conclusions Since the performance of Genetic Programming is likely to be improvable compared to the out-of-the-box approach used here, and given the biological insight potentially provided by the Genetic Programming solutions, we conclude that Genetic Programming methods are worth further investigation as a tool for cancer patient classification based on gene expression data. PMID:21569330
NASA Astrophysics Data System (ADS)
Wesemann, Johannes; Burgholzer, Reinhard; Herrnegger, Mathew; Schulz, Karsten
2017-04-01
In recent years, a lot of research in hydrological modelling has been invested to improve the automatic calibration of rainfall-runoff models. This includes for example (1) the implementation of new optimisation methods, (2) the incorporation of new and different objective criteria and signatures in the optimisation and (3) the usage of auxiliary data sets apart from runoff. Nevertheless, in many applications manual calibration is still justifiable and frequently applied. The hydrologist performing the manual calibration, with his expert knowledge, is able to judge the hydrographs simultaneously concerning details but also in a holistic view. This integrated eye-ball verification procedure available to man can be difficult to formulate in objective criteria, even when using a multi-criteria approach. Comparing the results of automatic and manual calibration is not straightforward. Automatic calibration often solely involves objective criteria such as Nash-Sutcliffe Efficiency Coefficient or the Kling-Gupta-Efficiency as a benchmark during the calibration. Consequently, a comparison based on such measures is intrinsically biased towards automatic calibration. Additionally, objective criteria do not cover all aspects of a hydrograph leaving questions concerning the quality of a simulation open. This contribution therefore seeks to examine the quality of manually and automatically calibrated hydrographs by interactively involving expert knowledge in the evaluation. Simulations have been performed for the Mur catchment in Austria with the rainfall-runoff model COSERO using two parameter sets evolved from a manual and an automatic calibration. A subset of resulting hydrographs for observation and simulation, representing the typical flow conditions and events, will be evaluated in this study. In an interactive crowdsourcing approach experts attending the session can vote for their preferred simulated hydrograph without having information on the calibration method that produced the respective hydrograph. Therefore, the result of the poll can be seen as an additional quality criterion for the comparison of the two different approaches and help in the evaluation of the automatic calibration method.
Nationwide forestry applications program. Ten-Ecosystem Study (TES) site 7, Weld County, Colorado
NASA Technical Reports Server (NTRS)
Weaver, J. E. (Principal Investigator); Almond, R. H.
1979-01-01
The author has identified the following significant results. The best dates for automatic data processing analysis appeared to be in midsummer. The level 2 separation of grassland, water, and other resources was reasonably successful, but the level 3 separation of grassland into cultivated (growing crops) and weeds did not appear feasible. Low simulated inventory proportions of grassland indicated that the restricted inventory signature was not representative of all grassland classes and could not be extended with acceptable accuracy.
Wetlands delineation by spectral signature analysis and legal implications
NASA Technical Reports Server (NTRS)
Anderon, R. R.; Carter, V.
1972-01-01
High altitude analysis of wetland resources and the use of such information in an operational mode to address specific problems of wetland preservation at a state level are discussed. Work efforts were directed toward: (1) developing techniques for using large scale color IR photography in state wetlands mapping program, (2) developing methods for obtaining wetlands ecology information from high altitude photography, (3) developing means by which spectral data can be more accurately analyzed visually, and (4) developing spectral data for automatic mapping of wetlands.
Automatic Vetting for Malice in Android Platforms
2016-05-01
be distinguished. For example, both Face- book and Instagram ship a class named Landroid/support/v4/app/Fragment;§ with different method signatures...relationships (e.g., Lcom/ instagram /.../LoadImageTask; is a sub- class of the abstract class Landroid/os/AsyncTask;). Figure 2(c) displays a method graph...7feaf7c75a5305b1083a160f...baa6.dex.dex parent_loader 3 Instagram instagram_classes.dex parent_loader 4 JohnNESLite johnneslite_classes.dex parent_loader (a) Class Loader
To eliminate automatic pay adjustments for Members of Congress, and for other purposes.
Rep. Latta, Robert E. [R-OH-5
2009-01-15
House - 03/23/2009 Motion to Discharge Committee filed by Mr. Latta. Petition No: 111-1. (All Actions) Notes: On 3/23/2009, a motion was filed to discharge the Committee on House Administration, and the Committee on Oversight and Government Reform from consideration of H.R.581. A discharge petition requires 218 signatures for further action. (Discharge Petition No. 111-1: text with... Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
The Evaluation of Small Arms Effectiveness Criteria, Volume I
1975-05-01
UNCLASSIFIED AD NUMBER ADB004382 NEW LIMITATION CHANGE TO Approved for public release, distribution unlimited FROM Distribution authorized to U.S...aimedSand pointed fire E-14 E-4 Frequency distribution of sizes of M16 and BAR bursts of automatic fire E-16 SE-5 Percent of times each range bracket...defense range F-10 F-4 Weapon-signature simuilator F-15 1 F-5 Target components in armored target box F-17 F-6 Portable round counter for the M16 rifle
Yang, Huan; Zheng, Jie; Wang, Hai-Yan; Li, Nan; Yang, Ya-Ya; Shen, Yu-Ping
2017-01-01
Gelatinous Chinese medicines (GCMs) including Asini Corii Colla, Testudinis Carapacis ET Plastri Colla, and Cervi Cornus Colla, were made from reptile shell or mammalian skin or deer horn, and consumed as a popular tonic, as well as hemopoietic and hemostatic agents. Misuse of them would not exert their functions, and fake or adulterate products have caused drug market disorder and affected food and drug safety. GCMs are rich in denatured proteins, but insufficient in available DNA fragments, hence commonly used cytochrome c oxidase I barcoding was not successful for their authentication. In this study, we performed comparative proteomic analysis of them and their animal origins to identify the composition of intrinsic proteins for the first time. A reliable and convenient approach was proposed for their authentication, by the incorporation of sodium dodecyl sulfate-polyacrylamide gel electrophoresis, two-dimensional electrophoresis, and matrix-assisted laser desorption/ionization-time of flight/time of flight mass spectrometry (MALDI-TOF/TOF-MS). A total of 26 proteins were identified from medicinal parts of original animals, and GCMs proteins presented in a dispersive manner in electrophoresis analyses due to complicated changes in the structure of original proteins caused by long-term decoction and the addition of ingredients during their manufacturing. In addition, by comparison of MALDI-TOF/TOF-MS profiling, 19 signature peptide fragments originated from the protein of GCM products were selected according to criteria. These could assist in the discrimination and identification of adulterates of GCMs and other ACMs for their form of raw medicinal material, the pulverized, and even the complex. Comparative proteomic analysis of three gelatinous Chinese medicines was conducted, and their authentications were based on tryptic-digested peptides profiling using matrix-assisted laser desorption/ionization-time of flight/time of flight mass spectrometry. Abbreviations used: GCMs: Gelatinous Chinese medicines, COI: Cytochrome c oxidase I, SDS-PAGE: Sodium dodecyl sulfate polyacrylamide gel electrophoresis, 2-DE: Two-dimensional electrophoresis, MALDI-TOF/TOF-MS: Matrix-assisted laser desorption/ionization-time of flight/time of flight mass spectrometry, LC: Liquid chromatography, ChP: Chinese Pharmacopoeia, HPLC: High performance liquid chromatography, LC-ESI + -MS: Liquid chromatography-electro spray ionization-mass spectrometry, IEF: isoelectric focusing, HCCA: α-Cyano-4-hydroxycinnamic acid.
NASA Astrophysics Data System (ADS)
McManus, Catherine E.; Dowe, James; McMillan, Nancy J.
2018-07-01
Many industrial and commercial issues involve authentication of such matters as the manufacturer or geographic source of a material, and quality control of materials, determining whether specific treatments have been properly applied, or if a material is authentic or fraudulent. Often, multiple analytical techniques and tests are used, resulting in expensive and time-consuming testing procedures. Laser-Induced Breakdown Spectroscopy (LIBS) is a rapid laser ablation spectroscopic analytical method. Each LIBS spectrum contains information about the concentration of every element, some isotopic ratios, and the molecular structure of the material, making it a unique and comprehensive signature of the material. Quantagenetics® is a multivariate statistical method based on Bayesian statistics that uses the Euclidian distance between LIBS spectra of materials to classify materials (US Patents 9,063,085 and 8,699,022). The fundamental idea behind Quantagenetics® is that LIBS spectra contain sufficient information to determine the origin and history of materials. This study presents two case studies that illustrate the method. LIBS spectra from 510 Colombian emeralds from 18 mines were classified by mine. Overall, 99.4% of the spectra were correctly classified; the success rate for individual mines ranges from 98.2% to 100%. Some of the mines are separated by distances as little as 200 m, indicating that the method uses the slight but consistent differences in composition to identify the mine of origin accurately. The second study used bars of 17-4 stainless steel from three manufacturers. Each of the three bars was cut into 90 coupons; 30 of each bar received no further treatment, another 30 from each bar received one tempering and hardening treatment, and the final 30 coupons from each bar received a different heat treatment. Using LIBS spectra taken from the coupons, the Quantagenetics® method classified the 270 coupons both by manufacturer (composition) and heat treatment (structure) with an overall success rate of 95.3%. Individual success rates range from 92.4% to 97.6%. These case studies were successful despite having no preconceived knowledge of the materials; artificial intelligence allows the materials to classify themselves without human intervention or bias. Multivariate analysis of LIBS spectra using the Quantagenetics® method has promise to improve quality control and authentication of a wide variety of materials in industrial enterprises.
Li, Ming; Jia, Bin; Ding, Liying; Hong, Feng; Ouyang, Yongzhong; Chen, Rui; Zhou, Shumin; Chen, Huanwen; Fang, Xiang
2013-09-01
Molecular images of documents were obtained by sequentially scanning the surface of the document using desorption atmospheric pressure chemical ionization mass spectrometry (DAPCI-MS), which was operated in either a gasless, solvent-free or methanol vapor-assisted mode. The decay process of the ink used for handwriting was monitored by following the signal intensities recorded by DAPCI-MS. Handwritings made using four types of inks on four kinds of paper surfaces were tested. By studying the dynamic decay of the inks, DAPCI-MS imaging differentiated a 10-min old from two 4 h old samples. Non-destructive forensic analysis of forged signatures either handwritten or computer-assisted was achieved according to the difference of the contour in DAPCI images, which was attributed to the strength personalized by different writers. Distinction of the order of writing/stamping on documents and detection of illegal printings were accomplished with a spatial resolution of about 140 µm. A Matlab® written program was developed to facilitate the visualization of the similarity between signature images obtained by DAPCI-MS. The experimental results show that DAPCI-MS imaging provides rich information at the molecular level and thus can be used for the reliable document analysis in forensic applications. © 2013 The Authors. Journal of Mass Spectrometry published by John Wiley & Sons, Ltd.
Ottoni, Claudio; Ricaut, François-X; Vanderheyden, Nancy; Brucato, Nicolas; Waelkens, Marc; Decorte, Ronny
2011-01-01
The archaeological site of Sagalassos is located in Southwest Turkey, in the western part of the Taurus mountain range. Human occupation of its territory is attested from the late 12th millennium BP up to the 13th century AD. By analysing the mtDNA variation in 85 skeletons from Sagalassos dated to the 11th–13th century AD, this study attempts to reconstruct the genetic signature potentially left in this region of Anatolia by the many civilizations, which succeeded one another over the centuries until the mid-Byzantine period (13th century BC). Authentic ancient DNA data were determined from the control region and some SNPs in the coding region of the mtDNA in 53 individuals. Comparative analyses with up to 157 modern populations allowed us to reconstruct the origin of the mid-Byzantine people still dwelling in dispersed hamlets in Sagalassos, and to detect the maternal contribution of their potential ancestors. By integrating the genetic data with historical and archaeological information, we were able to attest in Sagalassos a significant maternal genetic signature of Balkan/Greek populations, as well as ancient Persians and populations from the Italian peninsula. Some contribution from the Levant has been also detected, whereas no contribution from Central Asian population could be ascertained. PMID:21224890
Gao, Zitong; Liu, Yang; Wang, Xiaoyue; Song, Jingyuan; Chen, Shilin; Ragupathy, Subramanyam; Han, Jianping; Newmaster, Steven G
2017-07-19
Lonicerae japonicae Flos has been used to produce hundred kinds of Chinese patent medicines (CPMs) in China. Economically motivated adulterants have been documented, leading to market instability and a decline in consumer confidence. ITS2 has been used to identify raw medicinal materials, but it's not suitable for the identification of botanical extracts and complex CPMs. Therefore, a short barcode for the identification of processed CPMs would be profitable. A 34 bp nucleotide signature (5' CTAGCGGTGGTCGTACGATAGCCAATGCATGAGT 3') was developed derived from ITS2 region of Eucommiae Folium based on unique motifs. Mixtures of powdered Lonicerae japonicae Flos and Lonicerae Flos resulted in double peaks at the expected SNP (Single Nucleotide Polymorphisms) positions, of which the height of the peaks were roughly indicative of the species' ratio in the mixed powder. Subsequently we tested 20 extracts and 47 CPMs labelled as containing some species of Lonicera. The results revealed only 17% of the extracts and 22% of the CPMs were authentic, others exist substitution or adulterant; 7% were shown to contain both of two adulterants Eucommiae Folium and Lonicerae Flos. The methods developed in this study will widely broaden the application of DNA barcode in quality assurance of natural health products.
Spatial-spectral preprocessing for endmember extraction on GPU's
NASA Astrophysics Data System (ADS)
Jimenez, Luis I.; Plaza, Javier; Plaza, Antonio; Li, Jun
2016-10-01
Spectral unmixing is focused in the identification of spectrally pure signatures, called endmembers, and their corresponding abundances in each pixel of a hyperspectral image. Mainly focused on the spectral information contained in the hyperspectral images, endmember extraction techniques have recently included spatial information to achieve more accurate results. Several algorithms have been developed for automatic or semi-automatic identification of endmembers using spatial and spectral information, including the spectral-spatial endmember extraction (SSEE) where, within a preprocessing step in the technique, both sources of information are extracted from the hyperspectral image and equally used for this purpose. Previous works have implemented the SSEE technique in four main steps: 1) local eigenvectors calculation in each sub-region in which the original hyperspectral image is divided; 2) computation of the maxima and minima projection of all eigenvectors over the entire hyperspectral image in order to obtain a candidates pixels set; 3) expansion and averaging of the signatures of the candidate set; 4) ranking based on the spectral angle distance (SAD). The result of this method is a list of candidate signatures from which the endmembers can be extracted using various spectral-based techniques, such as orthogonal subspace projection (OSP), vertex component analysis (VCA) or N-FINDR. Considering the large volume of data and the complexity of the calculations, there is a need for efficient implementations. Latest- generation hardware accelerators such as commodity graphics processing units (GPUs) offer a good chance for improving the computational performance in this context. In this paper, we develop two different implementations of the SSEE algorithm using GPUs. Both are based on the eigenvectors computation within each sub-region of the first step, one using the singular value decomposition (SVD) and another one using principal component analysis (PCA). Based on our experiments with hyperspectral data sets, high computational performance is observed in both cases.
In-Flight Technique for Acquiring Mid- And Far-Field Sonic Boom Signatures
NASA Technical Reports Server (NTRS)
Stansbery, Eugene G.; Baize, Daniel G.; Maglieri, Domenic, J.
1999-01-01
Flight test experiments have been conducted to establish the feasibility of obtaining sonic boom signature measurements below a supersonic aircraft using the NASA Portable Automatic Triggering System (PATS) mounted in the USMC Pioneer Unmanned Aerial Vehicle (UAV). This study forms a part of the NASA sonic boom minimization activities, specifically the demonstration of persistence of modified boom signatures to very large distances in a real atmosphere. The basic objective of the measurement effort was to obtain a qualitative view of the sonic boom signature in terms of its shape, number of shocks, their locations, and their relative strength. Results suggest that the technique may very well provide quantitative information relative to mid-field and far-field boom signatures. The purpose of this presentation is to describe the arrangement and operation of this in-flight system and to present the resulting sonic boom measurements. Adaption and modification of two PATS to the UAV payload section are described and include transducer location, mounting arrangement and recording system isolation. Ground static runup, takeoff and landing, and cruise flight checkouts regarding UAV propeller and flow noise on the PATS automated triggering system and recording mode are discussed. For the proof-of-concept tests, the PATS instrumented UAV was flown under radar control in steady-level flight at the altitude of 8700 feet MSL and at a cruise speed of about 60 knots. The USN F-4N sonic boom generating aircraft was vectored over the UAV on reciprocal headings at altitudes of about 1 1,000 feet MSL and 13,000 feet MSL at about Mach 1. 15. Sonic boom signatures were acquired on both PATS for all six supersonic passes. Although the UAV propeller noise is clearly evident in all the measurements, the F-4 boom signature is clearly distinguishable and is typically N-wave in character with sharply rising shock fronts and with a mid-shock associated with the inlet-wing juncture. Consideration is being given to adapting the PATS/TJAV measurements technique to the NASA Learjet to determine feasibility of acquiring in-flight boom signatures in the altitude range of 10,000 feet to 40,000 feet.
NASA Astrophysics Data System (ADS)
Lahamy, H.; Lichti, D.
2012-07-01
The automatic interpretation of human gestures can be used for a natural interaction with computers without the use of mechanical devices such as keyboards and mice. The recognition of hand postures have been studied for many years. However, most of the literature in this area has considered 2D images which cannot provide a full description of the hand gestures. In addition, a rotation-invariant identification remains an unsolved problem even with the use of 2D images. The objective of the current study is to design a rotation-invariant recognition process while using a 3D signature for classifying hand postures. An heuristic and voxelbased signature has been designed and implemented. The tracking of the hand motion is achieved with the Kalman filter. A unique training image per posture is used in the supervised classification. The designed recognition process and the tracking procedure have been successfully evaluated. This study has demonstrated the efficiency of the proposed rotation invariant 3D hand posture signature which leads to 98.24% recognition rate after testing 12723 samples of 12 gestures taken from the alphabet of the American Sign Language.
Autonomous target recognition using remotely sensed surface vibration measurements
NASA Astrophysics Data System (ADS)
Geurts, James; Ruck, Dennis W.; Rogers, Steven K.; Oxley, Mark E.; Barr, Dallas N.
1993-09-01
The remotely measured surface vibration signatures of tactical military ground vehicles are investigated for use in target classification and identification friend or foe (IFF) systems. The use of remote surface vibration sensing by a laser radar reduces the effects of partial occlusion, concealment, and camouflage experienced by automatic target recognition systems using traditional imagery in a tactical battlefield environment. Linear Predictive Coding (LPC) efficiently represents the vibration signatures and nearest neighbor classifiers exploit the LPC feature set using a variety of distortion metrics. Nearest neighbor classifiers achieve an 88 percent classification rate in an eight class problem, representing a classification performance increase of thirty percent from previous efforts. A novel confidence figure of merit is implemented to attain a 100 percent classification rate with less than 60 percent rejection. The high classification rates are achieved on a target set which would pose significant problems to traditional image-based recognition systems. The targets are presented to the sensor in a variety of aspects and engine speeds at a range of 1 kilometer. The classification rates achieved demonstrate the benefits of using remote vibration measurement in a ground IFF system. The signature modeling and classification system can also be used to identify rotary and fixed-wing targets.
Ju, Seung-hwan; Seo, Hee-suk; Han, Sung-hyu; Ryou, Jae-cheol; Kwak, Jin
2013-01-01
The prevalence of computers and the development of the Internet made us able to easily access information. As people are concerned about user information security, the interest of the user authentication method is growing. The most common computer authentication method is the use of alphanumerical usernames and passwords. The password authentication systems currently used are easy, but only if you know the password, as the user authentication is vulnerable. User authentication using fingerprints, only the user with the information that is specific to the authentication security is strong. But there are disadvantage such as the user cannot change the authentication key. In this study, we proposed authentication methodology that combines numeric-based password and biometric-based fingerprint authentication system. Use the information in the user's fingerprint, authentication keys to obtain security. Also, using numeric-based password can to easily change the password; the authentication keys were designed to provide flexibility.
Ju, Seung-hwan; Seo, Hee-suk; Han, Sung-hyu; Ryou, Jae-cheol
2013-01-01
The prevalence of computers and the development of the Internet made us able to easily access information. As people are concerned about user information security, the interest of the user authentication method is growing. The most common computer authentication method is the use of alphanumerical usernames and passwords. The password authentication systems currently used are easy, but only if you know the password, as the user authentication is vulnerable. User authentication using fingerprints, only the user with the information that is specific to the authentication security is strong. But there are disadvantage such as the user cannot change the authentication key. In this study, we proposed authentication methodology that combines numeric-based password and biometric-based fingerprint authentication system. Use the information in the user's fingerprint, authentication keys to obtain security. Also, using numeric-based password can to easily change the password; the authentication keys were designed to provide flexibility. PMID:24151601
Joint sparse representation for robust multimodal biometrics recognition.
Shekhar, Sumit; Patel, Vishal M; Nasrabadi, Nasser M; Chellappa, Rama
2014-01-01
Traditional biometric recognition systems rely on a single biometric signature for authentication. While the advantage of using multiple sources of information for establishing the identity has been widely recognized, computational models for multimodal biometrics recognition have only recently received attention. We propose a multimodal sparse representation method, which represents the test data by a sparse linear combination of training data, while constraining the observations from different modalities of the test subject to share their sparse representations. Thus, we simultaneously take into account correlations as well as coupling information among biometric modalities. A multimodal quality measure is also proposed to weigh each modality as it gets fused. Furthermore, we also kernelize the algorithm to handle nonlinearity in data. The optimization problem is solved using an efficient alternative direction method. Various experiments show that the proposed method compares favorably with competing fusion-based methods.
NASA Astrophysics Data System (ADS)
Peng, Xiang; Zhang, Peng; Cai, Lilong
In this paper, we present a virtual-optical based information security system model with the aid of public-key-infrastructure (PKI) techniques. The proposed model employs a hybrid architecture in which our previously published encryption algorithm based on virtual-optics imaging methodology (VOIM) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). For an asymmetric system, given an encryption key, it is computationally infeasible to determine the decryption key and vice versa. The whole information security model is run under the framework of PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOIM security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network.
Mobile-PKI Service Model for Ubiquitous Environment
NASA Astrophysics Data System (ADS)
Jeun, Inkyung; Chun, Kilsoo
One of the most important things in PKI(Public Key Infrastructure) is the private key management issue. The private key must be deal with safely for secure PKI service. Even though PKI service is usually used for identification and authentication of user in e-commerce, PKI service has many inconvenient factors. Especially, the fact that storage media of private key for PKI service is limited to PC hard disk drive or smart card users must always carry, gives an inconvenience to user and is not suitable in ubiquitous network. This paper suggests the digital signature service using a mobile phone(m-PKI service) which is suitable in future network. A mobile phone is the most widely used for personal communication means and has a characteristic of high movability. We can use the PKI service anytime and anywhere using m-PKI.
Screen-printed nanoparticles as anti-counterfeiting tags
NASA Astrophysics Data System (ADS)
Campos-Cuerva, Carlos; Zieba, Maciej; Sebastian, Victor; Martínez, Gema; Sese, Javier; Irusta, Silvia; Contamina, Vicente; Arruebo, Manuel; Santamaria, Jesus
2016-03-01
Metallic nanoparticles with different physical properties have been screen printed as authentication tags on different types of paper. Gold and silver nanoparticles show unique optical signatures, including sharp emission bandwidths and long lifetimes of the printed label, even under accelerated weathering conditions. Magnetic nanoparticles show distinct physical signals that depend on the size of the nanoparticle itself. They were also screen printed on different substrates and their magnetic signals read out using a magnetic pattern recognition sensor and a vibrating sample magnetometer. The novelty of our work lies in the demonstration that the combination of nanomaterials with optical and magnetic properties on the same printed support is possible, and the resulting combined signals can be used to obtain a user-configurable label, providing a high degree of security in anti-counterfeiting applications using simple commercially-available sensors.
Screen-printed nanoparticles as anti-counterfeiting tags.
Campos-Cuerva, Carlos; Zieba, Maciej; Sebastian, Victor; Martínez, Gema; Sese, Javier; Irusta, Silvia; Contamina, Vicente; Arruebo, Manuel; Santamaria, Jesus
2016-03-04
Metallic nanoparticles with different physical properties have been screen printed as authentication tags on different types of paper. Gold and silver nanoparticles show unique optical signatures, including sharp emission bandwidths and long lifetimes of the printed label, even under accelerated weathering conditions. Magnetic nanoparticles show distinct physical signals that depend on the size of the nanoparticle itself. They were also screen printed on different substrates and their magnetic signals read out using a magnetic pattern recognition sensor and a vibrating sample magnetometer. The novelty of our work lies in the demonstration that the combination of nanomaterials with optical and magnetic properties on the same printed support is possible, and the resulting combined signals can be used to obtain a user-configurable label, providing a high degree of security in anti-counterfeiting applications using simple commercially-available sensors.
Light Weight MP3 Watermarking Method for Mobile Terminals
NASA Astrophysics Data System (ADS)
Takagi, Koichi; Sakazawa, Shigeyuki; Takishima, Yasuhiro
This paper proposes a novel MP3 watermarking method which is applicable to a mobile terminal with limited computational resources. Considering that in most cases the embedded information is copyright information or metadata, which should be extracted before playing back audio contents, the watermark detection process should be executed at high speed. However, when conventional methods are used with a mobile terminal, it takes a considerable amount of time to detect a digital watermark. This paper focuses on scalefactor manipulation to enable high speed watermark embedding/detection for MP3 audio and also proposes the manipulation method which minimizes audio quality degradation adaptively. Evaluation tests showed that the proposed method is capable of embedding 3 bits/frame information without degrading audio quality and detecting it at very high speed. Finally, this paper describes application examples for authentication with a digital signature.
Haddad, Renato; Catharino, Rodrigo Ramos; Marques, Lygia Azevedo; Eberlin, Marcos Nogueira
2008-11-01
Perfume counterfeiting is an illegal worldwide practice that involves huge economic losses and potential consumer risk. EASI is a simple, easily performed and rapidly implemented desorption/ionization technique for ambient mass spectrometry (MS). Herein we demonstrate that EASI-MS allows nearly instantaneous perfume typification and counterfeit detection. Samples are simply sprayed onto a glass rod or paper surface and, after a few seconds of ambient drying, a profile of the most polar components of the perfume is acquired. These components provide unique and reproducible chemical signatures for authentic perfume samples. Counterfeiting is readily recognized since the exact set and relative proportions of the more polar chemicals, sometimes at low concentrations, are unknown or hard to reproduce by the counterfeiters and hence very distinct and variable EASI-MS profiles are observed for the counterfeit samples.
Quantum Dialogue with Authentication Based on Bell States
NASA Astrophysics Data System (ADS)
Shen, Dongsu; Ma, Wenping; Yin, Xunru; Li, Xiaoping
2013-06-01
We propose an authenticated quantum dialogue protocol, which is based on a shared private quantum entangled channel. In this protocol, the EPR pairs are randomly prepared in one of the four Bell states for communication. By performing four Pauli operations on the shared EPR pairs to encode their shared authentication key and secret message, two legitimate users can implement mutual identity authentication and quantum dialogue without the help from the third party authenticator. Furthermore, due to the EPR pairs which are used for secure communication are utilized to implement authentication and the whole authentication process is included in the direct secure communication process, it does not require additional particles to realize authentication in this protocol. The updated authentication key provides the counterparts with a new authentication key for the next authentication and direct communication. Compared with other secure communication with authentication protocols, this one is more secure and efficient owing to the combination of authentication and direct communication. Security analysis shows that it is secure against the eavesdropping attack, the impersonation attack and the man-in-the-middle (MITM) attack.
Toward Automatic Verification of Goal-Oriented Flow Simulations
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.
2014-01-01
We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.
Automatic Organ Segmentation for CT Scans Based on Super-Pixel and Convolutional Neural Networks.
Liu, Xiaoming; Guo, Shuxu; Yang, Bingtao; Ma, Shuzhi; Zhang, Huimao; Li, Jing; Sun, Changjian; Jin, Lanyi; Li, Xueyan; Yang, Qi; Fu, Yu
2018-04-20
Accurate segmentation of specific organ from computed tomography (CT) scans is a basic and crucial task for accurate diagnosis and treatment. To avoid time-consuming manual optimization and to help physicians distinguish diseases, an automatic organ segmentation framework is presented. The framework utilized convolution neural networks (CNN) to classify pixels. To reduce the redundant inputs, the simple linear iterative clustering (SLIC) of super-pixels and the support vector machine (SVM) classifier are introduced. To establish the perfect boundary of organs in one-pixel-level, the pixels need to be classified step-by-step. First, the SLIC is used to cut an image into grids and extract respective digital signatures. Next, the signature is classified by the SVM, and the rough edges are acquired. Finally, a precise boundary is obtained by the CNN, which is based on patches around each pixel-point. The framework is applied to abdominal CT scans of livers and high-resolution computed tomography (HRCT) scans of lungs. The experimental CT scans are derived from two public datasets (Sliver 07 and a Chinese local dataset). Experimental results show that the proposed method can precisely and efficiently detect the organs. This method consumes 38 s/slice for liver segmentation. The Dice coefficient of the liver segmentation results reaches to 97.43%. For lung segmentation, the Dice coefficient is 97.93%. This finding demonstrates that the proposed framework is a favorable method for lung segmentation of HRCT scans.
Online fingerprint verification.
Upendra, K; Singh, S; Kumar, V; Verma, H K
2007-01-01
As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.
Visual cognition during real social interaction
Skarratt, Paul A.; Cole, Geoff G.; Kuhn, Gustav
2012-01-01
Laboratory studies of social visual cognition often simulate the critical aspects of joint attention by having participants interact with a computer-generated avatar. Recently, there has been a movement toward examining these processes during authentic social interaction. In this review, we will focus on attention to faces, attentional misdirection, and a phenomenon we have termed social inhibition of return (Social IOR), that have revealed aspects of social cognition that were hitherto unknown. We attribute these discoveries to the use of paradigms that allow for more realistic social interactions to take place. We also point to an area that has begun to attract a considerable amount of interest—that of Theory of Mind (ToM) and automatic perspective taking—and suggest that this too might benefit from adopting a similar approach. PMID:22754521
Assuring quality in narrative analysis.
Bailey, P H
1996-04-01
Many nurse-researchers using qualitative strategies have been concerned with assuring quality in their work. The early literature reveals that the concepts of validity and reliability, as understood from the positivist perspective, are somehow inappropriate and inadequate when applied to interpretive research. More recent literature suggests that because of the positivist and interpretive paradigms are epistemologically divergent, the transfer of quality criteria from one perspective to the other is not automatic or even reasonable. The purpose of this article, therefore, is to clarify what the terms quality, trustworthiness, credibility, authenticity, and goodness mean in qualitative research findings. The process of assuring quality, validation, in qualitative research will be discussed within the context of the interpretive method, narrative analysis. A brief review of quality in narrative analysis nursing research will also be presented.
Singh, Anushikha; Dutta, Malay Kishore
2017-12-01
The authentication and integrity verification of medical images is a critical and growing issue for patients in e-health services. Accurate identification of medical images and patient verification is an essential requirement to prevent error in medical diagnosis. The proposed work presents an imperceptible watermarking system to address the security issue of medical fundus images for tele-ophthalmology applications and computer aided automated diagnosis of retinal diseases. In the proposed work, patient identity is embedded in fundus image in singular value decomposition domain with adaptive quantization parameter to maintain perceptual transparency for variety of fundus images like healthy fundus or disease affected image. In the proposed method insertion of watermark in fundus image does not affect the automatic image processing diagnosis of retinal objects & pathologies which ensure uncompromised computer-based diagnosis associated with fundus image. Patient ID is correctly recovered from watermarked fundus image for integrity verification of fundus image at the diagnosis centre. The proposed watermarking system is tested in a comprehensive database of fundus images and results are convincing. results indicate that proposed watermarking method is imperceptible and it does not affect computer vision based automated diagnosis of retinal diseases. Correct recovery of patient ID from watermarked fundus image makes the proposed watermarking system applicable for authentication of fundus images for computer aided diagnosis and Tele-ophthalmology applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Radar signatures of road vehicles: airborne SAR experiments
NASA Astrophysics Data System (ADS)
Palubinskas, G.; Runge, H.; Reinartz, P.
2005-10-01
The German radar satellite TerraSAR-X is a high resolution, dual receive antenna SAR satellite, which will be launched in spring 2006. Since it will have the capability to measure the velocity of moving targets, the acquired interferometric data can be useful for traffic monitoring applications on a global scale. DLR has started already the development of an automatic and operational processing system which will detect cars, measure their speed and assign them to a road. Statistical approaches are used to derive the vehicle detection algorithm, which require the knowledge of the radar signatures of vehicles, especially under consideration of the geometry of the radar look direction and the vehicle orientation. Simulation of radar signatures is a very difficult task due to the lack of realistic models of vehicles. In this paper the radar signatures of the parking cars are presented. They are estimated experimentally from airborne E-SAR X-band data, which have been collected during flight campaigns in 2003-2005. Several test cars of the same type placed in carefully selected orientation angles and several over-flights with different heading angles made it possible to cover the whole range of aspect angles from 0° to 180°. The large synthetic aperture length or beam width angle of 7° can be divided into several looks. Thus processing of each look separately allows to increase the angle resolution. Such a radar signature profile of one type of vehicle over the whole range of aspect angles in fine resolution can be used further for the verification of simulation studies and for the performance prediction for traffic monitoring with TerraSAR-X.
ERIC Educational Resources Information Center
Wood, Alex M.; Linley, P. Alex; Maltby, John; Baliousis, Michael; Joseph, Stephen
2008-01-01
This article describes the development of a measure of dispositional authenticity and tests whether authenticity is related to well-being, as predicted by several counseling psychology perspectives. Scales were designed to measure a tripartite conception of authenticity, comprising self-alienation, authentic living, and accepting external…
Measuring Teacher Authenticity: Criteria Students Use in Their Perception of Teacher Authenticity
ERIC Educational Resources Information Center
De Bruyckere, Pedro; Kirschner, Paul A.
2017-01-01
Authenticity is an often-heard term with respect to education. Tasks should be authentic, the learning environment should be authentic and, above all, the teacher should be authentic. Previous qualitative research has shown that there are four primary criteria that students in formal educational settings use when forming their perceptions of…
van den Bosch, Ralph; Taris, Toon W
2014-01-01
Previous research on authenticity has mainly focused on trait conceptualizations of authenticity (e.g., Wood et al., 2008), whereas in specific environments (e.g., at work) state conceptualizations of authenticity (cf. Van den Bosch & Taris, 2013) are at least as relevant. For example, working conditions are subject to change, and this could well have consequences for employees' perceived level of authenticity at work. The current study employs a work-specific, state-like conceptualization of authenticity to investigate the relations between authenticity at work, well-being, and work outcomes. A series of ten separate hierarchical regression analyses using data from 685 participants indicated that after controlling for selected work characteristics and demographic variables, authenticity at work accounted for on average 11% of the variance of various wellbeing and work outcomes. Of the three subscales of authenticity at work (i.e., authentic living, self-alienation, and accepting influence), self-alienation was the strongest predictor of outcomes, followed by authentic living and accepting external influence, respectively. These findings are discussed in the light of their practical and theoretical implications.
GO-PCA: An Unsupervised Method to Explore Gene Expression Data Using Prior Knowledge
Wagner, Florian
2015-01-01
Method Genome-wide expression profiling is a widely used approach for characterizing heterogeneous populations of cells, tissues, biopsies, or other biological specimen. The exploratory analysis of such data typically relies on generic unsupervised methods, e.g. principal component analysis (PCA) or hierarchical clustering. However, generic methods fail to exploit prior knowledge about the molecular functions of genes. Here, I introduce GO-PCA, an unsupervised method that combines PCA with nonparametric GO enrichment analysis, in order to systematically search for sets of genes that are both strongly correlated and closely functionally related. These gene sets are then used to automatically generate expression signatures with functional labels, which collectively aim to provide a readily interpretable representation of biologically relevant similarities and differences. The robustness of the results obtained can be assessed by bootstrapping. Results I first applied GO-PCA to datasets containing diverse hematopoietic cell types from human and mouse, respectively. In both cases, GO-PCA generated a small number of signatures that represented the majority of lineages present, and whose labels reflected their respective biological characteristics. I then applied GO-PCA to human glioblastoma (GBM) data, and recovered signatures associated with four out of five previously defined GBM subtypes. My results demonstrate that GO-PCA is a powerful and versatile exploratory method that reduces an expression matrix containing thousands of genes to a much smaller set of interpretable signatures. In this way, GO-PCA aims to facilitate hypothesis generation, design of further analyses, and functional comparisons across datasets. PMID:26575370
GO-PCA: An Unsupervised Method to Explore Gene Expression Data Using Prior Knowledge.
Wagner, Florian
2015-01-01
Genome-wide expression profiling is a widely used approach for characterizing heterogeneous populations of cells, tissues, biopsies, or other biological specimen. The exploratory analysis of such data typically relies on generic unsupervised methods, e.g. principal component analysis (PCA) or hierarchical clustering. However, generic methods fail to exploit prior knowledge about the molecular functions of genes. Here, I introduce GO-PCA, an unsupervised method that combines PCA with nonparametric GO enrichment analysis, in order to systematically search for sets of genes that are both strongly correlated and closely functionally related. These gene sets are then used to automatically generate expression signatures with functional labels, which collectively aim to provide a readily interpretable representation of biologically relevant similarities and differences. The robustness of the results obtained can be assessed by bootstrapping. I first applied GO-PCA to datasets containing diverse hematopoietic cell types from human and mouse, respectively. In both cases, GO-PCA generated a small number of signatures that represented the majority of lineages present, and whose labels reflected their respective biological characteristics. I then applied GO-PCA to human glioblastoma (GBM) data, and recovered signatures associated with four out of five previously defined GBM subtypes. My results demonstrate that GO-PCA is a powerful and versatile exploratory method that reduces an expression matrix containing thousands of genes to a much smaller set of interpretable signatures. In this way, GO-PCA aims to facilitate hypothesis generation, design of further analyses, and functional comparisons across datasets.
NASA Astrophysics Data System (ADS)
Dricker, I. G.; Friberg, P.; Hellman, S.
2001-12-01
Under the contract with the CTBTO, Instrumental Software Technologies Inc., (ISTI) has designed and developed a Standard Station Interface (SSI) - a set of executable programs and application programming interface libraries for acquisition, authentication, archiving and telemetry of seismic and infrasound data for stations of the CTBTO nuclear monitoring network. SSI (written in C) is fully supported under both the Solaris and Linux operating systems and will be shipped with fully documented source code. SSI consists of several interconnected modules. The Digitizer Interface Module maintains a near-real-time data flow between multiple digitizers and the SSI. The Disk Buffer Module is responsible for local data archival. The Station Key Management Module is a low-level tool for data authentication and verification of incoming signatures. The Data Transmission Module supports packetized near-real-time data transmission from the primary CTBTO stations to the designated Data Center. The AutoDRM module allows transport of seismic and infrasound signed data via electronic mail (auxiliary station mode). The Command Interface Module is used to pass the remote commands to the digitizers and other modules of SSI. A station operator has access to the state-of-health information and waveforms via an the Operator Interface Module. Modular design of SSI will allow painless extension of the software system within and outside the boundaries of CTBTO station requirements. Currently an alpha version of SSI undergoes extensive tests in the lab and onsite.
Access control and confidentiality in radiology
NASA Astrophysics Data System (ADS)
Noumeir, Rita; Chafik, Adil
2005-04-01
A medical record contains a large amount of data about the patient such as height, weight and blood pressure. It also contains sensitive information such as fertility, abortion, psychiatric data, sexually transmitted diseases and diagnostic results. Access to this information must be carefully controlled. Information technology has greatly improved patient care. The recent extensive deployment of digital medical images made diagnostic images promptly available to healthcare decision makers, regardless of their geographic location. Medical images are digitally archived, transferred on telecommunication networks, and visualized on computer screens. However, with the widespread use of computing and communication technologies in healthcare, the issue of data security has become increasingly important. Most of the work until now has focused on the security of data communication to ensure its integrity, authentication, confidentiality and user accountability. The mechanisms that have been proposed to achieve the security of data communication are not specific to healthcare. Data integrity can be achieved with data signature. Data authentication can be achieved with certificate exchange. Data confidentiality can be achieved with encryption. User accountability can be achieved with audits. Although these mechanisms are essential to ensure data security during its transfer on the network, access control is needed in order to ensure data confidentiality and privacy within the information system application. In this paper, we present and discuss an access control mechanism that takes into account the notion of a care process. Radiology information is categorized and a model to enforce data privacy is proposed.
Characterization of Ancient DNA Supports Long-Term Survival of Haloarchaea
Lowenstein, Tim K.; Timofeeff, Michael N.; Schubert, Brian A.; Lum, J. Koji
2014-01-01
Abstract Bacteria and archaea isolated from crystals of halite 104 to 108 years old suggest long-term survival of halophilic microorganisms, but the results are controversial. Independent verification of the authenticity of reputed living prokaryotes in ancient salt is required because of the high potential for environmental and laboratory contamination. Low success rates of prokaryote cultivation from ancient halite, however, hamper direct replication experiments. In such cases, culture-independent approaches that use the polymerase chain reaction (PCR) and sequencing of 16S ribosomal DNA are a robust alternative. Here, we use amplification, cloning, and sequencing of 16S ribosomal DNA to investigate the authenticity of halophilic archaea cultured from subsurface halite, Death Valley, California, 22,000 to 34,000 years old. We recovered 16S ribosomal DNA sequences that are identical, or nearly so (>99%), to two strains, Natronomonas DV462A and Halorubrum DV427, which were previously isolated from the same halite interval. These results provide the best independent support to date for the long-term survival of halophilic archaea in ancient halite. PCR-based approaches are sensitive to small amounts of DNA and could allow investigation of even older halites, 106 to 108 years old, from which microbial cultures have been reported. Such studies of microbial life in ancient salt are particularly important as we search for microbial signatures in similar deposits on Mars and elsewhere in the Solar System. Key Words: Ancient DNA—Halite—Haloarchaea—Long-term survival. Astrobiology 14, 553–560. PMID:24977469
An, Younghwa
2012-01-01
Recently, many biometrics-based user authentication schemes using smart cards have been proposed to improve the security weaknesses in user authentication system. In 2011, Das proposed an efficient biometric-based remote user authentication scheme using smart cards that can provide strong authentication and mutual authentication. In this paper, we analyze the security of Das's authentication scheme, and we have shown that Das's authentication scheme is still insecure against the various attacks. Also, we proposed the enhanced scheme to remove these security problems of Das's authentication scheme, even if the secret information stored in the smart card is revealed to an attacker. As a result of security analysis, we can see that the enhanced scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server.
An, Younghwa
2012-01-01
Recently, many biometrics-based user authentication schemes using smart cards have been proposed to improve the security weaknesses in user authentication system. In 2011, Das proposed an efficient biometric-based remote user authentication scheme using smart cards that can provide strong authentication and mutual authentication. In this paper, we analyze the security of Das's authentication scheme, and we have shown that Das's authentication scheme is still insecure against the various attacks. Also, we proposed the enhanced scheme to remove these security problems of Das's authentication scheme, even if the secret information stored in the smart card is revealed to an attacker. As a result of security analysis, we can see that the enhanced scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server. PMID:22899887
NASA Astrophysics Data System (ADS)
Chen, BinQiang; Zhang, ZhouSuo; Zi, YanYang; He, ZhengJia; Sun, Chuang
2013-10-01
Detecting transient vibration signatures is of vital importance for vibration-based condition monitoring and fault detection of the rotating machinery. However, raw mechanical signals collected by vibration sensors are generally mixtures of physical vibrations of the multiple mechanical components installed in the examined machinery. Fault-generated incipient vibration signatures masked by interfering contents are difficult to be identified. The fast kurtogram (FK) is a concise and smart gadget for characterizing these vibration features. The multi-rate filter-bank (MRFB) and the spectral kurtosis (SK) indicator of the FK are less powerful when strong interfering vibration contents exist, especially when the FK are applied to vibration signals of short duration. It is encountered that the impulsive interfering contents not authentically induced by mechanical faults complicate the optimal analyzing process and lead to incorrect choosing of the optimal analysis subband, therefore the original FK may leave out the essential fault signatures. To enhance the analyzing performance of FK for industrial applications, an improved version of fast kurtogram, named as "fast spatial-spectral ensemble kurtosis kurtogram", is presented. In the proposed technique, discrete quasi-analytic wavelet tight frame (QAWTF) expansion methods are incorporated as the detection filters. The QAWTF, constructed based on dual tree complex wavelet transform, possesses better vibration transient signature extracting ability and enhanced time-frequency localizability compared with conventional wavelet packet transforms (WPTs). Moreover, in the constructed QAWTF, a non-dyadic ensemble wavelet subband generating strategy is put forward to produce extra wavelet subbands that are capable of identifying fault features located in transition-band of WPT. On the other hand, an enhanced signal impulsiveness evaluating indicator, named "spatial-spectral ensemble kurtosis" (SSEK), is put forward and utilized as the quantitative measure to select optimal analyzing parameters. The SSEK indicator is robuster in evaluating the impulsiveness intensity of vibration signals due to its better suppressing ability of Gaussian noise, harmonics and sporadic impulsive shocks. Numerical validations, an experimental test and two engineering applications were used to verify the effectiveness of the proposed technique. The analyzing results of the numerical validations, experimental tests and engineering applications demonstrate that the proposed technique possesses robuster transient vibration content detecting performance in comparison with the original FK and the WPT-based FK method, especially when they are applied to the processing of vibration signals of relative limited duration.
Development, implementation and evaluation of satellite-aided agricultural monitoring systems
NASA Technical Reports Server (NTRS)
Cicone, R. C.; Crist, E. P.; Metzler, M.; Nuesch, D.
1982-01-01
Research activities in support of AgRISTARS Inventory Technology Development Project in the use of aerospace remote sensing for agricultural inventory described include: (1) corn and soybean crop spectral temporal signature characterization; (2) efficient area estimation techniques development; and (3) advanced satellite and sensor system definition. Studies include a statistical evaluation of the impact of cultural and environmental factors on crop spectral profiles, the development and evaluation of an automatic crop area estimation procedure, and the joint use of SEASAT-SAR and LANDSAT MSS for crop inventory.
ERIC Educational Resources Information Center
Lüddecke, Florian
2016-01-01
Whereas the importance of authenticity in relation to educational contexts has been highlighted, educational authenticity (EA) has mainly referred to a real-life/world convergence or the notion of teacher authenticity, implying that authenticity can be taught and learnt. This view, however, has largely overlooked philosophical considerations so…
ERIC Educational Resources Information Center
Nematollahi, Shirin; Maghsoudi, Mojtaba
2015-01-01
In this current study the researchers have tried to investigate the possible effect of authentic and non-authentic texts on Iranian EFL learners' vocabulary retention. Despite the great deal of studies conducted in the area of EFL/ESL learning, the effect of authentic versus non-authentic texts have almost gained little attention and been…
Fulfillment of HTTP Authentication Based on Alcatel OmniSwitch 9700
NASA Astrophysics Data System (ADS)
Liu, Hefu
This paper provides a way of HTTP authentication On Alcatel OmniSwitch 9700. Authenticated VLANs control user access to network resources based on VLAN assignment and user authentication. The user can be authenticated through the switch via any standard Web browser software. Web browser client displays the username and password prompts. Then a way for HTML forms can be given to pass HTTP authentication data when it's submitted. A radius server will provide a database of user information that the switch checks whenever it tries to authenticate through the switch. Before or after authentication, the client can get an address from a Dhcp server.
Optical identification using imperfections in 2D materials
NASA Astrophysics Data System (ADS)
Cao, Yameng; Robson, Alexander J.; Alharbi, Abdullah; Roberts, Jonathan; Woodhead, Christopher S.; Noori, Yasir J.; Bernardo-Gavito, Ramón; Shahrjerdi, Davood; Roedig, Utz; Fal'ko, Vladimir I.; Young, Robert J.
2017-12-01
The ability to uniquely identify an object or device is important for authentication. Imperfections, locked into structures during fabrication, can be used to provide a fingerprint that is challenging to reproduce. In this paper, we propose a simple optical technique to read unique information from nanometer-scale defects in 2D materials. Imperfections created during crystal growth or fabrication lead to spatial variations in the bandgap of 2D materials that can be characterized through photoluminescence measurements. We show a simple setup involving an angle-adjustable transmission filter, simple optics and a CCD camera can capture spatially-dependent photoluminescence to produce complex maps of unique information from 2D monolayers. Atomic force microscopy is used to verify the origin of the optical signature measured, demonstrating that it results from nanometer-scale imperfections. This solution to optical identification with 2D materials could be employed as a robust security measure to prevent counterfeiting.
Virtual-optical information security system based on public key infrastructure
NASA Astrophysics Data System (ADS)
Peng, Xiang; Zhang, Peng; Cai, Lilong; Niu, Hanben
2005-01-01
A virtual-optical based encryption model with the aid of public key infrastructure (PKI) is presented in this paper. The proposed model employs a hybrid architecture in which our previously published encryption method based on virtual-optics scheme (VOS) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). The whole information security model is run under the framework of international standard ITU-T X.509 PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOS security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network. Numerical experiments prove the effectiveness of the method. The security of proposed model is briefly analyzed by examining some possible attacks from the viewpoint of a cryptanalysis.
Graph State-Based Quantum Group Authentication Scheme
NASA Astrophysics Data System (ADS)
Liao, Longxia; Peng, Xiaoqi; Shi, Jinjing; Guo, Ying
2017-02-01
Motivated by the elegant structure of the graph state, we design an ingenious quantum group authentication scheme, which is implemented by operating appropriate operations on the graph state and can solve the problem of multi-user authentication. Three entities, the group authentication server (GAS) as a verifier, multiple users as provers and the trusted third party Trent are included. GAS and Trent assist the multiple users in completing the authentication process, i.e., GAS is responsible for registering all the users while Trent prepares graph states. All the users, who request for authentication, encode their authentication keys on to the graph state by performing Pauli operators. It demonstrates that a novel authentication scheme can be achieved with the flexible use of graph state, which can synchronously authenticate a large number of users, meanwhile the provable security can be guaranteed definitely.
22 CFR 92.36 - Authentication defined.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Authentication defined. 92.36 Section 92.36... Notarial Acts § 92.36 Authentication defined. An authentication is a certification of the genuineness of... recognized in another jurisdiction. Documents which may require authentication include legal instruments...
Authentic feminist? Authenticity and feminist identity in teenage feminists' talk.
Calder-Dawe, Octavia; Gavey, Nicola
2017-12-01
This article explores how young people's feminist identities take shape in conjunction with a contemporary ideal of personal authenticity: to know and to express the 'real me'. Drawing from interviews with 18 teenagers living in Auckland, New Zealand, we examine a novel convergence of authenticity and feminism in participants' identity talk. For social psychologists interested in identity and politics, this convergence is intriguing: individualizing values such as authenticity are generally associated with disengagement with structural critique and with a repudiation of politicized and activist identities. Rather than seeking to categorize authentic feminism as an instance of either 'good/collective' or 'bad/individualized' feminist politics, we use discourse analysis to examine how the identity position of authentic feminist was constructed and to explore implications for feminist politics. On one hand, interviewees mobilized authentic feminism to affirm their commitment to normative liberal values of authenticity and self-expression. At the same time, the position of authentic feminist appeared to authorize risky feminist identifications and to justify counter-normative feelings, desires, and actions. To conclude, we explore how encountering others' intolerance of authentic feminism exposed interviewees to the limits of authenticity discourse, propelling some towards new understandings of the social world and their space for action within it. © 2017 The British Psychological Society.
22 CFR 92.38 - Forms of certificate of authentication.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Forms of certificate of authentication. 92.38... SERVICES Specific Notarial Acts § 92.38 Forms of certificate of authentication. The form of a certificate of authentication depends on the statutory requirements of the jurisdiction where the authenticated...
18 CFR 375.102 - Custody and authentication of Commission records.
Code of Federal Regulations, 2010 CFR
2010-04-01
... authentication of Commission records. 375.102 Section 375.102 Conservation of Power and Water Resources FEDERAL... Provisions § 375.102 Custody and authentication of Commission records. (a) Custody of official records. (1...) Authentication of Commission action. All orders and other actions of the Commission shall be authenticated or...
ERIC Educational Resources Information Center
Bialystok, Lauren
2015-01-01
Authenticity is often touted as an important virtue for teachers. But what do we mean when we say that a teacher ought to be "authentic"? Research shows that discussions of teacher authenticity frequently refer to other character traits or simply to teacher effectiveness, but authenticity is a unique concept with a long philosophical…
Richard Peters and Valuing Authenticity
ERIC Educational Resources Information Center
Degenhardt, M. A. B.
2009-01-01
Richard Peters has been praised for the authenticity of his philosophy, and inquiry into aspects of the development of his philosophy reveals a profound authenticity. Yet authenticity is something he seems not to favour. The apparent paradox is resolved by observing historical changes in the understanding of authenticity as an important value.…
Defining the questions: a research agenda for nontraditional authentication in arms control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hauck, Danielle K; Mac Arthur, Duncan W; Smith, Morag K
Many traditional authentication techniques have been based on hardware solutions. Thus authentication of measurement system hardware has been considered in terms of physical inspection and destructive analysis. Software authentication has implied hash function analysis or authentication tools such as Rose. Continuity of knowledge is maintained through TIDs and cameras. Although there is ongoing progress improving all of these authentication methods, there has been little discussion of the human factors involved in authentication. Issues of non-traditional authentication include sleight-of-hand substitutions, monitor perception vs. reality, and visual diversions. Since monitor confidence in a measurement system depends on the product of their confidencesmore » in each authentication element, it is important to investigate all authentication techniques, including the human factors. This paper will present an initial effort to identify the most important problems that traditional authentication approaches in safeguards have not addressed and are especially relevant to arms control verification. This will include a survey of the literature and direct engagement with nontraditional experts in areas like psychology and human factors. Based on the identification of problem areas, potential research areas will be identified and a possible research agenda will be developed.« less
Do We Need to Design Course-Based Undergraduate Research Experiences for Authenticity?
Rowland, Susan; Pedwell, Rhianna; Lawrie, Gwen; Lovie-Toon, Joseph; Hung, Yu
2016-01-01
The recent push for more authentic teaching and learning in science, technology, engineering, and mathematics indicates a shared agreement that undergraduates require greater exposure to professional practices. There is considerable variation, however, in how “authentic” science education is defined. In this paper we present our definition of authenticity as it applies to an “authentic” large-scale undergraduate research experience (ALURE); we also look to the literature and the student voice for alternate perceptions around this concept. A metareview of science education literature confirmed the inconsistency in definitions and application of the notion of authentic science education. An exploration of how authenticity was explained in 604 reflections from ALURE and traditional laboratory students revealed contrasting and surprising notions and experiences of authenticity. We consider the student experience in terms of alignment with 1) the intent of our designed curriculum and 2) the literature definitions of authentic science education. These findings contribute to the conversation surrounding authenticity in science education. They suggest two things: 1) educational experiences can have significant authenticity for the participants, even when there is no purposeful design for authentic practice, and 2) the continuing discussion of and design for authenticity in UREs may be redundant. PMID:27909029
Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier
2009-01-01
The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989
Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; McLaughlin, Gregory; Lednev, Igor K
2013-09-01
Body fluid traces recovered at crime scenes are among the most common and important types of forensic evidence. However, the ability to characterize a biological stain at a crime scene nondestructively has not yet been demonstrated. Here, we expand the Raman spectroscopic approach for the identification of dry traces of pure body fluids to address the problem of heterogeneous contamination, which can impair the performance of conventional methods. The concept of multidimensional Raman signatures was utilized for the identification of blood in dry traces contaminated with sand, dust, and soil. Multiple Raman spectra were acquired from the samples via automatic scanning, and the contribution of blood was evaluated through the fitting quality using spectroscopic signature components. The spatial mapping technique allowed for detection of "hot spots" dominated by blood contribution. The proposed method has great potential for blood identification in highly contaminated samples. © 2013 American Academy of Forensic Sciences.
Automatic NMR-Based Identification of Chemical Reaction Types in Mixtures of Co-Occurring Reactions
Latino, Diogo A. R. S.; Aires-de-Sousa, João
2014-01-01
The combination of chemoinformatics approaches with NMR techniques and the increasing availability of data allow the resolution of problems far beyond the original application of NMR in structure elucidation/verification. The diversity of applications can range from process monitoring, metabolic profiling, authentication of products, to quality control. An application related to the automatic analysis of complex mixtures concerns mixtures of chemical reactions. We encoded mixtures of chemical reactions with the difference between the 1H NMR spectra of the products and the reactants. All the signals arising from all the reactants of the co-occurring reactions were taken together (a simulated spectrum of the mixture of reactants) and the same was done for products. The difference spectrum is taken as the representation of the mixture of chemical reactions. A data set of 181 chemical reactions was used, each reaction manually assigned to one of 6 types. From this dataset, we simulated mixtures where two reactions of different types would occur simultaneously. Automatic learning methods were trained to classify the reactions occurring in a mixture from the 1H NMR-based descriptor of the mixture. Unsupervised learning methods (self-organizing maps) produced a reasonable clustering of the mixtures by reaction type, and allowed the correct classification of 80% and 63% of the mixtures in two independent test sets of different similarity to the training set. With random forests (RF), the percentage of correct classifications was increased to 99% and 80% for the same test sets. The RF probability associated to the predictions yielded a robust indication of their reliability. This study demonstrates the possibility of applying machine learning methods to automatically identify types of co-occurring chemical reactions from NMR data. Using no explicit structural information about the reactions participants, reaction elucidation is performed without structure elucidation of the molecules in the mixtures. PMID:24551112
Automatic NMR-based identification of chemical reaction types in mixtures of co-occurring reactions.
Latino, Diogo A R S; Aires-de-Sousa, João
2014-01-01
The combination of chemoinformatics approaches with NMR techniques and the increasing availability of data allow the resolution of problems far beyond the original application of NMR in structure elucidation/verification. The diversity of applications can range from process monitoring, metabolic profiling, authentication of products, to quality control. An application related to the automatic analysis of complex mixtures concerns mixtures of chemical reactions. We encoded mixtures of chemical reactions with the difference between the (1)H NMR spectra of the products and the reactants. All the signals arising from all the reactants of the co-occurring reactions were taken together (a simulated spectrum of the mixture of reactants) and the same was done for products. The difference spectrum is taken as the representation of the mixture of chemical reactions. A data set of 181 chemical reactions was used, each reaction manually assigned to one of 6 types. From this dataset, we simulated mixtures where two reactions of different types would occur simultaneously. Automatic learning methods were trained to classify the reactions occurring in a mixture from the (1)H NMR-based descriptor of the mixture. Unsupervised learning methods (self-organizing maps) produced a reasonable clustering of the mixtures by reaction type, and allowed the correct classification of 80% and 63% of the mixtures in two independent test sets of different similarity to the training set. With random forests (RF), the percentage of correct classifications was increased to 99% and 80% for the same test sets. The RF probability associated to the predictions yielded a robust indication of their reliability. This study demonstrates the possibility of applying machine learning methods to automatically identify types of co-occurring chemical reactions from NMR data. Using no explicit structural information about the reactions participants, reaction elucidation is performed without structure elucidation of the molecules in the mixtures.
Sonic Boom Prediction and Minimization of the Douglas Reference OPT5 Configuration
NASA Technical Reports Server (NTRS)
Siclari, Michael J.
1999-01-01
Conventional CFD methods and grids do not yield adequate resolution of the complex shock flow pattern generated by a real aircraft geometry. As a result, a unique grid topology and supersonic flow solver was developed at Northrop Grumman based on the characteristic behavior of supersonic wave patterns emanating from the aircraft. Using this approach, it was possible to compute flow fields with adequate resolution several body lengths below the aircraft. In this region, three-dimensional effects are diminished and conventional two-dimensional modified linear theory (MLT) can be applied to estimate ground pressure signatures or sonic booms. To accommodate real aircraft geometries and alleviate the burdensome grid generation task, an implicit marching multi-block, multi-grid finite-volume Euler code was developed as the basis for the sonic boom prediction methodology. The Thomas two-dimensional extrapolation method is built into the Euler code so that ground signatures can be obtained quickly and efficiently with minimum computational effort suitable to the aircraft design environment. The loudness levels of these signatures can then be determined using a NASA generated noise code. Since the Euler code is a three-dimensional flow field solver, the complete circumferential region below the aircraft is computed. The extrapolation of all this field data from a cylinder of constant radius leads to the definition of the entire boom corridor occurring directly below and off to the side of the aircraft's flight path yielding an estimate for the entire noise "annoyance" corridor in miles as well as its magnitude. An automated multidisciplinary sonic boom design optimization software system was developed during the latter part of HSR Phase 1. Using this system, it was found that sonic boom signatures could be reduced through optimization of a variety of geometric aircraft parameters. This system uses a gradient based nonlinear optimizer as the driver in conjunction with a computationally efficient Euler CFD solver (NIIM3DSB) for computing the three-dimensional near-field characteristics of the aircraft. The intent of the design system is to identify and optimize geometric design variables that have a beneficial impact on the ground sonic boom. The system uses a simple wave drag data format to specify the aircraft geometry. The geometry is internally enhanced and analytic methods are used to generate marching grids suitable for the multi-block Euler solver. The Thomas extrapolation method is integrated into this system, and hence, the aircraft's centerline ground sonic boom signature is also automatically computed for a specified cruise altitude and yields the parameters necessary to evaluate the design function. The entire design system has been automated since the gradient based optimization software requires many flow analyses in order to obtain the required sensitivity derivatives for each design variable in order to converge on an optimal solution. Hence, once the problem is defined which includes defining the objective function and geometric and aerodynamic constraints, the system will automatically regenerate the perturbed geometry, the necessary grids, the Euler solution, and finally the ground sonic boom signature at the request of the optimizer.
Hamlet, Jason R; Pierson, Lyndon G
2014-10-21
Detection and deterrence of spoofing of user authentication may be achieved by including a cryptographic fingerprint unit within a hardware device for authenticating a user of the hardware device. The cryptographic fingerprint unit includes an internal physically unclonable function ("PUF") circuit disposed in or on the hardware device, which generates a PUF value. Combining logic is coupled to receive the PUF value, combines the PUF value with one or more other authentication factors to generate a multi-factor authentication value. A key generator is coupled to generate a private key and a public key based on the multi-factor authentication value while a decryptor is coupled to receive an authentication challenge posed to the hardware device and encrypted with the public key and coupled to output a response to the authentication challenge decrypted with the private key.
Examining the relationship between authenticity and self-handicapping.
Akin, Ahmet; Akin, Umran
2014-12-01
Self-handicapping includes strategies of externalization in which people excuse failure and internalize success, but which also prevents them from behaving in an authentic way. The goal was to investigate the relation of authenticity with self-handicapping. The study was conducted with 366 university students (176 men, 190 women; M age = 20.2 yr.). Participants completed the Turkish version of the Authenticity Scale and the Self-handicapping Scale. Self-handicapping was correlated positively with two factors of authenticity, accepting external influence and self-alienation, and negatively with the authentic living factor. A multiple regression analysis indicated that self-handicapping was predicted positively by self-alienation and accepting external influence and negatively by authentic living, accounting for 21% of the variance collectively. These results demonstrated the negative association of authenticity with self-handicapping.
An Authentication Protocol for Future Sensor Networks.
Bilal, Muhammad; Kang, Shin-Gak
2017-04-28
Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN-logic) and simulated the SMSN and previously proposed schemes in an automated protocol verifier tool. Finally, we compared the computational complexity and communication cost against well-known authentication protocols.
An Authentication Protocol for Future Sensor Networks
Bilal, Muhammad; Kang, Shin-Gak
2017-01-01
Authentication is one of the essential security services in Wireless Sensor Networks (WSNs) for ensuring secure data sessions. Sensor node authentication ensures the confidentiality and validity of data collected by the sensor node, whereas user authentication guarantees that only legitimate users can access the sensor data. In a mobile WSN, sensor and user nodes move across the network and exchange data with multiple nodes, thus experiencing the authentication process multiple times. The integration of WSNs with Internet of Things (IoT) brings forth a new kind of WSN architecture along with stricter security requirements; for instance, a sensor node or a user node may need to establish multiple concurrent secure data sessions. With concurrent data sessions, the frequency of the re-authentication process increases in proportion to the number of concurrent connections. Moreover, to establish multiple data sessions, it is essential that a protocol participant have the capability of running multiple instances of the protocol run, which makes the security issue even more challenging. The currently available authentication protocols were designed for the autonomous WSN and do not account for the above requirements. Hence, ensuring a lightweight and efficient authentication protocol has become more crucial. In this paper, we present a novel, lightweight and efficient key exchange and authentication protocol suite called the Secure Mobile Sensor Network (SMSN) Authentication Protocol. In the SMSN a mobile node goes through an initial authentication procedure and receives a re-authentication ticket from the base station. Later a mobile node can use this re-authentication ticket when establishing multiple data exchange sessions and/or when moving across the network. This scheme reduces the communication and computational complexity of the authentication process. We proved the strength of our protocol with rigorous security analysis (including formal analysis using the BAN-logic) and simulated the SMSN and previously proposed schemes in an automated protocol verifier tool. Finally, we compared the computational complexity and communication cost against well-known authentication protocols. PMID:28452937
Image authentication using distributed source coding.
Lin, Yao-Chung; Varodayan, David; Girod, Bernd
2012-01-01
We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.
Evaluation of Grid Modification Methods for On- and Off-Track Sonic Boom Analysis
NASA Technical Reports Server (NTRS)
Nayani, Sudheer N.; Campbell, Richard L.
2013-01-01
Grid modification methods have been under development at NASA to enable better predictions of low boom pressure signatures from supersonic aircraft. As part of this effort, two new codes, Stretched and Sheared Grid - Modified (SSG) and Boom Grid (BG), have been developed in the past year. The CFD results from these codes have been compared with ones from the earlier grid modification codes Stretched and Sheared Grid (SSGRID) and Mach Cone Aligned Prism (MCAP) and also with the available experimental results. NASA's unstructured grid suite of software TetrUSS and the automatic sourcing code AUTOSRC were used for base grid generation and flow solutions. The BG method has been evaluated on three wind tunnel models. Pressure signatures have been obtained up to two body lengths below a Gulfstream aircraft wind tunnel model. Good agreement with the wind tunnel results have been obtained for both on-track and off-track (up to 53 degrees) cases. On-track pressure signatures up to ten body lengths below a Straight Line Segmented Leading Edge (SLSLE) wind tunnel model have been extracted. Good agreement with the wind tunnel results have been obtained. Pressure signatures have been obtained at 1.5 body lengths below a Lockheed Martin aircraft wind tunnel model. Good agreement with the wind tunnel results have been obtained for both on-track and off-track (up to 40 degrees) cases. Grid sensitivity studies have been carried out to investigate any grid size related issues. Methods have been evaluated for fully turbulent, mixed laminar/turbulent and fully laminar flow conditions.
Authentic leadership: application to women leaders
Hopkins, Margaret M.; O’Neil, Deborah A.
2015-01-01
The purpose of this perspective article is to present the argument that authentic leadership is a gendered representation of leadership. We first provide a brief history of leadership theories and definitions of authentic leadership. We then critique authentic leadership and offer arguments to support the premise that authentic leadership is not gender-neutral and is especially challenging for women. PMID:26236254
Xavier's Take on Authentic Writing: Structuring Choices for Expression and Impact
ERIC Educational Resources Information Center
Behizadeh, Nadia
2015-01-01
Because authenticity in education is a subjective judgment regarding the meaningfulness of an activity, a need exists to co-investigate with students classroom factors increasing authenticity of writing. In this case study, one 8th grade student's needs for authentic writing are explored in detail. Xavier's take on authentic writing…
Authentic leadership: application to women leaders.
Hopkins, Margaret M; O'Neil, Deborah A
2015-01-01
The purpose of this perspective article is to present the argument that authentic leadership is a gendered representation of leadership. We first provide a brief history of leadership theories and definitions of authentic leadership. We then critique authentic leadership and offer arguments to support the premise that authentic leadership is not gender-neutral and is especially challenging for women.
Localized lossless authentication watermark (LAW)
NASA Astrophysics Data System (ADS)
Celik, Mehmet U.; Sharma, Gaurav; Tekalp, A. Murat; Saber, Eli S.
2003-06-01
A novel framework is proposed for lossless authentication watermarking of images which allows authentication and recovery of original images without any distortions. This overcomes a significant limitation of traditional authentication watermarks that irreversibly alter image data in the process of watermarking and authenticate the watermarked image rather than the original. In particular, authenticity is verified before full reconstruction of the original image, whose integrity is inferred from the reversibility of the watermarking procedure. This reduces computational requirements in situations when either the verification step fails or the zero-distortion reconstruction is not required. A particular instantiation of the framework is implemented using a hierarchical authentication scheme and the lossless generalized-LSB data embedding mechanism. The resulting algorithm, called localized lossless authentication watermark (LAW), can localize tampered regions of the image; has a low embedding distortion, which can be removed entirely if necessary; and supports public/private key authentication and recovery options. The effectiveness of the framework and the instantiation is demonstrated through examples.
How to Speak an Authentication Secret Securely from an Eavesdropper
NASA Astrophysics Data System (ADS)
O'Gorman, Lawrence; Brotman, Lynne; Sammon, Michael
When authenticating over the telephone or mobile headphone, the user cannot always assure that no eavesdropper hears the password or authentication secret. We describe an eavesdropper-resistant, challenge-response authentication scheme for spoken authentication where an attacker can hear the user’s voiced responses. This scheme entails the user to memorize a small number of plaintext-ciphertext pairs. At authentication, these are challenged in random order and interspersed with camouflage elements. It is shown that the response can be made to appear random so that no information on the memorized secret can be learned by eavesdroppers. We describe the method along with parameter value tradeoffs of security strength, authentication time, and memory effort. This scheme was designed for user authentication of wireless headsets used for hands-free communication by healthcare staff at a hospital.
A Lightweight Continuous Authentication Protocol for the Internet of Things.
Chuang, Yo-Hsuan; Lo, Nai-Wei; Yang, Cheng-Ying; Tang, Ssu-Wei
2018-04-05
Modern societies are moving toward an information-oriented environment. To gather and utilize information around people's modern life, tiny devices with all kinds of sensing devices and various sizes of gateways need to be deployed and connected with each other through the Internet or proxy-based wireless sensor networks (WSNs). Within this kind of Internet of Things (IoT) environment, how to authenticate each other between two communicating devices is a fundamental security issue. As a lot of IoT devices are powered by batteries and they need to transmit sensed data periodically, it is necessary for IoT devices to adopt a lightweight authentication protocol to reduce their energy consumption when a device wants to authenticate and transmit data to its targeted peer. In this paper, a lightweight continuous authentication protocol for sensing devices and gateway devices in general IoT environments is introduced. The concept of valid authentication time period is proposed to enhance robustness of authentication between IoT devices. To construct the proposed lightweight continuous authentication protocol, token technique and dynamic features of IoT devices are adopted in order to reach the design goals: the reduction of time consumption for consecutive authentications and energy saving for authenticating devices through by reducing the computation complexity during session establishment of continuous authentication. Security analysis is conducted to evaluate security strength of the proposed protocol. In addition, performance analysis has shown the proposed protocol is a strong competitor among existing protocols for device-to-device authentication in IoT environments.
The flavor-locked flavorful two Higgs doublet model
NASA Astrophysics Data System (ADS)
Altmannshofer, Wolfgang; Gori, Stefania; Robinson, Dean J.; Tuckler, Douglas
2018-03-01
We propose a new framework to generate the Standard Model (SM) quark flavor hierarchies in the context of two Higgs doublet models (2HDM). The `flavorful' 2HDM couples the SM-like Higgs doublet exclusively to the third quark generation, while the first two generations couple exclusively to an additional source of electroweak symmetry breaking, potentially generating striking collider signatures. We synthesize the flavorful 2HDM with the `flavor-locking' mechanism, that dynamically generates large quark mass hierarchies through a flavor-blind portal to distinct flavon and hierarchon sectors: dynamical alignment of the flavons allows a unique hierarchon to control the respective quark masses. We further develop the theoretical construction of this mechanism, and show that in the context of a flavorful 2HDM-type setup, it can automatically achieve realistic flavor structures: the CKM matrix is automatically hierarchical with | V cb | and | V ub | generically of the observed size. Exotic contributions to meson oscillation observables may also be generated, that may accommodate current data mildly better than the SM itself.
Operation and control software for APNEA
DOE Office of Scientific and Technical Information (OSTI.GOV)
McClelland, J.H.; Storm, B.H. Jr.; Ahearn, J.
1997-11-01
The human interface software for the Lockheed Martin Specialty Components (LMSC) Active/Passive Neutron Examination & Analysis System (APENA) provides a user friendly operating environment for the movement and analysis of waste drums. It is written in Microsoft Visual C++ on a Windows NT platform. Object oriented and multitasking techniques are used extensively to maximize the capability of the system. A waste drum is placed on a loading platform with a fork lift and then automatically moved into the APNEA chamber in preparation for analysis. A series of measurements is performed, controlled by menu commands to hardware components attached as peripheralmore » devices, in order to create data files for analysis. The analysis routines use the files to identify the pertinent radioactive characteristics of the drum, including the type, location, and quantity of fissionable material. At the completion of the measurement process, the drum is automatically unloaded and the data are archived in preparation for storage as part of the drum`s data signature. 3 figs.« less
System steganalysis with automatic fingerprint extraction
Sloan, Tom; Hernandez-Castro, Julio; Isasi, Pedro
2018-01-01
This paper tries to tackle the modern challenge of practical steganalysis over large data by presenting a novel approach whose aim is to perform with perfect accuracy and in a completely automatic manner. The objective is to detect changes introduced by the steganographic process in those data objects, including signatures related to the tools being used. Our approach achieves this by first extracting reliable regularities by analyzing pairs of modified and unmodified data objects; then, combines these findings by creating general patterns present on data used for training. Finally, we construct a Naive Bayes model that is used to perform classification, and operates on attributes extracted using the aforementioned patterns. This technique has been be applied for different steganographic tools that operate in media files of several types. We are able to replicate or improve on a number or previously published results, but more importantly, we in addition present new steganalytic findings over a number of popular tools that had no previous known attacks. PMID:29694366
NASA Astrophysics Data System (ADS)
Sridevi, B.; Supriya, T. S.; Rajaram, S.
2013-01-01
The current generation of wireless networks has been designed predominantly to support voice and more recently data traffic. WiMAX is currently one of the hottest technologies in wireless. The main motive of the mobile technologies is to provide seamless cost effective mobility. But this is affected by Authentication cost and handover delay since on each handoff the Mobile Station (MS) has to undergo all steps of authentication. Pre-Authentication is used to reduce the handover delay and increase the speed of the Intra-ASN Handover. Proposed Pre-Authentication method is intended to reduce the authentication delay by getting pre authenticated by central authority called Pre Authentication Authority (PAA). MS requests PAA for Pre Authentication Certificate (PAC) before performing handoff. PAA verifies the identity of MS and provides PAC to MS and also to the neighboring target Base Stations (tBSs). MS having time bound PAC can skip the authentication process when recognized by target BS during handoff. It also prevents the DOS (Denial Of Service) attack and Replay attack. It has no wastage of unnecessary key exchange of the resources. The proposed work is simulated by NS2 model and by MATLAB.
NASA Technical Reports Server (NTRS)
Sparrow, Victor W.; Gionfriddo, Thomas A.
1994-01-01
In this study there were two primary tasks. The first was to develop an algorithm for quantifying the distortion in a sonic boom. Such an algorithm should be somewhat automatic, with minimal human intervention. Once the algorithm was developed, it was used to test the hypothesis that the cause of a sonic boom distortion was due to atmospheric turbulence. This hypothesis testing was the second task. Using readily available sonic boom data, we statistically tested whether there was a correlation between the sonic boom distortion and the distance a boom traveled through atmospheric turbulence.
The multiscale nature of magnetic pattern on the solar surface
NASA Astrophysics Data System (ADS)
Scardigli, S.; Del Moro, D.; Berrilli, F.
Multiscale magnetic underdense regions (voids) appear in high resolution magnetograms of quiet solar surface. These regions may be considered a signature of the underlying convective structure. The study of the associated pattern paves the way for the study of turbulent convective scales from granular to global. In order to address the question of magnetic pattern driven by turbulent convection we used a novel automatic void detection method to calculate void distributions. The absence of preferred scales of organization in the calculated distributions supports the multiscale nature of flows on the solar surface and the absence of preferred convective scales.
Authentic leadership: a new theory for nursing or back to basics?
Wong, Carol; Cummings, Greta
2009-01-01
Authentic leadership is an emerging theoretical model purported to focus on the root component of effective leadership. The purpose of this paper is to describe the relevance of authentic leadership to the advancement of nursing leadership practice and research and address the question of whether this is a new theory for leadership or an old one in new packaging. The paper outlines the origins and key elements of the model, assesses the theoretical, conceptual and measurement issues associated with authentic leadership and compares it with other leadership theories frequently reported in the nursing literature. The emerging authentic leadership theory holds promise for explaining the underlying processes by which authentic leaders and followers influence work outcomes and organizational performance. Construct validity of authentic leadership has preliminary documentation and a few studies have shown positive relationships between authenticity and trust. Furthermore, the clarity of the authenticity construct and comprehensiveness of the overall theoretical framework provide a fruitful base for future research examining the relationship between authentic leadership and the creation of healthier work environments. A clear focus on the relational aspects of leadership, the foundational moral/ethical component, a potential linkage of positive psychological capital to work engagement and the emphasis on leader and follower development in the authentic leadership framework are closely aligned to current and future nursing leadership practice and research priorities for the creation of sustainable changes in nursing work environments.
Drolet, Matthis; Schubotz, Ricarda I; Fischer, Julia
2013-06-01
Context has been found to have a profound effect on the recognition of social stimuli and correlated brain activation. The present study was designed to determine whether knowledge about emotional authenticity influences emotion recognition expressed through speech intonation. Participants classified emotionally expressive speech in an fMRI experimental design as sad, happy, angry, or fearful. For some trials, stimuli were cued as either authentic or play-acted in order to manipulate participant top-down belief about authenticity, and these labels were presented both congruently and incongruently to the emotional authenticity of the stimulus. Contrasting authentic versus play-acted stimuli during uncued trials indicated that play-acted stimuli spontaneously up-regulate activity in the auditory cortex and regions associated with emotional speech processing. In addition, a clear interaction effect of cue and stimulus authenticity showed up-regulation in the posterior superior temporal sulcus and the anterior cingulate cortex, indicating that cueing had an impact on the perception of authenticity. In particular, when a cue indicating an authentic stimulus was followed by a play-acted stimulus, additional activation occurred in the temporoparietal junction, probably pointing to increased load on perspective taking in such trials. While actual authenticity has a significant impact on brain activation, individual belief about stimulus authenticity can additionally modulate the brain response to differences in emotionally expressive speech.
Lightweight ECC based RFID authentication integrated with an ID verifier transfer protocol.
He, Debiao; Kumar, Neeraj; Chilamkurti, Naveen; Lee, Jong-Hyouk
2014-10-01
The radio frequency identification (RFID) technology has been widely adopted and being deployed as a dominant identification technology in a health care domain such as medical information authentication, patient tracking, blood transfusion medicine, etc. With more and more stringent security and privacy requirements to RFID based authentication schemes, elliptic curve cryptography (ECC) based RFID authentication schemes have been proposed to meet the requirements. However, many recently published ECC based RFID authentication schemes have serious security weaknesses. In this paper, we propose a new ECC based RFID authentication integrated with an ID verifier transfer protocol that overcomes the weaknesses of the existing schemes. A comprehensive security analysis has been conducted to show strong security properties that are provided from the proposed authentication scheme. Moreover, the performance of the proposed authentication scheme is analyzed in terms of computational cost, communicational cost, and storage requirement.
Optical authentication based on moiré effect of nonlinear gratings in phase space
NASA Astrophysics Data System (ADS)
Liao, Meihua; He, Wenqi; Wu, Jiachen; Lu, Dajiang; Liu, Xiaoli; Peng, Xiang
2015-12-01
An optical authentication scheme based on the moiré effect of nonlinear gratings in phase space is proposed. According to the phase function relationship of the moiré effect in phase space, an arbitrary authentication image can be encoded into two nonlinear gratings which serve as the authentication lock (AL) and the authentication key (AK). The AL is stored in the authentication system while the AK is assigned to the authorized user. The authentication procedure can be performed using an optoelectronic approach, while the design process is accomplished by a digital approach. Furthermore, this optical authentication scheme can be extended for multiple users with different security levels. The proposed scheme can not only verify the legality of a user identity, but can also discriminate and control the security levels of legal users. Theoretical analysis and simulation experiments are provided to verify the feasibility and effectiveness of the proposed scheme.
Vein matching using artificial neural network in vein authentication systems
NASA Astrophysics Data System (ADS)
Noori Hoshyar, Azadeh; Sulaiman, Riza
2011-10-01
Personal identification technology as security systems is developing rapidly. Traditional authentication modes like key; password; card are not safe enough because they could be stolen or easily forgotten. Biometric as developed technology has been applied to a wide range of systems. According to different researchers, vein biometric is a good candidate among other biometric traits such as fingerprint, hand geometry, voice, DNA and etc for authentication systems. Vein authentication systems can be designed by different methodologies. All the methodologies consist of matching stage which is too important for final verification of the system. Neural Network is an effective methodology for matching and recognizing individuals in authentication systems. Therefore, this paper explains and implements the Neural Network methodology for finger vein authentication system. Neural Network is trained in Matlab to match the vein features of authentication system. The Network simulation shows the quality of matching as 95% which is a good performance for authentication system matching.
Kerberos authentication: The security answer for unsecured networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1995-06-01
Traditional authentication schemes do not properly address the problems encountered with today`s unsecured networks. Kerbmm developed by MIT, on the other hand is designed to operate in an open unsecured network, yet provide good authentication and security including encrypted session traffic. Basic Kerberos principles as well as experiences of the ESnet Authentication Pilot Project with Cross Realm. Authentication between four National Laboratories will also be described.
SSO - Single-Sign-On Profile: Authentication Mechanisms Version 2.0
NASA Astrophysics Data System (ADS)
Taffoni, Giuliano; Schaaf, André; Rixon, Guy; Major, Brian; Taffoni, Giuliano
2017-05-01
Approved client-server authentication mechanisms are described for the IVOA single-sign-on profile: No Authentication; HTTP Basic Authentication; TLS with passwords; TLS with client certificates; Cookies; Open Authentication; Security Assertion Markup Language; OpenID. Normative rules are given for the implementation of these mechanisms, mainly by reference to pre-existing standards. The Authorization mechanisms are out of the scope of this document.
Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia
2018-05-17
Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.
Tseng, Yufeng Jane; Kuo, Chun-Ting; Wang, San-Yuan; Liao, Hsiao-Wei; Chen, Guan-Yuan; Ku, Yuan-Ling; Shao, Wei-Cheng; Kuo, Ching-Hua
2013-10-01
This study developed CE and ultra-high-pressure LC (UHPLC) methods coupled with UV detectors to characterize the metabolomic profiles of different rhubarb species. The optimal CE conditions used a BGE with 15 mM sodium tetraborate, 15 mM sodium dihydrogen phosphate monohydrate, 30 mM sodium deoxycholate, and 30% ACN v/v at pH 8.3. The optimal UHPLC conditions used a mobile phase composed of 0.05% phosphate buffer and ACN with gradient elution. The gradient profile increased linearly from 10 to 21% ACN within the first 25 min, then increased to 33% ACN for the next 10 min. It took another 5 min to reach the 65% ACN, then for the next 5 min, it stayed unchanged. Sixteen samples of Rheum officinale and Rheum tanguticum collected from various locations were analyzed by CE and UHPLC methods. The metabolite profiles of CE were aligned and baseline corrected before chemometric analysis. Metabolomic signatures of rhubarb species from CE and UHPLC were clustered using principle component analysis and distance-based redundancy analysis; the clusters were not only able to discriminate different species but also different cultivation regions. Similarity measurements were performed by calculating the correlation coefficient of each sample with the authentic samples. Hybrid rhizome was clearly identified through similarity measurement of UHPLC metabolite profile and later confirmed by gene sequencing. The present study demonstrated that CE and UHPLC are efficient and effective tools to identify and authenticate herbs even coupled with simple detectors. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Lightweight Continuous Authentication Protocol for the Internet of Things
Chuang, Yo-Hsuan; Yang, Cheng-Ying; Tang, Ssu-Wei
2018-01-01
Modern societies are moving toward an information-oriented environment. To gather and utilize information around people’s modern life, tiny devices with all kinds of sensing devices and various sizes of gateways need to be deployed and connected with each other through the Internet or proxy-based wireless sensor networks (WSNs). Within this kind of Internet of Things (IoT) environment, how to authenticate each other between two communicating devices is a fundamental security issue. As a lot of IoT devices are powered by batteries and they need to transmit sensed data periodically, it is necessary for IoT devices to adopt a lightweight authentication protocol to reduce their energy consumption when a device wants to authenticate and transmit data to its targeted peer. In this paper, a lightweight continuous authentication protocol for sensing devices and gateway devices in general IoT environments is introduced. The concept of valid authentication time period is proposed to enhance robustness of authentication between IoT devices. To construct the proposed lightweight continuous authentication protocol, token technique and dynamic features of IoT devices are adopted in order to reach the design goals: the reduction of time consumption for consecutive authentications and energy saving for authenticating devices through by reducing the computation complexity during session establishment of continuous authentication. Security analysis is conducted to evaluate security strength of the proposed protocol. In addition, performance analysis has shown the proposed protocol is a strong competitor among existing protocols for device-to-device authentication in IoT environments. PMID:29621168
22 CFR 92.40 - Authentication of foreign extradition papers.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Authentication of foreign extradition papers... RELATED SERVICES Specific Notarial Acts § 92.40 Authentication of foreign extradition papers. Foreign extradition papers are authenticated by chiefs of mission. ...
NASA Astrophysics Data System (ADS)
Gharami, Snigdha; Dinakaran, M.
2017-11-01
We see challenges in authenticating each aspect of electronic usage, starting from transaction to social interaction the authenticity and availability of correct information is guided in various ways. Authentication and authorization follow one another; a process of authentication is calculated on multiple layers of steps. In this paper we discuss various possibilities of modifying and using ways to deal with authentication and authorization mechanism. Idea is to work through authentication with mathematical calculations. We will go through various scenarios and find out the system of information that fits best at the moment of need. We will take account of new approaches of authentication and authorization while working on mathematical paradigm of information. The paper also takes an eye on quantum cryptography and discusses on how it could help one in the present scenario. This paper is divided into sections discussing on various paradigm of authentication and how one can achieve it in secure way, this paper is part of research work where analysis of various constraints are to be followed in the extended research work.
Phone, Email and Video Interactions with Characters in an Epidemiology Game: Towards Authenticity
NASA Astrophysics Data System (ADS)
Ney, Muriel; Gonçalves, Celso; Blacheff, Nicolas; Schwartz, Claudine; Bosson, Jean-Luc
A key concern in game-based learning is the level of authenticity that the game requires in order to have an accurate match of what the learners can expect in the real world with what they need to learn. In this paper, we show how four challenges to the designer of authentic games have been addressed in a game for an undergraduate course in a medical school. We focus in particular on the system of interaction with different characters of the game, namely, the patients and a number of professionals. Students use their personal phone and email application, as well as various web sites. First, we analyze the authenticity of the game through four attributes, authenticity of the character, of the content of the feedback, of the mode and channel of communication and of the constraints. Second, the perceived authenticity (by students) is analyzed. The later is threefold and defined by an external authenticity (perceived likeness with a real life reference), an internal authenticity (perceived internal coherence of the proposed situations) and a didactical authenticity (perceived relevance with respect to learning goals).
22 CFR 131.1 - Certification of documents.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Foreign Relations DEPARTMENT OF STATE MISCELLANEOUS CERTIFICATES OF AUTHENTICATION § 131.1 Certification of documents. The Authentication Officer, Acting Authentication Officer, or any Assistant Authentication Officer designated by either of the former officers may, and is hereby authorized to, sign and...
Study on a Biometric Authentication Model based on ECG using a Fuzzy Neural Network
NASA Astrophysics Data System (ADS)
Kim, Ho J.; Lim, Joon S.
2018-03-01
Traditional authentication methods use numbers or graphic passwords and thus involve the risk of loss or theft. Various studies are underway regarding biometric authentication because it uses the unique biometric data of a human being. Biometric authentication technology using ECG from biometric data involves signals that record electrical stimuli from the heart. It is difficult to manipulate and is advantageous in that it enables unrestrained measurements from sensors that are attached to the skin. This study is on biometric authentication methods using the neural network with weighted fuzzy membership functions (NEWFM). In the biometric authentication process, normalization and the ensemble average is applied during preprocessing, characteristics are extracted using Haar-wavelets, and a registration process called “training” is performed in the fuzzy neural network. In the experiment, biometric authentication was performed on 73 subjects in the Physionet Database. 10-40 ECG waveforms were tested for use in the registration process, and 15 ECG waveforms were deemed the appropriate number for registering ECG waveforms. 1 ECG waveforms were used during the authentication stage to conduct the biometric authentication test. Upon testing the proposed biometric authentication method based on 73 subjects from the Physionet Database, the TAR was 98.32% and FAR was 5.84%.
Lin, Shih-Sung; Hung, Min-Hsiung; Tsai, Chang-Lung; Chou, Li-Ping
2012-12-01
The study aims to provide an ease-of-use approach for senior patients to utilize remote healthcare systems. An ease-of-use remote healthcare system (RHS) architecture using RFID (Radio Frequency Identification) and networking technologies is developed. Specifically, the codes in RFID tags are used for authenticating the patients' ID to secure and ease the login process. The patient needs only to take one action, i.e. placing a RFID tag onto the reader, to automatically login and start the RHS and then acquire automatic medical services. An ease-of-use emergency monitoring and reporting mechanism is developed as well to monitor and protect the safety of the senior patients who have to be left alone at home. By just pressing a single button, the RHS can automatically report the patient's emergency information to the clinic side so that the responsible medical personnel can take proper urgent actions for the patient. Besides, Web services technology is used to build the Internet communication scheme of the RHS so that the interoperability and data transmission security between the home server and the clinical server can be enhanced. A prototype RHS is constructed to validate the effectiveness of our designs. Testing results show that the proposed RHS architecture possesses the characteristics of ease to use, simplicity to operate, promptness in login, and no need to preserve identity information. The proposed RHS architecture can effectively increase the willingness of senior patients who act slowly or are unfamiliar with computer operations to use the RHS. The research results can be used as an add-on for developing future remote healthcare systems.
Caira, Simonetta; Pinto, Gabriella; Nicolai, Maria Adalgisa; Chianese, Lina; Addeo, Francesco
2016-08-01
Water buffalo (WB) casein (CN) and curd samples from indigenous Italian and international breeds were examined with the objective of identifying signature peptides that could function as an indicator to determine the origin of their milk products. CN in complex mixtures were digested with trypsin, and peptide fragments were subsequently identified by matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry (MALDI-TOF MS). The unique presence of a β-CN A variant and an internally deleted αs1-CN (f35-42) variant in international WB milk samples was ascertained by identifying signature tryptic peptides from either dephosphorylated or native CN. Four signature unphosphorylated peptides derived from β-CN A, i.e. (f49-68) Asn(68) (2223.6 Da), (f1-28) Ser(10) (3169.4 Da), (f1-29) Ser(10) (3297.4 Da) and (f33-48) Thr(41) (1982 Da) and two from αs1-CN (f35-42) deleted fragments, i.e. (f23-34) Met(31) (1415.7 Da) and (f43-58) Val(44) (1752.7 Da), were identified. Two signature casein phosphopeptides (CPPs), i.e. β-CN (f1-28) 4P (3489.1 Da) and β-CN (f33-48) 1P (2062.0 Da), were identified in the tryptic hydrolysate of native casein or curd and cheese samples using in-batch hydroxyapatite (HA) chromatography. All these fragments functioned as analytical surrogates of two αs1- and β-casein variants that specifically occur in the milk of international WB breeds. Furthermore, the bovine peptide β-CN (f1-28) 4P had a distinct and lower molecular mass compared with the WB counterpart and functioned as a species-specific marker for all breeds of WB. Advantages of this analytical approach are that (i) peptides are easier to separate than proteins, (ii) signature peptide probes originating from specific casein variants allow for the targeting of all international WB milk, curd and cheese samples and (iii) bovine and WB casein in mixtures can be simultaneously determined in protected designation of origin (PDO) "Mozzarella di Bufala Campana" cheese. This analytical method enabled the specific detection of international WB and bovine casein with a sensitivity threshold of 2 and 0.78 %, respectively. Graphical Abstract Monitoring of prototypic tryptic CPPs by MALDI-TOF analysis in Mediterranean (A), Romanian (B), Indian (C), Polish (D) and Canadian (E) curd samples to guarantee the authenticity of the PDO "Mozzarella di Bufala Campana" cheese.
Location-assured, multifactor authentication on smartphones via LTE communication
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham
2013-05-01
With the added security provided by LTE, geographical location has become an important factor for authentication to enhance the security of remote client authentication during mCommerce applications using Smartphones. Tight combination of geographical location with classic authentication factors like PINs/Biometrics in a real-time, remote verification scheme over the LTE layer connection assures the authenticator about the client itself (via PIN/biometric) as well as the client's current location, thus defines the important aspects of "who", "when", and "where" of the authentication attempt without eaves dropping or man on the middle attacks. To securely integrate location as an authentication factor into the remote authentication scheme, client's location must be verified independently, i.e. the authenticator should not solely rely on the location determined on and reported by the client's Smartphone. The latest wireless data communication technology for mobile phones (4G LTE, Long-Term Evolution), recently being rolled out in various networks, can be employed to enhance this location-factor requirement of independent location verification. LTE's Control Plane LBS provisions, when integrated with user-based authentication and independent source of localisation factors ensures secure efficient, continuous location tracking of the Smartphone. This feature can be performed during normal operation of the LTE-based communication between client and network operator resulting in the authenticator being able to verify the client's claimed location more securely and accurately. Trials and experiments show that such algorithm implementation is viable for nowadays Smartphone-based banking via LTE communication.
22 CFR 61.3 - Certification and authentication criteria.
Code of Federal Regulations, 2014 CFR
2014-04-01
... AUDIO-VISUAL MATERIALS § 61.3 Certification and authentication criteria. (a) The Department shall certify or authenticate audio-visual materials submitted for review as educational, scientific and... of the material. (b) The Department will not certify or authenticate any audio-visual material...
22 CFR 61.3 - Certification and authentication criteria.
Code of Federal Regulations, 2013 CFR
2013-04-01
... AUDIO-VISUAL MATERIALS § 61.3 Certification and authentication criteria. (a) The Department shall certify or authenticate audio-visual materials submitted for review as educational, scientific and... of the material. (b) The Department will not certify or authenticate any audio-visual material...
22 CFR 61.3 - Certification and authentication criteria.
Code of Federal Regulations, 2012 CFR
2012-04-01
... AUDIO-VISUAL MATERIALS § 61.3 Certification and authentication criteria. (a) The Department shall certify or authenticate audio-visual materials submitted for review as educational, scientific and... of the material. (b) The Department will not certify or authenticate any audio-visual material...
Managing authenticity: the paradox of great leadership.
Goffee, Rob; Jones, Gareth
2005-12-01
Leaders and followers both associate authenticity with sincerity, honesty, and integrity. It's the real thing--the attribute that uniquely defines great managers. But while the expression of a genuine self is necessary for great leadership, the concept of authenticity is often misunderstood, not least by leaders themselves. They often assume that authenticity is an innate quality--that a person is either genuine or not. In fact, the authors say, authenticity is largely defined by what other people see in you and, as such, can to a great extent be controlled by you. In this article, the authors explore the qualities of authentic leadership. To illustrate their points, they recount the experiences of some of the authentic leaders they have known and studied, including the BBC's Greg Dyke, Nestlé's Peter Brabeck-Letmathe, and Marks & Spencer's Jean Tomlin. Establishing your authenticity as a leader is a two-part challenge. You have to consistently match your words and deeds; otherwise, followers will never accept you as authentic. But it is not enough just to practice what you preach. To get people to follow you, you also have to get them to relate to you. This means presenting different faces to different audiences--a requirement that many people find hard to square with authenticity. But authenticity is not the product of manipulation. It accurately reflects aspects of the leader's inner self, so it can't be an act. Authentic leaders seem to know which personality traits they should reveal to whom, and when. Highly attuned to their environments, authentic leaders rely on an intuition born of formative, sometimes harsh experiences to understand the expectations and concerns of the people they seek to influence. They retain their distinctiveness as individuals, yet they know how to win acceptance in strong corporate and social cultures and how to use elements of those cultures as a basis for radical change.
Comparative Study on Various Authentication Protocols in Wireless Sensor Networks.
Rajeswari, S Raja; Seenivasagam, V
2016-01-01
Wireless sensor networks (WSNs) consist of lightweight devices with low cost, low power, and short-ranged wireless communication. The sensors can communicate with each other to form a network. In WSNs, broadcast transmission is widely used along with the maximum usage of wireless networks and their applications. Hence, it has become crucial to authenticate broadcast messages. Key management is also an active research topic in WSNs. Several key management schemes have been introduced, and their benefits are not recognized in a specific WSN application. Security services are vital for ensuring the integrity, authenticity, and confidentiality of the critical information. Therefore, the authentication mechanisms are required to support these security services and to be resilient to distinct attacks. Various authentication protocols such as key management protocols, lightweight authentication protocols, and broadcast authentication protocols are compared and analyzed for all secure transmission applications. The major goal of this survey is to compare and find out the appropriate protocol for further research. Moreover, the comparisons between various authentication techniques are also illustrated.
Comparative Study on Various Authentication Protocols in Wireless Sensor Networks
Rajeswari, S. Raja; Seenivasagam, V.
2016-01-01
Wireless sensor networks (WSNs) consist of lightweight devices with low cost, low power, and short-ranged wireless communication. The sensors can communicate with each other to form a network. In WSNs, broadcast transmission is widely used along with the maximum usage of wireless networks and their applications. Hence, it has become crucial to authenticate broadcast messages. Key management is also an active research topic in WSNs. Several key management schemes have been introduced, and their benefits are not recognized in a specific WSN application. Security services are vital for ensuring the integrity, authenticity, and confidentiality of the critical information. Therefore, the authentication mechanisms are required to support these security services and to be resilient to distinct attacks. Various authentication protocols such as key management protocols, lightweight authentication protocols, and broadcast authentication protocols are compared and analyzed for all secure transmission applications. The major goal of this survey is to compare and find out the appropriate protocol for further research. Moreover, the comparisons between various authentication techniques are also illustrated. PMID:26881272
Emmerich, Astrid I; Rigotti, Thomas
2017-01-01
This study investigates the role of context-specific authenticity at work for work-related outcomes (intrinsic motivation, work ability) and depressivity. Furthermore reciprocal relations between work-related authenticity and healthy psychological functioning are investigated. Longitudinal data from 1,243 employees from 63 subsidiaries of a non-profit organization in the social sector were analyzed using multilevel structural equation modeling. Work-related authenticity at T1 predicted work ability and depressivity, but not intrinsic motivation at T2, about 6 months later. Work-related authenticity at T2 was predicted by intrinsic motivation and depressivity, but not by work ability at T1. We conclude that work-related authenticity and healthy psychological functioning are positively reinforcing each other. Thus, enabling employees to be authentic supposedly increases their well-being and is a pivotal opportunity for organizations to foster health and performance-related indicators like work ability and prevent negative health indicators like depressivity. At the same time, authenticity of employees can be fostered through workplace health promotion.
Security analysis for biometric data in ID documents
NASA Astrophysics Data System (ADS)
Schimke, Sascha; Kiltz, Stefan; Vielhauer, Claus; Kalker, Ton
2005-03-01
In this paper we analyze chances and challenges with respect to the security of using biometrics in ID documents. We identify goals for ID documents, set by national and international authorities, and discuss the degree of security, which is obtainable with the inclusion of biometric into documents like passports. Starting from classical techniques for manual authentication of ID card holders, we expand our view towards automatic methods based on biometrics. We do so by reviewing different human biometric attributes by modality, as well as by discussing possible techniques for storing and handling the particular biometric data on the document. Further, we explore possible vulnerabilities of potential biometric passport systems. Based on the findings of that discussion we will expand upon two exemplary approaches for including digital biometric data in the context of ID documents and present potential risks attack scenarios along with technical aspects such as capacity and robustness.
Evaluation of complex gonioapparent samples using a bidirectional spectrometer.
Rogelj, Nina; Penttinen, Niko; Gunde, Marta Klanjšek
2015-08-24
Many applications use gonioapparent targets whose appearance depends on irradiation and viewing angles; the strongest effects are provided by light diffraction. These targets, optically variable devices (OVDs), are used in both security and authentication applications. This study introduces a bidirectional spectrometer, which enables to analyze samples with most complex angular and spectral properties. In our work, the spectrometer is evaluated with samples having very different types of reflection, concerning spectral and angular distributions. Furthermore, an OVD containing several different grating patches is evaluated. The device uses automatically adjusting exposure time to provide maximum signal dynamics and is capable of doing steps as small as 0.01°. However, even 2° steps for the detector movement showed that this device is more than capable of characterizing even the most complex reflecting surfaces. This study presents sRGB visualizations, discussion of bidirectional reflection, and accurate grating period calculations for all of the grating samples used.
Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities
NASA Astrophysics Data System (ADS)
Shivanand M., Handigund; Shweta, Bhat
The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.
Chang, I-Pin; Lee, Tian-Fu; Lin, Tsung-Hung; Liu, Chuan-Ming
2015-11-30
Key agreements that use only password authentication are convenient in communication networks, but these key agreement schemes often fail to resist possible attacks, and therefore provide poor security compared with some other authentication schemes. To increase security, many authentication and key agreement schemes use smartcard authentication in addition to passwords. Thus, two-factor authentication and key agreement schemes using smartcards and passwords are widely adopted in many applications. Vaidya et al. recently presented a two-factor authentication and key agreement scheme for wireless sensor networks (WSNs). Kim et al. observed that the Vaidya et al. scheme fails to resist gateway node bypassing and user impersonation attacks, and then proposed an improved scheme for WSNs. This study analyzes the weaknesses of the two-factor authentication and key agreement scheme of Kim et al., which include vulnerability to impersonation attacks, lost smartcard attacks and man-in-the-middle attacks, violation of session key security, and failure to protect user privacy. An efficient and secure authentication and key agreement scheme for WSNs based on the scheme of Kim et al. is then proposed. The proposed scheme not only solves the weaknesses of previous approaches, but also increases security requirements while maintaining low computational cost.
South African managers in public service: On being authentic
Simbhoo, Nirvana
2014-01-01
South African managers in public service consistently face challenges related to managing a well-adjusted and productive diverse workforce. Following the notion that leadership authenticity fosters positive psychological employee capacity, the aim of this study was to explore the meaning essence of authenticity as lived in the work–life experiences of senior managers in public service. Five senior managers in public service were purposefully selected based on their articulated challenges with being authentic at work, whilst attending a diversity sensitivity workshop. From a hermeneutic phenomenological perspective, in-depth interviews were used, and an interpretative phenomenological analysis yielded two predominant themes offering a description of what it means to be authentic. Authenticity is experienced as an affective state that results from a continuous self-appraisal of the extent to which expression of self is congruent with a subjective and socially constructed expectation of self in relation to others. Authenticity seems to develop through a continuous process of internal and external adaptation, and it leads to ultimately building a differentiated yet integrated identity of self. A reciprocal dynamic between feeling authentic and self-confidence alludes to the potential importance of authenticity dynamics in identity work. PMID:24434054
South African managers in public service: on being authentic.
Barnard, Antoni; Simbhoo, Nirvana
2014-01-01
South African managers in public service consistently face challenges related to managing a well-adjusted and productive diverse workforce. Following the notion that leadership authenticity fosters positive psychological employee capacity, the aim of this study was to explore the meaning essence of authenticity as lived in the work-life experiences of senior managers in public service. Five senior managers in public service were purposefully selected based on their articulated challenges with being authentic at work, whilst attending a diversity sensitivity workshop. From a hermeneutic phenomenological perspective, in-depth interviews were used, and an interpretative phenomenological analysis yielded two predominant themes offering a description of what it means to be authentic. Authenticity is experienced as an affective state that results from a continuous self-appraisal of the extent to which expression of self is congruent with a subjective and socially constructed expectation of self in relation to others. Authenticity seems to develop through a continuous process of internal and external adaptation, and it leads to ultimately building a differentiated yet integrated identity of self. A reciprocal dynamic between feeling authentic and self-confidence alludes to the potential importance of authenticity dynamics in identity work.
Study on the security of the authentication scheme with key recycling in QKD
NASA Astrophysics Data System (ADS)
Li, Qiong; Zhao, Qiang; Le, Dan; Niu, Xiamu
2016-09-01
In quantum key distribution (QKD), the information theoretically secure authentication is necessary to guarantee the integrity and authenticity of the exchanged information over the classical channel. In order to reduce the key consumption, the authentication scheme with key recycling (KR), in which a secret but fixed hash function is used for multiple messages while each tag is encrypted with a one-time pad (OTP), is preferred in QKD. Based on the assumption that the OTP key is perfect, the security of the authentication scheme has be proved. However, the OTP key of authentication in a practical QKD system is not perfect. How the imperfect OTP affects the security of authentication scheme with KR is analyzed thoroughly in this paper. In a practical QKD, the information of the OTP key resulting from QKD is partially leaked to the adversary. Although the information leakage is usually so little to be neglected, it will lead to the increasing degraded security of the authentication scheme as the system runs continuously. Both our theoretical analysis and simulation results demonstrate that the security level of authentication scheme with KR, mainly indicated by its substitution probability, degrades exponentially in the number of rounds and gradually diminishes to zero.
Juan-Albarracín, Javier; Fuster-Garcia, Elies; Pérez-Girbés, Alexandre; Aparici-Robles, Fernando; Alberich-Bayarri, Ángel; Revert-Ventura, Antonio; Martí-Bonmatí, Luis; García-Gómez, Juan M
2018-06-01
Purpose To determine if preoperative vascular heterogeneity of glioblastoma is predictive of overall survival of patients undergoing standard-of-care treatment by using an unsupervised multiparametric perfusion-based habitat-discovery algorithm. Materials and Methods Preoperative magnetic resonance (MR) imaging including dynamic susceptibility-weighted contrast material-enhanced perfusion studies in 50 consecutive patients with glioblastoma were retrieved. Perfusion parameters of glioblastoma were analyzed and used to automatically draw four reproducible habitats that describe the tumor vascular heterogeneity: high-angiogenic and low-angiogenic regions of the enhancing tumor, potentially tumor-infiltrated peripheral edema, and vasogenic edema. Kaplan-Meier and Cox proportional hazard analyses were conducted to assess the prognostic potential of the hemodynamic tissue signature to predict patient survival. Results Cox regression analysis yielded a significant correlation between patients' survival and maximum relative cerebral blood volume (rCBV max ) and maximum relative cerebral blood flow (rCBF max ) in high-angiogenic and low-angiogenic habitats (P < .01, false discovery rate-corrected P < .05). Moreover, rCBF max in the potentially tumor-infiltrated peripheral edema habitat was also significantly correlated (P < .05, false discovery rate-corrected P < .05). Kaplan-Meier analysis demonstrated significant differences between the observed survival of populations divided according to the median of the rCBV max or rCBF max at the high-angiogenic and low-angiogenic habitats (log-rank test P < .05, false discovery rate-corrected P < .05), with an average survival increase of 230 days. Conclusion Preoperative perfusion heterogeneity contains relevant information about overall survival in patients who undergo standard-of-care treatment. The hemodynamic tissue signature method automatically describes this heterogeneity, providing a set of vascular habitats with high prognostic capabilities. © RSNA, 2018.
Detecting measurement outliers: remeasure efficiently
NASA Astrophysics Data System (ADS)
Ullrich, Albrecht
2010-09-01
Shrinking structures, advanced optical proximity correction (OPC) and complex measurement strategies continually challenge critical dimension (CD) metrology tools and recipe creation processes. One important quality ensuring task is the control of measurement outlier behavior. Outliers could trigger false positive alarm for specification violations impacting cycle time or potentially yield. Constant high level of outliers not only deteriorates cycle time but also puts unnecessary stress on tool operators leading eventually to human errors. At tool level the sources of outliers are natural variations (e.g. beam current etc.), drifts, contrast conditions, focus determination or pattern recognition issues, etc. Some of these can result from suboptimal or even wrong recipe settings, like focus position or measurement box size. Such outliers, created by an automatic recipe creation process faced with more complicated structures, would manifest itself rather as systematic variation of measurements than the one caused by 'pure' tool variation. I analyzed several statistical methods to detect outliers. These range from classical outlier tests for extrema, robust metrics like interquartile range (IQR) to methods evaluating the distribution of different populations of measurement sites, like the Cochran test. The latter suits especially the detection of systematic effects. The next level of outlier detection entwines additional information about the mask and the manufacturing process with the measurement results. The methods were reviewed for measured variations assumed to be normally distributed with zero mean but also for the presence of a statistically significant spatial process signature. I arrive at the conclusion that intelligent outlier detection can influence the efficiency and cycle time of CD metrology greatly. In combination with process information like target, typical platform variation and signature, one can tailor the detection to the needs of the photomask at hand. By monitoring the outlier behavior carefully, weaknesses of the automatic recipe creation process can be spotted.
Develop Advanced Nonlinear Signal Analysis Topographical Mapping System
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1997-01-01
During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 1 2013-01-01 2013-01-01 false Authentication. 1.22 Section 1.22 Agriculture Office of the Secretary of Agriculture ADMINISTRATIVE REGULATIONS Official Records § 1.22 Authentication. When a request is received for an authenticated copy of a document that the agency determines to make...
Authentic Montessori: The Teacher Makes the Difference
ERIC Educational Resources Information Center
Huxel, Alexa C.
2013-01-01
What are the elements that make up authentic Montessori? Is Montessori something concrete or abstract? Are there intangibles that make Montessori what it is? Many classrooms today have Montessori materials and small tables and chairs. Are they authentic Montessori? When examining areas that traditionally make defining authentic Montessori…
36 CFR 1275.66 - Reproduction and authentication of other materials.
Code of Federal Regulations, 2010 CFR
2010-07-01
... authentication of other materials. 1275.66 Section 1275.66 Parks, Forests, and Public Property NATIONAL ARCHIVES... Reproduction and authentication of other materials. (a) Copying of materials, including tape recordings... materials when necessary for the purpose of the research. (c) The fees for reproduction and authentication...
22 CFR 92.37 - Authentication procedure.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Authentication procedure. 92.37 Section 92.37... Notarial Acts § 92.37 Authentication procedure. (a) The consular officer must compare the foreign official...) Where the State law requires the consular officer's certificate of authentication to show that the...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 1 2010-01-01 2010-01-01 false Authentication. 1.22 Section 1.22 Agriculture Office of the Secretary of Agriculture ADMINISTRATIVE REGULATIONS Official Records § 1.22 Authentication. When a request is received for an authenticated copy of a document that the agency determines to make...
Authentic leadership: becoming and remaining an authentic nurse leader.
Murphy, Lin G
2012-11-01
This article explores how chief nurse executives became and remained authentic leaders. Using narrative inquiry, this qualitative study focused on the life stories of participants. Results demonstrate the importance of reframing, reflection in alignment with values, and the courage needed as nurse leaders progress to authenticity.
Ninth Grade Student Responses to Authentic Science Instruction
NASA Astrophysics Data System (ADS)
Ellison, Michael Steven
This mixed methods case study documents an effort to implement authentic science and engineering instruction in one teacher's ninth grade science classrooms in a science-focused public school. The research framework and methodology is a derivative of work developed and reported by Newmann and others (Newmann & Associates, 1996). Based on a working definition of authenticity, data were collected for eight months on the authenticity in the experienced teacher's pedagogy and in student performance. Authenticity was defined as the degree to which a classroom lesson, an assessment task, or an example of student performance demonstrates construction of knowledge through use of the meaning-making processes of science and engineering, and has some value to students beyond demonstrating success in school (Wehlage et al., 1996). Instruments adapted for this study produced a rich description of the authenticity of the teacher's instruction and student performance. The pedagogical practices of the classroom teacher were measured as moderately authentic on average. However, the authenticity model revealed the teacher's strategy of interspersing relatively low authenticity instructional units focused on building science knowledge with much higher authenticity tasks requiring students to apply these concepts and skills. The authenticity of the construction of knowledge and science meaning-making processes components of authentic pedagogy were found to be greater, than the authenticity of affordances for students to find value in classroom activities beyond demonstrating success in school. Instruction frequently included one aspect of value beyond school, connections to the world outside the classroom, but students were infrequently afforded the opportunity to present their classwork to audiences beyond the teacher. When the science instruction in the case was measured to afford a greater level of authentic intellectual work, a higher level of authentic student performance on science classwork was also measured. In addition, direct observation measures of student behavioral engagement showed that behavioral engagement was generally high, but not associated with the authenticity of the pedagogy. Direct observation measures of student self-regulation found evidence that when instruction focused on core science and engineering concepts and made stronger connections to the student's world beyond the classroom, student self-regulated learning was greater, and included evidence of student ownership. In light of the alignment between the model of authenticity used in this study and the Next Generation Science Standards (NGSS), the results suggest that further research on the value beyond school component of the model could improve understanding of student engagement and performance in response to the implementation of the NGSS. In particular, it suggests a unique role environmental education can play in affording student success in K-12 science and a tool to measure that role.
Lee, Tian-Fu; Liu, Chuan-Ming
2013-06-01
A smart-card based authentication scheme for telecare medicine information systems enables patients, doctors, nurses, health visitors and the medicine information systems to establish a secure communication platform through public networks. Zhu recently presented an improved authentication scheme in order to solve the weakness of the authentication scheme of Wei et al., where the off-line password guessing attacks cannot be resisted. This investigation indicates that the improved scheme of Zhu has some faults such that the authentication scheme cannot execute correctly and is vulnerable to the attack of parallel sessions. Additionally, an enhanced authentication scheme based on the scheme of Zhu is proposed. The enhanced scheme not only avoids the weakness in the original scheme, but also provides users' anonymity and authenticated key agreements for secure data communications.
Kent, Alexander Dale [Los Alamos, NM
2008-09-02
Methods and systems in a data/computer network for authenticating identifying data transmitted from a client to a server through use of a gateway interface system which are communicately coupled to each other are disclosed. An authentication packet transmitted from a client to a server of the data network is intercepted by the interface, wherein the authentication packet is encrypted with a one-time password for transmission from the client to the server. The one-time password associated with the authentication packet can be verified utilizing a one-time password token system. The authentication packet can then be modified for acceptance by the server, wherein the response packet generated by the server is thereafter intercepted, verified and modified for transmission back to the client in a similar but reverse process.
West, Charles E; Pureveen, Jos; Scarlett, Alan G; Lengger, Sabine K; Wilde, Michael J; Korndorffer, Frans; Tegelaar, Erik W; Rowland, Steven J
2014-05-15
The identification of key acid metabolites ('signature' metabolites) has allowed significant improvements to be made in our understanding of the biodegradation of petroleum hydrocarbons, in reservoir and in contaminated natural systems, such as aquifers and seawater. On this basis, anaerobic oxidation is now more widely accepted as one viable mechanism, for instance. However, identification of metabolites in the complex acid mixtures from petroleum degradation is challenging and would benefit from use of more highly resolving analytical methods. Comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry (GCxGC/TOFMS) with both nominal mass and accurate mass measurement was used to study the complex mixtures of aromatic acids (as methyl esters) in petroleum fractions. Numerous mono- and di-aromatic acid isomers were identified in a commercial naphthenic acids fraction from petroleum and in an acids fraction from a biodegraded petroleum. In many instances, compounds were identified by comparison of mass spectral and retention time data with those of authentic compounds. The identification of a variety of alkyl naphthalene carboxylic and alkanoic and alkyl tetralin carboxylic and alkanoic acids, plus identifications of a range of alkyl indane acids, provides further evidence for 'signature' metabolites of biodegradation of aromatic petroleum hydrocarbons. Identifications such as these now offer the prospect of better differentiation of metabolites of bacterial processes (e.g. aerobic, methanogenic, sulphate-reducing) in polar petroleum fractions. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Żmuda-Trzebiatowska, Iwona; Wachowiak, Mirosław; Klisińska-Kopacz, Anna; Trykowski, Grzegorz; Śliwiński, Gerard
2015-02-01
The Raman and complementary spectroscopic analyses were performed using the exceptional possibility of research on the XIX c. original paint materials of the artist palette of J. Matejko stored in the National Museum in Cracow. The yellow and ochre-based paints characteristic for Matejko's workshop and selected from the ensemble of 273 labelled tubes (brand of R. Ainé/Paris) supplied during the period of 1880-1893 were investigated. Highly specific Raman spectra were obtained for paints containing mixtures of the Zn- and Sn-modified Pb-Sb pigment, and also for the ochre-based ones. A clear pigment discrimination of the mixture of cadmium yellow (CdS), cinnabar (HgS) and lead white (2PbCO3ṡPb(OH)2) was possible by means of Raman data collected under different excitations at 514 nm and 785 nm. It was shown that the Raman spectra complemented by the XRF, SEM-EDX and in some cases also by the LIPS and FTIR data ensure reliable pigment identification in multi-component paints containing secondary species and impurities. The reported spectral signatures will be used for non-destructive investigation of the collection of about 300 oil paintings of J. Matejko. In view of the comparative research on polish painting which point out that richness of modified Naples yellows clearly distinguish Matejko's artworks from other ones painted in the period of 1850-1883, the Raman data of these paints can provide support in the authentication studies.
Above the Noise: The Search for Periodicities in the Inner Heliosphere
NASA Astrophysics Data System (ADS)
Threlfall, James; De Moortel, Ineke; Conlon, Thomas
2017-11-01
Remote sensing of coronal and heliospheric periodicities can provide vital insight into the local conditions and dynamics of the solar atmosphere. We seek to trace long (one hour or longer) periodic oscillatory signatures (previously identified above the limb in the corona by, e.g., Telloni et al. in Astrophys. J. 767, 138, 2013) from their origin at the solar surface out into the heliosphere. To do this, we combined on-disk measurements taken by the Atmospheric Imaging Assembly (AIA) onboard the Solar Dynamics Observatory (SDO) and concurrent extreme ultra-violet (EUV) and coronagraph data from one of the Solar Terrestrial Relations Observatory (STEREO) spacecraft to study the evolution of two active regions in the vicinity of an equatorial coronal hole over several days in early 2011. Fourier and wavelet analysis of signals were performed. Applying white-noise-based confidence levels to the power spectra associated with detrended intensity time series yields detections of oscillatory signatures with periods from 6 - 13 hours in both AIA and STEREO data. As was found by Telloni et al. (2013), these signatures are aligned with local magnetic structures. However, typical spectral power densities all vary substantially as a function of period, indicating spectra dominated by red (rather than white) noise. Contrary to the white-noise-based results, applying global confidence levels based on a generic background-noise model (allowing a combination of white noise, red noise, and transients following Auchère et al. in Astrophys. J. 825, 110, 2016) without detrending the time series uncovers only sporadic, spatially uncorrelated evidence of periodic signatures in either instrument. Automating this method to individual pixels in the STEREO/COR coronagraph field of view is non-trivial. Efforts to identify and implement a more robust automatic background noise model fitting procedure are needed.
Automated recognition of stratigraphic marker shales from geophysical logs in iron ore deposits
NASA Astrophysics Data System (ADS)
Silversides, Katherine; Melkumyan, Arman; Wyman, Derek; Hatherly, Peter
2015-04-01
The mining of stratiform ore deposits requires a means of determining the location of stratigraphic boundaries. A variety of geophysical logs may provide the required data but, in the case of banded iron formation hosted iron ore deposits in the Hamersley Ranges of Western Australia, only one geophysical log type (natural gamma) is collected for this purpose. The information from these logs is currently processed by slow manual interpretation. In this paper we present an alternative method of automatically identifying recurring stratigraphic markers in natural gamma logs from multiple drill holes. Our approach is demonstrated using natural gamma geophysical logs that contain features corresponding to the presence of stratigraphically important marker shales. The host stratigraphic sequence is highly consistent throughout the Hamersley and the marker shales can therefore be used to identify the stratigraphic location of the banded iron formation (BIF) or BIF hosted ore. The marker shales are identified using Gaussian Processes (GP) trained by either manual or active learning methods and the results are compared to the existing geological interpretation. The manual method involves the user selecting the signatures for improving the library, whereas the active learning method uses the measure of uncertainty provided by the GP to select specific examples for the user to consider for addition. The results demonstrate that both GP methods can identify a feature, but the active learning approach has several benefits over the manual method. These benefits include greater accuracy in the identified signatures, faster library building, and an objective approach for selecting signatures that includes the full range of signatures across a deposit in the library. When using the active learning method, it was found that the current manual interpretation could be replaced in 78.4% of the holes with an accuracy of 95.7%.
ERIC Educational Resources Information Center
Gulikers, Judith T. M.; Bastiaens, Theo J.; Kirschner, Paul A.; Kester, Liesbeth
2006-01-01
This article examines the relationships between perceptions of authenticity and alignment on study approach and learning outcome. Senior students of a vocational training program performed an authentic assessment and filled in a questionnaire about the authenticity of various assessment characteristics and the alignment between the assessment and…
School Principals' Authentic Leadership and Teachers' Psychological Capital: Teachers' Perspectives
ERIC Educational Resources Information Center
Feng, Feng-I
2016-01-01
This study examined teachers' perceptions of principals' authentic leadership and the relationship of authentic leadership to teachers' psychological capital in Taiwan. A total of 1,429 elementary and secondary school teachers were surveyed. The results showed that teachers perceived their principals' authentic leadership as moderate and that the…
21 CFR 1311.125 - Requirements for establishing logical access control-Individual practitioner.
Code of Federal Regulations, 2010 CFR
2010-04-01
... substance prescriptions and who has obtained a two-factor authentication credential as provided in § 1311... his two-factor authentication credential to satisfy the logical access controls. The second individual... authentication factor required by the two-factor authentication protocol is lost, stolen, or compromised. Such...
31 CFR 363.21 - When may you require offline authentication and documentary evidence?
Code of Federal Regulations, 2010 CFR
2010-07-01
... authentication and documentary evidence? 363.21 Section 363.21 Money and Finance: Treasury Regulations Relating... TreasuryDirect § 363.21 When may you require offline authentication and documentary evidence? We may require offline authentication and documentary evidence at our option. [74 FR 19419, Apr. 29, 2009] ...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Requirements for obtaining an authentication... Prescriptions § 1311.110 Requirements for obtaining an authentication credential—Individual practitioners... credentialing office) may conduct identity proofing and authorize the issuance of the authentication credential...
Toward Developing Authentic Leadership: Team-Based Simulations
ERIC Educational Resources Information Center
Shapira-Lishchinsky, Orly
2014-01-01
Although there is a consensus that authentic leadership should be an essential component in educational leadership, no study to date has ever tried to find whether team-based simulations may promote authentic leadership. The purpose of this study was to identify whether principal trainees can develop authentic leadership through ethical decision…
Learning How to Lead: A Lifetime Journey
ERIC Educational Resources Information Center
Baugher, Shirley L.
2005-01-01
Much has been written about theories of leadership, leadership qualities, and the development of leadership. In this article, the author focuses on the work of Kevin Cashman, who proposed the following "Five Touchstones" that are crucial to authentic leadership: (1) Know Yourself Authentically; (2) Listen Authentically; (3) Express Authentically;…
Authenticity in the Bureau-Enterprise Culture: The Struggle for Authentic Meaning
ERIC Educational Resources Information Center
Woods, Philip A.
2007-01-01
This article emphasizes the extent to which conceptions of authenticity are forged through social interaction and socially mediated identities and how, in turn, authentic leadership involves the transformation of the organizational, social or cultural order in which leadership is situated. The overarching context for this exploration of authentic…
Teachers' Development Model to Authentic Assessment by Empowerment Evaluation Approach
ERIC Educational Resources Information Center
Charoenchai, Charin; Phuseeorn, Songsak; Phengsawat, Waro
2015-01-01
The purposes of this study were 1) Study teachers authentic assessment, teachers comprehension of authentic assessment and teachers needs for authentic assessment development. 2) To create teachers development model. 3) Experiment of teachers development model. 4) Evaluate effectiveness of teachers development model. The research is divided into 4…
Trustworthiness and Authenticity: Alternate Ways To Judge Authentic Assessments.
ERIC Educational Resources Information Center
Hipps, Jerome A.
New methods are needed to judge the quality of alternative student assessment, methods which complement the philosophy underlying authentic assessments. This paper examines assumptions underlying validity, reliability, and objectivity, and why they are not matched to authentic assessment, concentrating on the constructivist paradigm of E. Guba and…
[Causes for change in producing areas of geo-authentic herbs].
Liang, Fei; Li, Jian; Zhang, Wei; Zhang, Rui-Xian
2013-05-01
Geo-authentic herbs lay stress on their producing areas. The producing areas of most geo-authentic herbs have never changed since the ancient times. However, many other geo-authentic herbs have experienced significant changes in the long history. There are two main causes for the change in producing areas of herbs-change of natural environment and development of human society, which are restricted by each other and play a great role throughout the development process of geo-authentic herbs.
An Efficient Authenticated Key Transfer Scheme in Client-Server Networks
NASA Astrophysics Data System (ADS)
Shi, Runhua; Zhang, Shun
2017-10-01
In this paper, we presented a novel authenticated key transfer scheme in client-server networks, which can achieve two secure goals of remote user authentication and the session key establishment between the remote user and the server. Especially, the proposed scheme can subtly provide two fully different authentications: identity-base authentication and anonymous authentication, while the remote user only holds a private key. Furthermore, our scheme only needs to transmit 1-round messages from the remote user to the server, thus it is very efficient in communication complexity. In addition, the most time-consuming computation in our scheme is elliptic curve scalar point multiplication, so it is also feasible even for mobile devices.
Gjersoe, Nathalia L.; Newman, George E.; Chituc, Vladimir; Hood, Bruce
2014-01-01
The current studies examine how valuation of authentic items varies as a function of culture. We find that U.S. respondents value authentic items associated with individual persons (a sweater or an artwork) more than Indian respondents, but that both cultures value authentic objects not associated with persons (a dinosaur bone or a moon rock) equally. These differences cannot be attributed to more general cultural differences in the value assigned to authenticity. Rather, the results support the hypothesis that individualistic cultures place a greater value on objects associated with unique persons and in so doing, offer the first evidence for how valuation of certain authentic items may vary cross-culturally. PMID:24658437
Gjersoe, Nathalia L; Newman, George E; Chituc, Vladimir; Hood, Bruce
2014-01-01
The current studies examine how valuation of authentic items varies as a function of culture. We find that U.S. respondents value authentic items associated with individual persons (a sweater or an artwork) more than Indian respondents, but that both cultures value authentic objects not associated with persons (a dinosaur bone or a moon rock) equally. These differences cannot be attributed to more general cultural differences in the value assigned to authenticity. Rather, the results support the hypothesis that individualistic cultures place a greater value on objects associated with unique persons and in so doing, offer the first evidence for how valuation of certain authentic items may vary cross-culturally.
NASA Astrophysics Data System (ADS)
Dong, Yumin; Xiao, Shufen; Ma, Hongyang; Chen, Libo
2016-12-01
Cloud computing and big data have become the developing engine of current information technology (IT) as a result of the rapid development of IT. However, security protection has become increasingly important for cloud computing and big data, and has become a problem that must be solved to develop cloud computing. The theft of identity authentication information remains a serious threat to the security of cloud computing. In this process, attackers intrude into cloud computing services through identity authentication information, thereby threatening the security of data from multiple perspectives. Therefore, this study proposes a model for cloud computing protection and management based on quantum authentication, introduces the principle of quantum authentication, and deduces the quantum authentication process. In theory, quantum authentication technology can be applied in cloud computing for security protection. This technology cannot be cloned; thus, it is more secure and reliable than classical methods.
A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks
Chen, Huifang; Ge, Linlin; Xie, Lei
2015-01-01
The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes. PMID:26184224
A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.
Chen, Huifang; Ge, Linlin; Xie, Lei
2015-07-14
The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.
Secure ADS-B authentication system and method
NASA Technical Reports Server (NTRS)
Viggiano, Marc J (Inventor); Valovage, Edward M (Inventor); Samuelson, Kenneth B (Inventor); Hall, Dana L (Inventor)
2010-01-01
A secure system for authenticating the identity of ADS-B systems, including: an authenticator, including a unique id generator and a transmitter transmitting the unique id to one or more ADS-B transmitters; one or more ADS-B transmitters, including a receiver receiving the unique id, one or more secure processing stages merging the unique id with the ADS-B transmitter's identification, data and secret key and generating a secure code identification and a transmitter transmitting a response containing the secure code and ADSB transmitter's data to the authenticator; the authenticator including means for independently determining each ADS-B transmitter's secret key, a receiver receiving each ADS-B transmitter's response, one or more secure processing stages merging the unique id, ADS-B transmitter's identification and data and generating a secure code, and comparison processing comparing the authenticator-generated secure code and the ADS-B transmitter-generated secure code and providing an authentication signal based on the comparison result.
NASA Technical Reports Server (NTRS)
Brenner, Richard; Lala, Jaynarayan H.; Nagle, Gail A.; Schor, Andrei; Turkovich, John
1994-01-01
This program demonstrated the integration of a number of technologies that can increase the availability and reliability of launch vehicles while lowering costs. Availability is increased with an advanced guidance algorithm that adapts trajectories in real-time. Reliability is increased with fault-tolerant computers and communication protocols. Costs are reduced by automatically generating code and documentation. This program was realized through the cooperative efforts of academia, industry, and government. The NASA-LaRC coordinated the effort, while Draper performed the integration. Georgia Institute of Technology supplied a weak Hamiltonian finite element method for optimal control problems. Martin Marietta used MATLAB to apply this method to a launch vehicle (FENOC). Draper supplied the fault-tolerant computing and software automation technology. The fault-tolerant technology includes sequential and parallel fault-tolerant processors (FTP & FTPP) and authentication protocols (AP) for communication. Fault-tolerant technology was incrementally incorporated. Development culminated with a heterogeneous network of workstations and fault-tolerant computers using AP. Draper's software automation system, ASTER, was used to specify a static guidance system based on FENOC, navigation, flight control (GN&C), models, and the interface to a user interface for mission control. ASTER generated Ada code for GN&C and C code for models. An algebraic transform engine (ATE) was developed to automatically translate MATLAB scripts into ASTER.
Reconciling Divisions in the Field of Authentic Education
ERIC Educational Resources Information Center
Sarid, Ariel
2015-01-01
The aim of this article is twofold: first, to identify and address three central divisions in the field of authentic education that introduce ambiguity and at times inconsistencies within the field of authentic education. These divisions concern a) the relationship between autonomy and authenticity; b) the division between the two basic attitudes…
ERIC Educational Resources Information Center
Ramezanzadeh, Akram; Adel, Seyyed Mohammad Reza; Zareian, Gholamreza
2016-01-01
This study probed the conceptualization of authenticity in teaching and its link to teachers' emotional life through critical emotional praxis because emotions are integral to discovering who we really are (McCarthy, E. D. 2009. "Emotional Performances as Dramas of Authenticity." In "Authenticity in Culture, Self, and Society,"…
Code of Federal Regulations, 2010 CFR
2010-07-01
... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Authentication and certification... Suspension and Revocation Hearings § 20.1303 Authentication and certification of extracts from shipping...) Authentication and certification must include a statement that the person acting has seen the original, compared...
An Examination of Teacher Authenticity in the College Classroom
ERIC Educational Resources Information Center
Johnson, Zac D.; LaBelle, Sara
2017-01-01
This study sought to generate a more robust understanding of teacher (in)authenticity. In other contexts, authenticity is regarded as a display of true self and has been positively linked to beneficial psychological (e.g., increased self-esteem) and social outcomes (e.g., higher relational satisfaction). However, what it means to be authentic in…
Commonalities and Specificities of Authentic Leadership in Ghana and New Zealand
ERIC Educational Resources Information Center
Owusu-Bempah, Justice; Addison, Ramzi; Fairweather, John
2014-01-01
The authentic leadership literature suggests that there are three critical elements that precede the bestowal of authentic leadership: first, the espoused values and actions of authentic leaders must be congruent; second, the expectation of the leaders and the followers must be congruent; and, third, the leaders must behave with high moral…
ERIC Educational Resources Information Center
Everett, Donna R.
This guide presents performance-based authentic assessment ideas, samples, and suggestions to help marketing teachers and students respond to changes and pressures from outside the classroom. It contains 21 activities, each accompanied by a method of authentic assessment. In most cases, the authentic assessment method is a scoring device. The…
ERIC Educational Resources Information Center
Stone, Anne H.
2012-01-01
Purpose: The purpose of this study was to identify what leadership coaches perceive to be the benefits of authenticity to their clients' success. Another purpose was to identify what barriers leadership coaches perceive as preventing their clients from developing authenticity. A final purpose of this study was to identify which strategies…
Authentic e-Learning in a Multicultural Context: Virtual Benchmarking Cases from Five Countries
ERIC Educational Resources Information Center
Leppisaari, Irja; Herrington, Jan; Vainio, Leena; Im, Yeonwook
2013-01-01
The implementation of authentic learning elements at education institutions in five countries, eight online courses in total, is examined in this paper. The International Virtual Benchmarking Project (2009-2010) applied the elements of authentic learning developed by Herrington and Oliver (2000) as criteria to evaluate authenticity. Twelve…
Developmental Changes in Judgments of Authentic Objects
ERIC Educational Resources Information Center
Frazier, Brandy N.; Gelman, Susan A.
2009-01-01
This study examined the development of an understanding of authenticity among 112 children (preschoolers, kindergarten, 1st graders, and 4th graders) and 119 college students. Participants were presented with pairs of photographs depicting authentic and non-authentic objects and asked to pick which one belongs in a museum and which one they would…
Using Horses to Teach Authentic Leadership Skills to At-Risk Youth
ERIC Educational Resources Information Center
Adams, Brittany Lee
2013-01-01
The primary purpose of this study was to determine the impact of an equine-facilitated authentic leadership development program on at-risk youth. Participants were asked to participate in two focus groups and a 3-day equine-facilitated authentic leadership development program based on Bill George's Model of Authentic Leadership. Participants were…
Emmerich, Astrid I.; Rigotti, Thomas
2017-01-01
This study investigates the role of context-specific authenticity at work for work-related outcomes (intrinsic motivation, work ability) and depressivity. Furthermore reciprocal relations between work-related authenticity and healthy psychological functioning are investigated. Longitudinal data from 1,243 employees from 63 subsidiaries of a non-profit organization in the social sector were analyzed using multilevel structural equation modeling. Work-related authenticity at T1 predicted work ability and depressivity, but not intrinsic motivation at T2, about 6 months later. Work-related authenticity at T2 was predicted by intrinsic motivation and depressivity, but not by work ability at T1. We conclude that work-related authenticity and healthy psychological functioning are positively reinforcing each other. Thus, enabling employees to be authentic supposedly increases their well-being and is a pivotal opportunity for organizations to foster health and performance-related indicators like work ability and prevent negative health indicators like depressivity. At the same time, authenticity of employees can be fostered through workplace health promotion. PMID:28316581
Manticore and CS mode : parallelizable encryption with joint cipher-state authentication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree
2004-10-01
We describe a new mode of encryption with inexpensive authentication, which uses information from the internal state of the cipher to provide the authentication. Our algorithms have a number of benefits: (1) the encryption has properties similar to CBC mode, yet the encipherment and authentication can be parallelized and/or pipelined, (2) the authentication overhead is minimal, and (3) the authentication process remains resistant against some IV reuse. We offer a Manticore class of authenticated encryption algorithms based on cryptographic hash functions, which support variable block sizes up to twice the hash output length and variable key lengths. A proof ofmore » security is presented for the MTC4 and Pepper algorithms. We then generalize the construction to create the Cipher-State (CS) mode of encryption that uses the internal state of any round-based block cipher as an authenticator. We provide hardware and software performance estimates for all of our constructions and give a concrete example of the CS mode of encryption that uses AES as the encryption primitive and adds a small speed overhead (10-15%) compared to AES alone.« less
A Key Establishment Protocol for RFID User in IPTV Environment
NASA Astrophysics Data System (ADS)
Jeong, Yoon-Su; Kim, Yong-Tae; Sohn, Jae-Min; Park, Gil-Cheol; Lee, Sang-Ho
In recent years, the usage of IPTV (Internet Protocol Television) has been increased. The reason is a technological convergence of broadcasting and telecommunication delivering interactive applications and multimedia content through high speed Internet connections. The main critical point of IPTV security requirements is subscriber authentication. That is, IPTV service should have the capability to identify the subscribers to prohibit illegal access. Currently, IPTV service does not provide a sound authentication mechanism to verify the identity of its wireless users (or devices). This paper focuses on a lightweight authentication and key establishment protocol based on the use of hash functions. The proposed approach provides effective authentication for a mobile user with a RFID tag whose authentication information is communicated back and forth with the IPTV authentication server via IPTV set-top box (STB). That is, the proposed protocol generates user's authentication information that is a bundle of two public keys derived from hashing user's private keys and RFID tag's session identifier, and adds 1bit to this bundled information for subscriber's information confidentiality before passing it to the authentication server.
The influence of authentic leadership on safety climate in nursing.
Dirik, Hasan Fehmi; Seren Intepeler, Seyda
2017-07-01
This study analysed nurses' perceptions of authentic leadership and safety climate and examined the contribution of authentic leadership to the safety climate. It has been suggested and emphasised that authentic leadership should be used as a guidance to ensure quality care and the safety of patients and health-care personnel. This predictive study was conducted with 350 nurses in three Turkish hospitals. The data were collected using the Authentic Leadership Questionnaire and the Safety Climate Survey and analysed using hierarchical regression analysis. The mean authentic leadership perception and the safety climate scores of the nurses were 2.92 and 3.50, respectively. The percentage of problematic responses was found to be less than 10% for only four safety climate items. Hierarchical regression analysis revealed that authentic leadership significantly predicted the safety climate. Procedural and political improvements are required in terms of the safety climate in institutions, where the study was conducted, and authentic leadership increases positive perceptions of safety climate. Exhibiting the characteristics of authentic leadership, or improving them and reflecting them on to personnel can enhance the safety climate. Planning information sharing meetings to raise the personnel's awareness of safety climate and systemic improvements can contribute to creating safe care climates. © 2017 John Wiley & Sons Ltd.
Chang, I-Pin; Lee, Tian-Fu; Lin, Tsung-Hung; Liu, Chuan-Ming
2015-01-01
Key agreements that use only password authentication are convenient in communication networks, but these key agreement schemes often fail to resist possible attacks, and therefore provide poor security compared with some other authentication schemes. To increase security, many authentication and key agreement schemes use smartcard authentication in addition to passwords. Thus, two-factor authentication and key agreement schemes using smartcards and passwords are widely adopted in many applications. Vaidya et al. recently presented a two-factor authentication and key agreement scheme for wireless sensor networks (WSNs). Kim et al. observed that the Vaidya et al. scheme fails to resist gateway node bypassing and user impersonation attacks, and then proposed an improved scheme for WSNs. This study analyzes the weaknesses of the two-factor authentication and key agreement scheme of Kim et al., which include vulnerability to impersonation attacks, lost smartcard attacks and man-in-the-middle attacks, violation of session key security, and failure to protect user privacy. An efficient and secure authentication and key agreement scheme for WSNs based on the scheme of Kim et al. is then proposed. The proposed scheme not only solves the weaknesses of previous approaches, but also increases security requirements while maintaining low computational cost. PMID:26633396
Compact FPGA hardware architecture for public key encryption in embedded devices
Morales-Sandoval, Miguel; Cumplido, René; Feregrino-Uribe, Claudia; Algredo-Badillo, Ignacio
2018-01-01
Security is a crucial requirement in the envisioned applications of the Internet of Things (IoT), where most of the underlying computing platforms are embedded systems with reduced computing capabilities and energy constraints. In this paper we present the design and evaluation of a scalable low-area FPGA hardware architecture that serves as a building block to accelerate the costly operations of exponentiation and multiplication in GF(p), commonly required in security protocols relying on public key encryption, such as in key agreement, authentication and digital signature. The proposed design can process operands of different size using the same datapath, which exhibits a significant reduction in area without loss of efficiency if compared to representative state of the art designs. For example, our design uses 96% less standard logic than a similar design optimized for performance, and 46% less resources than other design optimized for area. Even using fewer area resources, our design still performs better than its embedded software counterparts (190x and 697x). PMID:29360824
Compact FPGA hardware architecture for public key encryption in embedded devices.
Rodríguez-Flores, Luis; Morales-Sandoval, Miguel; Cumplido, René; Feregrino-Uribe, Claudia; Algredo-Badillo, Ignacio
2018-01-01
Security is a crucial requirement in the envisioned applications of the Internet of Things (IoT), where most of the underlying computing platforms are embedded systems with reduced computing capabilities and energy constraints. In this paper we present the design and evaluation of a scalable low-area FPGA hardware architecture that serves as a building block to accelerate the costly operations of exponentiation and multiplication in [Formula: see text], commonly required in security protocols relying on public key encryption, such as in key agreement, authentication and digital signature. The proposed design can process operands of different size using the same datapath, which exhibits a significant reduction in area without loss of efficiency if compared to representative state of the art designs. For example, our design uses 96% less standard logic than a similar design optimized for performance, and 46% less resources than other design optimized for area. Even using fewer area resources, our design still performs better than its embedded software counterparts (190x and 697x).
Tongue prints: A novel biometric and potential forensic tool.
Radhika, T; Jeddy, Nadeem; Nithya, S
2016-01-01
Tongue is a vital internal organ well encased within the oral cavity and protected from the environment. It has unique features which differ from individual to individual and even between identical twins. The color, shape, and surface features are characteristic of every individual, and this serves as a tool for identification. Many modes of biometric systems have come into existence such as fingerprint, iris scan, skin color, signature verification, voice recognition, and face recognition. The search for a new personal identification method secure has led to the use of the lingual impression or the tongue print as a method of biometric authentication. Tongue characteristics exhibit sexual dimorphism thus aiding in the identification of the person. Emerging as a novel biometric tool, tongue prints also hold the promise of a potential forensic tool. This review highlights the uniqueness of tongue prints and its superiority over other biometric identification systems. The various methods of tongue print collection and the classification of tongue features are also elucidated.
Canonical quantization of general relativity in discrete space-times.
Gambini, Rodolfo; Pullin, Jorge
2003-01-17
It has long been recognized that lattice gauge theory formulations, when applied to general relativity, conflict with the invariance of the theory under diffeomorphisms. We analyze discrete lattice general relativity and develop a canonical formalism that allows one to treat constrained theories in Lorentzian signature space-times. The presence of the lattice introduces a "dynamical gauge" fixing that makes the quantization of the theories conceptually clear, albeit computationally involved. The problem of a consistent algebra of constraints is automatically solved in our approach. The approach works successfully in other field theories as well, including topological theories. A simple cosmological application exhibits quantum elimination of the singularity at the big bang.
Results from the Crop Identification Technology Assessment for Remote Sensing (CITARS) project
NASA Technical Reports Server (NTRS)
Bauer, M. E. (Principal Investigator); Davis, B. J.; Bizzell, R. M.; Hall, F. G.; Feiveson, A. H.; Malila, W. A.; Rice, D. P.
1976-01-01
The author has identified the following significant results. It was found that several factors had a significant effect on crop identification performance: (1) crop maturity and site characteristics, (2) which of several different single date automatic data processing procedures was used for local recognition, (3) nonlocal recognition, both with and without preprocessing for the extension of recognition signatures, and (4) use of multidate data. It also was found that classification accuracy for field center pixels was not a reliable indicator of proportion estimation performance for whole areas, that bias was present in proportion estimates, and that training data and procedures strongly influenced crop identification performance.
Detection and identification of concealed weapons using matrix pencil
NASA Astrophysics Data System (ADS)
Adve, Raviraj S.; Thayaparan, Thayananthan
2011-06-01
The detection and identification of concealed weapons is an extremely hard problem due to the weak signature of the target buried within the much stronger signal from the human body. This paper furthers the automatic detection and identification of concealed weapons by proposing the use of an effective approach to obtain the resonant frequencies in a measurement. The technique, based on Matrix Pencil, a scheme for model based parameter estimation also provides amplitude information, hence providing a level of confidence in the results. Of specific interest is the fact that Matrix Pencil is based on a singular value decomposition, making the scheme robust against noise.
ERIC Educational Resources Information Center
Smith, Jane; Wiese, Patricia
2006-01-01
This article discusses the importance of authentic picture-storybook adaptations of multicultural folktales and describes an action research project through which a children's picture-book adaptation of a traditional tale can be authenticated using an inquiry-based process. In addition to modeling an actual authentication project using "The Golden…
21 CFR 20.3 - Certification and authentication of Food and Drug Administration records.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Certification and authentication of Food and Drug... authentication of Food and Drug Administration records. (a) Upon request, the Food and Drug Administration will... or for authentication of records shall be sent in writing to the Freedom of Information Staff (HFI-35...
Children's and Adolescents' Perception of the Authenticity of Smiles
ERIC Educational Resources Information Center
Thibault, Pascal; Gosselin, Pierre; Brunel, Marie-Lise; Hess, Ursula
2009-01-01
Recently, Thibault and colleagues described the Duchenne marker as a cultural dialect for the perception of smile authenticity. The current study had the goal to follow up on this finding and to investigate the cues that French Canadian children use to evaluate the authenticity of smiles from members of three ethnic groups. The authenticity of six…
Meeting EFL Learners Halfway by Using Locally Relevant Authentic Materials
ERIC Educational Resources Information Center
Thomas, Catherine
2014-01-01
The author defines and describes authentic materials and discusses their benefits--citing the Input Hypothesis and the Output Principle in support of such materials--as well as some challenges of using authentic materials. Five categories of authentic materials are presented, and sources for materials and ways to use them in the EFL classroom are…
Service Oriented Architecture Security Risks and their Mitigation
2012-10-01
this section can be mitigated by making use of suitable authentication , confidentiality, integrity, and authorisation standards such as Security...for authorisation . Machines/non-human users should be clearly identified and authenticated by the identity provision and authentication services... authentication , any security related attributes for the subject, and the authorisation decisions given based on the security and privilege attributes
A Framework for Determining the Authenticity of Assessment Tasks: Applied to an Example in Law
ERIC Educational Resources Information Center
Burton, Kelley
2011-01-01
Authentic assessment tasks enhance engagement, retention and the aspirations of students. This paper explores the discipline-generic features of authentic assessment, which reflect what students need to achieve in the real world. Some assessment tasks are more authentic than others and this paper designs a proposed framework supported by the…
ERIC Educational Resources Information Center
Hardre, Patricia L.
2013-01-01
Authenticity is a key to using technology for instruction in ways that enhance learning and support learning transfer. Simply put, a representation is authentic when it shows learners clearly what a task, context, or experience will be like in real practice. More authentic representations help people learn and understand better. They support…
Authentic leadership: develop the leader within.
Yasinski, Lesia
2014-03-01
Great leadership usually starts with a willing heart, a positive attitude, and a desire to make a difference. Strong leadership is important, in today's health care climate, to ensure optimal patient outcomes and the fostering of future generations of knowledgeable, motivated and enthusiastic perioperative nurses. This article will explore key elements necessary for the development of authentic leadership. While highlighting the role that personal development plays in leadership skills, this article will also discuss ways to cultivate authenticity in leadership. The following questions will be addressed: What is authentic leadership? How does one become an authentic leader?
[Brief introduction of geo-authentic herbs].
Liang, Fei; Li, Jian; Zhang, Wei; Zhang, Rui-Xian
2013-05-01
The science of geo-authentic herbs is a characteristic discipline of traditional Chinese medicine established during thousands of years of clinical practices. It has a long history under the guidance of profound theories of traditional Chinese medicine. The words of "geo-authentic product" were derived from an administrative division unit in the ancient times, which layed stress on the good quality of products in particular regions. In ancient records of traditional Chinese medicine, the words of "geo-authentic product" were first found in Concise Herbal Foundation Compilation of the Ming dynasty, and the words of "geo-authentic herbs" were first discovered in Peony Pavilion of the late Ming dynasty. After all, clinical effect is the fundamental evaluation standard of geo-authentic herbs.
Women outperform men in distinguishing between authentic and nonauthentic smiles.
Spies, Maren; Sevincer, A Timur
2017-11-28
Women tend to be more accurate in decoding facial expressions than men. We hypothesized that women's better performance in decoding facial expressions extends to distinguishing between authentic and nonauthentic smiles. We showed participants portrait photos of persons who smiled because either they saw a pleasant picture (authentic smile) or were instructed to smile by the experimenter (nonauthentic smile) and asked them to identify the smiles. Participants judged single photos of persons depicting either an authentic or a nonauthentic smile, and they judged adjacent photos of the same person depicting an authentic smile and a nonauthentic smile. Women outperformed men in identifying the smiles when judging the adjacent photos. We discuss implications for judging smile authenticity in real life and limitations for the observed sex difference.
A Secured Authentication Protocol for SIP Using Elliptic Curves Cryptography
NASA Astrophysics Data System (ADS)
Chen, Tien-Ho; Yeh, Hsiu-Lien; Liu, Pin-Chuan; Hsiang, Han-Chen; Shih, Wei-Kuan
Session initiation protocol (SIP) is a technology regularly performed in Internet Telephony, and Hyper Text Transport Protocol (HTTP) as digest authentication is one of the major methods for SIP authentication mechanism. In 2005, Yang et al. pointed out that HTTP could not resist server spoofing attack and off-line guessing attack and proposed a secret authentication with Diffie-Hellman concept. In 2009, Tsai proposed a nonce based authentication protocol for SIP. In this paper, we demonstrate that their protocol could not resist the password guessing attack and insider attack. Furthermore, we propose an ECC-based authentication mechanism to solve their issues and present security analysis of our protocol to show that ours is suitable for applications with higher security requirement.
Relationship authenticity partially mediates the effects of attachment on relationship satisfaction.
Rasco, Danney; Warner, Rebecca M
2017-01-01
Individuals with anxious and avoidant attachment tend to experience less satisfaction in their relationships. Past research suggests the negative effects of attachment on relationship satisfaction may be partially mediated by self-disclosure and self-concealment; the present study evaluated relationship authenticity as a potential additional mediator. Confirmatory factor analysis indicated that relationship authenticity is distinct from self-disclosure and self-concealment. Relationship authenticity predicted additional variance in relationship satisfaction controlling for attachment, self-disclosure, and self-concealment. The results were consistent with relationship authenticity, along with self-disclosure and self-concealment, partially mediating the effects of attachment on relationship satisfaction. These findings suggest that relationship authenticity may play a unique role in understanding how attachment influences relationship satisfaction. Theoretical and clinical implications are discussed.
NASA Astrophysics Data System (ADS)
Hardyanti, R. C.; Hartono; Fianti
2018-03-01
Physics Learning in Curriculum of 2013 is closely related to the implementation of scientific approach and authentic assessment in learning. This study aims to analyze the implementation of scientific approaches and authentic assessment in physics learning, as well as to analyze the constraints of scientific approach and authentic assessment in physics learning. The data collection techniques used in this study are questionnaires, observations, interviews, and documentation. The calculation results used are percentage techniques and analyzed by using qualitative descriptive approach. Based on the results of research and discussion, the implementation of physics learning based on the scientific approach goes well with the percentage of 84.60%. Physical learning activity based on authentic assessment also goes well with the percentage of 88%. The results of the percentage of scientific approaches and authentic assessment approaches are less than 100%. It shows that there are obstacles to the implementation of the scientific approach and the constraints of authentic assessment. The obstacles to the implementation of scientific approach include time, heavy load of material, input or ability of learners, the willingness of learners in asking questions, laboratory support, and the ability of students to process data. While the obstacles to the implementation of authentic assessment include the limited time for carrying out of authentic assessment, the components of the criteria in carrying out the authentic assessment, the lack of discipline in administering the administration, the difficulty of changing habits in carrying out the assessment from traditional assessment to the authentic assessment, the obstacle to process the score in accordance with the format Curriculum of 2013.
Discovering your authentic leadership.
George, Bill; Sims, Peter; McLean, Andrew N; Mayer, Diana
2007-02-01
The ongoing problems in business leadership over the past five years have underscored the need for a new kind of leader in the twenty-first century: the authentic leader. Author Bill George, a Harvard Business School professor and the former chairman and CEO of Medtronic, and his colleagues, conducted the largest leadership development study ever undertaken. They interviewed 125 business leaders from different racial, religious, national, and socioeconomic backgrounds to understand how leaders become and remain authentic. Their interviews showed that you do not have to be born with any particular characteristics or traits to lead. You also do not have to be at the top of your organization. Anyone can learn to be an authentic leader. The journey begins with leaders understanding their life stories. Authentic leaders frame their stories in ways that allow them to see themselves not as passive observers but as individuals who learn from their experiences. These leaders make time to examine their experiences and to reflect on them, and in doing so they grow as individuals and as leaders. Authentic leaders also work hard at developing self-awareness through persistent and often courageous self-exploration. Denial can be the greatest hurdle that leaders face in becoming self-aware, but authentic leaders ask for, and listen to, honest feedback. They also use formal and informal support networks to help them stay grounded and lead integrated lives. The authors argue that achieving business results over a sustained period of time is the ultimate mark of authentic leadership. It may be possible to drive short-term outcomes without being authentic, but authentic leadership is the only way to create long-term results.
42 CFR 401.140 - Fees and charges.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for records; and (3) Certification or authentication of records. (b) Fee schedules. The fee schedule.... (3) Certification or authentication of records. Three dollars per certification or authentication. (4...
Code of Federal Regulations, 2010 CFR
2010-07-01
... that my password or other form of authentication has become compromised? 363.19 Section 363.19 Money... that my password or other form of authentication has become compromised? If you become aware that your password has become compromised, that any other form of authentication has been compromised, lost, stolen...
ERIC Educational Resources Information Center
Mattord, Herbert J.
2012-01-01
Organizations continue to rely on password-based authentication methods to control access to many Web-based systems. This research study developed a benchmarking instrument intended to assess authentication methods used in Web-based information systems (IS). It developed an Authentication Method System Index (AMSI) to analyze collected data from…
ERIC Educational Resources Information Center
Thompson, Merlin B.
2015-01-01
The problem with authenticity--the idea of being "true to one's self"--is that its somewhat checkered reputation garners a complete range of favorable and unfavorable reactions. In educational settings, authenticity is lauded as one of the top two traits students desire in their teachers. Yet, authenticity is criticized for its tendency…
Robust authentication through stochastic femtosecond laser filament induced scattering surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Haisu; Tzortzakis, Stelios, E-mail: stzortz@iesl.forth.gr; Materials Science and Technology Department, University of Crete, 71003 Heraklion
2016-05-23
We demonstrate a reliable authentication method by femtosecond laser filament induced scattering surfaces. The stochastic nonlinear laser fabrication nature results in unique authentication robust properties. This work provides a simple and viable solution for practical applications in product authentication, while also opens the way for incorporating such elements in transparent media and coupling those in integrated optical circuits.
Obfuscated authentication systems, devices, and methods
Armstrong, Robert C; Hutchinson, Robert L
2013-10-22
Embodiments of the present invention are directed toward authentication systems, devices, and methods. Obfuscated executable instructions may encode an authentication procedure and protect an authentication key. The obfuscated executable instructions may require communication with a remote certifying authority for operation. In this manner, security may be controlled by the certifying authority without regard to the security of the electronic device running the obfuscated executable instructions.
Applications of Multi-Channel Safety Authentication Protocols in Wireless Networks.
Chen, Young-Long; Liau, Ren-Hau; Chang, Liang-Yu
2016-01-01
People can use their web browser or mobile devices to access web services and applications which are built into these servers. Users have to input their identity and password to login the server. The identity and password may be appropriated by hackers when the network environment is not safe. The multiple secure authentication protocol can improve the security of the network environment. Mobile devices can be used to pass the authentication messages through Wi-Fi or 3G networks to serve as a second communication channel. The content of the message number is not considered in a multiple secure authentication protocol. The more excessive transmission of messages would be easier to collect and decode by hackers. In this paper, we propose two schemes which allow the server to validate the user and reduce the number of messages using the XOR operation. Our schemes can improve the security of the authentication protocol. The experimental results show that our proposed authentication protocols are more secure and effective. In regard to applications of second authentication communication channels for a smart access control system, identity identification and E-wallet, our proposed authentication protocols can ensure the safety of person and property, and achieve more effective security management mechanisms.
Girls' relationship authenticity and self-esteem across adolescence.
Impett, Emily A; Sorsoli, Lynn; Schooler, Deborah; Henson, James M; Tolman, Deborah L
2008-05-01
Feminist psychologists have long posited that relationship authenticity (i.e., the congruence between what one thinks and feels and what one does and says in relational contexts) is integral to self-esteem and well-being. Guided by a feminist developmental framework, the authors investigated the role of relationship authenticity in promoting girls' self-esteem over the course of adolescence. Latent growth curve modeling was used to test the association between relationship authenticity and self-esteem with data from a 5-year, 3-wave longitudinal study of 183 adolescent girls. Results revealed that both relationship authenticity and self-esteem increased steadily in a linear fashion from the 8th to the 12th grade. Girls who scored high on the measure of relationship authenticity in the 8th grade experienced greater increases in self-esteem over the course of adolescence than girls who scored low on relationship authenticity. Further, girls who increased in authenticity also tended to increase in self-esteem over the course of adolescence. The importance of a feminist developmental framework for identifying and understanding salient dimensions of female adolescence is discussed. (PsycINFO Database Record (c) 2008 APA, all rights reserved).
Frazier, Brandy N; Gelman, Susan A; Wilson, Alice; Hood, Bruce
2009-01-01
Authentic objects are those that have an historical link to a person, event, time, or place of some significance (e.g., original Picasso painting; gown worn by Princess Diana; your favorite baby blanket). The current study examines everyday beliefs about authentic objects, with three primary goals: to determine the scope of adults' evaluation of authentic objects, to examine such evaluation in two distinct cultural settings, and to determine whether a person's attachment history (i.e., whether or not they owned an attachment object as a child) predicts evaluation of authentic objects. We found that college students in the U.K. (N = 125) and U.S. (N = 119) consistently evaluate a broad range of authentic items as more valuable than matched control (inauthentic) objects, more desirable to keep, and more desirable to touch, though only non-personal authentic items were judged to be more appropriate for display in a museum. These patterns were remarkably similar across the two cultural contexts. Additionally, those who had an attachment object as a child evaluated objects more favorably, and in particular judged authentic objects to be more valuable. Altogether, these results demonstrate broad endorsement of "positive contagion" among college-educated adults.
Frazier, Brandy N.; Gelman, Susan A.; Wilson, Alice; Hood, Bruce
2010-01-01
Authentic objects are those that have an historical link to a person, event, time, or place of some significance (e.g., original Picasso painting; gown worn by Princess Diana; your favorite baby blanket). The current study examines everyday beliefs about authentic objects, with three primary goals: to determine the scope of adults’ evaluation of authentic objects, to examine such evaluation in two distinct cultural settings, and to determine whether a person’s attachment history (i.e., whether or not they owned an attachment object as a child) predicts evaluation of authentic objects. We found that college students in the U.K. (N = 125) and U.S. (N = 119) consistently evaluate a broad range of authentic items as more valuable than matched control (inauthentic) objects, more desirable to keep, and more desirable to touch, though only non-personal authentic items were judged to be more appropriate for display in a museum. These patterns were remarkably similar across the two cultural contexts. Additionally, those who had an attachment object as a child evaluated objects more favorably, and in particular judged authentic objects to be more valuable. Altogether, these results demonstrate broad endorsement of "positive contagion" among college-educated adults. PMID:20631919
Mishra, Raghavendra; Barnwal, Amit Kumar
2015-05-01
The Telecare medical information system (TMIS) presents effective healthcare delivery services by employing information and communication technologies. The emerging privacy and security are always a matter of great concern in TMIS. Recently, Chen at al. presented a password based authentication schemes to address the privacy and security. Later on, it is proved insecure against various active and passive attacks. To erase the drawbacks of Chen et al.'s anonymous authentication scheme, several password based authentication schemes have been proposed using public key cryptosystem. However, most of them do not present pre-smart card authentication which leads to inefficient login and password change phases. To present an authentication scheme with pre-smart card authentication, we present an improved anonymous smart card based authentication scheme for TMIS. The proposed scheme protects user anonymity and satisfies all the desirable security attributes. Moreover, the proposed scheme presents efficient login and password change phases where incorrect input can be quickly detected and a user can freely change his password without server assistance. Moreover, we demonstrate the validity of the proposed scheme by utilizing the widely-accepted BAN (Burrows, Abadi, and Needham) logic. The proposed scheme is also comparable in terms of computational overheads with relevant schemes.
Spell, Rachelle M.; Guinan, Judith A.; Miller, Kristen R.; Beck, Christopher W.
2014-01-01
Incorporating authentic research experiences in introductory biology laboratory classes would greatly expand the number of students exposed to the excitement of discovery and the rigor of the scientific process. However, the essential components of an authentic research experience and the barriers to their implementation in laboratory classes are poorly defined. To guide future reform efforts in this area, we conducted a national survey of biology faculty members to determine 1) their definitions of authentic research experiences in laboratory classes, 2) the extent of authentic research experiences currently experienced in their laboratory classes, and 3) the barriers that prevent incorporation of authentic research experiences into these classes. Strikingly, the definitions of authentic research experiences differ among faculty members and tend to emphasize either the scientific process or the discovery of previously unknown data. The low level of authentic research experiences in introductory biology labs suggests that more development and support is needed to increase undergraduate exposure to research experiences. Faculty members did not cite several barriers commonly assumed to impair pedagogical reform; however, their responses suggest that expanded support for development of research experiences in laboratory classes could address the most common barrier. PMID:24591509
Spell, Rachelle M; Guinan, Judith A; Miller, Kristen R; Beck, Christopher W
2014-01-01
Incorporating authentic research experiences in introductory biology laboratory classes would greatly expand the number of students exposed to the excitement of discovery and the rigor of the scientific process. However, the essential components of an authentic research experience and the barriers to their implementation in laboratory classes are poorly defined. To guide future reform efforts in this area, we conducted a national survey of biology faculty members to determine 1) their definitions of authentic research experiences in laboratory classes, 2) the extent of authentic research experiences currently experienced in their laboratory classes, and 3) the barriers that prevent incorporation of authentic research experiences into these classes. Strikingly, the definitions of authentic research experiences differ among faculty members and tend to emphasize either the scientific process or the discovery of previously unknown data. The low level of authentic research experiences in introductory biology labs suggests that more development and support is needed to increase undergraduate exposure to research experiences. Faculty members did not cite several barriers commonly assumed to impair pedagogical reform; however, their responses suggest that expanded support for development of research experiences in laboratory classes could address the most common barrier.
Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia
2018-01-01
Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional “encrypt-then-sign” or “sign-then-encrypt” strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation. PMID:29772840
Wang, Ting; Tan, Siow Ying; Mutilangi, William; Aykas, Didem P; Rodriguez-Saona, Luis E
2015-10-01
The objective of this study was to develop a simple and rapid method to differentiate whey protein types (WPC, WPI, and WPH) used for beverage manufacturing by combining the spectral signature collected from portable mid-infrared spectrometers and pattern recognition analysis. Whey protein powders from different suppliers are produced using a large number of processing and compositional variables, resulting in variation in composition, concentration, protein structure, and thus functionality. Whey protein powders including whey protein isolates, whey protein concentrates and whey protein hydrolysates were obtained from different suppliers and their spectra collected using portable mid-infrared spectrometers (single and triple reflection) by pressing the powder onto an Attenuated Total Reflectance (ATR) diamond crystal with a pressure clamp. Spectra were analyzed by soft independent modeling of class analogy (SIMCA) generating a classification model showing the ability to differentiate whey protein types by forming tight clusters with interclass distance values of >3, considered to be significantly different from each other. The major bands centered at 1640 and 1580 cm(-1) were responsible for separation and were associated with differences in amide I and amide II vibrations of proteins, respectively. Another important band in whey protein clustering was associated with carboxylate vibrations of acidic amino acids (∼1570 cm(-1)). The use of a portable mid-IR spectrometer combined with pattern recognition analysis showed potential for discriminating whey protein ingredients that can help to streamline the analytical procedure so that it is more applicable for field-based screening of ingredients. A rapid, simple and accurate method was developed to authenticate commercial whey protein products by using portable mid-infrared spectrometers combined with chemometrics, which could help ensure the functionality of whey protein ingredients in food applications. © 2015 Institute of Food Technologists®
Choi, Younsung; Nam, Junghyun; Lee, Donghoon; Kim, Jiye; Jung, Jaewook; Won, Dongho
2014-01-01
An anonymous user authentication scheme allows a user, who wants to access a remote application server, to achieve mutual authentication and session key establishment with the server in an anonymous manner. To enhance the security of such authentication schemes, recent researches combined user's biometrics with a password. However, these authentication schemes are designed for single server environment. So when a user wants to access different application servers, the user has to register many times. To solve this problem, Chuang and Chen proposed an anonymous multiserver authenticated key agreement scheme using smart cards together with passwords and biometrics. Chuang and Chen claimed that their scheme not only supports multiple servers but also achieves various security requirements. However, we show that this scheme is vulnerable to a masquerade attack, a smart card attack, a user impersonation attack, and a DoS attack and does not achieve perfect forward secrecy. We also propose a security enhanced anonymous multiserver authenticated key agreement scheme which addresses all the weaknesses identified in Chuang and Chen's scheme.