Yi, Faliu; Jeoung, Yousun; Moon, Inkyu
2017-05-20
In recent years, many studies have focused on authentication of two-dimensional (2D) images using double random phase encryption techniques. However, there has been little research on three-dimensional (3D) imaging systems, such as integral imaging, for 3D image authentication. We propose a 3D image authentication scheme based on a double random phase integral imaging method. All of the 2D elemental images captured through integral imaging are encrypted with a double random phase encoding algorithm and only partial phase information is reserved. All the amplitude and other miscellaneous phase information in the encrypted elemental images is discarded. Nevertheless, we demonstrate that 3D images from integral imaging can be authenticated at different depths using a nonlinear correlation method. The proposed 3D image authentication algorithm can provide enhanced information security because the decrypted 2D elemental images from the sparse phase cannot be easily observed by the naked eye. Additionally, using sparse phase images without any amplitude information can greatly reduce data storage costs and aid in image compression and data transmission.
NASA Astrophysics Data System (ADS)
Lu, Dajiang; He, Wenqi; Liao, Meihua; Peng, Xiang
2017-02-01
A new method to eliminate the security risk of the well-known interference-based optical cryptosystem is proposed. In this method, which is suitable for security authentication application, two phase-only masks are separately placed at different distances from the output plane, where a certification image (public image) can be obtained. To further increase the security and flexibility of this authentication system, we employ one more validation image (secret image), which can be observed at another output plane, for confirming the identity of the user. Only if the two correct masks are properly settled at their positions one could obtain two significant images. Besides, even if the legal users exchange their masks (keys), the authentication process will fail and the authentication results will not reveal any information. Numerical simulations are performed to demonstrate the validity and security of the proposed method.
Two-level image authentication by two-step phase-shifting interferometry and compressive sensing
NASA Astrophysics Data System (ADS)
Zhang, Xue; Meng, Xiangfeng; Yin, Yongkai; Yang, Xiulun; Wang, Yurong; Li, Xianye; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi
2018-01-01
A two-level image authentication method is proposed; the method is based on two-step phase-shifting interferometry, double random phase encoding, and compressive sensing (CS) theory, by which the certification image can be encoded into two interferograms. Through discrete wavelet transform (DWT), sparseness processing, Arnold transform, and data compression, two compressed signals can be generated and delivered to two different participants of the authentication system. Only the participant who possesses the first compressed signal attempts to pass the low-level authentication. The application of Orthogonal Match Pursuit CS algorithm reconstruction, inverse Arnold transform, inverse DWT, two-step phase-shifting wavefront reconstruction, and inverse Fresnel transform can result in the output of a remarkable peak in the central location of the nonlinear correlation coefficient distributions of the recovered image and the standard certification image. Then, the other participant, who possesses the second compressed signal, is authorized to carry out the high-level authentication. Therefore, both compressed signals are collected to reconstruct the original meaningful certification image with a high correlation coefficient. Theoretical analysis and numerical simulations verify the feasibility of the proposed method.
Passive detection of copy-move forgery in digital images: state-of-the-art.
Al-Qershi, Osamah M; Khoo, Bee Ee
2013-09-10
Currently, digital images and videos have high importance because they have become the main carriers of information. However, the relative ease of tampering with images and videos makes their authenticity untrustful. Digital image forensics addresses the problem of the authentication of images or their origins. One main branch of image forensics is passive image forgery detection. Images could be forged using different techniques, and the most common forgery is the copy-move, in which a region of an image is duplicated and placed elsewhere in the same image. Active techniques, such as watermarking, have been proposed to solve the image authenticity problem, but those techniques have limitations because they require human intervention or specially equipped cameras. To overcome these limitations, several passive authentication methods have been proposed. In contrast to active methods, passive methods do not require any previous information about the image, and they take advantage of specific detectable changes that forgeries can bring into the image. In this paper, we describe the current state-of-the-art of passive copy-move forgery detection methods. The key current issues in developing a robust copy-move forgery detector are then identified, and the trends of tackling those issues are addressed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Optical multiple-image authentication based on cascaded phase filtering structure
NASA Astrophysics Data System (ADS)
Wang, Q.; Alfalou, A.; Brosseau, C.
2016-10-01
In this study, we report on the recent developments of optical image authentication algorithms. Compared with conventional optical encryption, optical image authentication achieves more security strength because such methods do not need to recover information of plaintext totally during the decryption period. Several recently proposed authentication systems are briefly introduced. We also propose a novel multiple-image authentication system, where multiple original images are encoded into a photon-limited encoded image by using a triple-plane based phase retrieval algorithm and photon counting imaging (PCI) technique. One can only recover a noise-like image using correct keys. To check authority of multiple images, a nonlinear fractional correlation is employed to recognize the original information hidden in the decrypted results. The proposal can be implemented optically using a cascaded phase filtering configuration. Computer simulation results are presented to evaluate the performance of this proposal and its effectiveness.
Spectroscopically Enhanced Method and System for Multi-Factor Biometric Authentication
NASA Astrophysics Data System (ADS)
Pishva, Davar
This paper proposes a spectroscopic method and system for preventing spoofing of biometric authentication. One of its focus is to enhance biometrics authentication with a spectroscopic method in a multifactor manner such that a person's unique ‘spectral signatures’ or ‘spectral factors’ are recorded and compared in addition to a non-spectroscopic biometric signature to reduce the likelihood of imposter getting authenticated. By using the ‘spectral factors’ extracted from reflectance spectra of real fingers and employing cluster analysis, it shows how the authentic fingerprint image presented by a real finger can be distinguished from an authentic fingerprint image embossed on an artificial finger, or molded on a fingertip cover worn by an imposter. This paper also shows how to augment two widely used biometrics systems (fingerprint and iris recognition devices) with spectral biometrics capabilities in a practical manner and without creating much overhead or inconveniencing their users.
Integrated circuit authentication using photon-limited x-ray microscopy.
Markman, Adam; Javidi, Bahram
2016-07-15
A counterfeit integrated circuit (IC) may contain subtle changes to its circuit configuration. These changes may be observed when imaged using an x-ray; however, the energy from the x-ray can potentially damage the IC. We have investigated a technique to authenticate ICs under photon-limited x-ray imaging. We modeled an x-ray image with lower energy by generating a photon-limited image from a real x-ray image using a weighted photon-counting method. We performed feature extraction on the image using the speeded-up robust features (SURF) algorithm. We then authenticated the IC by comparing the SURF features to a database of SURF features from authentic and counterfeit ICs. Our experimental results with real and counterfeit ICs using an x-ray microscope demonstrate that we can correctly authenticate an IC image captured using orders of magnitude lower energy x-rays. To the best of our knowledge, this Letter is the first one on using a photon-counting x-ray imaging model and relevant algorithms to authenticate ICs to prevent potential damage.
A multispectral photon-counting double random phase encoding scheme for image authentication.
Yi, Faliu; Moon, Inkyu; Lee, Yeon H
2014-05-20
In this paper, we propose a new method for color image-based authentication that combines multispectral photon-counting imaging (MPCI) and double random phase encoding (DRPE) schemes. The sparsely distributed information from MPCI and the stationary white noise signal from DRPE make intruder attacks difficult. In this authentication method, the original multispectral RGB color image is down-sampled into a Bayer image. The three types of color samples (red, green and blue color) in the Bayer image are encrypted with DRPE and the amplitude part of the resulting image is photon counted. The corresponding phase information that has nonzero amplitude after photon counting is then kept for decryption. Experimental results show that the retrieved images from the proposed method do not visually resemble their original counterparts. Nevertheless, the original color image can be efficiently verified with statistical nonlinear correlations. Our experimental results also show that different interpolation algorithms applied to Bayer images result in different verification effects for multispectral RGB color images.
Analog Video Authentication and Seal Verification Equipment Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregory Lancaster
Under contract to the US Department of Energy in support of arms control treaty verification activities, the Savannah River National Laboratory in conjunction with the Pacific Northwest National Laboratory, the Idaho National Laboratory and Milagro Consulting, LLC developed equipment for use within a chain of custody regime. This paper discussed two specific devices, the Authentication Through the Lens (ATL) analog video authentication system and a photographic multi-seal reader. Both of these devices have been demonstrated in a field trial, and the experience gained throughout will also be discussed. Typically, cryptographic methods are used to prove the authenticity of digital imagesmore » and video used in arms control chain of custody applications. However, in some applications analog cameras are used. Since cryptographic authentication methods will not work on analog video streams, a simple method of authenticating analog video was developed and tested. A photographic multi-seal reader was developed to image different types of visual unique identifiers for use in chain of custody and authentication activities. This seal reader is unique in its ability to image various types of seals including the Cobra Seal, Reflective Particle Tags, and adhesive seals. Flicker comparison is used to compare before and after images collected with the seal reader in order to detect tampering and verify the integrity of the seal.« less
Discussion and a new method of optical cryptosystem based on interference
NASA Astrophysics Data System (ADS)
Lu, Dajiang; He, Wenqi; Liao, Meihua; Peng, Xiang
2017-02-01
A discussion and an objective security analysis of the well-known optical image encryption based on interference are presented in this paper. A new method is also proposed to eliminate the security risk of the original cryptosystem. For a possible practical application, we expand this new method into a hierarchical authentication scheme. In this authentication system, with a pre-generated and fixed random phase lock, different target images indicating different authentication levels are analytically encoded into corresponding phase-only masks (phase keys) and amplitude-only masks (amplitude keys). For the authentication process, a legal user can obtain a specified target image at the output plane if his/her phase key, and amplitude key, which should be settled close against the fixed internal phase lock, are respectively illuminated by two coherent beams. By comparing the target image with all the standard certification images in the database, the system can thus verify the user's legality even his/her identity level. Moreover, in despite of the internal phase lock of this system being fixed, the crosstalk between different pairs of keys held by different users is low. Theoretical analysis and numerical simulation are both provided to demonstrate the validity of this method.
Fan, Desheng; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Pan, Xuemei; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi
2015-04-10
A multiple-image authentication method with a cascaded multilevel architecture in the Fresnel domain is proposed, in which a synthetic encoded complex amplitude is first fabricated, and its real amplitude component is generated by iterative amplitude encoding, random sampling, and space multiplexing for the low-level certification images, while the phase component of the synthetic encoded complex amplitude is constructed by iterative phase information encoding and multiplexing for the high-level certification images. Then the synthetic encoded complex amplitude is iteratively encoded into two phase-type ciphertexts located in two different transform planes. During high-level authentication, when the two phase-type ciphertexts and the high-level decryption key are presented to the system and then the Fresnel transform is carried out, a meaningful image with good quality and a high correlation coefficient with the original certification image can be recovered in the output plane. Similar to the procedure of high-level authentication, in the case of low-level authentication with the aid of a low-level decryption key, no significant or meaningful information is retrieved, but it can result in a remarkable peak output in the nonlinear correlation coefficient of the output image and the corresponding original certification image. Therefore, the method realizes different levels of accessibility to the original certification image for different authority levels with the same cascaded multilevel architecture.
Image multiplexing and authentication based on double phase retrieval in fresnel transform domain
NASA Astrophysics Data System (ADS)
Chang, Hsuan-Ting; Lin, Che-Hsian; Chen, Chien-Yue
2017-04-01
An image multiplexing and authentication method based on the double-phase retrieval algorithm (DPRA) with the manipulations of wavelength and position in the Fresnel transform (FrT) domain is proposed in this study. The DPRA generates two matched phase-only functions (POFs) in the different planes so that the corresponding image can be reconstructed at the output plane. Given a number of target images, all the sets of matched POFs are used to generate the phase-locked system through the phase modulation and synthesis to achieve the multiplexing purpose. To reconstruct a target image, the corresponding phase key and all the correct parameters in the FrT are required. Therefore, the authentication system with high-level security can be achieved. The computer simulation verifies the validity of the proposed method and also shows good resistance to the crosstalk among the reconstructed images.
Digital Camera with Apparatus for Authentication of Images Produced from an Image File
NASA Technical Reports Server (NTRS)
Friedman, Gary L. (Inventor)
1996-01-01
A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.
Wang, Xiaogang; Chen, Wen; Chen, Xudong
2015-03-09
In this paper, we develop a new optical information authentication system based on compressed double-random-phase-encoded images and quick-response (QR) codes, where the parameters of optical lightwave are used as keys for optical decryption and the QR code is a key for verification. An input image attached with QR code is first optically encoded in a simplified double random phase encoding (DRPE) scheme without using interferometric setup. From the single encoded intensity pattern recorded by a CCD camera, a compressed double-random-phase-encoded image, i.e., the sparse phase distribution used for optical decryption, is generated by using an iterative phase retrieval technique with QR code. We compare this technique to the other two methods proposed in literature, i.e., Fresnel domain information authentication based on the classical DRPE with holographic technique and information authentication based on DRPE and phase retrieval algorithm. Simulation results show that QR codes are effective on improving the security and data sparsity of optical information encryption and authentication system.
Medical Image Tamper Detection Based on Passive Image Authentication.
Ulutas, Guzin; Ustubioglu, Arda; Ustubioglu, Beste; V Nabiyev, Vasif; Ulutas, Mustafa
2017-12-01
Telemedicine has gained popularity in recent years. Medical images can be transferred over the Internet to enable the telediagnosis between medical staffs and to make the patient's history accessible to medical staff from anywhere. Therefore, integrity protection of the medical image is a serious concern due to the broadcast nature of the Internet. Some watermarking techniques are proposed to control the integrity of medical images. However, they require embedding of extra information (watermark) into image before transmission. It decreases visual quality of the medical image and can cause false diagnosis. The proposed method uses passive image authentication mechanism to detect the tampered regions on medical images. Structural texture information is obtained from the medical image by using local binary pattern rotation invariant (LBPROT) to make the keypoint extraction techniques more successful. Keypoints on the texture image are obtained with scale invariant feature transform (SIFT). Tampered regions are detected by the method by matching the keypoints. The method improves the keypoint-based passive image authentication mechanism (they do not detect tampering when the smooth region is used for covering an object) by using LBPROT before keypoint extraction because smooth regions also have texture information. Experimental results show that the method detects tampered regions on the medical images even if the forged image has undergone some attacks (Gaussian blurring/additive white Gaussian noise) or the forged regions are scaled/rotated before pasting.
Lee, Jaekwon; Moon, Seunghwan; Lim, Juhun; Gwak, Min-Joo; Kim, Jae Gwan; Chung, Euiheon; Lee, Jong-Hyun
2017-04-22
A new authentication method employing a laser and a scanner is proposed to improve image contrast of the finger vein and to extract blood flow pattern for liveness detection. A micromirror reflects a laser beam and performs a uniform raster scan. Transmissive vein images were obtained, and compared with those of an LED. Blood flow patterns were also obtained based on speckle images in perfusion and occlusion. Curvature ratios of the finger vein and blood flow intensities were found to be nearly constant, regardless of the vein size, which validated the high repeatability of this scheme for identity authentication with anti-spoofing.
Lee, Jaekwon; Moon, Seunghwan; Lim, Juhun; Gwak, Min-Joo; Kim, Jae Gwan; Chung, Euiheon; Lee, Jong-Hyun
2017-01-01
A new authentication method employing a laser and a scanner is proposed to improve image contrast of the finger vein and to extract blood flow pattern for liveness detection. A micromirror reflects a laser beam and performs a uniform raster scan. Transmissive vein images were obtained, and compared with those of an LED. Blood flow patterns were also obtained based on speckle images in perfusion and occlusion. Curvature ratios of the finger vein and blood flow intensities were found to be nearly constant, regardless of the vein size, which validated the high repeatability of this scheme for identity authentication with anti-spoofing. PMID:28441728
Seppänen, Tapio
2017-01-01
Fourier transform infrared (FTIR) microspectroscopy images contain information from the whole infrared spectrum used for microspectroscopic analyses. In combination with the FTIR image, visible light images are used to depict the area from which the FTIR spectral image was sampled. These two images are traditionally acquired as separate files. This paper proposes a histogram shifting-based data hiding technique to embed visible light images in FTIR spectral images producing single entities. The primary objective is to improve data management efficiency. Secondary objectives are confidentiality, availability, and reliability. Since the integrity of biomedical data is vital, the proposed method applies reversible data hiding. After extraction of the embedded data, the FTIR image is reversed to its original state. Furthermore, the proposed method applies authentication tags generated with keyed Hash-Based Message Authentication Codes (HMAC) to detect tampered or corrupted areas of FTIR images. The experimental results show that the FTIR spectral images carrying the payload maintain good perceptual fidelity and the payload can be reliably recovered even after bit flipping or cropping attacks. It has been also shown that extraction successfully removes all modifications caused by the payload. Finally, authentication tags successfully indicated tampered FTIR image areas. PMID:29259987
Multi-image encryption based on synchronization of chaotic lasers and iris authentication
NASA Astrophysics Data System (ADS)
Banerjee, Santo; Mukhopadhyay, Sumona; Rondoni, Lamberto
2012-07-01
A new technique of transmitting encrypted combinations of gray scaled and chromatic images using chaotic lasers derived from Maxwell-Bloch's equations has been proposed. This novel scheme utilizes the general method of solution of a set of linear equations to transmit similar sized heterogeneous images which are a combination of monochrome and chromatic images. The chaos encrypted gray scaled images are concatenated along the three color planes resulting in color images. These are then transmitted over a secure channel along with a cover image which is an iris scan. The entire cryptology is augmented with an iris-based authentication scheme. The secret messages are retrieved once the authentication is successful. The objective of our work is briefly outlined as (a) the biometric information is the iris which is encrypted before transmission, (b) the iris is used for personal identification and verifying for message integrity, (c) the information is transmitted securely which are colored images resulting from a combination of gray images, (d) each of the images transmitted are encrypted through chaos based cryptography, (e) these encrypted multiple images are then coupled with the iris through linear combination of images before being communicated over the network. The several layers of encryption together with the ergodicity and randomness of chaos render enough confusion and diffusion properties which guarantee a fool-proof approach in achieving secure communication as demonstrated by exhaustive statistical methods. The result is vital from the perspective of opening a fundamental new dimension in multiplexing and simultaneous transmission of several monochromatic and chromatic images along with biometry based authentication and cryptography.
Image authentication using distributed source coding.
Lin, Yao-Chung; Varodayan, David; Girod, Bernd
2012-01-01
We present a novel approach using distributed source coding for image authentication. The key idea is to provide a Slepian-Wolf encoded quantized image projection as authentication data. This version can be correctly decoded with the help of an authentic image as side information. Distributed source coding provides the desired robustness against legitimate variations while detecting illegitimate modification. The decoder incorporating expectation maximization algorithms can authenticate images which have undergone contrast, brightness, and affine warping adjustments. Our authentication system also offers tampering localization by using the sum-product algorithm.
NASA Astrophysics Data System (ADS)
Salvemini, Filomena; Grazzi, Francesco; Kardjilov, Nikolay; Wieder, Frank; Manke, Ingo; Edge, David; Williams, Alan; Zoppi, Marco
2017-05-01
Non-invasive experimental methods play an important role in the field of cultural heritage. Benefiting from the technical progress in recent years, neutron imaging has been demonstrated to complement effectively studies based on surface analysis, allowing for a non-invasive characterization of the whole three-dimensional volume. This study focuses on a kris and a kanjar, two weapons from ancient Asia, to show the potential of the combined use of X-ray and neutron imaging techniques for the characterisation of the manufacturing methods and the authentication of objects of cultural and historical interest.
Personal authentication using hand vein triangulation and knuckle shape.
Kumar, Ajay; Prathyusha, K Venkata
2009-09-01
This paper presents a new approach to authenticate individuals using triangulation of hand vein images and simultaneous extraction of knuckle shape information. The proposed method is fully automated and employs palm dorsal hand vein images acquired from the low-cost, near infrared, contactless imaging. The knuckle tips are used as key points for the image normalization and extraction of region of interest. The matching scores are generated in two parallel stages: (i) hierarchical matching score from the four topologies of triangulation in the binarized vein structures and (ii) from the geometrical features consisting of knuckle point perimeter distances in the acquired images. The weighted score level combination from these two matching scores are used to authenticate the individuals. The achieved experimental results from the proposed system using contactless palm dorsal-hand vein images are promising (equal error rate of 1.14%) and suggest more user friendly alternative for user identification.
A nuclear method to authenticate Buddha images
NASA Astrophysics Data System (ADS)
Khaweerat, S.; Ratanatongchai, W.; Channuie, J.; Wonglee, S.; Picha, R.; Promping, J.; Silva, K.; Liamsuwan, T.
2015-05-01
The value of Buddha images in Thailand varies dramatically depending on authentication and provenance. In general, people use their individual skills to make the justification which frequently leads to obscurity, deception and illegal activities. Here, we propose two non-destructive techniques of neutron radiography (NR) and neutron activation autoradiography (NAAR) to reveal respectively structural and elemental profiles of small Buddha images. For NR, a thermal neutron flux of 105 n cm-2s-1 was applied. NAAR needed a higher neutron flux of 1012 n cm-2 s-1 to activate the samples. Results from NR and NAAR revealed unique characteristic of the samples. Similarity of the profile played a key role in the classification of the samples. The results provided visual evidence to enhance the reliability of authenticity approval. The method can be further developed for routine practice which impact thousands of customers in Thailand.
New secure communication-layer standard for medical image management (ISCL)
NASA Astrophysics Data System (ADS)
Kita, Kouichi; Nohara, Takashi; Hosoba, Minoru; Yachida, Masuyoshi; Yamaguchi, Masahiro; Ohyama, Nagaaki
1999-07-01
This paper introduces a summary of the standard draft of ISCL 1.00 which will be published by MEDIS-DC officially. ISCL is abbreviation of Integrated Secure Communication Layer Protocols for Secure Medical Image Management Systems. ISCL is a security layer which manages security function between presentation layer and TCP/IP layer. ISCL mechanism depends on basic function of a smart IC card and symmetric secret key mechanism. A symmetry key for each session is made by internal authentication function of a smart IC card with a random number. ISCL has three functions which assure authentication, confidently and integrity. Entity authentication process is done through 3 path 4 way method using functions of internal authentication and external authentication of a smart iC card. Confidentially algorithm and MAC algorithm for integrity are able to be selected. ISCL protocols are communicating through Message Block which consists of Message Header and Message Data. ISCL protocols are evaluating by applying to regional collaboration system for image diagnosis, and On-line Secure Electronic Storage system for medical images. These projects are supported by Medical Information System Development Center. These project shows ISCL is useful to keep security.
NASA Astrophysics Data System (ADS)
Lee, Jasper C.; Ma, Kevin C.; Liu, Brent J.
2008-03-01
A Data Grid for medical images has been developed at the Image Processing and Informatics Laboratory, USC to provide distribution and fault-tolerant storage of medical imaging studies across Internet2 and public domain. Although back-up policies and grid certificates guarantee privacy and authenticity of grid-access-points, there still lacks a method to guarantee the sensitive DICOM images have not been altered or corrupted during transmission across a public domain. This paper takes steps toward achieving full image transfer security within the Data Grid by utilizing DICOM image authentication and a HIPAA-compliant auditing system. The 3-D lossless digital signature embedding procedure involves a private 64 byte signature that is embedded into each original DICOM image volume, whereby on the receiving end the signature can to be extracted and verified following the DICOM transmission. This digital signature method has also been developed at the IPILab. The HIPAA-Compliant Auditing System (H-CAS) is required to monitor embedding and verification events, and allows monitoring of other grid activity as well. The H-CAS system federates the logs of transmission and authentication events at each grid-access-point and stores it into a HIPAA-compliant database. The auditing toolkit is installed at the local grid-access-point and utilizes Syslog [1], a client-server standard for log messaging over an IP network, to send messages to the H-CAS centralized database. By integrating digital image signatures and centralized logging capabilities, DICOM image integrity within the Medical Imaging and Informatics Data Grid can be monitored and guaranteed without loss to any image quality.
Localized lossless authentication watermark (LAW)
NASA Astrophysics Data System (ADS)
Celik, Mehmet U.; Sharma, Gaurav; Tekalp, A. Murat; Saber, Eli S.
2003-06-01
A novel framework is proposed for lossless authentication watermarking of images which allows authentication and recovery of original images without any distortions. This overcomes a significant limitation of traditional authentication watermarks that irreversibly alter image data in the process of watermarking and authenticate the watermarked image rather than the original. In particular, authenticity is verified before full reconstruction of the original image, whose integrity is inferred from the reversibility of the watermarking procedure. This reduces computational requirements in situations when either the verification step fails or the zero-distortion reconstruction is not required. A particular instantiation of the framework is implemented using a hierarchical authentication scheme and the lossless generalized-LSB data embedding mechanism. The resulting algorithm, called localized lossless authentication watermark (LAW), can localize tampered regions of the image; has a low embedding distortion, which can be removed entirely if necessary; and supports public/private key authentication and recovery options. The effectiveness of the framework and the instantiation is demonstrated through examples.
Optimization of illuminating system to detect optical properties inside a finger
NASA Astrophysics Data System (ADS)
Sano, Emiko; Shikai, Masahiro; Shiratsuki, Akihide; Maeda, Takuji; Matsushita, Masahito; Sasakawa, Koichi
2007-01-01
Biometrics performs personal authentication using individual bodily features including fingerprints, faces, etc. These technologies have been studied and developed for many years. In particular, fingerprint authentication has evolved over many years, and fingerprinting is currently one of world's most established biometric authentication techniques. Not long ago this technique was only used for personal identification in criminal investigations and high-security facilities. In recent years, however, various biometric authentication techniques have appeared in everyday applications. Even though providing great convenience, they have also produced a number of technical issues concerning operation. Generally, fingerprint authentication is comprised of a number of component technologies: (1) sensing technology for detecting the fingerprint pattern; (2) image processing technology for converting the captured pattern into feature data that can be used for verification; (3) verification technology for comparing the feature data with a reference and determining whether it matches. Current fingerprint authentication issues, revealed in research results, originate with fingerprint sensing technology. Sensing methods for detecting a person's fingerprint pattern for image processing are particularly important because they impact overall fingerprint authentication performance. The following are the current problems concerning sensing methods that occur in some cases: Some fingers whose fingerprints used to be difficult to detect by conventional sensors. Fingerprint patterns are easily affected by the finger's surface condition, such noise as discontinuities and thin spots can appear in fingerprint patterns obtained from wrinkled finger, sweaty finger, and so on. To address these problems, we proposed a novel fingerprint sensor based on new scientific knowledge. A characteristic of this new method is that obtained fingerprint patterns are not easily affected by the finger's surface condition because it detects the fingerprint pattern inside the finger using transmitted light. We examined optimization of illumination system of this novel fingerprint sensor to detect contrasty fingerprint pattern from wide area and to improve image processing at (2).
An authenticated image encryption scheme based on chaotic maps and memory cellular automata
NASA Astrophysics Data System (ADS)
Bakhshandeh, Atieh; Eslami, Ziba
2013-06-01
This paper introduces a new image encryption scheme based on chaotic maps, cellular automata and permutation-diffusion architecture. In the permutation phase, a piecewise linear chaotic map is utilized to confuse the plain-image and in the diffusion phase, we employ the Logistic map as well as a reversible memory cellular automata to obtain an efficient and secure cryptosystem. The proposed method admits advantages such as highly secure diffusion mechanism, computational efficiency and ease of implementation. A novel property of the proposed scheme is its authentication ability which can detect whether the image is tampered during the transmission or not. This is particularly important in applications where image data or part of it contains highly sensitive information. Results of various analyses manifest high security of this new method and its capability for practical image encryption.
Khalil, Mohammed S.; Khan, Muhammad Khurram; Alginahi, Yasser M.
2014-01-01
This paper presents a novel watermarking method to facilitate the authentication and detection of the image forgery on the Quran images. Two layers of embedding scheme on wavelet and spatial domain are introduced to enhance the sensitivity of fragile watermarking and defend the attacks. Discrete wavelet transforms are applied to decompose the host image into wavelet prior to embedding the watermark in the wavelet domain. The watermarked wavelet coefficient is inverted back to spatial domain then the least significant bits is utilized to hide another watermark. A chaotic map is utilized to blur the watermark to make it secure against the local attack. The proposed method allows high watermark payloads, while preserving good image quality. Experiment results confirm that the proposed methods are fragile and have superior tampering detection even though the tampered area is very small. PMID:25028681
Khalil, Mohammed S; Kurniawan, Fajri; Khan, Muhammad Khurram; Alginahi, Yasser M
2014-01-01
This paper presents a novel watermarking method to facilitate the authentication and detection of the image forgery on the Quran images. Two layers of embedding scheme on wavelet and spatial domain are introduced to enhance the sensitivity of fragile watermarking and defend the attacks. Discrete wavelet transforms are applied to decompose the host image into wavelet prior to embedding the watermark in the wavelet domain. The watermarked wavelet coefficient is inverted back to spatial domain then the least significant bits is utilized to hide another watermark. A chaotic map is utilized to blur the watermark to make it secure against the local attack. The proposed method allows high watermark payloads, while preserving good image quality. Experiment results confirm that the proposed methods are fragile and have superior tampering detection even though the tampered area is very small.
Limitations and requirements of content-based multimedia authentication systems
NASA Astrophysics Data System (ADS)
Wu, Chai W.
2001-08-01
Recently, a number of authentication schemes have been proposed for multimedia data such as images and sound data. They include both label based systems and semifragile watermarks. The main requirement for such authentication systems is that minor modifications such as lossy compression which do not alter the content of the data preserve the authenticity of the data, whereas modifications which do modify the content render the data not authentic. These schemes can be classified into two main classes depending on the model of image authentication they are based on. One of the purposes of this paper is to look at some of the advantages and disadvantages of these image authentication schemes and their relationship with fundamental limitations of the underlying model of image authentication. In particular, we study feature-based algorithms which generate an authentication tag based on some inherent features in the image such as the location of edges. The main disadvantage of most proposed feature-based algorithms is that similar images generate similar features, and therefore it is possible for a forger to generate dissimilar images that have the same features. On the other hand, the class of hash-based algorithms utilizes a cryptographic hash function or a digital signature scheme to reduce the data and generate an authentication tag. It inherits the security of digital signatures to thwart forgery attacks. The main disadvantage of hash-based algorithms is that the image needs to be modified in order to be made authenticatable. The amount of modification is on the order of the noise the image can tolerate before it is rendered inauthentic. The other purpose of this paper is to propose a multimedia authentication scheme which combines some of the best features of both classes of algorithms. The proposed scheme utilizes cryptographic hash functions and digital signature schemes and the data does not need to be modified in order to be made authenticatable. Several applications including the authentication of images on CD-ROM and handwritten documents will be discussed.
Chica, Manuel
2012-11-01
A novel method for authenticating pollen grains in bright-field microscopic images is presented in this work. The usage of this new method is clear in many application fields such as bee-keeping sector, where laboratory experts need to identify fraudulent bee pollen samples against local known pollen types. Our system is based on image processing and one-class classification to reject unknown pollen grain objects. The latter classification technique allows us to tackle the major difficulty of the problem, the existence of many possible fraudulent pollen types, and the impossibility of modeling all of them. Different one-class classification paradigms are compared to study the most suitable technique for solving the problem. In addition, feature selection algorithms are applied to reduce the complexity and increase the accuracy of the models. For each local pollen type, a one-class classifier is trained and aggregated into a multiclassifier model. This multiclassification scheme combines the output of all the one-class classifiers in a unique final response. The proposed method is validated by authenticating pollen grains belonging to different Spanish bee pollen types. The overall accuracy of the system on classifying fraudulent microscopic pollen grain objects is 92.3%. The system is able to rapidly reject pollen grains, which belong to nonlocal pollen types, reducing the laboratory work and effort. The number of possible applications of this authentication method in the microscopy research field is unlimited. Copyright © 2012 Wiley Periodicals, Inc.
Facelock: familiarity-based graphical authentication.
Jenkins, Rob; McLachlan, Jane L; Renaud, Karen
2014-01-01
Authentication codes such as passwords and PIN numbers are widely used to control access to resources. One major drawback of these codes is that they are difficult to remember. Account holders are often faced with a choice between forgetting a code, which can be inconvenient, or writing it down, which compromises security. In two studies, we test a new knowledge-based authentication method that does not impose memory load on the user. Psychological research on face recognition has revealed an important distinction between familiar and unfamiliar face perception: When a face is familiar to the observer, it can be identified across a wide range of images. However, when the face is unfamiliar, generalisation across images is poor. This contrast can be used as the basis for a personalised 'facelock', in which authentication succeeds or fails based on image-invariant recognition of faces that are familiar to the account holder. In Study 1, account holders authenticated easily by detecting familiar targets among other faces (97.5% success rate), even after a one-year delay (86.1% success rate). Zero-acquaintance attackers were reduced to guessing (<1% success rate). Even personal attackers who knew the account holder well were rarely able to authenticate (6.6% success rate). In Study 2, we found that shoulder-surfing attacks by strangers could be defeated by presenting different photos of the same target faces in observed and attacked grids (1.9% success rate). Our findings suggest that the contrast between familiar and unfamiliar face recognition may be useful for developers of graphical authentication systems.
[Inheritance and innovation of traditional Chinese medicinal authentication].
Zhao, Zhong-zhen; Chen, Hu-biao; Xiao, Pei-gen; Guo, Ping; Liang, Zhi-tao; Hung, Fanny; Wong, Lai-lai; Brand, Eric; Liu, Jing
2015-09-01
Chinese medicinal authentication is fundamental for the standardization and globalization of Chinese medicine. The discipline of authentication addresses difficult issues that have remained unresolved for thousands of years, and is essential for preserving safety. Chinese medicinal authentication has both scientific and traditional cultural connotations; the use of scientific methods to elucidate traditional experience-based differentiation carries the legacy of Chinese medicine forward, and offers immediate practical significance and long-term scientific value. In this paper, a path of inheritance and innovation is explored through the scientific exposition of Chinese medicinal authentication, featuring a review of specialized publications, the establishment of a Chinese medicine specimen center and Chinese medicinal image databases, the expansion of authentication technologies, and the formation of a cultural project dedicated to the Compedium of Materia Medica.
Cryptography Would Reveal Alterations In Photographs
NASA Technical Reports Server (NTRS)
Friedman, Gary L.
1995-01-01
Public-key decryption method proposed to guarantee authenticity of photographic images represented in form of digital files. In method, digital camera generates original data from image in standard public format; also produces coded signature to verify standard-format image data. Scheme also helps protect against other forms of lying, such as attaching false captions.
Facelock: familiarity-based graphical authentication
McLachlan, Jane L.; Renaud, Karen
2014-01-01
Authentication codes such as passwords and PIN numbers are widely used to control access to resources. One major drawback of these codes is that they are difficult to remember. Account holders are often faced with a choice between forgetting a code, which can be inconvenient, or writing it down, which compromises security. In two studies, we test a new knowledge-based authentication method that does not impose memory load on the user. Psychological research on face recognition has revealed an important distinction between familiar and unfamiliar face perception: When a face is familiar to the observer, it can be identified across a wide range of images. However, when the face is unfamiliar, generalisation across images is poor. This contrast can be used as the basis for a personalised ‘facelock’, in which authentication succeeds or fails based on image-invariant recognition of faces that are familiar to the account holder. In Study 1, account holders authenticated easily by detecting familiar targets among other faces (97.5% success rate), even after a one-year delay (86.1% success rate). Zero-acquaintance attackers were reduced to guessing (<1% success rate). Even personal attackers who knew the account holder well were rarely able to authenticate (6.6% success rate). In Study 2, we found that shoulder-surfing attacks by strangers could be defeated by presenting different photos of the same target faces in observed and attacked grids (1.9% success rate). Our findings suggest that the contrast between familiar and unfamiliar face recognition may be useful for developers of graphical authentication systems. PMID:25024913
Wavelet-based reversible watermarking for authentication
NASA Astrophysics Data System (ADS)
Tian, Jun
2002-04-01
In the digital information age, digital content (audio, image, and video) can be easily copied, manipulated, and distributed. Copyright protection and content authentication of digital content has become an urgent problem to content owners and distributors. Digital watermarking has provided a valuable solution to this problem. Based on its application scenario, most digital watermarking methods can be divided into two categories: robust watermarking and fragile watermarking. As a special subset of fragile watermark, reversible watermark (which is also called lossless watermark, invertible watermark, erasable watermark) enables the recovery of the original, unwatermarked content after the watermarked content has been detected to be authentic. Such reversibility to get back unwatermarked content is highly desired in sensitive imagery, such as military data and medical data. In this paper we present a reversible watermarking method based on an integer wavelet transform. We look into the binary representation of each wavelet coefficient and embed an extra bit to expandable wavelet coefficient. The location map of all expanded coefficients will be coded by JBIG2 compression and these coefficient values will be losslessly compressed by arithmetic coding. Besides these two compressed bit streams, an SHA-256 hash of the original image will also be embedded for authentication purpose.
Security of fragile authentication watermarks with localization
NASA Astrophysics Data System (ADS)
Fridrich, Jessica
2002-04-01
In this paper, we study the security of fragile image authentication watermarks that can localize tampered areas. We start by comparing the goals, capabilities, and advantages of image authentication based on watermarking and cryptography. Then we point out some common security problems of current fragile authentication watermarks with localization and classify attacks on authentication watermarks into five categories. By investigating the attacks and vulnerabilities of current schemes, we propose a variation of the Wong scheme18 that is fast, simple, cryptographically secure, and resistant to all known attacks, including the Holliman-Memon attack9. In the new scheme, a special symmetry structure in the logo is used to authenticate the block content, while the logo itself carries information about the block origin (block index, the image index or time stamp, author ID, etc.). Because the authentication of the content and its origin are separated, it is possible to easily identify swapped blocks between images and accurately detect cropped areas, while being able to accurately localize tampered pixels.
Person Authentication Using Learned Parameters of Lifting Wavelet Filters
NASA Astrophysics Data System (ADS)
Niijima, Koichi
2006-10-01
This paper proposes a method for identifying persons by the use of the lifting wavelet parameters learned by kurtosis-minimization. Our learning method uses desirable properties of kurtosis and wavelet coefficients of a facial image. Exploiting these properties, the lifting parameters are trained so as to minimize the kurtosis of lifting wavelet coefficients computed for the facial image. Since this minimization problem is an ill-posed problem, it is solved by the aid of Tikhonov's regularization method. Our learning algorithm is applied to each of the faces to be identified to generate its feature vector whose components consist of the learned parameters. The constructed feature vectors are memorized together with the corresponding faces in a feature vectors database. Person authentication is performed by comparing the feature vector of a query face with those stored in the database. In numerical experiments, the lifting parameters are trained for each of the neutral faces of 132 persons (74 males and 58 females) in the AR face database. Person authentication is executed by using the smile and anger faces of the same persons in the database.
Providing integrity and authenticity in DICOM images: a novel approach.
Kobayashi, Luiz Octavio Massato; Furuie, Sergio Shiguemi; Barreto, Paulo Sergio Licciardi Messeder
2009-07-01
The increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally, and socially to the patient, potentially shaking the credibility of the healthcare institution. Medical images play a crucial role in such context, given their importance in diagnosis, treatment, and research. Therefore, it is vital to take measures in order to prevent tampering and determine their provenance. This demands adoption of security mechanisms to assure information integrity and authenticity. There are a number of works done in this field, based on two major approaches: use of metadata and use of watermarking. However, there still are limitations for both approaches that must be properly addressed. This paper presents a new method using cryptographic means to improve trustworthiness of medical images, providing a stronger link between the image and the information on its integrity and authenticity, without compromising image quality to the end user. Use of Digital Imaging and Communications in Medicine structures is also an advantage for ease of development and deployment.
Reversible watermarking for authentication of DICOM images.
Zain, J M; Baldwin, L P; Clarke, M
2004-01-01
We propose a watermarking scheme that can recover the original image from the watermarked one. The purpose is to verify the integrity and authenticity of DICOM images. We used 800x600x8 bits ultrasound (US) images in our experiment. SHA-256 of the whole image is embedded in the least significant bits of the RONI (Region of Non-Interest). If the image has not been altered, the watermark will be extracted and the original image will be recovered. SHA-256 of the recovered image will be compared with the extracted watermark for authentication.
Jiang, Nanfeng; Song, Weiran; Wang, Hui; Guo, Gongde; Liu, Yuanyuan
2018-05-23
As the expectation for higher quality of life increases, consumers have higher demands for quality food. Food authentication is the technical means of ensuring food is what it says it is. A popular approach to food authentication is based on spectroscopy, which has been widely used for identifying and quantifying the chemical components of an object. This approach is non-destructive and effective but expensive. This paper presents a computer vision-based sensor system for food authentication, i.e., differentiating organic from non-organic apples. This sensor system consists of low-cost hardware and pattern recognition software. We use a flashlight to illuminate apples and capture their images through a diffraction grating. These diffraction images are then converted into a data matrix for classification by pattern recognition algorithms, including k -nearest neighbors ( k -NN), support vector machine (SVM) and three partial least squares discriminant analysis (PLS-DA)- based methods. We carry out experiments on a reasonable collection of apple samples and employ a proper pre-processing, resulting in a highest classification accuracy of 94%. Our studies conclude that this sensor system has the potential to provide a viable solution to empower consumers in food authentication.
Medical Image Authentication Using DPT Watermarking: A Preliminary Attempt
NASA Astrophysics Data System (ADS)
Wong, M. L. Dennis; Goh, Antionette W.-T.; Chua, Hong Siang
Secure authentication of digital medical image content provides great value to the e-Health community and medical insurance industries. Fragile Watermarking has been proposed to provide the mechanism to authenticate digital medical image securely. Transform Domain based Watermarking are typically slower than spatial domain watermarking owing to the overhead in calculation of coefficients. In this paper, we propose a new Discrete Pascal Transform based watermarking technique. Preliminary experiment result shows authentication capability. Possible improvements on the proposed scheme are also presented before conclusions.
NASA Astrophysics Data System (ADS)
Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang
2017-01-01
A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.
Development of a Raman chemical imaging detection method for authenticating skim milk powder
USDA-ARS?s Scientific Manuscript database
This research demonstrated that Raman chemical imaging coupled with a simple image classification algorithm can be used to detect multiple chemical adulterants in skim milk powder. Ammonium sulfate, dicyandiamide, melamine, and urea were mixed into the milk powder as chemical adulterants in the conc...
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2008-03-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The function to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and Success in login" effective. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Development of a Raman chemical image detection algorithm for authenticating dry milk
USDA-ARS?s Scientific Manuscript database
This research developed a Raman chemical imaging method for detecting multiple adulterants in skim milk powder. Ammonium sulfate, dicyandiamide, melamine, and urea were mixed into the milk powder as chemical adulterants in the concentration range of 0.1–5.0%. A Raman imaging system using a 785-nm la...
Dual domain watermarking for authentication and compression of cultural heritage images.
Zhao, Yang; Campisi, Patrizio; Kundur, Deepa
2004-03-01
This paper proposes an approach for the combined image authentication and compression of color images by making use of a digital watermarking and data hiding framework. The digital watermark is comprised of two components: a soft-authenticator watermark for authentication and tamper assessment of the given image, and a chrominance watermark employed to improve the efficiency of compression. The multipurpose watermark is designed by exploiting the orthogonality of various domains used for authentication, color decomposition and watermark insertion. The approach is implemented as a DCT-DWT dual domain algorithm and is applied for the protection and compression of cultural heritage imagery. Analysis is provided to characterize the behavior of the scheme under ideal conditions. Simulations and comparisons of the proposed approach with state-of-the-art existing work demonstrate the potential of the overall scheme.
NASA Astrophysics Data System (ADS)
Wang, Q.; Elbouz, M.; Alfalou, A.; Brosseau, C.
2017-06-01
We present a novel method to optimize the discrimination ability and noise robustness of composite filters. This method is based on the iterative preprocessing of training images which can extract boundary and detailed feature information of authentic training faces, thereby improving the peak-to-correlation energy (PCE) ratio of authentic faces and to be immune to intra-class variance and noise interference. By adding the training images directly, one can obtain a composite template with high discrimination ability and robustness for face recognition task. The proposed composite correlation filter does not involve any complicated mathematical analysis and computation which are often required in the design of correlation algorithms. Simulation tests have been conducted to check the effectiveness and feasibility of our proposal. Moreover, to assess robustness of composite filters using receiver operating characteristic (ROC) curves, we devise a new method to count the true positive and false positive rates for which the difference between PCE and threshold is involved.
Optically secured information retrieval using two authenticated phase-only masks.
Wang, Xiaogang; Chen, Wen; Mei, Shengtao; Chen, Xudong
2015-10-23
We propose an algorithm for jointly designing two phase-only masks (POMs) that allow for the encryption and noise-free retrieval of triple images. The images required for optical retrieval are first stored in quick-response (QR) codes for noise-free retrieval and flexible readout. Two sparse POMs are respectively calculated from two different images used as references for authentication based on modified Gerchberg-Saxton algorithm (GSA) and pixel extraction, and are then used as support constraints in a modified double-phase retrieval algorithm (MPRA), together with the above-mentioned QR codes. No visible information about the target images or the reference images can be obtained from each of these authenticated POMs. This approach allows users to authenticate the two POMs used for image reconstruction without visual observation of the reference images. It also allows user to friendly access and readout with mobile devices.
Optically secured information retrieval using two authenticated phase-only masks
Wang, Xiaogang; Chen, Wen; Mei, Shengtao; Chen, Xudong
2015-01-01
We propose an algorithm for jointly designing two phase-only masks (POMs) that allow for the encryption and noise-free retrieval of triple images. The images required for optical retrieval are first stored in quick-response (QR) codes for noise-free retrieval and flexible readout. Two sparse POMs are respectively calculated from two different images used as references for authentication based on modified Gerchberg-Saxton algorithm (GSA) and pixel extraction, and are then used as support constraints in a modified double-phase retrieval algorithm (MPRA), together with the above-mentioned QR codes. No visible information about the target images or the reference images can be obtained from each of these authenticated POMs. This approach allows users to authenticate the two POMs used for image reconstruction without visual observation of the reference images. It also allows user to friendly access and readout with mobile devices. PMID:26494213
Optically secured information retrieval using two authenticated phase-only masks
NASA Astrophysics Data System (ADS)
Wang, Xiaogang; Chen, Wen; Mei, Shengtao; Chen, Xudong
2015-10-01
We propose an algorithm for jointly designing two phase-only masks (POMs) that allow for the encryption and noise-free retrieval of triple images. The images required for optical retrieval are first stored in quick-response (QR) codes for noise-free retrieval and flexible readout. Two sparse POMs are respectively calculated from two different images used as references for authentication based on modified Gerchberg-Saxton algorithm (GSA) and pixel extraction, and are then used as support constraints in a modified double-phase retrieval algorithm (MPRA), together with the above-mentioned QR codes. No visible information about the target images or the reference images can be obtained from each of these authenticated POMs. This approach allows users to authenticate the two POMs used for image reconstruction without visual observation of the reference images. It also allows user to friendly access and readout with mobile devices.
A biometric authentication model using hand gesture images.
Fong, Simon; Zhuang, Yan; Fister, Iztok; Fister, Iztok
2013-10-30
A novel hand biometric authentication method based on measurements of the user's stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password 'iloveu' in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, 'i' , 'l' , 'o' , 'v' , 'e' , and 'u'. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy.
Secure biometric image sensor and authentication scheme based on compressed sensing.
Suzuki, Hiroyuki; Suzuki, Masamichi; Urabe, Takuya; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki
2013-11-20
It is important to ensure the security of biometric authentication information, because its leakage causes serious risks, such as replay attacks using the stolen biometric data, and also because it is almost impossible to replace raw biometric information. In this paper, we propose a secure biometric authentication scheme that protects such information by employing an optical data ciphering technique based on compressed sensing. The proposed scheme is based on two-factor authentication, the biometric information being supplemented by secret information that is used as a random seed for a cipher key. In this scheme, a biometric image is optically encrypted at the time of image capture, and a pair of restored biometric images for enrollment and verification are verified in the authentication server. If any of the biometric information is exposed to risk, it can be reenrolled by changing the secret information. Through numerical experiments, we confirm that finger vein images can be restored from the compressed sensing measurement data. We also present results that verify the accuracy of the scheme.
A multimodal biometric authentication system based on 2D and 3D palmprint features
NASA Astrophysics Data System (ADS)
Aggithaya, Vivek K.; Zhang, David; Luo, Nan
2008-03-01
This paper presents a new personal authentication system that simultaneously exploits 2D and 3D palmprint features. Here, we aim to improve the accuracy and robustness of existing palmprint authentication systems using 3D palmprint features. The proposed system uses an active stereo technique, structured light, to capture 3D image or range data of the palm and a registered intensity image simultaneously. The surface curvature based method is employed to extract features from 3D palmprint and Gabor feature based competitive coding scheme is used for 2D representation. We individually analyze these representations and attempt to combine them with score level fusion technique. Our experiments on a database of 108 subjects achieve significant improvement in performance (Equal Error Rate) with the integration of 3D features as compared to the case when 2D palmprint features alone are employed.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kakinuma, Ryutaru; Moriyama, Noriyuki
2009-02-01
Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. To overcome these problems, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis likelihood by using helical CT scanner for the lung cancer mass screening. The functions to observe suspicious shadow in detail are provided in computer-aided diagnosis workstation with these screening algorithms. We also have developed the telemedicine network by using Web medical image conference system with the security improvement of images transmission, Biometric fingerprint authentication system and Biometric face authentication system. Biometric face authentication used on site of telemedicine makes "Encryption of file" and "Success in login" effective. As a result, patients' private information is protected. We can share the screen of Web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with workstation. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new telemedicine network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our telemedicine network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Digital image envelope: method and evaluation
NASA Astrophysics Data System (ADS)
Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang
2003-05-01
Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.
Finger vein extraction using gradient normalization and principal curvature
NASA Astrophysics Data System (ADS)
Choi, Joon Hwan; Song, Wonseok; Kim, Taejeong; Lee, Seung-Rae; Kim, Hee Chan
2009-02-01
Finger vein authentication is a personal identification technology using finger vein images acquired by infrared imaging. It is one of the newest technologies in biometrics. Its main advantage over other biometrics is the low risk of forgery or theft, due to the fact that finger veins are not normally visible to others. Extracting finger vein patterns from infrared images is the most difficult part in finger vein authentication. Uneven illumination, varying tissues and bones, and changes in the physical conditions and the blood flow make the thickness and brightness of the same vein different in each acquisition. Accordingly, extracting finger veins at their accurate positions regardless of their thickness and brightness is necessary for accurate personal identification. For this purpose, we propose a new finger vein extraction method which is composed of gradient normalization, principal curvature calculation, and binarization. As local brightness variation has little effect on the curvature and as gradient normalization makes the curvature fairly uniform at vein pixels, our method effectively extracts finger vein patterns regardless of the vein thickness or brightness. In our experiment, the proposed method showed notable improvement as compared with the existing methods.
NASA Astrophysics Data System (ADS)
Lohweg, Volker; Schaede, Johannes; Türke, Thomas
2006-02-01
The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.
Singh, Anushikha; Dutta, Malay Kishore
2017-12-01
The authentication and integrity verification of medical images is a critical and growing issue for patients in e-health services. Accurate identification of medical images and patient verification is an essential requirement to prevent error in medical diagnosis. The proposed work presents an imperceptible watermarking system to address the security issue of medical fundus images for tele-ophthalmology applications and computer aided automated diagnosis of retinal diseases. In the proposed work, patient identity is embedded in fundus image in singular value decomposition domain with adaptive quantization parameter to maintain perceptual transparency for variety of fundus images like healthy fundus or disease affected image. In the proposed method insertion of watermark in fundus image does not affect the automatic image processing diagnosis of retinal objects & pathologies which ensure uncompromised computer-based diagnosis associated with fundus image. Patient ID is correctly recovered from watermarked fundus image for integrity verification of fundus image at the diagnosis centre. The proposed watermarking system is tested in a comprehensive database of fundus images and results are convincing. results indicate that proposed watermarking method is imperceptible and it does not affect computer vision based automated diagnosis of retinal diseases. Correct recovery of patient ID from watermarked fundus image makes the proposed watermarking system applicable for authentication of fundus images for computer aided diagnosis and Tele-ophthalmology applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Practical security and privacy attacks against biometric hashing using sparse recovery
NASA Astrophysics Data System (ADS)
Topcu, Berkay; Karabat, Cagatay; Azadmanesh, Matin; Erdogan, Hakan
2016-12-01
Biometric hashing is a cancelable biometric verification method that has received research interest recently. This method can be considered as a two-factor authentication method which combines a personal password (or secret key) with a biometric to obtain a secure binary template which is used for authentication. We present novel practical security and privacy attacks against biometric hashing when the attacker is assumed to know the user's password in order to quantify the additional protection due to biometrics when the password is compromised. We present four methods that can reconstruct a biometric feature and/or the image from a hash and one method which can find the closest biometric data (i.e., face image) from a database. Two of the reconstruction methods are based on 1-bit compressed sensing signal reconstruction for which the data acquisition scenario is very similar to biometric hashing. Previous literature introduced simple attack methods, but we show that we can achieve higher level of security threats using compressed sensing recovery techniques. In addition, we present privacy attacks which reconstruct a biometric image which resembles the original image. We quantify the performance of the attacks using detection error tradeoff curves and equal error rates under advanced attack scenarios. We show that conventional biometric hashing methods suffer from high security and privacy leaks under practical attacks, and we believe more advanced hash generation methods are necessary to avoid these attacks.
Providing integrity, authenticity, and confidentiality for header and pixel data of DICOM images.
Al-Haj, Ali
2015-04-01
Exchange of medical images over public networks is subjected to different types of security threats. This has triggered persisting demands for secured telemedicine implementations that will provide confidentiality, authenticity, and integrity for the transmitted images. The medical image exchange standard (DICOM) offers mechanisms to provide confidentiality for the header data of the image but not for the pixel data. On the other hand, it offers mechanisms to achieve authenticity and integrity for the pixel data but not for the header data. In this paper, we propose a crypto-based algorithm that provides confidentially, authenticity, and integrity for the pixel data, as well as for the header data. This is achieved by applying strong cryptographic primitives utilizing internally generated security data, such as encryption keys, hashing codes, and digital signatures. The security data are generated internally from the header and the pixel data, thus a strong bond is established between the DICOM data and the corresponding security data. The proposed algorithm has been evaluated extensively using DICOM images of different modalities. Simulation experiments show that confidentiality, authenticity, and integrity have been achieved as reflected by the results we obtained for normalized correlation, entropy, PSNR, histogram analysis, and robustness.
Authenticity preservation with histogram-based reversible data hiding and quadtree concepts.
Huang, Hsiang-Cheh; Fang, Wai-Chi
2011-01-01
With the widespread use of identification systems, establishing authenticity with sensors has become an important research issue. Among the schemes for making authenticity verification based on information security possible, reversible data hiding has attracted much attention during the past few years. With its characteristics of reversibility, the scheme is required to fulfill the goals from two aspects. On the one hand, at the encoder, the secret information needs to be embedded into the original image by some algorithms, such that the output image will resemble the input one as much as possible. On the other hand, at the decoder, both the secret information and the original image must be correctly extracted and recovered, and they should be identical to their embedding counterparts. Under the requirement of reversibility, for evaluating the performance of the data hiding algorithm, the output image quality, named imperceptibility, and the number of bits for embedding, called capacity, are the two key factors to access the effectiveness of the algorithm. Besides, the size of side information for making decoding possible should also be evaluated. Here we consider using the characteristics of original images for developing our method with better performance. In this paper, we propose an algorithm that has the ability to provide more capacity than conventional algorithms, with similar output image quality after embedding, and comparable side information produced. Simulation results demonstrate the applicability and better performance of our algorithm.
Dhir, L; Habib, N E; Monro, D M; Rakshit, S
2010-06-01
The purpose of this study was to investigate the effect of cataract surgery and pupil dilation on iris pattern recognition for personal authentication. Prospective non-comparative cohort study. Images of 15 subjects were captured before (enrolment), and 5, 10, and 15 min after instillation of mydriatics before routine cataract surgery. After cataract surgery, images were captured 2 weeks thereafter. Enrolled and test images (after pupillary dilation and after cataract surgery) were segmented to extract the iris. This was then unwrapped onto a rectangular format for normalization and a novel method using the Discrete Cosine Transform was applied to encode the image into binary bits. The numerical difference between two iris codes (Hamming distance, HD) was calculated. The HD between identification and enrolment codes was used as a score and was compared with a confidence threshold for specific equipment, giving a match or non-match result. The Correct Recognition Rate (CRR) and Equal Error Rates (EERs) were calculated to analyse overall system performance. After cataract surgery, perfect identification and verification was achieved, with zero false acceptance rate, zero false rejection rate, and zero EER. After pupillary dilation, non-elastic deformation occurs and a CRR of 86.67% and EER of 9.33% were obtained. Conventional circle-based localization methods are inadequate. Matching reliability decreases considerably with increase in pupillary dilation. Cataract surgery has no effect on iris pattern recognition, whereas pupil dilation may be used to defeat an iris-based authentication system.
A biometric authentication model using hand gesture images
2013-01-01
A novel hand biometric authentication method based on measurements of the user’s stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password ‘iloveu’ in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, ‘i’ , ‘l’ , ‘o’ , ‘v’ , ‘e’ , and ‘u’. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy. PMID:24172288
Multipurpose image watermarking algorithm based on multistage vector quantization.
Lu, Zhe-Ming; Xu, Dian-Guo; Sun, Sheng-He
2005-06-01
The rapid growth of digital multimedia and Internet technologies has made copyright protection, copy protection, and integrity verification three important issues in the digital world. To solve these problems, the digital watermarking technique has been presented and widely researched. Traditional watermarking algorithms are mostly based on discrete transform domains, such as the discrete cosine transform, discrete Fourier transform (DFT), and discrete wavelet transform (DWT). Most of these algorithms are good for only one purpose. Recently, some multipurpose digital watermarking methods have been presented, which can achieve the goal of content authentication and copyright protection simultaneously. However, they are based on DWT or DFT. Lately, several robust watermarking schemes based on vector quantization (VQ) have been presented, but they can only be used for copyright protection. In this paper, we present a novel multipurpose digital image watermarking method based on the multistage vector quantizer structure, which can be applied to image authentication and copyright protection. In the proposed method, the semi-fragile watermark and the robust watermark are embedded in different VQ stages using different techniques, and both of them can be extracted without the original image. Simulation results demonstrate the effectiveness of our algorithm in terms of robustness and fragility.
System and method for authentication
Duerksen, Gary L.; Miller, Seth A.
2015-12-29
Described are methods and systems for determining authenticity. For example, the method may include providing an object of authentication, capturing characteristic data from the object of authentication, deriving authentication data from the characteristic data of the object of authentication, and comparing the authentication data with an electronic database comprising reference authentication data to provide an authenticity score for the object of authentication. The reference authentication data may correspond to one or more reference objects of authentication other than the object of authentication.
Fooprateepsiri, Rerkchai; Kurutach, Werasak
2014-03-01
Face authentication is a biometric classification method that verifies the identity of a user based on image of their face. Accuracy of the authentication is reduced when the pose, illumination and expression of the training face images are different than the testing image. The methods in this paper are designed to improve the accuracy of a features-based face recognition system when the pose between the input images and training images are different. First, an efficient 2D-to-3D integrated face reconstruction approach is introduced to reconstruct a personalized 3D face model from a single frontal face image with neutral expression and normal illumination. Second, realistic virtual faces with different poses are synthesized based on the personalized 3D face to characterize the face subspace. Finally, face recognition is conducted based on these representative virtual faces. Compared with other related works, this framework has the following advantages: (1) only one single frontal face is required for face recognition, which avoids the burdensome enrollment work; and (2) the synthesized face samples provide the capability to conduct recognition under difficult conditions like complex pose, illumination and expression. From the experimental results, we conclude that the proposed method improves the accuracy of face recognition by varying the pose, illumination and expression. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Text image authenticating algorithm based on MD5-hash function and Henon map
NASA Astrophysics Data System (ADS)
Wei, Jinqiao; Wang, Ying; Ma, Xiaoxue
2017-07-01
In order to cater to the evidentiary requirements of the text image, this paper proposes a fragile watermarking algorithm based on Hash function and Henon map. The algorithm is to divide a text image into parts, get flippable pixels and nonflippable pixels of every lump according to PSD, generate watermark of non-flippable pixels with MD5-Hash, encrypt watermark with Henon map and select embedded blocks. The simulation results show that the algorithm with a good ability in tampering localization can be used to authenticate and forensics the authenticity and integrity of text images
A QR Code Based Zero-Watermarking Scheme for Authentication of Medical Images in Teleradiology Cloud
Seenivasagam, V.; Velumani, R.
2013-01-01
Healthcare institutions adapt cloud based archiving of medical images and patient records to share them efficiently. Controlled access to these records and authentication of images must be enforced to mitigate fraudulent activities and medical errors. This paper presents a zero-watermarking scheme implemented in the composite Contourlet Transform (CT)—Singular Value Decomposition (SVD) domain for unambiguous authentication of medical images. Further, a framework is proposed for accessing patient records based on the watermarking scheme. The patient identification details and a link to patient data encoded into a Quick Response (QR) code serves as the watermark. In the proposed scheme, the medical image is not subjected to degradations due to watermarking. Patient authentication and authorized access to patient data are realized on combining a Secret Share with the Master Share constructed from invariant features of the medical image. The Hu's invariant image moments are exploited in creating the Master Share. The proposed system is evaluated with Checkmark software and is found to be robust to both geometric and non geometric attacks. PMID:23970943
Seenivasagam, V; Velumani, R
2013-01-01
Healthcare institutions adapt cloud based archiving of medical images and patient records to share them efficiently. Controlled access to these records and authentication of images must be enforced to mitigate fraudulent activities and medical errors. This paper presents a zero-watermarking scheme implemented in the composite Contourlet Transform (CT)-Singular Value Decomposition (SVD) domain for unambiguous authentication of medical images. Further, a framework is proposed for accessing patient records based on the watermarking scheme. The patient identification details and a link to patient data encoded into a Quick Response (QR) code serves as the watermark. In the proposed scheme, the medical image is not subjected to degradations due to watermarking. Patient authentication and authorized access to patient data are realized on combining a Secret Share with the Master Share constructed from invariant features of the medical image. The Hu's invariant image moments are exploited in creating the Master Share. The proposed system is evaluated with Checkmark software and is found to be robust to both geometric and non geometric attacks.
Digital camera with apparatus for authentication of images produced from an image file
NASA Technical Reports Server (NTRS)
Friedman, Gary L. (Inventor)
1993-01-01
A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.
Optical identity authentication technique based on compressive ghost imaging with QR code
NASA Astrophysics Data System (ADS)
Wenjie, Zhan; Leihong, Zhang; Xi, Zeng; Yi, Kang
2018-04-01
With the rapid development of computer technology, information security has attracted more and more attention. It is not only related to the information and property security of individuals and enterprises, but also to the security and social stability of a country. Identity authentication is the first line of defense in information security. In authentication systems, response time and security are the most important factors. An optical authentication technology based on compressive ghost imaging with QR codes is proposed in this paper. The scheme can be authenticated with a small number of samples. Therefore, the response time of the algorithm is short. At the same time, the algorithm can resist certain noise attacks, so it offers good security.
A RONI Based Visible Watermarking Approach for Medical Image Authentication.
Thanki, Rohit; Borra, Surekha; Dwivedi, Vedvyas; Borisagar, Komal
2017-08-09
Nowadays medical data in terms of image files are often exchanged between different hospitals for use in telemedicine and diagnosis. Visible watermarking being extensively used for Intellectual Property identification of such medical images, leads to serious issues if failed to identify proper regions for watermark insertion. In this paper, the Region of Non-Interest (RONI) based visible watermarking for medical image authentication is proposed. In this technique, to RONI of the cover medical image is first identified using Human Visual System (HVS) model. Later, watermark logo is visibly inserted into RONI of the cover medical image to get watermarked medical image. Finally, the watermarked medical image is compared with the original medical image for measurement of imperceptibility and authenticity of proposed scheme. The experimental results showed that this proposed scheme reduces the computational complexity and improves the PSNR when compared to many existing schemes.
NASA Astrophysics Data System (ADS)
Mita, Akifumi; Okamoto, Atsushi; Funakoshi, Hisatoshi
2004-06-01
We have proposed an all-optical authentic memory with the two-wave encryption method. In the recording process, the image data are encrypted to a white noise by the random phase masks added on the input beam with the image data and the reference beam. Only reading beam with the phase-conjugated distribution of the reference beam can decrypt the encrypted data. If the encrypted data are read out with an incorrect phase distribution, the output data are transformed into a white noise. Moreover, during read out, reconstructions of the encrypted data interfere destructively resulting in zero intensity. Therefore our memory has a merit that we can detect unlawful accesses easily by measuring the output beam intensity. In our encryption method, the random phase mask on the input plane plays important roles in transforming the input image into a white noise and prohibiting to decrypt a white noise to the input image by the blind deconvolution method. Without this mask, when unauthorized users observe the output beam by using CCD in the readout with the plane wave, the completely same intensity distribution as that of Fourier transform of the input image is obtained. Therefore the encrypted image will be decrypted easily by using the blind deconvolution method. However in using this mask, even if unauthorized users observe the output beam using the same method, the encrypted image cannot be decrypted because the observed intensity distribution is dispersed at random by this mask. Thus it can be said the robustness is increased by this mask. In this report, we compare two correlation coefficients, which represents the degree of a white noise of the output image, between the output image and the input image in using this mask or not. We show that the robustness of this encryption method is increased as the correlation coefficient is improved from 0.3 to 0.1 by using this mask.
Detecting Copy Move Forgery In Digital Images
NASA Astrophysics Data System (ADS)
Gupta, Ashima; Saxena, Nisheeth; Vasistha, S. K.
2012-03-01
In today's world several image manipulation software's are available. Manipulation of digital images has become a serious problem nowadays. There are many areas like medical imaging, digital forensics, journalism, scientific publications, etc, where image forgery can be done very easily. To determine whether a digital image is original or doctored is a big challenge. To find the marks of tampering in a digital image is a challenging task. The detection methods can be very useful in image forensics which can be used as a proof for the authenticity of a digital image. In this paper we propose the method to detect region duplication forgery by dividing the image into overlapping block and then perform searching to find out the duplicated region in the image.
NASA Astrophysics Data System (ADS)
Liu, Xiyao; Lou, Jieting; Wang, Yifan; Du, Jingyu; Zou, Beiji; Chen, Yan
2018-03-01
Authentication and copyright identification are two critical security issues for medical images. Although zerowatermarking schemes can provide durable, reliable and distortion-free protection for medical images, the existing zerowatermarking schemes for medical images still face two problems. On one hand, they rarely considered the distinguishability for medical images, which is critical because different medical images are sometimes similar to each other. On the other hand, their robustness against geometric attacks, such as cropping, rotation and flipping, is insufficient. In this study, a novel discriminative and robust zero-watermarking (DRZW) is proposed to address these two problems. In DRZW, content-based features of medical images are first extracted based on completed local binary pattern (CLBP) operator to ensure the distinguishability and robustness, especially against geometric attacks. Then, master shares and ownership shares are generated from the content-based features and watermark according to (2,2) visual cryptography. Finally, the ownership shares are stored for authentication and copyright identification. For queried medical images, their content-based features are extracted and master shares are generated. Their watermarks for authentication and copyright identification are recovered by stacking the generated master shares and stored ownership shares. 200 different medical images of 5 types are collected as the testing data and our experimental results demonstrate that DRZW ensures both the accuracy and reliability of authentication and copyright identification. When fixing the false positive rate to 1.00%, the average value of false negative rates by using DRZW is only 1.75% under 20 common attacks with different parameters.
Forensic use of photo response non-uniformity of imaging sensors and a counter method.
Dirik, Ahmet Emir; Karaküçük, Ahmet
2014-01-13
Analogous to use of bullet scratches in forensic science, the authenticity of a digital image can be verified through the noise characteristics of an imaging sensor. In particular, photo-response non-uniformity noise (PRNU) has been used in source camera identification (SCI). However, this technique can be used maliciously to track or inculpate innocent people. To impede such tracking, PRNU noise should be suppressed significantly. Based on this motivation, we propose a counter forensic method to deceive SCI. Experimental results show that it is possible to impede PRNU-based camera identification for various imaging sensors while preserving the image quality.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru
2007-03-01
Multislice CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multislice CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. Moreover, we have provided diagnostic assistance methods to medical screening specialists by using a lung cancer screening algorithm built into mobile helical CT scanner for the lung cancer mass screening done in the region without the hospital. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system.
Feature maps driven no-reference image quality prediction of authentically distorted images
NASA Astrophysics Data System (ADS)
Ghadiyaram, Deepti; Bovik, Alan C.
2015-03-01
Current blind image quality prediction models rely on benchmark databases comprised of singly and synthetically distorted images, thereby learning image features that are only adequate to predict human perceived visual quality on such inauthentic distortions. However, real world images often contain complex mixtures of multiple distortions. Rather than a) discounting the effect of these mixtures of distortions on an image's perceptual quality and considering only the dominant distortion or b) using features that are only proven to be efficient for singly distorted images, we deeply study the natural scene statistics of authentically distorted images, in different color spaces and transform domains. We propose a feature-maps-driven statistical approach which avoids any latent assumptions about the type of distortion(s) contained in an image, and focuses instead on modeling the remarkable consistencies in the scene statistics of real world images in the absence of distortions. We design a deep belief network that takes model-based statistical image features derived from a very large database of authentically distorted images as input and discovers good feature representations by generalizing over different distortion types, mixtures, and severities, which are later used to learn a regressor for quality prediction. We demonstrate the remarkable competence of our features for improving automatic perceptual quality prediction on a benchmark database and on the newly designed LIVE Authentic Image Quality Challenge Database and show that our approach of combining robust statistical features and the deep belief network dramatically outperforms the state-of-the-art.
Image authentication by means of fragile CGH watermarking
NASA Astrophysics Data System (ADS)
Schirripa Spagnolo, Giuseppe; Simonetti, Carla; Cozzella, Lorenzo
2005-09-01
In this paper we propose a fragile marking system based on Computer Generated Hologram coding techniques, which is able to detect malicious tampering while tolerating some incidental distortions. A fragile watermark is a mark that is readily altered or destroyed when the host image is modified through a linear or nonlinear transformation. A fragile watermark monitors the integrity of the content of the image but not its numerical representation. Therefore the watermark is designed so that the integrity is proven if the content of the image has not been tampered. Since digital images can be altered or manipulated with ease, the ability to detect changes to digital images is very important for many applications such as news reporting, medical archiving, or legal usages. The proposed technique could be applied to Color Images as well as to Gray Scale ones. Using Computer Generated Hologram watermarking, the embedded mark could be easily recovered by means of a Fourier Transform. Due to this fact host image can be tampered and watermarked with the same holographic pattern. To avoid this possibility we have introduced an encryption method using a asymmetric Cryptography. The proposed schema is based on the knowledge of original mark from the Authentication
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Mori, Kiyoshi; Eguchi, Kenji; Kaneko, Masahiro; Kakinuma, Ryutarou; Moriyama, Noriyuki; Ohmatsu, Hironobu; Masuda, Hideo; Machida, Suguru; Sasagawa, Michizou
2006-03-01
Multi-helical CT scanner advanced remarkably at the speed at which the chest CT images were acquired for mass screening. Mass screening based on multi-helical CT images requires a considerable number of images to be read. It is this time-consuming step that makes the use of helical CT for mass screening impractical at present. To overcome this problem, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images and a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification. We also have developed electronic medical recording system and prototype internet system for the community health in two or more regions by using the Virtual Private Network router and Biometric fingerprint authentication system and Biometric face authentication system for safety of medical information. Based on these diagnostic assistance methods, we have now developed a new computer-aided workstation and database that can display suspected lesions three-dimensionally in a short time. This paper describes basic studies that have been conducted to evaluate this new system. The results of this study indicate that our computer-aided diagnosis workstation and network system can increase diagnostic speed, diagnostic accuracy and safety of medical information.
Galbally, Javier; Marcel, Sébastien; Fierrez, Julian
2014-02-01
To ensure the actual presence of a real legitimate trait in contrast to a fake self-manufactured synthetic or reconstructed sample is a significant problem in biometric authentication, which requires the development of new and efficient protection measures. In this paper, we present a novel software-based fake detection method that can be used in multiple biometric systems to detect different types of fraudulent access attempts. The objective of the proposed system is to enhance the security of biometric recognition frameworks, by adding liveness assessment in a fast, user-friendly, and non-intrusive manner, through the use of image quality assessment. The proposed approach presents a very low degree of complexity, which makes it suitable for real-time applications, using 25 general image quality features extracted from one image (i.e., the same acquired for authentication purposes) to distinguish between legitimate and impostor samples. The experimental results, obtained on publicly available data sets of fingerprint, iris, and 2D face, show that the proposed method is highly competitive compared with other state-of-the-art approaches and that the analysis of the general image quality of real biometric samples reveals highly valuable information that may be very efficiently used to discriminate them from fake traits.
Maione, Camila; Barbosa, Rommel Melgaço
2018-01-24
Rice is one of the most important staple foods around the world. Authentication of rice is one of the most addressed concerns in the present literature, which includes recognition of its geographical origin and variety, certification of organic rice and many other issues. Good results have been achieved by multivariate data analysis and data mining techniques when combined with specific parameters for ascertaining authenticity and many other useful characteristics of rice, such as quality, yield and others. This paper brings a review of the recent research projects on discrimination and authentication of rice using multivariate data analysis and data mining techniques. We found that data obtained from image processing, molecular and atomic spectroscopy, elemental fingerprinting, genetic markers, molecular content and others are promising sources of information regarding geographical origin, variety and other aspects of rice, being widely used combined with multivariate data analysis techniques. Principal component analysis and linear discriminant analysis are the preferred methods, but several other data classification techniques such as support vector machines, artificial neural networks and others are also frequently present in some studies and show high performance for discrimination of rice.
System and method for authentication of goods
Kaish, Norman; Fraser, Jay; Durst, David I.
1999-01-01
An authentication system comprising a medium having a plurality of elements, the elements being distinctive, detectable and disposed in an irregular pattern or having an intrinsic irregularity. Each element is characterized by a determinable attribute distinct from a two-dimensional coordinate representation of simple optical absorption or simple optical reflection intensity. An attribute and position of the plurality of elements, with respect to a positional reference is detected. A processor generates an encrypted message including at least a portion of the attribute and position of the plurality of elements. The encrypted message is recorded in physical association with the medium. The elements are preferably dichroic fibers, and the attribute is preferably a polarization or dichroic axis, which may vary over the length of a fiber. An authentication of the medium based on the encrypted message may be authenticated with a statistical tolerance, based on a vector mapping of the elements of the medium, without requiring a complete image of the medium and elements to be recorded.
NASA Astrophysics Data System (ADS)
Bekkouche, S.; Chouarfia, A.
2011-06-01
Image watermarking can be defined as a technique that allows insertion of imperceptible and indelible digital data into an image. In addition to its initial application which is the copyright, watermarking can be used in other fields, particularly in the medical field in order to contribute to secure images shared on the network for telemedicine applications. In this report we study some watermarking methods and the comparison result of their combination, the first one is based on the CDMA (Code Division Multiple Access) in DWT and spatial domain and its aim is to verify the image authenticity whereas the second one is the reversible watermarking (the least significant bits LSB and cryptography tools) and the reversible contrast mapping RCM its objective is to check the integrity of the image and to keep the Confidentiality of the patient data. A new scheme of watermarking is the combination of the reversible watermarking method based on LSB and cryptography tools and the method of CDMA in spatial and DWT domain to verify the three security properties Integrity, Authenticity and confidentiality of medical data and patient information .In the end ,we made a comparison between these methods within the parameters of quality of medical images. Initially, an in-depth study on the characteristics of medical images would contribute to improve these methods to mitigate their limits and to optimize the results. Tests were done on IRM kind of medical images and the quality measurements have been done on the watermarked image to verify that this technique does not lead to a wrong diagnostic. The robustness of the watermarked images against attacks has been verified on the parameters of PSNR, SNR, MSE and MAE which the experimental result demonstrated that the proposed algorithm is good and robust in DWT than in spatial domain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Tyler Barratt; Urrea, Jorge Mario
2012-06-01
The aim of the Authenticating Cache architecture is to ensure that machine instructions in a Read Only Memory (ROM) are legitimate from the time the ROM image is signed (immediately after compilation) to the time they are placed in the cache for the processor to consume. The proposed architecture allows the detection of ROM image modifications during distribution or when it is loaded into memory. It also ensures that modified instructions will not execute in the processor-as the cache will not be loaded with a page that fails an integrity check. The authenticity of the instruction stream can also bemore » verified in this architecture. The combination of integrity and authenticity assurance greatly improves the security profile of a system.« less
Blind identification of image manipulation type using mixed statistical moments
NASA Astrophysics Data System (ADS)
Jeong, Bo Gyu; Moon, Yong Ho; Eom, Il Kyu
2015-01-01
We present a blind identification of image manipulation types such as blurring, scaling, sharpening, and histogram equalization. Motivated by the fact that image manipulations can change the frequency characteristics of an image, we introduce three types of feature vectors composed of statistical moments. The proposed statistical moments are generated from separated wavelet histograms, the characteristic functions of the wavelet variance, and the characteristic functions of the spatial image. Our method can solve the n-class classification problem. Through experimental simulations, we demonstrate that our proposed method can achieve high performance in manipulation type detection. The average rate of the correctly identified manipulation types is as high as 99.22%, using 10,800 test images and six manipulation types including the authentic image.
An optical authentication system based on imaging of excitation-selected lanthanide luminescence.
Carro-Temboury, Miguel R; Arppe, Riikka; Vosch, Tom; Sørensen, Thomas Just
2018-01-01
Secure data encryption relies heavily on one-way functions, and copy protection relies on features that are difficult to reproduce. We present an optical authentication system based on lanthanide luminescence from physical one-way functions or physical unclonable functions (PUFs). They cannot be reproduced and thus enable unbreakable encryption. Further, PUFs will prevent counterfeiting if tags with unique PUFs are grafted onto products. We have developed an authentication system that comprises a hardware reader, image analysis, and authentication software and physical keys that we demonstrate as an anticounterfeiting system. The physical keys are PUFs made from random patterns of taggants in polymer films on glass that can be imaged following selected excitation of particular lanthanide(III) ions doped into the individual taggants. This form of excitation-selected imaging ensures that by using at least two lanthanide(III) ion dopants, the random patterns cannot be copied, because the excitation selection will fail when using any other emitter. With the developed reader and software, the random patterns are read and digitized, which allows a digital pattern to be stored. This digital pattern or digital key can be used to authenticate the physical key in anticounterfeiting or to encrypt any message. The PUF key was produced with a staggering nominal encoding capacity of 7 3600 . Although the encoding capacity of the realized authentication system reduces to 6 × 10 104 , it is more than sufficient to completely preclude counterfeiting of products.
A network identity authentication system based on Fingerprint identification technology
NASA Astrophysics Data System (ADS)
Xia, Hong-Bin; Xu, Wen-Bo; Liu, Yuan
2005-10-01
Fingerprint verification is one of the most reliable personal identification methods. However, most of the automatic fingerprint identification system (AFIS) is not run via Internet/Intranet environment to meet today's increasing Electric commerce requirements. This paper describes the design and implementation of the archetype system of identity authentication based on fingerprint biometrics technology, and the system can run via Internet environment. And in our system the COM and ASP technology are used to integrate Fingerprint technology with Web database technology, The Fingerprint image preprocessing algorithms are programmed into COM, which deployed on the internet information server. The system's design and structure are proposed, and the key points are discussed. The prototype system of identity authentication based on Fingerprint have been successfully tested and evaluated on our university's distant education applications in an internet environment.
Personal Identification Using Fingernail Image Based on Correlation of Density Block
NASA Astrophysics Data System (ADS)
Noda, Mayumi; Saitoh, Fumihiko
This paper proposes an authentication using fingernail images by using the block segmentation matching. A fingernail is assumed to be a new physical character that is used for biometrics authentication. The proposed system is more effective than fingerprint authentication where psychological resistance and conformability are required. Since the block segmentation matching is useful for occlusion of an object, it is assumed to be robust to a partial change of fingernail. It is expected to enhance the difference of fingernails between persons. The experimental images of various lengths of fingernail and painted manicure were used for evaluation of system performance. The experimental results show that the proposed system obtains the sufficient accuracy to certificate individuals.
Edge detection techniques for iris recognition system
NASA Astrophysics Data System (ADS)
Tania, U. T.; Motakabber, S. M. A.; Ibrahimy, M. I.
2013-12-01
Nowadays security and authentication are the major parts of our daily life. Iris is one of the most reliable organ or part of human body which can be used for identification and authentication purpose. To develop an iris authentication algorithm for personal identification, this paper examines two edge detection techniques for iris recognition system. Between the Sobel and the Canny edge detection techniques, the experimental result shows that the Canny's technique has better ability to detect points in a digital image where image gray level changes even at slow rate.
DICOM image secure communications with Internet protocols IPv6 and IPv4.
Zhang, Jianguo; Yu, Fenghai; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen
2007-01-01
Image-data transmission from one site to another through public network is usually characterized in term of privacy, authenticity, and integrity. In this paper, we first describe a general scenario about how image is delivered from one site to another through a wide-area network (WAN) with security features of data privacy, integrity, and authenticity. Second, we give the common implementation method of the digital imaging and communication in medicine (DICOM) image communication software library with IPv6/IPv4 for high-speed broadband Internet by using open-source software. Third, we discuss two major security-transmission methods, the IP security (IPSec) and the secure-socket layer (SSL) or transport-layer security (TLS), being used currently in medical-image-data communication with privacy support. Fourth, we describe a test schema of multiple-modality DICOM-image communications through TCP/IPv4 and TCP/IPv6 with different security methods, different security algorithms, and operating systems, and evaluate the test results. We found that there are tradeoff factors between choosing the IPsec and the SSL/TLS-based security implementation of IPv6/IPv4 protocols. If the WAN networks only use IPv6 such as in high-speed broadband Internet, the choice is IPsec-based security. If the networks are IPv4 or the combination of IPv6 and IPv4, it is better to use SSL/TLS security. The Linux platform has more security algorithms implemented than the Windows (XP) platform, and can achieve better performance in most experiments of IPv6 and IPv4-based DICOM-image communications. In teleradiology or enterprise-PACS applications, the Linux operating system may be the better choice as peer security gateways for both the IPsec and the SSL/TLS-based secure DICOM communications cross public networks.
Crypto-Watermarking of Transmitted Medical Images.
Al-Haj, Ali; Mohammad, Ahmad; Amer, Alaa'
2017-02-01
Telemedicine is a booming healthcare practice that has facilitated the exchange of medical data and expertise between healthcare entities. However, the widespread use of telemedicine applications requires a secured scheme to guarantee confidentiality and verify authenticity and integrity of exchanged medical data. In this paper, we describe a region-based, crypto-watermarking algorithm capable of providing confidentiality, authenticity, and integrity for medical images of different modalities. The proposed algorithm provides authenticity by embedding robust watermarks in images' region of non-interest using SVD in the DWT domain. Integrity is provided in two levels: strict integrity implemented by a cryptographic hash watermark, and content-based integrity implemented by a symmetric encryption-based tamper localization scheme. Confidentiality is achieved as a byproduct of hiding patient's data in the image. Performance of the algorithm was evaluated with respect to imperceptibility, robustness, capacity, and tamper localization, using different medical images. The results showed the effectiveness of the algorithm in providing security for telemedicine applications.
The Potential of Using Brain Images for Authentication
Zhou, Zongtan; Shen, Hui; Hu, Dewen
2014-01-01
Biometric recognition (also known as biometrics) refers to the automated recognition of individuals based on their biological or behavioral traits. Examples of biometric traits include fingerprint, palmprint, iris, and face. The brain is the most important and complex organ in the human body. Can it be used as a biometric trait? In this study, we analyze the uniqueness of the brain and try to use the brain for identity authentication. The proposed brain-based verification system operates in two stages: gray matter extraction and gray matter matching. A modified brain segmentation algorithm is implemented for extracting gray matter from an input brain image. Then, an alignment-based matching algorithm is developed for brain matching. Experimental results on two data sets show that the proposed brain recognition system meets the high accuracy requirement of identity authentication. Though currently the acquisition of the brain is still time consuming and expensive, brain images are highly unique and have the potential possibility for authentication in view of pattern recognition. PMID:25126604
The potential of using brain images for authentication.
Chen, Fanglin; Zhou, Zongtan; Shen, Hui; Hu, Dewen
2014-01-01
Biometric recognition (also known as biometrics) refers to the automated recognition of individuals based on their biological or behavioral traits. Examples of biometric traits include fingerprint, palmprint, iris, and face. The brain is the most important and complex organ in the human body. Can it be used as a biometric trait? In this study, we analyze the uniqueness of the brain and try to use the brain for identity authentication. The proposed brain-based verification system operates in two stages: gray matter extraction and gray matter matching. A modified brain segmentation algorithm is implemented for extracting gray matter from an input brain image. Then, an alignment-based matching algorithm is developed for brain matching. Experimental results on two data sets show that the proposed brain recognition system meets the high accuracy requirement of identity authentication. Though currently the acquisition of the brain is still time consuming and expensive, brain images are highly unique and have the potential possibility for authentication in view of pattern recognition.
Line-scan macro-scale Raman chemical imaging for authentication of powdered foods and ingredients
USDA-ARS?s Scientific Manuscript database
Adulteration and fraud for powdered foods and ingredients are rising food safety risks that threaten consumers’ health. In this study, a newly developed line-scan macro-scale Raman imaging system using a 5 W 785 nm line laser as excitation source was used to authenticate the food powders. The system...
Blind technique using blocking artifacts and entropy of histograms for image tampering detection
NASA Astrophysics Data System (ADS)
Manu, V. T.; Mehtre, B. M.
2017-06-01
The tremendous technological advancements in recent times has enabled people to create, edit and circulate images easily than ever before. As a result of this, ensuring the integrity and authenticity of the images has become challenging. Malicious editing of images to deceive the viewer is referred to as image tampering. A widely used image tampering technique is image splicing or compositing, in which regions from different images are copied and pasted. In this paper, we propose a tamper detection method utilizing the blocking and blur artifacts which are the footprints of splicing. The classification of images as tampered or not, is done based on the standard deviations of the entropy histograms and block discrete cosine transformations. We can detect the exact boundaries of the tampered area in the image, if the image is classified as tampered. Experimental results on publicly available image tampering datasets show that the proposed method outperforms the existing methods in terms of accuracy.
Spread spectrum image steganography.
Marvel, L M; Boncelet, C R; Retter, C T
1999-01-01
In this paper, we present a new method of digital steganography, entitled spread spectrum image steganography (SSIS). Steganography, which means "covered writing" in Greek, is the science of communicating in a hidden manner. Following a discussion of steganographic communication theory and review of existing techniques, the new method, SSIS, is introduced. This system hides and recovers a message of substantial length within digital imagery while maintaining the original image size and dynamic range. The hidden message can be recovered using appropriate keys without any knowledge of the original image. Image restoration, error-control coding, and techniques similar to spread spectrum are described, and the performance of the system is illustrated. A message embedded by this method can be in the form of text, imagery, or any other digital signal. Applications for such a data-hiding scheme include in-band captioning, covert communication, image tamperproofing, authentication, embedded control, and revision tracking.
ERIC Educational Resources Information Center
Mattord, Herbert J.
2012-01-01
Organizations continue to rely on password-based authentication methods to control access to many Web-based systems. This research study developed a benchmarking instrument intended to assess authentication methods used in Web-based information systems (IS). It developed an Authentication Method System Index (AMSI) to analyze collected data from…
Optical authentication based on moiré effect of nonlinear gratings in phase space
NASA Astrophysics Data System (ADS)
Liao, Meihua; He, Wenqi; Wu, Jiachen; Lu, Dajiang; Liu, Xiaoli; Peng, Xiang
2015-12-01
An optical authentication scheme based on the moiré effect of nonlinear gratings in phase space is proposed. According to the phase function relationship of the moiré effect in phase space, an arbitrary authentication image can be encoded into two nonlinear gratings which serve as the authentication lock (AL) and the authentication key (AK). The AL is stored in the authentication system while the AK is assigned to the authorized user. The authentication procedure can be performed using an optoelectronic approach, while the design process is accomplished by a digital approach. Furthermore, this optical authentication scheme can be extended for multiple users with different security levels. The proposed scheme can not only verify the legality of a user identity, but can also discriminate and control the security levels of legal users. Theoretical analysis and simulation experiments are provided to verify the feasibility and effectiveness of the proposed scheme.
Perceptual quality prediction on authentically distorted images using a bag of features approach
Ghadiyaram, Deepti; Bovik, Alan C.
2017-01-01
Current top-performing blind perceptual image quality prediction models are generally trained on legacy databases of human quality opinion scores on synthetically distorted images. Therefore, they learn image features that effectively predict human visual quality judgments of inauthentic and usually isolated (single) distortions. However, real-world images usually contain complex composite mixtures of multiple distortions. We study the perceptually relevant natural scene statistics of such authentically distorted images in different color spaces and transform domains. We propose a “bag of feature maps” approach that avoids assumptions about the type of distortion(s) contained in an image and instead focuses on capturing consistencies—or departures therefrom—of the statistics of real-world images. Using a large database of authentically distorted images, human opinions of them, and bags of features computed on them, we train a regressor to conduct image quality prediction. We demonstrate the competence of the features toward improving automatic perceptual quality prediction by testing a learned algorithm using them on a benchmark legacy database as well as on a newly introduced distortion-realistic resource called the LIVE In the Wild Image Quality Challenge Database. We extensively evaluate the perceptual quality prediction model and algorithm and show that it is able to achieve good-quality prediction power that is better than other leading models. PMID:28129417
Xie, Shan Juan; Lu, Yu; Yoon, Sook; Yang, Jucheng; Park, Dong Sun
2015-01-01
Finger vein recognition has been considered one of the most promising biometrics for personal authentication. However, the capacities and percentages of finger tissues (e.g., bone, muscle, ligament, water, fat, etc.) vary person by person. This usually causes poor quality of finger vein images, therefore degrading the performance of finger vein recognition systems (FVRSs). In this paper, the intrinsic factors of finger tissue causing poor quality of finger vein images are analyzed, and an intensity variation (IV) normalization method using guided filter based single scale retinex (GFSSR) is proposed for finger vein image enhancement. The experimental results on two public datasets demonstrate the effectiveness of the proposed method in enhancing the image quality and finger vein recognition accuracy. PMID:26184226
Xie, Shan Juan; Lu, Yu; Yoon, Sook; Yang, Jucheng; Park, Dong Sun
2015-07-14
Finger vein recognition has been considered one of the most promising biometrics for personal authentication. However, the capacities and percentages of finger tissues (e.g., bone, muscle, ligament, water, fat, etc.) vary person by person. This usually causes poor quality of finger vein images, therefore degrading the performance of finger vein recognition systems (FVRSs). In this paper, the intrinsic factors of finger tissue causing poor quality of finger vein images are analyzed, and an intensity variation (IV) normalization method using guided filter based single scale retinex (GFSSR) is proposed for finger vein image enhancement. The experimental results on two public datasets demonstrate the effectiveness of the proposed method in enhancing the image quality and finger vein recognition accuracy.
Ju, Seung-hwan; Seo, Hee-suk; Han, Sung-hyu; Ryou, Jae-cheol; Kwak, Jin
2013-01-01
The prevalence of computers and the development of the Internet made us able to easily access information. As people are concerned about user information security, the interest of the user authentication method is growing. The most common computer authentication method is the use of alphanumerical usernames and passwords. The password authentication systems currently used are easy, but only if you know the password, as the user authentication is vulnerable. User authentication using fingerprints, only the user with the information that is specific to the authentication security is strong. But there are disadvantage such as the user cannot change the authentication key. In this study, we proposed authentication methodology that combines numeric-based password and biometric-based fingerprint authentication system. Use the information in the user's fingerprint, authentication keys to obtain security. Also, using numeric-based password can to easily change the password; the authentication keys were designed to provide flexibility.
Ju, Seung-hwan; Seo, Hee-suk; Han, Sung-hyu; Ryou, Jae-cheol
2013-01-01
The prevalence of computers and the development of the Internet made us able to easily access information. As people are concerned about user information security, the interest of the user authentication method is growing. The most common computer authentication method is the use of alphanumerical usernames and passwords. The password authentication systems currently used are easy, but only if you know the password, as the user authentication is vulnerable. User authentication using fingerprints, only the user with the information that is specific to the authentication security is strong. But there are disadvantage such as the user cannot change the authentication key. In this study, we proposed authentication methodology that combines numeric-based password and biometric-based fingerprint authentication system. Use the information in the user's fingerprint, authentication keys to obtain security. Also, using numeric-based password can to easily change the password; the authentication keys were designed to provide flexibility. PMID:24151601
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kaneko, Masahiro; Kakinuma, Ryutaru; Moriyama, Noriyuki
2011-03-01
We have developed the teleradiology network system with a new information security solution that provided with web medical image conference system. In the teleradiology network system, the security of information network is very important subjects. We are studying the secret sharing scheme as a method safely to store or to transmit the confidential medical information used with the teleradiology network system. The confidential medical information is exposed to the risk of the damage and intercept. Secret sharing scheme is a method of dividing the confidential medical information into two or more tallies. Individual medical information cannot be decoded by using one tally at all. Our method has the function of RAID. With RAID technology, if there is a failure in a single tally, there is redundant data already copied to other tally. Confidential information is preserved at an individual Data Center connected through internet because individual medical information cannot be decoded by using one tally at all. Therefore, even if one of the Data Centers is struck and information is damaged, the confidential medical information can be decoded by using the tallies preserved at the data center to which it escapes damage. We can safely share the screen of workstation to which the medical image of Data Center is displayed from two or more web conference terminals at the same time. Moreover, Real time biometric face authentication system is connected with Data Center. Real time biometric face authentication system analyzes the feature of the face image of which it takes a picture in 20 seconds with the camera and defends the safety of the medical information. We propose a new information transmission method and a new information storage method with a new information security solution.
A High Speed Finger-Print Optical Scanning Method
2000-01-01
biometrics technologies for authentication, from the view point of convenience and higher security, dactyloscopy is by far the best, much better than the...sensing technologies using static capacitance, thermal or optical detection, the optical detection is by far with the most potential to meet the...present time due to the low resolution of the inherent nature of thermal imaging technique. Besides, this method is easily influenced by environmental
Lunar-edge based on-orbit modulation transfer function (MTF) measurement
NASA Astrophysics Data System (ADS)
Cheng, Ying; Yi, Hongwei; Liu, Xinlong
2017-10-01
Modulation transfer function (MTF) is an important parameter for image quality evaluation of on-orbit optical image systems. Various methods have been proposed to determine the MTF of an imaging system which are based on images containing point, pulse and edge features. In this paper, the edge of the moon can be used as a high contrast target to measure on-orbit MTF of image systems based on knife-edge methods. The proposed method is an extension of the ISO 12233 Slanted-edge Spatial Frequency Response test, except that the shape of the edge is a circular arc instead of a straight line. In order to get more accurate edge locations and then obtain a more authentic edge spread function (ESF), we choose circular fitting method based on least square to fit lunar edge in sub-pixel edge detection process. At last, simulation results show that the MTF value at Nyquist frequency calculated using our lunar edge method is reliable and accurate with error less than 2% comparing with theoretical MTF value.
Review of passive-blind detection in digital video forgery based on sensing and imaging techniques
NASA Astrophysics Data System (ADS)
Tao, Junjie; Jia, Lili; You, Ying
2016-01-01
Advances in digital video compression and IP communication technologies raised new issues and challenges concerning the integrity and authenticity of surveillance videos. It is so important that the system should ensure that once recorded, the video cannot be altered; ensuring the audit trail is intact for evidential purposes. This paper gives an overview of passive techniques of Digital Video Forensics which are based on intrinsic fingerprints inherent in digital surveillance videos. In this paper, we performed a thorough research of literatures relevant to video manipulation detection methods which accomplish blind authentications without referring to any auxiliary information. We presents review of various existing methods in literature, and much more work is needed to be done in this field of video forensics based on video data analysis and observation of the surveillance systems.
Evaluation of security algorithms used for security processing on DICOM images
NASA Astrophysics Data System (ADS)
Chen, Xiaomeng; Shuai, Jie; Zhang, Jianguo; Huang, H. K.
2005-04-01
In this paper, we developed security approach to provide security measures and features in PACS image acquisition and Tele-radiology image transmission. The security processing on medical images was based on public key infrastructure (PKI) and including digital signature and data encryption to achieve the security features of confidentiality, privacy, authenticity, integrity, and non-repudiation. There are many algorithms which can be used in PKI for data encryption and digital signature. In this research, we select several algorithms to perform security processing on different DICOM images in PACS environment, evaluate the security processing performance of these algorithms, and find the relationship between performance with image types, sizes and the implementation methods.
Trustworthiness and Authenticity: Alternate Ways To Judge Authentic Assessments.
ERIC Educational Resources Information Center
Hipps, Jerome A.
New methods are needed to judge the quality of alternative student assessment, methods which complement the philosophy underlying authentic assessments. This paper examines assumptions underlying validity, reliability, and objectivity, and why they are not matched to authentic assessment, concentrating on the constructivist paradigm of E. Guba and…
Winner, Taryn L; Lanzarotta, Adam; Sommer, André J
2016-06-01
An effective method for detecting and characterizing counterfeit finished dosage forms and packaging materials is described in this study. Using attenuated total internal reflection Fourier transform infrared spectroscopic imaging, suspect tablet coating and core formulations as well as multi-layered foil safety seals, bottle labels, and cigarette tear tapes were analyzed and compared directly with those of a stored authentic product. The approach was effective for obtaining molecular information from structures as small as 6 μm.
Chao, Hui-Mei; Hsu, Chin-Ming; Miaou, Shaou-Gang
2002-03-01
A data-hiding technique called the "bipolar multiple-number base" was developed to provide capabilities of authentication, integration, and confidentiality for an electronic patient record (EPR) transmitted among hospitals through the Internet. The proposed technique is capable of hiding those EPR related data such as diagnostic reports, electrocardiogram, and digital signatures from doctors or a hospital into a mark image. The mark image could be the mark of a hospital used to identify the origin of an EPR. Those digital signatures from doctors and a hospital could be applied for the EPR authentication. Thus, different types of medical data can be integrated into the same mark image. The confidentiality is ultimately achieved by decrypting the EPR related data and digital signatures with an exact copy of the original mark image. The experimental results validate the integrity and the invisibility of the hidden EPR related data. This newly developed technique allows all of the hidden data to be separated and restored perfectly by authorized users.
Comparative study of palm print authentication system using geometric features
NASA Astrophysics Data System (ADS)
Shreyas, Kamath K. M.; Rajeev, Srijith; Panetta, Karen; Agaian, Sos S.
2017-05-01
Biometrics, particularly palm print authentication has been a stimulating research area due to its abundance of features. Stable features and effective matching are the most crucial steps for an authentication system. In conventional palm print authentication systems, matching is based on flexion creases, friction ridges, and minutiae points. Currently, contactless palm print imaging is an emerging technology. However, they tend to involve fluctuations in the image quality and texture loss due to factors such as varying illumination conditions, occlusions, noise, pose, and ghosting. These variations decrease the performance of the authentication systems. Furthermore, real-time palm print authentication in large databases continue to be a challenging task. In order to effectively solve these problems, features which are invariant to these anomalies are required. This paper proposes a robust palm print matching framework by making a comparative study of different local geometric features such as Difference-of-Gaussian, Hessian, Hessian-Laplace, Harris-Laplace, and Multiscale Harris for feature detection. These detectors are coupled with Scale Invariant Feature Transformation (SIFT) descriptor to describe the identified features. Additionally, a two-stage refinement process is carried out to obtain the best stable matches. Computer simulations demonstrate that the accuracy of the system has increased effectively with an EER of 0.86% when Harris-Laplace detector is used on IITD database.
[Molecular authentication of Jinyinhua formula granule by using allele-specific PCR].
Jiang, Chao; Tu, Li-Chan; Yuan, Yuan; Huang, Lu-Qi; Gao, Wei; Jin, Yan
2017-07-01
Traditional authentication method is hard to identify herb's authenticity of traditional Chinese medicine(TCM) formula granules because they have lost all their morphological characteristics. In this study, a new allele-specific PCR method was established for identifying the authentication of Jinyinhua formula granule (made from Lonicerae Japonicae Flos) based on an SNP site in trnL-trnF fragment. Genomic DNA was successfully extracted from Lonicerae Japonicae Flos and its formula granules by using an improved spin column method and then PCR was performed with the designed primer. Approximately 110 bp specific bands was obtained only in the authentic Lonicerae Japonicae Flos and its formula granules, while no bands were found in fake mixed products. In addition, the PCR product sequence was proved from Lonicerae Japonicae Flos trnL-trnF sequence by using BLAST method. Therefore, DNA molecular authentication method could make up the limitations of character identification method and microscopic identification, and quickly identify herb's authenticity of TCM formula granules, with enormous potential for market supervision and quality control. Copyright© by the Chinese Pharmaceutical Association.
[Application of rapid PCR to authenticate medicinal snakes].
Chen, Kang; Jiang, Chao; Yuan, Yuan; Huang, Lu-Qi; Li, Man
2014-10-01
To obtained an accurate, rapid and efficient method for authenticate medicinal snakes listed in Chinese Pharmacopoeia (Zaocysd humnades, Bungarus multicinctus, Agkistrodon acutus), a rapid PCR method for authenticate snakes and its adulterants was established based on the classic molecular authentication methods. DNA was extracted by alkaline lysis and the specific primers were amplified by two-steps PCR amplification method. The denatured and annealing temperature and cycle numbers were optimized. When 100 x SYBR Green I was added in the PCR product, strong green fluorescence was visualized under 365 nm UV whereas adulterants without. The whole process can complete in 30-45 minutes. The established method provides the technical support for authentication of the snakes on field.
Authentication via wavefront-shaped optical responses
NASA Astrophysics Data System (ADS)
Eilers, Hergen; Anderson, Benjamin R.; Gunawidjaja, Ray
2018-02-01
Authentication/tamper-indication is required in a wide range of applications, including nuclear materials management and product counterfeit detection. State-of-the-art techniques include reflective particle tags, laser speckle authentication, and birefringent seals. Each of these passive techniques has its own advantages and disadvantages, including the need for complex image comparisons, limited flexibility, sensitivity to environmental conditions, limited functionality, etc. We have developed a new active approach to address some of these short-comings. The use of an active characterization technique adds more flexibility and additional layers of security over current techniques. Our approach uses randomly-distributed nanoparticles embedded in a polymer matrix (tag/seal) which is attached to the item to be secured. A spatial light modulator is used to adjust the wavefront of a laser which interacts with the tag/seal, and a detector is used to monitor this interaction. The interaction can occur in various ways, including transmittance, reflectance, fluorescence, random lasing, etc. For example, at the time of origination, the wavefront-shaped reflectance from a tag/seal can be adjusted to result in a specific pattern (symbol, words, etc.) Any tampering with the tag/seal would results in a disturbance of the random orientation of the nanoparticles and thus distort the reflectance pattern. A holographic waveplate could be inserted into the laser beam for verification. The absence/distortion of the original pattern would then indicate that tampering has occurred. We have tested the tag/seal's and authentication method's tamper-indicating ability using various attack methods, including mechanical, thermal, and chemical attacks, and have verified our material/method's robust tamper-indicating ability.
NASA Astrophysics Data System (ADS)
Ahi, Kiarash; Shahbazmohamadi, Sina; Asadizanjani, Navid
2018-05-01
In this paper, a comprehensive set of techniques for quality control and authentication of packaged integrated circuits (IC) using terahertz (THz) time-domain spectroscopy (TDS) is developed. By material characterization, the presence of unexpected materials in counterfeit components is revealed. Blacktopping layers are detected using THz time-of-flight tomography, and thickness of hidden layers is measured. Sanded and contaminated components are detected by THz reflection-mode imaging. Differences between inside structures of counterfeit and authentic components are revealed through developing THz transmission imaging. For enabling accurate measurement of features by THz transmission imaging, a novel resolution enhancement technique (RET) has been developed. This RET is based on deconvolution of the THz image and the THz point spread function (PSF). The THz PSF is mathematically modeled through incorporating the spectrum of the THz imaging system, the axis of propagation of the beam, and the intensity extinction coefficient of the object into a Gaussian beam distribution. As a result of implementing this RET, the accuracy of the measurements on THz images has been improved from 2.4 mm to 0.1 mm and bond wires as small as 550 μm inside the packaging of the ICs are imaged.
ERIC Educational Resources Information Center
Everett, Donna R.
This guide presents performance-based authentic assessment ideas, samples, and suggestions to help marketing teachers and students respond to changes and pressures from outside the classroom. It contains 21 activities, each accompanied by a method of authentic assessment. In most cases, the authentic assessment method is a scoring device. The…
Rock images classification by using deep convolution neural network
NASA Astrophysics Data System (ADS)
Cheng, Guojian; Guo, Wenhui
2017-08-01
Granularity analysis is one of the most essential issues in authenticate under microscope. To improve the efficiency and accuracy of traditional manual work, an convolutional neural network based method is proposed for granularity analysis from thin section image, which chooses and extracts features from image samples while build classifier to recognize granularity of input image samples. 4800 samples from Ordos basin are used for experiments under colour spaces of HSV, YCbCr and RGB respectively. On the test dataset, the correct rate in RGB colour space is 98.5%, and it is believable in HSV and YCbCr colour space. The results show that the convolution neural network can classify the rock images with high reliability.
Securing palmprint authentication systems using spoof detection approach
NASA Astrophysics Data System (ADS)
Kanhangad, Vivek; Kumar, Abhishek
2013-12-01
Automated human authentication using features extracted from palmprint images has been studied extensively in the literature. Primary focus of the studies thus far has been the improvement of matching performance. As more biometric systems get deployed for wide range of applications, the threat of impostor attacks on these systems is on the rise. The most common among various types of attacks is the sensor level spoof attack using fake hands created using different materials. This paper investigates an approach for securing palmprint based biometric systems against spoof attacks that use photographs of the human hand for circumventing the system. The approach is based on the analysis of local texture patterns of acquired palmprint images for extracting discriminatory features. A trained binary classifier utilizes the discriminating information to determine if the input image is of real hand or a fake one. Experimental results, using 611 palmprint images corresponding to 100 subjects in the publicly available IITD palmprint image database, show that 1) palmprint authentication systems are highly vulnerable to spoof attacks and 2) the proposed spoof detection approach is effective for discriminating between real and fake image samples. In particular, the proposed approach achieves the best classification accuracy of 97.35%.
PREFACE: Anti-counterfeit Image Analysis Methods (A Special Session of ICSXII)
NASA Astrophysics Data System (ADS)
Javidi, B.; Fournel, T.
2007-06-01
The International Congress for Stereology is dedicated to theoretical and applied aspects of stochastic tools, image analysis and mathematical morphology. A special emphasis on `anti-counterfeit image analysis methods' has been given this year for the XIIth edition (ICSXII). Facing the economic and social threat of counterfeiting, this devoted session presents recent advances and original solutions in the field. A first group of methods are related to marks located either on the product (physical marks) or on the data (hidden information) to be protected. These methods concern laser fs 3D encoding and source separation for machine-readable identification, moiré and `guilloche' engraving for visual verification and watermarking. Machine-readable travel documents are well-suited examples introducing the second group of methods which are related to cryptography. Used in passports for data authentication and identification (of people), cryptography provides some powerful tools. Opto-digital processing allows some efficient implementations described in the papers and promising applications. We would like to thank the reviewers who have contributed to a session of high quality, and the authors for their fine and hard work. We would like to address some special thanks to the invited lecturers, namely Professor Roger Hersch and Dr Isaac Amidror for their survey of moiré methods, Prof. Serge Vaudenay for his survey of existing protocols concerning machine-readable travel documents, and Dr Elisabet Pérez-Cabré for her presentation on optical encryption for multifactor authentication. We also thank Professor Dominique Jeulin, President of the International Society for Stereology, Professor Michel Jourlin, President of the organizing committee of ICSXII, for their help and advice, and Mr Graham Douglas, the Publisher of Journal of Physics: Conference Series at IOP Publishing, for his efficiency. We hope that this collection of papers will be useful as a tool to further develop a very important field. Bahram Javidi University of Connecticut (USA) Thierry Fournel University of Saint-Etienne (France) Chairs of the special session on `Anti-counterfeit image analysis methods', July 2007
ERIC Educational Resources Information Center
Cross, Dawn M.
2017-01-01
Observation is the preferred method of assessing small children in school and play settings. Despite being taught this, based on statements during the interviews of this study, it appears little instruction is dedicated to the performance of this type of authentic assessment. The professional literature surveyed for this study finds that due to…
Human Cortical Activity Evoked by the Assignment of Authenticity when Viewing Works of Art
Huang, Mengfei; Bridge, Holly; Kemp, Martin J.; Parker, Andrew J.
2011-01-01
The expertise of others is a major social influence on our everyday decisions and actions. Many viewers of art, whether expert or naïve, are convinced that the full esthetic appreciation of an artwork depends upon the assurance that the work is genuine rather than fake. Rembrandt portraits provide an interesting image set for testing this idea, as there is a large number of them and recent scholarship has determined that quite a few fakes and copies exist. Use of this image set allowed us to separate the brain’s response to images of genuine and fake pictures from the brain’s response to external advice about the authenticity of the paintings. Using functional magnetic resonance imaging, viewing of artworks assigned as “copy,” rather than “authentic,” evoked stronger responses in frontopolar cortex (FPC), and right precuneus, regardless of whether the portrait was actually genuine. Advice about authenticity had no direct effect on the cortical visual areas responsive to the paintings, but there was a significant psycho-physiological interaction between the FPC and the lateral occipital area, which suggests that these visual areas may be modulated by FPC. We propose that the activation of brain networks rather than a single cortical area in this paradigm supports the art scholars’ view that esthetic judgments are multi-faceted and multi-dimensional in nature. PMID:22164139
NASA Astrophysics Data System (ADS)
Fathirad, Iraj; Devlin, John; Jiang, Frank
2012-09-01
The key-exchange and authentication are two crucial elements of any network security mechanism. IPsec, SSL/TLS, PGP and S/MIME are well-known security approaches in providing security service to network, transport and application layers; these protocols use different methods (based on their requirements) to establish keying materials and authenticates key-negotiation and participated parties. This paper studies and compares the authenticated key negotiation methods in mentioned protocols.
Obfuscated authentication systems, devices, and methods
Armstrong, Robert C; Hutchinson, Robert L
2013-10-22
Embodiments of the present invention are directed toward authentication systems, devices, and methods. Obfuscated executable instructions may encode an authentication procedure and protect an authentication key. The obfuscated executable instructions may require communication with a remote certifying authority for operation. In this manner, security may be controlled by the certifying authority without regard to the security of the electronic device running the obfuscated executable instructions.
Study on a Biometric Authentication Model based on ECG using a Fuzzy Neural Network
NASA Astrophysics Data System (ADS)
Kim, Ho J.; Lim, Joon S.
2018-03-01
Traditional authentication methods use numbers or graphic passwords and thus involve the risk of loss or theft. Various studies are underway regarding biometric authentication because it uses the unique biometric data of a human being. Biometric authentication technology using ECG from biometric data involves signals that record electrical stimuli from the heart. It is difficult to manipulate and is advantageous in that it enables unrestrained measurements from sensors that are attached to the skin. This study is on biometric authentication methods using the neural network with weighted fuzzy membership functions (NEWFM). In the biometric authentication process, normalization and the ensemble average is applied during preprocessing, characteristics are extracted using Haar-wavelets, and a registration process called “training” is performed in the fuzzy neural network. In the experiment, biometric authentication was performed on 73 subjects in the Physionet Database. 10-40 ECG waveforms were tested for use in the registration process, and 15 ECG waveforms were deemed the appropriate number for registering ECG waveforms. 1 ECG waveforms were used during the authentication stage to conduct the biometric authentication test. Upon testing the proposed biometric authentication method based on 73 subjects from the Physionet Database, the TAR was 98.32% and FAR was 5.84%.
Tuckley, Kushal
2017-01-01
In telemedicine systems, critical medical data is shared on a public communication channel. This increases the risk of unauthorised access to patient's information. This underlines the importance of secrecy and authentication for the medical data. This paper presents two innovative variations of classical histogram shift methods to increase the hiding capacity. The first technique divides the image into nonoverlapping blocks and embeds the watermark individually using the histogram method. The second method separates the region of interest and embeds the watermark only in the region of noninterest. This approach preserves the medical information intact. This method finds its use in critical medical cases. The high PSNR (above 45 dB) obtained for both techniques indicates imperceptibility of the approaches. Experimental results illustrate superiority of the proposed approaches when compared with other methods based on histogram shifting techniques. These techniques improve embedding capacity by 5–15% depending on the image type, without affecting the quality of the watermarked image. Both techniques also enable lossless reconstruction of the watermark and the host medical image. A higher embedding capacity makes the proposed approaches attractive for medical image watermarking applications without compromising the quality of the image. PMID:29104744
A Routing Path Construction Method for Key Dissemination Messages in Sensor Networks
Moon, Soo Young; Cho, Tae Ho
2014-01-01
Authentication is an important security mechanism for detecting forged messages in a sensor network. Each cluster head (CH) in dynamic key distribution schemes forwards a key dissemination message that contains encrypted authentication keys within its cluster to next-hop nodes for the purpose of authentication. The forwarding path of the key dissemination message strongly affects the number of nodes to which the authentication keys in the message are actually distributed. We propose a routing method for the key dissemination messages to increase the number of nodes that obtain the authentication keys. In the proposed method, each node selects next-hop nodes to which the key dissemination message will be forwarded based on secret key indexes, the distance to the sink node, and the energy consumption of its neighbor nodes. The experimental results show that the proposed method can increase by 50–70% the number of nodes to which authentication keys in each cluster are distributed compared to geographic and energy-aware routing (GEAR). In addition, the proposed method can detect false reports earlier by using the distributed authentication keys, and it consumes less energy than GEAR when the false traffic ratio (FTR) is ≥10%. PMID:25136649
Authentication of meat and meat products.
Ballin, N Z
2010-11-01
In recent years, interest in meat authenticity has increased. Many consumers are concerned about the meat they eat and accurate labelling is important to inform consumer choice. Authentication methods can be categorised into the areas where fraud is most likely to occur: meat origin, meat substitution, meat processing treatment and non-meat ingredient addition. Within each area the possibilities for fraud can be subcategorised as follows: meat origin-sex, meat cuts, breed, feed intake, slaughter age, wild versus farmed meat, organic versus conventional meat, and geographic origin; meat substitution-meat species, fat, and protein; meat processing treatment-irradiation, fresh versus thawed meat and meat preparation; non-meat ingredient addition-additives and water. Analytical methods used in authentication are as diverse as the authentication problems, and include a diverse range of equipment and techniques. This review is intended to provide an overview of the possible analytical methods available for meat and meat products authentication. In areas where no authentication methods have been published, possible strategies are suggested. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd. All rights reserved.
Research on simulated infrared image utility evaluation using deep representation
NASA Astrophysics Data System (ADS)
Zhang, Ruiheng; Mu, Chengpo; Yang, Yu; Xu, Lixin
2018-01-01
Infrared (IR) image simulation is an important data source for various target recognition systems. However, whether simulated IR images could be used as training data for classifiers depends on the features of fidelity and authenticity of simulated IR images. For evaluation of IR image features, a deep-representation-based algorithm is proposed. Being different from conventional methods, which usually adopt a priori knowledge or manually designed feature, the proposed method can extract essential features and quantitatively evaluate the utility of simulated IR images. First, for data preparation, we employ our IR image simulation system to generate large amounts of IR images. Then, we present the evaluation model of simulated IR image, for which an end-to-end IR feature extraction and target detection model based on deep convolutional neural network is designed. At last, the experiments illustrate that our proposed method outperforms other verification algorithms in evaluating simulated IR images. Cross-validation, variable proportion mixed data validation, and simulation process contrast experiments are carried out to evaluate the utility and objectivity of the images generated by our simulation system. The optimum mixing ratio between simulated and real data is 0.2≤γ≤0.3, which is an effective data augmentation method for real IR images.
A Mean Wink at Authenticity: Chinese Images in Disney's "Mulan."
ERIC Educational Resources Information Center
Mo, Weimin; Shen, Wenju
2000-01-01
Offers a critique from two Chinese educators with regard to the historical, cultural, linguistic, and artistic authenticity of Disney's animated film "Mulan." Argues that the filmmakers robbed the original story of its soul and "ran over Chinese culture with the Disney bulldozer," imposing mainstream cultural beliefs and…
Finger-Vein Image Enhancement Using a Fuzzy-Based Fusion Method with Gabor and Retinex Filtering
Shin, Kwang Yong; Park, Young Ho; Nguyen, Dat Tien; Park, Kang Ryoung
2014-01-01
Because of the advantages of finger-vein recognition systems such as live detection and usage as bio-cryptography systems, they can be used to authenticate individual people. However, images of finger-vein patterns are typically unclear because of light scattering by the skin, optical blurring, and motion blurring, which can degrade the performance of finger-vein recognition systems. In response to these issues, a new enhancement method for finger-vein images is proposed. Our method is novel compared with previous approaches in four respects. First, the local and global features of the vein lines of an input image are amplified using Gabor filters in four directions and Retinex filtering, respectively. Second, the means and standard deviations in the local windows of the images produced after Gabor and Retinex filtering are used as inputs for the fuzzy rule and fuzzy membership function, respectively. Third, the optimal weights required to combine the two Gabor and Retinex filtered images are determined using a defuzzification method. Fourth, the use of a fuzzy-based method means that image enhancement does not require additional training data to determine the optimal weights. Experimental results using two finger-vein databases showed that the proposed method enhanced the accuracy of finger-vein recognition compared with previous methods. PMID:24549251
Understanding the optics to aid microscopy image segmentation.
Yin, Zhaozheng; Li, Kang; Kanade, Takeo; Chen, Mei
2010-01-01
Image segmentation is essential for many automated microscopy image analysis systems. Rather than treating microscopy images as general natural images and rushing into the image processing warehouse for solutions, we propose to study a microscope's optical properties to model its image formation process first using phase contrast microscopy as an exemplar. It turns out that the phase contrast imaging system can be relatively well explained by a linear imaging model. Using this model, we formulate a quadratic optimization function with sparseness and smoothness regularizations to restore the "authentic" phase contrast images that directly correspond to specimen's optical path length without phase contrast artifacts such as halo and shade-off. With artifacts removed, high quality segmentation can be achieved by simply thresholding the restored images. The imaging model and restoration method are quantitatively evaluated on two sequences with thousands of cells captured over several days.
A novel iris patterns matching algorithm of weighted polar frequency correlation
NASA Astrophysics Data System (ADS)
Zhao, Weijie; Jiang, Linhua
2014-11-01
Iris recognition is recognized as one of the most accurate techniques for biometric authentication. In this paper, we present a novel correlation method - Weighted Polar Frequency Correlation(WPFC) - to match and evaluate two iris images, actually it can also be used for evaluating the similarity of any two images. The WPFC method is a novel matching and evaluating method for iris image matching, which is complete different from the conventional methods. For instance, the classical John Daugman's method of iris recognition uses 2D Gabor wavelets to extract features of iris image into a compact bit stream, and then matching two bit streams with hamming distance. Our new method is based on the correlation in the polar coordinate system in frequency domain with regulated weights. The new method is motivated by the observation that the pattern of iris that contains far more information for recognition is fine structure at high frequency other than the gross shapes of iris images. Therefore, we transform iris images into frequency domain and set different weights to frequencies. Then calculate the correlation of two iris images in frequency domain. We evaluate the iris images by summing the discrete correlation values with regulated weights, comparing the value with preset threshold to tell whether these two iris images are captured from the same person or not. Experiments are carried out on both CASIA database and self-obtained images. The results show that our method is functional and reliable. Our method provides a new prospect for iris recognition system.
NASA Astrophysics Data System (ADS)
Klug Boonstra, S.; Swann, J.; Boonstra, D.; Manfredi, L.; Christensen, P. R.
2016-12-01
Recent research identifies the most effective learning as active, engaged learning in which students interact with phenomena, other students, and the teacher/leader to derive meaning and construct understanding of their surroundings. "Similarly, an engaging and effective science education goes well beyond the low-level factual recall that is emphasized in many science classes. It must develop the skills that students need to solve complex problems, work in teams, make and recognize evidence-based arguments, and interpret and communicate complex information" (emphasis added). Authentic science research projects provide active, engaged learning in which students interact with authentic science data in an authentic problem-solving context to derive meaning and construct understanding of the world. In formal (and many informal) settings, the teacher/leader is effectively the gatekeeper who determines the learning experiences in which the students will participate. From our experience of nearly a decade and a half of authentic science programming for 5thgrade through early college students working with NASA Mars data, supporting and enabling the teacher is perhaps the most critical and foundational element for designing a successful authentic research experience. Yet, a major barrier to this type of learning are teacher/leaders who are too often not equipped or who lack confidence to succeed in facilitating authentic research projects. The Mars Student Imaging Project has implemented an iterative process of design, testing, and redesign that has identified and implemented critical teacher/leader-enabling elements that have led to increasingly successful adoptions within formal and informal educational settings - allowing more students to gain the benefits of immersive research experience.
No-Reference Image Quality Assessment by Wide-Perceptual-Domain Scorer Ensemble Method.
Liu, Tsung-Jung; Liu, Kuan-Hsien
2018-03-01
A no-reference (NR) learning-based approach to assess image quality is presented in this paper. The devised features are extracted from wide perceptual domains, including brightness, contrast, color, distortion, and texture. These features are used to train a model (scorer) which can predict scores. The scorer selection algorithms are utilized to help simplify the proposed system. In the final stage, the ensemble method is used to combine the prediction results from selected scorers. Two multiple-scale versions of the proposed approach are also presented along with the single-scale one. They turn out to have better performances than the original single-scale method. Because of having features from five different domains at multiple image scales and using the outputs (scores) from selected score prediction models as features for multi-scale or cross-scale fusion (i.e., ensemble), the proposed NR image quality assessment models are robust with respect to more than 24 image distortion types. They also can be used on the evaluation of images with authentic distortions. The extensive experiments on three well-known and representative databases confirm the performance robustness of our proposed model.
NASA Astrophysics Data System (ADS)
Wan, Qianwen; Panetta, Karen; Agaian, Sos
2017-05-01
Autonomous facial recognition system is widely used in real-life applications, such as homeland border security, law enforcement identification and authentication, and video-based surveillance analysis. Issues like low image quality, non-uniform illumination as well as variations in poses and facial expressions can impair the performance of recognition systems. To address the non-uniform illumination challenge, we present a novel robust autonomous facial recognition system inspired by the human visual system based, so called, logarithmical image visualization technique. In this paper, the proposed method, for the first time, utilizes the logarithmical image visualization technique coupled with the local binary pattern to perform discriminative feature extraction for facial recognition system. The Yale database, the Yale-B database and the ATT database are used for computer simulation accuracy and efficiency testing. The extensive computer simulation demonstrates the method's efficiency, accuracy, and robustness of illumination invariance for facial recognition.
Kim, Daehee; Kim, Dongwan; An, Sunshin
2016-07-09
Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption.
Kim, Daehee; Kim, Dongwan; An, Sunshin
2016-01-01
Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616
Quality evaluation of no-reference MR images using multidirectional filters and image statistics.
Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik
2018-09-01
This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.
An evaluation of authentication methods for smartphone based on users’ preferences
NASA Astrophysics Data System (ADS)
Sari, P. K.; Ratnasari, G. S.; Prasetio, A.
2016-04-01
This study discusses about smartphone screen lock preferences using some types of authentication methods. The purpose is to determine the user behaviours based on the perceived security and convenience, as well as the preferences for different types of authentication methods. Variables used are the considerations for locking the screens and the types of authentication methods. The population consists of the smartphone users with the total samples of 400 respondents within a nonprobability sampling method. Data analysis method used is the descriptive analysis. The results showed that the convenience factor is still the major consideration for locking the smartphone screens. Majority of the users chose the pattern unlock as the most convenient method to use. Meanwhile, fingerprint unlock becomes the most secure method in the users’ perceptions and as the method chosen to be used in the future.
NASA Astrophysics Data System (ADS)
Dong, Yumin; Xiao, Shufen; Ma, Hongyang; Chen, Libo
2016-12-01
Cloud computing and big data have become the developing engine of current information technology (IT) as a result of the rapid development of IT. However, security protection has become increasingly important for cloud computing and big data, and has become a problem that must be solved to develop cloud computing. The theft of identity authentication information remains a serious threat to the security of cloud computing. In this process, attackers intrude into cloud computing services through identity authentication information, thereby threatening the security of data from multiple perspectives. Therefore, this study proposes a model for cloud computing protection and management based on quantum authentication, introduces the principle of quantum authentication, and deduces the quantum authentication process. In theory, quantum authentication technology can be applied in cloud computing for security protection. This technology cannot be cloned; thus, it is more secure and reliable than classical methods.
The trustworthy digital camera: Restoring credibility to the photographic image
NASA Technical Reports Server (NTRS)
Friedman, Gary L.
1994-01-01
The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.
Hylemetry versus Biometry: a new method to certificate the lithography authenticity
NASA Astrophysics Data System (ADS)
Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla
2011-06-01
When we buy an artwork object a certificate of authenticity contain specific details about the artwork. Unfortunately, these certificates are often exchanged between similar artworks: the same document is supplied by the seller to certificate the originality. In this way the buyer will have a copy of an original certificate to attest that the "not original artwork" is an original one. A solution for this problem would be to insert a system that links together the certificate and a specific artwork. To do this it is necessary, for a single artwork, to find unique, unrepeatable, and unchangeable characteristics. In this paper we propose a new lithography certification based on the color spots distribution, which compose the lithography itself. Due to the high resolution acquisition media available today, it is possible using analysis method typical of speckle metrology. In particular, in verification phase it is only necessary acquiring the same portion of lithography, extracting the verification information, using the private key to obtain the same information from the certificate and confronting the two information using a comparison threshold. Due to the possible rotation and translation it is applied image correlation solutions, used in speckle metrology, to determine translation and rotation error and correct allow to verifying extracted and acquired images in the best situation, for granting correct originality verification.
NASA Astrophysics Data System (ADS)
Pounder, Jean
2017-04-01
The goal of Project Based Learning (PBL) is to actively engage students through authentic, real word study to increase content knowledge, understanding, and skills for everyday success. The essential design of PBL is very similar in nature to the scientific method and therefore easy to adapt to the science classroom. In my classroom, students use these essential elements when engaging in the study of the processes that affect the surface of a planet such as weathering and erosion. Studying Mars is a hook to getting students to learn about the same processes that occur on Earth and to contrast the differences that occur on another planetary body. As part of the Mars Student Imaging Project (MSIP), students have the opportunity to engage and collaborate with NASA scientists at Arizona State University and get feedback on their work. They research and develop their own question or area of focus to study. They use images of Mars taken using the THEMIS camera onboard the Mars Odyssey Satellite, which has been orbiting Mars since 2001. Students submit a proposal to the scientists at ASU and, if accepted, they are given the opportunity to use the THEMIS camera in orbit to photograph a new region on Mars that will hopefully contribute to their research. Students give a final presentation to the faculty, staff, community, and other students by presenting their work in a poster session and explaining their work to the audience.
User Authentication in Smartphones for Telehealth
SMITH, KATHERINE A.; ZHOU, LEMING; WATZLAF, VALERIE J. M.
2017-01-01
Many functions previously conducted on desktop computers are now performed on smartphones. Smartphones provide convenience, portability, and connectivity. When smartphones are used in the conduct of telehealth, sensitive data is invariably accessed, rendering the devices in need of user authentication to ensure data protection. User authentication of smartphones can help mitigate potential Health Insurance Portability and Accountability Act (HIPAA) breaches and keep sensitive patient information protected, while also facilitating the convenience of smartphones within everyday life and healthcare. This paper presents and examines several types of authentication methods available to smartphone users to help ensure security of sensitive data from attackers. The applications of these authentication methods in telehealth are discussed. PMID:29238444
User Authentication in Smartphones for Telehealth.
Smith, Katherine A; Zhou, Leming; Watzlaf, Valerie J M
2017-01-01
Many functions previously conducted on desktop computers are now performed on smartphones. Smartphones provide convenience, portability, and connectivity. When smartphones are used in the conduct of telehealth, sensitive data is invariably accessed, rendering the devices in need of user authentication to ensure data protection. User authentication of smartphones can help mitigate potential Health Insurance Portability and Accountability Act (HIPAA) breaches and keep sensitive patient information protected, while also facilitating the convenience of smartphones within everyday life and healthcare. This paper presents and examines several types of authentication methods available to smartphone users to help ensure security of sensitive data from attackers. The applications of these authentication methods in telehealth are discussed.
Counterfeit-resistant materials and a method and apparatus for authenticating materials
Ramsey, J. Michael; Klatt, Leon N.
2001-01-01
Fluorescent dichroic fibers randomly incorporated within a media provide an improved method for authentication and counterfeiting protection. The dichroism is provided by an alignment of fluorescent molecules along the length of the fibers. The fluorescent fibers provide an authentication mechanism of varying levels of capability. The authentication signature depends on four parameters, the x,y position, the dichroism and the local environment. The availability of so many non-deterministic variables makes production of counterfeit articles (e.g., currency, credit cards, etc.) essentially impossible Counterfeit-resistant articles, an apparatus for authenticating articles, and a process for forming counterfeit-resistant media are also provided&
Counterfeit-resistant materials and a method and apparatus for authenticating materials
Ramsey, J. Michael; Klatt, Leon N.
2000-01-01
Fluorescent dichroic fibers randomly incorporated within a media provide an improved method for authentication and counterfeiting protection. The dichroism is provided by an alignment of fluorescent molecules along the length of the fibers. The fluorescent fibers provide an authentication mechanism of varying levels of capability. The authentication signature depends on four parameters; the x,y position, the dichroism and the local environment. The availability of so many non-deterministic variables makes production of counterfeit articles (e.g., currency, credit cards, etc.) essentially impossible. Counterfeit-resistant articles, an apparatus for authenticating articles, and a process for forming counterfeit-resistant media are also provided.
Predicting perceptual quality of images in realistic scenario using deep filter banks
NASA Astrophysics Data System (ADS)
Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang
2018-03-01
Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.
Live face detection based on the analysis of Fourier spectra
NASA Astrophysics Data System (ADS)
Li, Jiangwei; Wang, Yunhong; Tan, Tieniu; Jain, Anil K.
2004-08-01
Biometrics is a rapidly developing technology that is to identify a person based on his or her physiological or behavioral characteristics. To ensure the correction of authentication, the biometric system must be able to detect and reject the use of a copy of a biometric instead of the live biometric. This function is usually termed "liveness detection". This paper describes a new method for live face detection. Using structure and movement information of live face, an effective live face detection algorithm is presented. Compared to existing approaches, which concentrate on the measurement of 3D depth information, this method is based on the analysis of Fourier spectra of a single face image or face image sequences. Experimental results show that the proposed method has an encouraging performance.
The non-contact biometric identified bio signal measurement sensor and algorithms.
Kim, Chan-Il; Lee, Jong-Ha
2018-01-01
In these days, wearable devices have been developed for effectively measuring biological data. However, these devices have tissue allege and noise problem. To solve these problems, biometric measurement based on a non-contact method, such as face image sequencing is developed. This makes it possible to measure biometric data without any operation and side effects. However, it is impossible for a remote center to identify the person whose data are measured by the novel methods. In this paper, we propose the novel non-contact heart rate and blood pressure imaging system, Deep Health Eye. This system has authentication process at the same time as measuring bio signals, through non-contact method. In the future, this system can be convenient home bio signal monitoring system by combined with smart mirror.
NASA Astrophysics Data System (ADS)
Komogortsev, Oleg V.; Karpov, Alexey; Holland, Corey D.
2012-06-01
The widespread use of computers throughout modern society introduces the necessity for usable and counterfeit-resistant authentication methods to ensure secure access to personal resources such as bank accounts, e-mail, and social media. Current authentication methods require tedious memorization of lengthy pass phrases, are often prone to shouldersurfing, and may be easily replicated (either by counterfeiting parts of the human body or by guessing an authentication token based on readily available information). This paper describes preliminary work toward a counterfeit-resistant usable eye movement-based (CUE) authentication method. CUE does not require any passwords (improving the memorability aspect of the authentication system), and aims to provide high resistance to spoofing and shoulder-surfing by employing the combined biometric capabilities of two behavioral biometric traits: 1) oculomotor plant characteristics (OPC) which represent the internal, non-visible, anatomical structure of the eye; 2) complex eye movement patterns (CEM) which represent the strategies employed by the brain to guide visual attention. Both OPC and CEM are extracted from the eye movement signal provided by an eye tracking system. Preliminary results indicate that the fusion of OPC and CEM traits is capable of providing a 30% reduction in authentication error when compared to the authentication accuracy of individual traits.
Design of a MEMS-based retina scanning system for biometric authentication
NASA Astrophysics Data System (ADS)
Woittennek, Franziska; Knobbe, Jens; Pügner, Tino; Schelinski, Uwe; Grüger, Heinrich
2014-05-01
There is an increasing need for reliable authentication for a number of applications such as e commerce. Common authentication methods based on ownership (ID card) or knowledge factors (password, PIN) are often prone to manipulations and may therefore be not safe enough. Various inherence factor based methods like fingerprint, retinal pattern or voice identifications are considered more secure. Retina scanning in particular offers both low false rejection rate (FRR) and low false acceptance rate (FAR) with about one in a million. Images of the retina with its characteristic pattern of blood vessels can be made with either a fundus camera or laser scanning methods. The present work describes the optical design of a new compact retina laser scanner which is based on MEMS (Micro Electric Mechanical System) technology. The use of a dual axis micro scanning mirror for laser beam deflection enables a more compact and robust design compared to classical systems. The scanner exhibits a full field of view of 10° which corresponds to an area of 4 mm2 on the retinal surface surrounding the optical disc. The system works in the near infrared and is designed for use under ambient light conditions, which implies a pupil diameter of 1.5 mm. Furthermore it features a long eye relief of 30 mm so that it can be conveniently used by persons wearing glasses. The optical design requirements and the optical performance are discussed in terms of spot diagrams and ray fan plots.
Invariant domain watermarking using heaviside function of order alpha and fractional Gaussian field.
Abbasi, Almas; Woo, Chaw Seng; Ibrahim, Rabha Waell; Islam, Saeed
2015-01-01
Digital image watermarking is an important technique for the authentication of multimedia content and copyright protection. Conventional digital image watermarking techniques are often vulnerable to geometric distortions such as Rotation, Scaling, and Translation (RST). These distortions desynchronize the watermark information embedded in an image and thus disable watermark detection. To solve this problem, we propose an RST invariant domain watermarking technique based on fractional calculus. We have constructed a domain using Heaviside function of order alpha (HFOA). The HFOA models the signal as a polynomial for watermark embedding. The watermark is embedded in all the coefficients of the image. We have also constructed a fractional variance formula using fractional Gaussian field. A cross correlation method based on the fractional Gaussian field is used for watermark detection. Furthermore the proposed method enables blind watermark detection where the original image is not required during the watermark detection thereby making it more practical than non-blind watermarking techniques. Experimental results confirmed that the proposed technique has a high level of robustness.
Invariant Domain Watermarking Using Heaviside Function of Order Alpha and Fractional Gaussian Field
Abbasi, Almas; Woo, Chaw Seng; Ibrahim, Rabha Waell; Islam, Saeed
2015-01-01
Digital image watermarking is an important technique for the authentication of multimedia content and copyright protection. Conventional digital image watermarking techniques are often vulnerable to geometric distortions such as Rotation, Scaling, and Translation (RST). These distortions desynchronize the watermark information embedded in an image and thus disable watermark detection. To solve this problem, we propose an RST invariant domain watermarking technique based on fractional calculus. We have constructed a domain using Heaviside function of order alpha (HFOA). The HFOA models the signal as a polynomial for watermark embedding. The watermark is embedded in all the coefficients of the image. We have also constructed a fractional variance formula using fractional Gaussian field. A cross correlation method based on the fractional Gaussian field is used for watermark detection. Furthermore the proposed method enables blind watermark detection where the original image is not required during the watermark detection thereby making it more practical than non-blind watermarking techniques. Experimental results confirmed that the proposed technique has a high level of robustness. PMID:25884854
An effective hand vein feature extraction method.
Li, Haigang; Zhang, Qian; Li, Chengdong
2015-01-01
As a new authentication method developed years ago, vein recognition technology features the unique advantage of bioassay. This paper studies the specific procedure for the extraction of hand back vein characteristics. There are different positions used in the collecting process, so that a suitable intravenous regional orientation method is put forward, allowing the positioning area to be the same for all hand positions. In addition, to eliminate the pseudo vein area, the valley regional shape extraction operator can be improved and combined with multiple segmentation algorithms. The images should be segmented step by step, making the vein texture to appear clear and accurate. Lastly, the segmented images should be filtered, eroded, and refined. This process helps to filter the most of the pseudo vein information. Finally, a clear vein skeleton diagram is obtained, demonstrating the effectiveness of the algorithm. This paper presents a hand back vein region location method. This makes it possible to rotate and correct the image by working out the inclination degree of contour at the side of hand back.
Authentication, privacy, security can exploit brainwave by biomarker
NASA Astrophysics Data System (ADS)
Jenkins, Jeffrey; Sweet, Charles; Sweet, James; Noel, Steven; Szu, Harold
2014-05-01
We seek to augment the current Common Access Control (CAC) card and Personal Identification Number (PIN) verification systems with an additional layer of classified access biometrics. Among proven devices such as fingerprint readers and cameras that can sense the human eye's iris pattern, we introduced a number of users to a sequence of 'grandmother images', or emotionally evoked stimuli response images from other users, as well as one of their own, for the purpose of authentication. We performed testing and evaluation of the Authenticity Privacy and Security (APS) brainwave biometrics, similar to the internal organ of the human eye's iris which cannot easily be altered. `Aha' recognition through stimulus-response habituation can serve as a biomarker, similar to keystroke dynamics analysis for inter and intra key fluctuation time of a memorized PIN number (FIST). Using a non-tethered Electroencephalogram (EEG) wireless smartphone/pc monitor interface, we explore the appropriate stimuli-response biomarker present in DTAB low frequency group waves. Prior to login, the user is shown a series of images on a computer display. They have been primed to click their mouse when the image is presented. DTAB waves are collected with a wireless EEG and are sent via Smartphone to a cloud based processing infrastructure. There, we measure fluctuations in DTAB waves from a wireless, non-tethered, single node EEG device between the Personal Graphic Image Number (PGIN) stimulus image and the response time from an individual's mental performance baseline. Towards that goal, we describe an infrastructure that supports distributed verification for web-based EEG authentication. The performance of machine learning on the relative Power Spectral Density EEG data may uncover features required for subsequent access to web or media content. Our approach provides a scalable framework wrapped into a robust Neuro-Informatics toolkit, viable for use in the Biomedical and mental health communities, as well as numerous consumer applications.
Benefits and Limitations of DNA Barcoding and Metabarcoding in Herbal Product Authentication
Raclariu, Ancuta Cristina; Heinrich, Michael; Ichim, Mihael Cristin
2017-01-01
Abstract Introduction Herbal medicines play an important role globally in the health care sector and in industrialised countries they are often considered as an alternative to mono‐substance medicines. Current quality and authentication assessment methods rely mainly on morphology and analytical phytochemistry‐based methods detailed in pharmacopoeias. Herbal products however are often highly processed with numerous ingredients, and even if these analytical methods are accurate for quality control of specific lead or marker compounds, they are of limited suitability for the authentication of biological ingredients. Objective To review the benefits and limitations of DNA barcoding and metabarcoding in complementing current herbal product authentication. Method Recent literature relating to DNA based authentication of medicinal plants, herbal medicines and products are summarised to provide a basic understanding of how DNA barcoding and metabarcoding can be applied to this field. Results Different methods of quality control and authentication have varying resolution and usefulness along the value chain of these products. DNA barcoding can be used for authenticating products based on single herbal ingredients and DNA metabarcoding for assessment of species diversity in processed products, and both methods should be used in combination with appropriate hyphenated chemical methods for quality control. Conclusions DNA barcoding and metabarcoding have potential in the context of quality control of both well and poorly regulated supply systems. Standardisation of protocols for DNA barcoding and DNA sequence‐based identification are necessary before DNA‐based biological methods can be implemented as routine analytical approaches and approved by the competent authorities for use in regulated procedures. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd. PMID:28906059
Takalo, Jouni; Timonen, Jussi; Sampo, Jouni; Rantala, Maaria; Siltanen, Samuli; Lassas, Matti
2014-11-01
A novel method is presented for distinguishing postal stamp forgeries and counterfeit banknotes from genuine samples. The method is based on analyzing differences in paper fibre networks. The main tool is a curvelet-based algorithm for measuring overall fibre orientation distribution and quantifying anisotropy. Using a couple of more appropriate parameters makes it possible to distinguish forgeries from genuine originals as concentrated point clouds in two- or three-dimensional parameter space. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Building an Authentic Leadership Image
ERIC Educational Resources Information Center
Criswell, Corey; Campbell, David
2008-01-01
Your image can be either an asset or a liability for you as a leader. Image building is neither superficial nor unimportant. It's not about creating a false image, but recognizing genuine aspects of yourself that should be coming across to other people--but aren't. Crafting your image requires you to gain a clear picture of the image people are…
PDE based scheme for multi-modal medical image watermarking.
Aherrahrou, N; Tairi, H
2015-11-25
This work deals with copyright protection of digital images, an issue that needs protection of intellectual property rights. It is an important issue with a large number of medical images interchanged on the Internet every day. So, it is a challenging task to ensure the integrity of received images as well as authenticity. Digital watermarking techniques have been proposed as valid solution for this problem. It is worth mentioning that the Region Of Interest (ROI)/Region Of Non Interest (RONI) selection can be seen as a significant limitation from which suffers most of ROI/RONI based watermarking schemes and that in turn affects and limit their applicability in an effective way. Generally, the ROI/RONI is defined by a radiologist or a computer-aided selection tool. And thus, this will not be efficient for an institute or health care system, where one has to process a large number of images. Therefore, developing an automatic ROI/RONI selection is a challenge task. The major aim of this work is to develop an automatic selection algorithm of embedding region based on the so called Partial Differential Equation (PDE) method. Thus avoiding ROI/RONI selection problems including: (1) computational overhead, (2) time consuming, and (3) modality dependent selection. The algorithm is evaluated in terms of imperceptibility, robustness, tamper localization and recovery using MRI, Ultrasound, CT and X-ray grey scale medical images. From experimental results that we have conducted on a database of 100 medical images of four modalities, it can be inferred that our method can achieve high imperceptibility, while showing good robustness against attacks. Furthermore, the experiment results confirm the effectiveness of the proposed algorithm in detecting and recovering the various types of tampering. The highest PSNR value reached over the 100 images is 94,746 dB, while the lowest PSNR value is 60,1272 dB, which demonstrates the higher imperceptibility nature of the proposed method. Moreover, the Normalized Correlation (NC) between the original watermark and the corresponding extracted watermark for 100 images is computed. We get a NC value greater than or equal to 0.998. This indicates that the extracted watermark is very similar to the original watermark for all modalities. The key features of our proposed method are to (1) increase the robustness of the watermark against attacks; (2) provide more transparency to the embedded watermark. (3) provide more authenticity and integrity protection of the content of medical images. (4) provide minimum ROI/RONI selection complexity.
Novel continuous authentication using biometrics
NASA Astrophysics Data System (ADS)
Dubey, Prakash; Patidar, Rinku; Mishra, Vikas; Norman, Jasmine; Mangayarkarasi, R.
2017-11-01
We explore whether a classifier can consistent1y verify c1ients and interact with the computer using camera and behavior of users. In this paper we propose a new way of authentication of user which wi1l capture many images of user in random time and ana1ysis of its touch biometric behavior. In this system experiment the touch conduct of a c1ient/user between an en1istment stage is stored in the database and it is checked its mean time behavior during equa1 partition of time. This touch behavior wi1l ab1e to accept or reject the user. This wi1l modify the use of biometric more accurate to use. In this system the work p1an going to perform is the user wi1l ask single time to a1low to take it picture before 1ogin. Then it wi1l take images of user without permission of user automatica1ly and store in the database. This images and existing image of user wi1l be compare and reject or accept wi1l depend on its comparison. The user touch behavior wi1l keep storing with number of touch make in equa1 amount of time of the user. This touch behavior and image wi1l fina1ly perform authentication of the user automatically.
Benefits and Limitations of DNA Barcoding and Metabarcoding in Herbal Product Authentication.
Raclariu, Ancuta Cristina; Heinrich, Michael; Ichim, Mihael Cristin; de Boer, Hugo
2018-03-01
Herbal medicines play an important role globally in the health care sector and in industrialised countries they are often considered as an alternative to mono-substance medicines. Current quality and authentication assessment methods rely mainly on morphology and analytical phytochemistry-based methods detailed in pharmacopoeias. Herbal products however are often highly processed with numerous ingredients, and even if these analytical methods are accurate for quality control of specific lead or marker compounds, they are of limited suitability for the authentication of biological ingredients. To review the benefits and limitations of DNA barcoding and metabarcoding in complementing current herbal product authentication. Recent literature relating to DNA based authentication of medicinal plants, herbal medicines and products are summarised to provide a basic understanding of how DNA barcoding and metabarcoding can be applied to this field. Different methods of quality control and authentication have varying resolution and usefulness along the value chain of these products. DNA barcoding can be used for authenticating products based on single herbal ingredients and DNA metabarcoding for assessment of species diversity in processed products, and both methods should be used in combination with appropriate hyphenated chemical methods for quality control. DNA barcoding and metabarcoding have potential in the context of quality control of both well and poorly regulated supply systems. Standardisation of protocols for DNA barcoding and DNA sequence-based identification are necessary before DNA-based biological methods can be implemented as routine analytical approaches and approved by the competent authorities for use in regulated procedures. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd. © 2017 The Authors. Phytochemical Analysis Published by John Wiley & Sons Ltd.
Image-based electronic patient records for secured collaborative medical applications.
Zhang, Jianguo; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Yao, Yihong; Cai, Weihua; Jin, Jin; Zhang, Guozhen; Sun, Kun
2005-01-01
We developed a Web-based system to interactively display image-based electronic patient records (EPR) for secured intranet and Internet collaborative medical applications. The system consists of four major components: EPR DICOM gateway (EPR-GW), Image-based EPR repository server (EPR-Server), Web Server and EPR DICOM viewer (EPR-Viewer). In the EPR-GW and EPR-Viewer, the security modules of Digital Signature and Authentication are integrated to perform the security processing on the EPR data with integrity and authenticity. The privacy of EPR in data communication and exchanging is provided by SSL/TLS-based secure communication. This presentation gave a new approach to create and manage image-based EPR from actual patient records, and also presented a way to use Web technology and DICOM standard to build an open architecture for collaborative medical applications.
RMB identification based on polarization parameters inversion imaging
NASA Astrophysics Data System (ADS)
Liu, Guoyan; Gao, Kun; Liu, Xuefeng; Ni, Guoqiang
2016-10-01
Social order is threatened by counterfeit money. Conventional anti-counterfeit technology is much too old to identify its authenticity or not. The intrinsic difference between genuine notes and counterfeit notes is its paper tissue. In this paper a new technology of detecting RMB is introduced, the polarization parameter indirect microscopic imaging technique. A conventional reflection microscopic system is used as the basic optical system, and inserting into it with polarization-modulation mechanics. The near-field structural characteristics can be delivered by optical wave and material coupling. According to coupling and conduction physics, calculate the changes of optical wave parameters, then get the curves of the intensity of the image. By analyzing near-field polarization parameters in nanoscale, finally calculate indirect polarization parameter imaging of the fiber of the paper tissue in order to identify its authenticity.
Alizadeh, Mojtaba; Zamani, Mazdak; Baharun, Sabariah; Abdul Manaf, Azizah; Sakurai, Kouichi; Anada, Hiroaki; Anada, Hiroki; Keshavarz, Hassan; Ashraf Chaudhry, Shehzad; Khurram Khan, Muhammad
2015-01-01
Proxy Mobile IPv6 is a network-based localized mobility management protocol that supports mobility without mobile nodes' participation in mobility signaling. The details of user authentication procedure are not specified in this standard, hence, many authentication schemes have been proposed for this standard. In 2013, Chuang et al., proposed an authentication method for PMIPv6, called SPAM. However, Chuang et al.'s Scheme protects the network against some security attacks, but it is still vulnerable to impersonation and password guessing attacks. In addition, we discuss other security drawbacks such as lack of revocation procedure in case of loss or stolen device, and anonymity issues of the Chuang et al.'s scheme. We further propose an enhanced authentication method to mitigate the security issues of SPAM method and evaluate our scheme using BAN logic.
Alizadeh, Mojtaba; Zamani, Mazdak; Baharun, Sabariah; Abdul Manaf, Azizah; Sakurai, Kouichi; Anada, Hiroki; Keshavarz, Hassan; Ashraf Chaudhry, Shehzad; Khurram Khan, Muhammad
2015-01-01
Proxy Mobile IPv6 is a network-based localized mobility management protocol that supports mobility without mobile nodes’ participation in mobility signaling. The details of user authentication procedure are not specified in this standard, hence, many authentication schemes have been proposed for this standard. In 2013, Chuang et al., proposed an authentication method for PMIPv6, called SPAM. However, Chuang et al.’s Scheme protects the network against some security attacks, but it is still vulnerable to impersonation and password guessing attacks. In addition, we discuss other security drawbacks such as lack of revocation procedure in case of loss or stolen device, and anonymity issues of the Chuang et al.’s scheme. We further propose an enhanced authentication method to mitigate the security issues of SPAM method and evaluate our scheme using BAN logic. PMID:26580963
Digital authentication with copy-detection patterns
NASA Astrophysics Data System (ADS)
Picard, Justin
2004-06-01
Technologies for making high-quality copies of documents are getting more available, cheaper, and more efficient. As a result, the counterfeiting business engenders huge losses, ranging to 5% to 8% of worldwide sales of brand products, and endangers the reputation and value of the brands themselves. Moreover, the growth of the Internet drives the business of counterfeited documents (fake IDs, university diplomas, checks, and so on), which can be bought easily and anonymously from hundreds of companies on the Web. The incredible progress of digital imaging equipment has put in question the very possibility of verifying the authenticity of documents: how can we discern genuine documents from seemingly "perfect" copies? This paper proposes a solution based on creating digital images with specific properties, called a Copy-detection patterns (CDP), that is printed on arbitrary documents, packages, etc. CDPs make an optimal use of an "information loss principle": every time an imae is printed or scanned, some information is lost about the original digital image. That principle applies even for the highest quality scanning, digital imaging, printing or photocopying equipment today, and will likely remain true for tomorrow. By measuring the amount of information contained in a scanned CDP, the CDP detector can take a decision on the authenticity of the document.
Robust authentication through stochastic femtosecond laser filament induced scattering surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Haisu; Tzortzakis, Stelios, E-mail: stzortz@iesl.forth.gr; Materials Science and Technology Department, University of Crete, 71003 Heraklion
2016-05-23
We demonstrate a reliable authentication method by femtosecond laser filament induced scattering surfaces. The stochastic nonlinear laser fabrication nature results in unique authentication robust properties. This work provides a simple and viable solution for practical applications in product authentication, while also opens the way for incorporating such elements in transparent media and coupling those in integrated optical circuits.
Image counter-forensics based on feature injection
NASA Astrophysics Data System (ADS)
Iuliani, M.; Rossetto, S.; Bianchi, T.; De Rosa, Alessia; Piva, A.; Barni, M.
2014-02-01
Starting from the concept that many image forensic tools are based on the detection of some features revealing a particular aspect of the history of an image, in this work we model the counter-forensic attack as the injection of a specific fake feature pointing to the same history of an authentic reference image. We propose a general attack strategy that does not rely on a specific detector structure. Given a source image x and a target image y, the adversary processes x in the pixel domain producing an attacked image ~x, perceptually similar to x, whose feature f(~x) is as close as possible to f(y) computed on y. Our proposed counter-forensic attack consists in the constrained minimization of the feature distance Φ(z) =│ f(z) - f(y)│ through iterative methods based on gradient descent. To solve the intrinsic limit due to the numerical estimation of the gradient on large images, we propose the application of a feature decomposition process, that allows the problem to be reduced into many subproblems on the blocks the image is partitioned into. The proposed strategy has been tested by attacking three different features and its performance has been compared to state-of-the-art counter-forensic methods.
Digital image modification detection using color information and its histograms.
Zhou, Haoyu; Shen, Yue; Zhu, Xinghui; Liu, Bo; Fu, Zigang; Fan, Na
2016-09-01
The rapid development of many open source and commercial image editing software makes the authenticity of the digital images questionable. Copy-move forgery is one of the most widely used tampering techniques to create desirable objects or conceal undesirable objects in a scene. Existing techniques reported in the literature to detect such tampering aim to improve the robustness of these methods against the use of JPEG compression, blurring, noise, or other types of post processing operations. These post processing operations are frequently used with the intention to conceal tampering and reduce tampering clues. A robust method based on the color moments and other five image descriptors is proposed in this paper. The method divides the image into fixed size overlapping blocks. Clustering operation divides entire search space into smaller pieces with similar color distribution. Blocks from the tampered regions will reside within the same cluster since both copied and moved regions have similar color distributions. Five image descriptors are used to extract block features, which makes the method more robust to post processing operations. An ensemble of deep compositional pattern-producing neural networks are trained with these extracted features. Similarity among feature vectors in clusters indicates possible forged regions. Experimental results show that the proposed method can detect copy-move forgery even if an image was distorted by gamma correction, addictive white Gaussian noise, JPEG compression, or blurring. Copyright © 2016. Published by Elsevier Ireland Ltd.
NASA Astrophysics Data System (ADS)
Leni, Pierre-Emmanuel; Fougerolle, Yohan D.; Truchetet, Frédéric
2014-05-01
We propose a progressive transmission approach of an image authenticated using an overlapping subimage that can be removed to restore the original image. Our approach is different from most visible watermarking approaches that allow one to later remove the watermark, because the mark is not directly introduced in the two-dimensional image space. Instead, it is rather applied to an equivalent monovariate representation of the image. Precisely, the approach is based on our progressive transmission approach that relies on a modified Kolmogorov spline network, and therefore inherits its advantages: resilience to packet losses during transmission and support of heterogeneous display environments. The marked image can be accessed at any intermediate resolution, and a key is needed to remove the mark to fully recover the original image without loss. Moreover, the key can be different for every resolution, and the images can be globally restored in case of packet losses during the transmission. Our contributions lie in the proposition of decomposing a mark (an overlapping image) and an image into monovariate functions following the Kolmogorov superposition theorem; and in the combination of these monovariate functions to provide a removable visible "watermarking" of images with the ability to restore the original image using a key.
Raman, Vijayasankar; Avula, Bharathi; Galal, Ahmed M; Wang, Yan-Hong; Khan, Ikhlas A
2013-01-01
Yohimbine is the major alkaloid found in the stem bark of yohimbe, Pausinystalia johimbe (Rubiaceae), an evergreen tree native to Africa. The objectives of the current study were to provide a detailed anatomy of yohimbe bark, as well as to determine the quantity of yohimbine in the raw yohimbe products sold online. Twelve commercial raw materials of yohimbe were analyzed by microscopic and ultra performance liquid chromatography-UV-MS methods. The study revealed that three samples were probably adulterated and four other samples contained various levels of impurities. Yohimbine was not detected in one sample, whereas its presence in other samples was found to be in the range 0.1-0.91%. The present work also provides a detailed anatomy of the stem bark of yohimbe, with light and scanning electron microscopy images, for proper identification and authentication.
Selectively Encrypted Pull-Up Based Watermarking of Biometric data
NASA Astrophysics Data System (ADS)
Shinde, S. A.; Patel, Kushal S.
2012-10-01
Biometric authentication systems are becoming increasingly popular due to their potential usage in information security. However, digital biometric data (e.g. thumb impression) are themselves vulnerable to security attacks. There are various methods are available to secure biometric data. In biometric watermarking the data are embedded in an image container and are only retrieved if the secrete key is available. This container image is encrypted to have more security against the attack. As wireless devices are equipped with battery as their power supply, they have limited computational capabilities; therefore to reduce energy consumption we use the method of selective encryption of container image. The bit pull-up-based biometric watermarking scheme is based on amplitude modulation and bit priority which reduces the retrieval error rate to great extent. By using selective Encryption mechanism we expect more efficiency in time at the time of encryption as well as decryption. Significant reduction in error rate is expected to be achieved by the bit pull-up method.
Proteomics for the authentication of fish species.
Mazzeo, Maria Fiorella; Siciliano, Rosa Anna
2016-09-16
Assessment of seafood authenticity and origin, mainly in the case of processed products (fillets, sticks, baby food) represents the crucial point to prevent fraudulent deceptions thus guaranteeing market transparency and consumers health. The most dangerous practice that jeopardies fish safety is intentional or unintentional mislabeling, originating from the substitution of valuable fish species with inferior ones. Conventional analytical methods for fish authentication are becoming inadequate to comply with the strict regulations issued by the European Union and with the increase of mislabeling due to the introduction on the market of new fish species and market globalization. This evidence prompts the development of high-throughput approaches suitable to identify unambiguous biomarkers of authenticity and screen a large number of samples with minimal time consumption. Proteomics provides suitable and powerful tools to investigate main aspects of food quality and safety and has given an important contribution in the field of biomarkers discovery applied to food authentication. This report describes the most relevant methods developed to assess fish identity and offers a perspective on their potential in the evaluation of fish quality and safety thus depicting the key role of proteomics in the authentication of fish species and processed products. The assessment of fishery products authenticity is a main issue in the control quality process as deceptive practices could imply severe health risks. Proteomics based methods could significantly contribute to detect falsification and frauds, thus becoming a reliable operative first-line testing resource in food authentication. Copyright © 2016 Elsevier B.V. All rights reserved.
Efficient model checking of network authentication protocol based on SPIN
NASA Astrophysics Data System (ADS)
Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan
2013-03-01
Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.
Authenticity techniques for PACS images and records
NASA Astrophysics Data System (ADS)
Wong, Stephen T. C.; Abundo, Marco; Huang, H. K.
1995-05-01
Along with the digital radiology environment supported by picture archiving and communication systems (PACS) comes a new problem: How to establish trust in multimedia medical data that exist only in the easily altered memory of a computer. Trust is characterized in terms of integrity and privacy of digital data. Two major self-enforcing techniques can be used to assure the authenticity of electronic images and text -- key-based cryptography and digital time stamping. Key-based cryptography associates the content of an image with the originator using one or two distinct keys and prevents alteration of the document by anyone other than the originator. A digital time stamping algorithm generates a characteristic `digital fingerprint' for the original document using a mathematical hash function, and checks that it has not been modified. This paper discusses these cryptographic algorithms and their appropriateness for a PACS environment. It also presents experimental results of cryptographic algorithms on several imaging modalities.
A Continuous Identity Authentication Scheme Based on Physiological and Behavioral Characteristics.
Wu, Guannan; Wang, Jian; Zhang, Yongrong; Jiang, Shuai
2018-01-10
Wearable devices have flourished over the past ten years providing great advantages to people and, recently, they have also been used for identity authentication. Most of the authentication methods adopt a one-time authentication manner which cannot provide continuous certification. To address this issue, we present a two-step authentication method based on an own-built fingertip sensor device which can capture motion data (e.g., acceleration and angular velocity) and physiological data (e.g., a photoplethysmography (PPG) signal) simultaneously. When the device is worn on the user's fingertip, it will automatically recognize whether the wearer is a legitimate user or not. More specifically, multisensor data is collected and analyzed to extract representative and intensive features. Then, human activity recognition is applied as the first step to enhance the practicability of the authentication system. After correctly discriminating the motion state, a one-class machine learning algorithm is applied for identity authentication as the second step. When a user wears the device, the authentication process is carried on automatically at set intervals. Analyses were conducted using data from 40 individuals across various operational scenarios. Extensive experiments were executed to examine the effectiveness of the proposed approach, which achieved an average accuracy rate of 98.5% and an F1-score of 86.67%. Our results suggest that the proposed scheme provides a feasible and practical solution for authentication.
A Continuous Identity Authentication Scheme Based on Physiological and Behavioral Characteristics
Wu, Guannan; Wang, Jian; Zhang, Yongrong; Jiang, Shuai
2018-01-01
Wearable devices have flourished over the past ten years providing great advantages to people and, recently, they have also been used for identity authentication. Most of the authentication methods adopt a one-time authentication manner which cannot provide continuous certification. To address this issue, we present a two-step authentication method based on an own-built fingertip sensor device which can capture motion data (e.g., acceleration and angular velocity) and physiological data (e.g., a photoplethysmography (PPG) signal) simultaneously. When the device is worn on the user’s fingertip, it will automatically recognize whether the wearer is a legitimate user or not. More specifically, multisensor data is collected and analyzed to extract representative and intensive features. Then, human activity recognition is applied as the first step to enhance the practicability of the authentication system. After correctly discriminating the motion state, a one-class machine learning algorithm is applied for identity authentication as the second step. When a user wears the device, the authentication process is carried on automatically at set intervals. Analyses were conducted using data from 40 individuals across various operational scenarios. Extensive experiments were executed to examine the effectiveness of the proposed approach, which achieved an average accuracy rate of 98.5% and an F1-score of 86.67%. Our results suggest that the proposed scheme provides a feasible and practical solution for authentication. PMID:29320463
How to Speak an Authentication Secret Securely from an Eavesdropper
NASA Astrophysics Data System (ADS)
O'Gorman, Lawrence; Brotman, Lynne; Sammon, Michael
When authenticating over the telephone or mobile headphone, the user cannot always assure that no eavesdropper hears the password or authentication secret. We describe an eavesdropper-resistant, challenge-response authentication scheme for spoken authentication where an attacker can hear the user’s voiced responses. This scheme entails the user to memorize a small number of plaintext-ciphertext pairs. At authentication, these are challenged in random order and interspersed with camouflage elements. It is shown that the response can be made to appear random so that no information on the memorized secret can be learned by eavesdroppers. We describe the method along with parameter value tradeoffs of security strength, authentication time, and memory effort. This scheme was designed for user authentication of wireless headsets used for hands-free communication by healthcare staff at a hospital.
Developing a multimodal biometric authentication system using soft computing methods.
Malcangi, Mario
2015-01-01
Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.
Sample Preparation for Mass Spectrometry Imaging of Plant Tissues: A Review
Dong, Yonghui; Li, Bin; Malitsky, Sergey; Rogachev, Ilana; Aharoni, Asaph; Kaftan, Filip; Svatoš, Aleš; Franceschi, Pietro
2016-01-01
Mass spectrometry imaging (MSI) is a mass spectrometry based molecular ion imaging technique. It provides the means for ascertaining the spatial distribution of a large variety of analytes directly on tissue sample surfaces without any labeling or staining agents. These advantages make it an attractive molecular histology tool in medical, pharmaceutical, and biological research. Likewise, MSI has started gaining popularity in plant sciences; yet, information regarding sample preparation methods for plant tissues is still limited. Sample preparation is a crucial step that is directly associated with the quality and authenticity of the imaging results, it therefore demands in-depth studies based on the characteristics of plant samples. In this review, a sample preparation pipeline is discussed in detail and illustrated through selected practical examples. In particular, special concerns regarding sample preparation for plant imaging are critically evaluated. Finally, the applications of MSI techniques in plants are reviewed according to different classes of plant metabolites. PMID:26904042
Watermarking of ultrasound medical images in teleradiology using compressed watermark
Badshah, Gran; Liew, Siau-Chuin; Zain, Jasni Mohamad; Ali, Mushtaq
2016-01-01
Abstract. The open accessibility of Internet-based medical images in teleradialogy face security threats due to the nonsecured communication media. This paper discusses the spatial domain watermarking of ultrasound medical images for content authentication, tamper detection, and lossless recovery. For this purpose, the image is divided into two main parts, the region of interest (ROI) and region of noninterest (RONI). The defined ROI and its hash value are combined as watermark, lossless compressed, and embedded into the RONI part of images at pixel’s least significant bits (LSBs). The watermark lossless compression and embedding at pixel’s LSBs preserve image diagnostic and perceptual qualities. Different lossless compression techniques including Lempel-Ziv-Welch (LZW) were tested for watermark compression. The performances of these techniques were compared based on more bit reduction and compression ratio. LZW was found better than others and used in tamper detection and recovery watermarking of medical images (TDARWMI) scheme development to be used for ROI authentication, tamper detection, localization, and lossless recovery. TDARWMI performance was compared and found to be better than other watermarking schemes. PMID:26839914
Fong, Simon; Zhuang, Yan
2012-01-01
User authentication has been widely used by biometric applications that work on unique bodily features, such as fingerprints, retina scan, and palm vessels recognition. This paper proposes a novel concept of biometric authentication by exploiting a user's medical history. Although medical history may not be absolutely unique to every individual person, the chances of having two persons who share an exactly identical trail of medical and prognosis history are slim. Therefore, in addition to common biometric identification methods, medical history can be used as ingredients for generating Q&A challenges upon user authentication. This concept is motivated by a recent advancement on smart-card technology that future identity cards are able to carry patents' medical history like a mobile database. Privacy, however, may be a concern when medical history is used for authentication. Therefore in this paper, a new method is proposed for abstracting the medical data by using attribute value taxonomies, into a hierarchical data tree (h-Data). Questions can be abstracted to various level of resolution (hence sensitivity of private data) for use in the authentication process. The method is described and a case study is given in this paper.
Cross spectral, active and passive approach to face recognition for improved performance
NASA Astrophysics Data System (ADS)
Grudzien, A.; Kowalski, M.; Szustakowski, M.
2017-08-01
Biometrics is a technique for automatic recognition of a person based on physiological or behavior characteristics. Since the characteristics used are unique, biometrics can create a direct link between a person and identity, based on variety of characteristics. The human face is one of the most important biometric modalities for automatic authentication. The most popular method of face recognition which relies on processing of visual information seems to be imperfect. Thermal infrared imagery may be a promising alternative or complement to visible range imaging due to its several reasons. This paper presents an approach of combining both methods.
ERIC Educational Resources Information Center
Abe, D.; And Others
This discussion shows that "authentic documents" are a basic tool for the acquisition of communicative competence in a second language. An authentic document is a sort of photograph of discourse produced at a given time and in a given place. Like a cliche, it has its own existence. Two reasons for choosing authentic documents in second language…
Secure password-based authenticated key exchange for web services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Fang; Meder, Samuel; Chevassut, Olivier
This paper discusses an implementation of an authenticated key-exchange method rendered on message primitives defined in the WS-Trust and WS-SecureConversation specifications. This IEEE-specified cryptographic method (AuthA) is proven-secure for password-based authentication and key exchange, while the WS-Trust and WS-Secure Conversation are emerging Web Services Security specifications that extend the WS-Security specification. A prototype of the presented protocol is integrated in the WSRF-compliant Globus Toolkit V4. Further hardening of the implementation is expected to result in a version that will be shipped with future Globus Toolkit releases. This could help to address the current unavailability of decent shared-secret-based authentication options inmore » the Web Services and Grid world. Future work will be to integrate One-Time-Password (OTP) features in the authentication protocol.« less
A novel method for rotation invariant palm print image stitching
NASA Astrophysics Data System (ADS)
Rao, Shishir Paramathma; Panetta, Karen; Agaian, Sos S.
2017-05-01
Although not as popular as fingerprint biometrics, palm prints have garnered interest in scientific community for the rich amount of distinctive information available on the palm. In this paper, a novel method for touchless palm print stitching to increase the effective area is presented. The method is not only rotation invariant but also able to robustly handle many distortions of touchless systems like illumination variations, pose variations etc. The proposed method also can handle partial palmprints, which have a high chance of occurrence in a scene of crime, by stitching them together to produce a much larger-to-full size palmprint for authentication purpose. Experiment results are shown for IIT-D palmprint database, from which pseudo partial palmprints were generated by cropping and randomly rotating them. Furthermore, the quality of stitching algorithm is determined by extensive computer simulations and visual analysis of the stitched image. Experimental results also show that the stitching significantly increases the area of palm image for feature point detection and hence provides a way to increase the accuracy and reliability of detection.
A critical review on the applications of artificial neural networks in winemaking technology.
Moldes, O A; Mejuto, J C; Rial-Otero, R; Simal-Gandara, J
2017-09-02
Since their development in 1943, artificial neural networks were extended into applications in many fields. Last twenty years have brought their introduction into winery, where they were applied following four basic purposes: authenticity assurance systems, electronic sensory devices, production optimization methods, and artificial vision in image treatment tools, with successful and promising results. This work reviews the most significant approaches for neural networks in winemaking technologies with the aim of producing a clear and useful review document.
Chaotic Image Encryption of Regions of Interest
NASA Astrophysics Data System (ADS)
Xiao, Di; Fu, Qingqing; Xiang, Tao; Zhang, Yushu
Since different regions of an image have different importance, therefore only the important information of the image regions, which the users are really interested in, needs to be encrypted and protected emphatically in some special multimedia applications. However, the regions of interest (ROI) are always some irregular parts, such as the face and the eyes. Assuming the bulk data in transmission without being damaged, we propose a chaotic image encryption algorithm for ROI. ROI with irregular shapes are chosen and detected arbitrarily. Then the chaos-based image encryption algorithm with scrambling, S-box and diffusion parts is used to encrypt the ROI. Further, the whole image is compressed with Huffman coding. At last, a message authentication code (MAC) of the compressed image is generated based on chaotic maps. The simulation results show that the encryption algorithm has a good security level and can resist various attacks. Moreover, the compression method improves the storage and transmission efficiency to some extent, and the MAC ensures the integrity of the transmission data.
Research on user behavior authentication model based on stochastic Petri nets
NASA Astrophysics Data System (ADS)
Zhang, Chengyuan; Xu, Haishui
2017-08-01
A behavioural authentication model based on stochastic Petri net is proposed to meet the randomness, uncertainty and concurrency characteristics of user behaviour. The use of random models in the location, changes, arc and logo to describe the characteristics of a variety of authentication and game relationships, so as to effectively implement the graphical user behaviour authentication model analysis method, according to the corresponding proof to verify the model is valuable.
An Efficient Semi-fragile Watermarking Scheme for Tamper Localization and Recovery
NASA Astrophysics Data System (ADS)
Hou, Xiang; Yang, Hui; Min, Lianquan
2018-03-01
To solve the problem that remote sensing images are vulnerable to be tampered, a semi-fragile watermarking scheme was proposed. Binary random matrix was used as the authentication watermark, which was embedded by quantizing the maximum absolute value of directional sub-bands coefficients. The average gray level of every non-overlapping 4×4 block was adopted as the recovery watermark, which was embedded in the least significant bit. Watermarking detection could be done directly without resorting to the original images. Experimental results showed our method was robust against rational distortions to a certain extent. At the same time, it was fragile to malicious manipulation, and realized accurate localization and approximate recovery of the tampered regions. Therefore, this scheme can protect the security of remote sensing image effectively.
Defining the questions: a research agenda for nontraditional authentication in arms control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hauck, Danielle K; Mac Arthur, Duncan W; Smith, Morag K
Many traditional authentication techniques have been based on hardware solutions. Thus authentication of measurement system hardware has been considered in terms of physical inspection and destructive analysis. Software authentication has implied hash function analysis or authentication tools such as Rose. Continuity of knowledge is maintained through TIDs and cameras. Although there is ongoing progress improving all of these authentication methods, there has been little discussion of the human factors involved in authentication. Issues of non-traditional authentication include sleight-of-hand substitutions, monitor perception vs. reality, and visual diversions. Since monitor confidence in a measurement system depends on the product of their confidencesmore » in each authentication element, it is important to investigate all authentication techniques, including the human factors. This paper will present an initial effort to identify the most important problems that traditional authentication approaches in safeguards have not addressed and are especially relevant to arms control verification. This will include a survey of the literature and direct engagement with nontraditional experts in areas like psychology and human factors. Based on the identification of problem areas, potential research areas will be identified and a possible research agenda will be developed.« less
Screening of adulterants in powdered foods and ingredients using line-scan Raman chemical imaging.
USDA-ARS?s Scientific Manuscript database
A newly developed line-scan Raman imaging system using a 785 nm line laser was used to authenticate powdered foods and ingredients. The system was used to collect hyperspectral Raman images in the range of 102–2865 wavenumber from three representative food powders mixed with selected adulterants eac...
32 CFR 161.7 - ID card life-cycle procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... provide two fingerprint biometric scans and a facial image, to assist with authenticating the applicant's... manner: (i) A digitized, full-face passport-type photograph will be captured for the facial image and stored in DEERS and shall have a plain white or off-white background. No flags, posters, or other images...
Masada, Sayaka
2016-07-01
Various herbal medicines have been developed and used in various parts of the world for thousands of years. Although locally grown indigenous plants were originally used for traditional herbal preparations, Western herbal products are now becoming popular in Japan with the increasing interest in health. At the same time, there are growing concerns about the substitution of ingredients and adulteration of herbal products, highlighting the need for the authentication of the origin of plants used in herbal products. This review describes studies on Cimicifuga and Vitex products developed in Europe and Japan, focusing on establishing analytical methods to evaluate the origins of material plants and finished products. These methods include a polymerase chain reaction-restriction fragment length polymorphism method and a multiplex amplification refractory mutation system method. A genome-based authentication method and liquid chromatography-mass spectrometry-based authentication for black cohosh products, and the identification of two characteristic diterpenes of agnus castus fruit and a shrub chaste tree fruit-specific triterpene derivative are also described.
Dawel, Amy; Palermo, Romina; O'Kearney, Richard; McKone, Elinor
2015-01-01
Much is known about development of the ability to label facial expressions of emotion (e.g., as happy or sad), but rather less is known about the emergence of more complex emotional face processing skills. The present study investigates one such advanced skill: the ability to tell if someone is genuinely feeling an emotion or just pretending (i.e., authenticity discrimination). Previous studies have shown that children can discriminate authenticity of happy faces, using expression intensity as an important cue, but have not tested the negative emotions of sadness or fear. Here, children aged 8-12 years (n = 85) and adults (n = 57) viewed pairs of faces in which one face showed a genuinely-felt emotional expression (happy, sad, or scared) and the other face showed a pretend version. For happy faces, children discriminated authenticity above chance, although they performed more poorly than adults. For sad faces, for which our pretend and genuine images were equal in intensity, adults could discriminate authenticity, but children could not. Neither age group could discriminate authenticity of the fear faces. Results also showed that children judged authenticity based on intensity information alone for all three expressions tested, while adults used a combination of intensity and other factor/s. In addition, novel results show that individual differences in empathy (both cognitive and affective) correlated with authenticity discrimination for happy faces in adults, but not children. Overall, our results indicate late maturity of skills needed to accurately determine the authenticity of emotions from facial information alone, and raise questions about how this might affect social interactions in late childhood and the teenage years.
Dawel, Amy; Palermo, Romina; O’Kearney, Richard; McKone, Elinor
2015-01-01
Much is known about development of the ability to label facial expressions of emotion (e.g., as happy or sad), but rather less is known about the emergence of more complex emotional face processing skills. The present study investigates one such advanced skill: the ability to tell if someone is genuinely feeling an emotion or just pretending (i.e., authenticity discrimination). Previous studies have shown that children can discriminate authenticity of happy faces, using expression intensity as an important cue, but have not tested the negative emotions of sadness or fear. Here, children aged 8–12 years (n = 85) and adults (n = 57) viewed pairs of faces in which one face showed a genuinely-felt emotional expression (happy, sad, or scared) and the other face showed a pretend version. For happy faces, children discriminated authenticity above chance, although they performed more poorly than adults. For sad faces, for which our pretend and genuine images were equal in intensity, adults could discriminate authenticity, but children could not. Neither age group could discriminate authenticity of the fear faces. Results also showed that children judged authenticity based on intensity information alone for all three expressions tested, while adults used a combination of intensity and other factor/s. In addition, novel results show that individual differences in empathy (both cognitive and affective) correlated with authenticity discrimination for happy faces in adults, but not children. Overall, our results indicate late maturity of skills needed to accurately determine the authenticity of emotions from facial information alone, and raise questions about how this might affect social interactions in late childhood and the teenage years. PMID:25999868
Advances in high-resolution mass spectrometry based on metabolomics studies for food--a review.
Rubert, Josep; Zachariasova, Milena; Hajslova, Jana
2015-01-01
Food authenticity becomes a necessity for global food policies, since food placed in the market without fail has to be authentic. It has always been a challenge, since in the past minor components, called also markers, have been mainly monitored by chromatographic methods in order to authenticate the food. Nevertheless, nowadays, advanced analytical methods have allowed food fingerprints to be achieved. At the same time they have been also combined with chemometrics, which uses statistical methods in order to verify food and to provide maximum information by analysing chemical data. These sophisticated methods based on different separation techniques or stand alone have been recently coupled to high-resolution mass spectrometry (HRMS) in order to verify the authenticity of food. The new generation of HRMS detectors have experienced significant advances in resolving power, sensitivity, robustness, extended dynamic range, easier mass calibration and tandem mass capabilities, making HRMS more attractive and useful to the food metabolomics community, therefore becoming a reliable tool for food authenticity. The purpose of this review is to summarise and describe the most recent metabolomics approaches in the area of food metabolomics, and to discuss the strengths and drawbacks of the HRMS analytical platforms combined with chemometrics.
Ai, Jinxia; Wang, Xuesong; Gao, Lijun; Xia, Wei; Li, Mingcheng; Yuan, Guangxin; Niu, Jiamu; Zhang, Lihua
2017-11-01
The use of Fetus cervi, which is derived from the embryo and placenta of Cervus Nippon Temminck or Cervs elaphus Linnaeus, has been documented for a long time in China. There are abundant species of deer worldwide. Those recorded by China Pharmacopeia (2010 edition) from all the species were either authentic or adulterants/counterfeits. Identification of their origins or authenticity became a key in the preparation of the authentic products. The traditional SDS alkaline lysis and salt-outing methods were modified to extract mt DNA and genomic DNA from fresh and dry Fetus cervi in addition to Fetus from false animals, respectively. A set of primers were designed by bioinformatics to target the intra-and inter-variation. The mt DNA and genomic DNA extracted from Fetus cervi using the two methods meet the requirement for authenticity. Extraction of mt DNA by SDS alkaline lysis is more practical and accurate than extraction of genomic DNA by salt-outing method. There were differences in length and number of segments amplified by PCR between mt DNA from authentic Fetus cervi and false animals Fetus. The distinctive PCR-fingerprint patterns can distinguish the Fetus cervi from adulterants and counterfeit animal Fetus.
Kent, Alexander Dale [Los Alamos, NM
2008-09-02
Methods and systems in a data/computer network for authenticating identifying data transmitted from a client to a server through use of a gateway interface system which are communicately coupled to each other are disclosed. An authentication packet transmitted from a client to a server of the data network is intercepted by the interface, wherein the authentication packet is encrypted with a one-time password for transmission from the client to the server. The one-time password associated with the authentication packet can be verified utilizing a one-time password token system. The authentication packet can then be modified for acceptance by the server, wherein the response packet generated by the server is thereafter intercepted, verified and modified for transmission back to the client in a similar but reverse process.
Chen, Chenglong; Ni, Jiangqun; Shen, Zhaoyi; Shi, Yun Qing
2017-06-01
Geometric transformations, such as resizing and rotation, are almost always needed when two or more images are spliced together to create convincing image forgeries. In recent years, researchers have developed many digital forensic techniques to identify these operations. Most previous works in this area focus on the analysis of images that have undergone single geometric transformations, e.g., resizing or rotation. In several recent works, researchers have addressed yet another practical and realistic situation: successive geometric transformations, e.g., repeated resizing, resizing-rotation, rotation-resizing, and repeated rotation. We will also concentrate on this topic in this paper. Specifically, we present an in-depth analysis in the frequency domain of the second-order statistics of the geometrically transformed images. We give an exact formulation of how the parameters of the first and second geometric transformations influence the appearance of periodic artifacts. The expected positions of characteristic resampling peaks are analytically derived. The theory developed here helps to address the gap left by previous works on this topic and is useful for image security and authentication, in particular, the forensics of geometric transformations in digital images. As an application of the developed theory, we present an effective method that allows one to distinguish between the aforementioned four different processing chains. The proposed method can further estimate all the geometric transformation parameters. This may provide useful clues for image forgery detection.
A Secured Authentication Protocol for SIP Using Elliptic Curves Cryptography
NASA Astrophysics Data System (ADS)
Chen, Tien-Ho; Yeh, Hsiu-Lien; Liu, Pin-Chuan; Hsiang, Han-Chen; Shih, Wei-Kuan
Session initiation protocol (SIP) is a technology regularly performed in Internet Telephony, and Hyper Text Transport Protocol (HTTP) as digest authentication is one of the major methods for SIP authentication mechanism. In 2005, Yang et al. pointed out that HTTP could not resist server spoofing attack and off-line guessing attack and proposed a secret authentication with Diffie-Hellman concept. In 2009, Tsai proposed a nonce based authentication protocol for SIP. In this paper, we demonstrate that their protocol could not resist the password guessing attack and insider attack. Furthermore, we propose an ECC-based authentication mechanism to solve their issues and present security analysis of our protocol to show that ours is suitable for applications with higher security requirement.
Authentic Reading, Writing, and Discussion: An Exploratory Study of a Pen Pal Project
ERIC Educational Resources Information Center
Gambrell, Linda B.; Hughes, Elizabeth M.; Calvert, Leah; Malloy, Jacquelynn A.; Igo, Brent
2011-01-01
In this exploratory study, reading, writing, and discussion were examined within the context of a pen pal intervention focusing on authentic literacy tasks. The study employed a mixed-method design with a triangulation-convergence model to explore the relationship between authentic literacy tasks and the literacy motivation of elementary students…
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin
2010-04-01
The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.
NASA Astrophysics Data System (ADS)
Gharami, Snigdha; Dinakaran, M.
2017-11-01
We see challenges in authenticating each aspect of electronic usage, starting from transaction to social interaction the authenticity and availability of correct information is guided in various ways. Authentication and authorization follow one another; a process of authentication is calculated on multiple layers of steps. In this paper we discuss various possibilities of modifying and using ways to deal with authentication and authorization mechanism. Idea is to work through authentication with mathematical calculations. We will go through various scenarios and find out the system of information that fits best at the moment of need. We will take account of new approaches of authentication and authorization while working on mathematical paradigm of information. The paper also takes an eye on quantum cryptography and discusses on how it could help one in the present scenario. This paper is divided into sections discussing on various paradigm of authentication and how one can achieve it in secure way, this paper is part of research work where analysis of various constraints are to be followed in the extended research work.
NASA Astrophysics Data System (ADS)
Sridevi, B.; Supriya, T. S.; Rajaram, S.
2013-01-01
The current generation of wireless networks has been designed predominantly to support voice and more recently data traffic. WiMAX is currently one of the hottest technologies in wireless. The main motive of the mobile technologies is to provide seamless cost effective mobility. But this is affected by Authentication cost and handover delay since on each handoff the Mobile Station (MS) has to undergo all steps of authentication. Pre-Authentication is used to reduce the handover delay and increase the speed of the Intra-ASN Handover. Proposed Pre-Authentication method is intended to reduce the authentication delay by getting pre authenticated by central authority called Pre Authentication Authority (PAA). MS requests PAA for Pre Authentication Certificate (PAC) before performing handoff. PAA verifies the identity of MS and provides PAC to MS and also to the neighboring target Base Stations (tBSs). MS having time bound PAC can skip the authentication process when recognized by target BS during handoff. It also prevents the DOS (Denial Of Service) attack and Replay attack. It has no wastage of unnecessary key exchange of the resources. The proposed work is simulated by NS2 model and by MATLAB.
Authenticated communication from quantum readout of PUFs
NASA Astrophysics Data System (ADS)
Škorić, Boris; Pinkse, Pepijn W. H.; Mosk, Allard P.
2017-08-01
Quantum readout of physical unclonable functions (PUFs) is a recently introduced method for remote authentication of objects. We present an extension of the protocol to enable the authentication of data: A verifier can check if received classical data were sent by the PUF holder. We call this modification QR-d or, in the case of the optical-PUF implementation, QSA-d. We discuss how QSA-d can be operated in a parallel way. We also present a protocol for authenticating quantum states.
Thermal imaging as a biometrics approach to facial signature authentication.
Guzman, A M; Goryawala, M; Wang, Jin; Barreto, A; Andrian, J; Rishe, N; Adjouadi, M
2013-01-01
A new thermal imaging framework with unique feature extraction and similarity measurements for face recognition is presented. The research premise is to design specialized algorithms that would extract vasculature information, create a thermal facial signature and identify the individual. The proposed algorithm is fully integrated and consolidates the critical steps of feature extraction through the use of morphological operators, registration using the Linear Image Registration Tool and matching through unique similarity measures designed for this task. The novel approach at developing a thermal signature template using four images taken at various instants of time ensured that unforeseen changes in the vasculature over time did not affect the biometric matching process as the authentication process relied only on consistent thermal features. Thirteen subjects were used for testing the developed technique on an in-house thermal imaging system. The matching using the similarity measures showed an average accuracy of 88.46% for skeletonized signatures and 90.39% for anisotropically diffused signatures. The highly accurate results obtained in the matching process clearly demonstrate the ability of the thermal infrared system to extend in application to other thermal imaging based systems. Empirical results applying this approach to an existing database of thermal images proves this assertion.
Brinckmann, J A
2013-11-01
Pharmacopoeial monographs providing specifications for composition, identity, purity, quality, and strength of a botanical are developed based on analysis of presumably authenticated botanical reference materials. The specimens should represent the quality traditionally specified for the intended use, which may require different standards for medicinal versus food use. Development of quality standards monographs may occur through collaboration between a sponsor company or industry association and a pharmacopoeial expert committee. The sponsor may base proposed standards and methods on their own preferred botanical supply which may, or may not, be geo-authentic and/or correspond to qualities defined in traditional medicine formularies and pharmacopoeias. Geo-authentic botanicals are those with specific germplasm, cultivated or collected in their traditional production regions, of a specified biological age at maturity, with specific production techniques and processing methods. Consequences of developing new monographs that specify characteristics of an 'introduced' cultivated species or of a material obtained from one unique origin could lead to exclusion of geo-authentic herbs and may have therapeutic implications for clinical practice. In this review, specifications of selected medicinal plants with either a geo-authentic or geographical indication designation are discussed and compared against official pharmacopoeial standards for same genus and species regardless of origin. Copyright © 2012 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Smith, Gregory D.; Nunan, Elizabeth; Walker, Claire; Kushel, Dan
2009-01-01
Imaging of artwork is an important aspect of art conservation, technical art history, and art authentication. Many forms of near-infrared (NIR) imaging are used by conservators, archaeologists, forensic scientists, and technical art historians to examine the underdrawings of paintings, to detect damages and restorations, to enhance faded or…
Secure ADS-B authentication system and method
NASA Technical Reports Server (NTRS)
Viggiano, Marc J (Inventor); Valovage, Edward M (Inventor); Samuelson, Kenneth B (Inventor); Hall, Dana L (Inventor)
2010-01-01
A secure system for authenticating the identity of ADS-B systems, including: an authenticator, including a unique id generator and a transmitter transmitting the unique id to one or more ADS-B transmitters; one or more ADS-B transmitters, including a receiver receiving the unique id, one or more secure processing stages merging the unique id with the ADS-B transmitter's identification, data and secret key and generating a secure code identification and a transmitter transmitting a response containing the secure code and ADSB transmitter's data to the authenticator; the authenticator including means for independently determining each ADS-B transmitter's secret key, a receiver receiving each ADS-B transmitter's response, one or more secure processing stages merging the unique id, ADS-B transmitter's identification and data and generating a secure code, and comparison processing comparing the authenticator-generated secure code and the ADS-B transmitter-generated secure code and providing an authentication signal based on the comparison result.
Filtering methods for broadcast authentication against PKC-based denial of service in WSN: a survey
NASA Astrophysics Data System (ADS)
Afianti, Farah; Wirawan, Iwan; Suryani, Titiek
2017-11-01
Broadcast authentication is used to determine legitimate packet from authorized user. The received packet can be forwarded or used for the further purpose. The use of digital signature is one of the compromising methods but it is followed by high complexity especially in the verification process. That phenomenon is used by the adversary to force the user to verify a lot of false packet data. Kind of Denial of Service (DoS) which attacks the main signature can be mitigated by using pre-authentication methods as the first layer to filter false packet data. The objective of the filter is not replacing the main signature but as an addition to actual verification in the sensor node. This paper contributes in comparing the cost of computation, storage, and communication among several filters. The result shows Pre- Authenticator and Dos Attack-Resistant scheme have the lower overhead than the others. Thus followed by needing powerful sender. Moreover, the key chain is promising methods because of efficiency and effectiveness.
Passive forensics for copy-move image forgery using a method based on DCT and SVD.
Zhao, Jie; Guo, Jichang
2013-12-10
As powerful image editing tools are widely used, the demand for identifying the authenticity of an image is much increased. Copy-move forgery is one of the tampering techniques which are frequently used. Most existing techniques to expose this forgery need to improve the robustness for common post-processing operations and fail to precisely locate the tampering region especially when there are large similar or flat regions in the image. In this paper, a robust method based on DCT and SVD is proposed to detect this specific artifact. Firstly, the suspicious image is divided into fixed-size overlapping blocks and 2D-DCT is applied to each block, then the DCT coefficients are quantized by a quantization matrix to obtain a more robust representation of each block. Secondly, each quantized block is divided non-overlapping sub-blocks and SVD is applied to each sub-block, then features are extracted to reduce the dimension of each block using its largest singular value. Finally, the feature vectors are lexicographically sorted, and duplicated image blocks will be matched by predefined shift frequency threshold. Experiment results demonstrate that our proposed method can effectively detect multiple copy-move forgery and precisely locate the duplicated regions, even when an image was distorted by Gaussian blurring, AWGN, JPEG compression and their mixed operations. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Intersubject Differences in False Nonmatch Rates for a Fingerprint-Based Authentication System
NASA Astrophysics Data System (ADS)
Breebaart, Jeroen; Akkermans, Ton; Kelkboom, Emile
2009-12-01
The intersubject dependencies of false nonmatch rates were investigated for a minutiae-based biometric authentication process using single enrollment and verification measurements. A large number of genuine comparison scores were subjected to statistical inference tests that indicated that the number of false nonmatches depends on the subject and finger under test. This result was also observed if subjects associated with failures to enroll were excluded from the test set. The majority of the population (about 90%) showed a false nonmatch rate that was considerably smaller than the average false nonmatch rate of the complete population. The remaining 10% could be characterized as "goats due to their relatively high probability for a false nonmatch. The image quality reported by the template extraction module only weakly correlated with the genuine comparison scores. When multiple verification attempts were investigated, only a limited benefit was observed for "goats, since the conditional probability for a false nonmatch given earlier nonsuccessful attempts increased with the number of attempts. These observations suggest that (1) there is a need for improved identification of "goats during enrollment (e.g., using dedicated signal-driven analysis and classification methods and/or the use of multiple enrollment images) and (2) there should be alternative means for identity verification in the biometric system under test in case of two subsequent false nonmatches.
Fuzzy-cellular neural network for face recognition HCI Authentication
NASA Astrophysics Data System (ADS)
Hoomod, Haider K.; ali, Ahmed abd
2018-05-01
Because of the rapid development of mobile devices technology, ease of use and interact with humans. May have found a mobile device most uses in our communications. Mobile devices can carry large amounts of personal and sensitive data, but often left not guaranteed (pin) locks are inconvenient to use and thus have seen low adoption while biometrics is more convenient and less susceptible to fraud and manipulation. Were propose in this paper authentication technique for using a mobile face recognition based on cellular neural networks [1] and fuzzy rules control. The good speed and get recognition rate from applied the proposed system in Android system. The images obtained in real time for 60 persons each person has 20 t0 60 different shot face images (about 3600 images), were the results for (FAR = 0), (FRR = 1.66%), (FER = 1.66) and accuracy = 98.34
Analysis of pork and poultry meat and bone meal mixture using hyperspectral imaging
NASA Astrophysics Data System (ADS)
Oh, Mirae; Lee, Hoonsoo; Torres, Irina; Garrido Varo, Ana; Pérez Marín, Dolores; Kim, Moon S.
2017-05-01
Meat and bone meal (MBM) has been banned as animal feed for ruminants since 2001 because it is the source of bovine spongiform encephalopathy (BSE). Moreover, many countries have banned the use of MBM as animal feed for not only ruminants but other farm animals as well, to prevent potential outbreak of BSE. Recently, the EU has introduced use of some MBM in feeds for different animal species, such as poultry MBM for swine feed and pork MBM for poultry feed, for economic reasons. In order to authenticate the MBM species origin, species-specific MBM identification methods are needed. Various spectroscopic and spectral imaging techniques have allowed rapid and non-destructive quality assessments of foods and animal feeds. The objective of this study was to develop rapid and accurate methods to differentiate pork MBM from poultry MBM using short-wave infrared (SWIR) hyperspectral imaging techniques. Results from a preliminary investigation of hyperspectral imaging for assessing pork and poultry MBM characteristics and quantitative analysis of poultry-pork MBM mixtures are presented in this paper.
Lossless Data Embedding—New Paradigm in Digital Watermarking
NASA Astrophysics Data System (ADS)
Fridrich, Jessica; Goljan, Miroslav; Du, Rui
2002-12-01
One common drawback of virtually all current data embedding methods is the fact that the original image is inevitably distorted due to data embedding itself. This distortion typically cannot be removed completely due to quantization, bit-replacement, or truncation at the grayscales 0 and 255. Although the distortion is often quite small and perceptual models are used to minimize its visibility, the distortion may not be acceptable for medical imagery (for legal reasons) or for military images inspected under nonstandard viewing conditions (after enhancement or extreme zoom). In this paper, we introduce a new paradigm for data embedding in images (lossless data embedding) that has the property that the distortion due to embedding can be completely removed from the watermarked image after the embedded data has been extracted. We present lossless embedding methods for the uncompressed formats (BMP, TIFF) and for the JPEG format. We also show how the concept of lossless data embedding can be used as a powerful tool to achieve a variety of nontrivial tasks, including lossless authentication using fragile watermarks, steganalysis of LSB embedding, and distortion-free robust watermarking.
Digital Authenticity and Integrity: Digital Cultural Heritage Documents as Research Resources
ERIC Educational Resources Information Center
Bradley; Rachael
2005-01-01
This article presents the results of a survey addressing methods of securing digital content and ensuring the content's authenticity and integrity, as well as the perceived importance of authenticity and integrity. The survey was sent to 40 digital repositories in the United States and Canada between June 30 and July 19, 2003. Twenty-two…
Standards for Cell Line Authentication and Beyond
Cole, Kenneth D.; Plant, Anne L.
2016-01-01
Different genomic technologies have been applied to cell line authentication, but only one method (short tandem repeat [STR] profiling) has been the subject of a comprehensive and definitive standard (ASN-0002). Here we discuss the power of this document and why standards such as this are so critical for establishing the consensus technical criteria and practices that can enable progress in the fields of research that use cell lines. We also examine other methods that could be used for authentication and discuss how a combination of methods could be used in a holistic fashion to assess various critical aspects of the quality of cell lines. PMID:27300367
Lossless data embedding for all image formats
NASA Astrophysics Data System (ADS)
Fridrich, Jessica; Goljan, Miroslav; Du, Rui
2002-04-01
Lossless data embedding has the property that the distortion due to embedding can be completely removed from the watermarked image without accessing any side channel. This can be a very important property whenever serious concerns over the image quality and artifacts visibility arise, such as for medical images, due to legal reasons, for military images or images used as evidence in court that may be viewed after enhancement and zooming. We formulate two general methodologies for lossless embedding that can be applied to images as well as any other digital objects, including video, audio, and other structures with redundancy. We use the general principles as guidelines for designing efficient, simple, and high-capacity lossless embedding methods for three most common image format paradigms - raw, uncompressed formats (BMP), lossy or transform formats (JPEG), and palette formats (GIF, PNG). We close the paper with examples of how the concept of lossless data embedding can be used as a powerful tool to achieve a variety of non-trivial tasks, including elegant lossless authentication using fragile watermarks. Note on terminology: some authors coined the terms erasable, removable, reversible, invertible, and distortion-free for the same concept.
Identification of ginseng root using quantitative X-ray microtomography.
Ye, Linlin; Xue, Yanling; Wang, Yudan; Qi, Juncheng; Xiao, Tiqiao
2017-07-01
The use of X-ray phase-contrast microtomography for the investigation of Chinese medicinal materials is advantageous for its nondestructive, in situ , and three-dimensional quantitative imaging properties. The X-ray phase-contrast microtomography quantitative imaging method was used to investigate the microstructure of ginseng, and the phase-retrieval method is also employed to process the experimental data. Four different ginseng samples were collected and investigated; these were classified according to their species, production area, and sample growth pattern. The quantitative internal characteristic microstructures of ginseng were extracted successfully. The size and position distributions of the calcium oxalate cluster crystals (COCCs), important secondary metabolites that accumulate in ginseng, are revealed by the three-dimensional quantitative imaging method. The volume and amount of the COCCs in different species of the ginseng are obtained by a quantitative analysis of the three-dimensional microstructures, which shows obvious difference among the four species of ginseng. This study is the first to provide evidence of the distribution characteristics of COCCs to identify four types of ginseng, with regard to species authentication and age identification, by X-ray phase-contrast microtomography quantitative imaging. This method is also expected to reveal important relationships between COCCs and the occurrence of the effective medicinal components of ginseng.
Promoting 21st Century Skills in the Classroom through the Use of Authentic Student Research
NASA Astrophysics Data System (ADS)
Klug, S. L.; Swann, J. L.; Manfredi, L.; Christensen, P. R.
2012-12-01
Preparing students for the workforce starts well before they start college. The Mars Student Imaging Project has incorporated 21st Century Skills into their project to help the students practice and sharpen these skills. Professional development for the educational facilitators helps the teachers become familiar with these skills. By augmenting the authentic research project with these 21st Century Skills, the students are able to achieve a higher level experience that mirrors the real-world workforce.
A covert authentication and security solution for GMOs.
Mueller, Siguna; Jafari, Farhad; Roth, Don
2016-09-21
Proliferation and expansion of security risks necessitates new measures to ensure authenticity and validation of GMOs. Watermarking and other cryptographic methods are available which conceal and recover the original signature, but in the process reveal the authentication information. In many scenarios watermarking and standard cryptographic methods are necessary but not sufficient and new, more advanced, cryptographic protocols are necessary. Herein, we present a new crypto protocol, that is applicable in broader settings, and embeds the authentication string indistinguishably from a random element in the signature space and the string is verified or denied without disclosing the actual signature. Results show that in a nucleotide string of 1000, the algorithm gives a correlation of 0.98 or higher between the distribution of the codon and that of E. coli, making the signature virtually invisible. This algorithm may be used to securely authenticate and validate GMOs without disclosing the actual signature. While this protocol uses watermarking, its novelty is in use of more complex cryptographic techniques based on zero knowledge proofs to encode information.
ERIC Educational Resources Information Center
Abdelhafez, Hanan A.; Abdallah, Mahmoud M. S.
2015-01-01
This paper reports a research study that sought investigating Assiut University College of Education (AUCOE) EFL student teachers' awareness and use of online authentic materials on the basis of their actual language leaning needs, and how this relates to their language learning motivation. To accomplish this, a mixed-method research methodology…
ERIC Educational Resources Information Center
Hsu, Yu-Chang; Ching, Yu-Hui
2012-01-01
This research applied a mixed-method design to explore how best to promote learning in authentic contexts in an online graduate course in instructional message design. The students used Twitter apps on their mobile devices to collect, share, and comment on authentic design examples found in their daily lives. The data sources included tweets…
Investigation of Prospective Teachers' Beliefs towards Authentic Assessment
ERIC Educational Resources Information Center
Kinay, Ismail
2018-01-01
The aim of this study is to examine the prospective teachers' beliefs toward authentic assessment in relation to various variables. The survey method has been used in this study and the sample of the study is comprised of 612 prospective teachers 368 (60.1%) of whom are female and 244 (39.9%) of whom are male. The "Authentic Assessment Belief…
Mu, Zhendong; Yin, Jinhai; Hu, Jianfeng
2018-01-01
In this paper, a person authentication system that can effectively identify individuals by generating unique electroencephalogram signal features in response to self-face and non-self-face photos is presented. In order to achieve a good stability performance, the sequence of self-face photo including first-occurrence position and non-first-occurrence position are taken into account in the serial occurrence of visual stimuli. In addition, a Fisher linear classification method and event-related potential technique for feature analysis is adapted to yield remarkably better outcomes than that by most of the existing methods in the field. The results have shown that the EEG-based person authentications via brain-computer interface can be considered as a suitable approach for biometric authentication system.
HERMA-Heartbeat Microwave Authentication
NASA Technical Reports Server (NTRS)
Haque, Salman-ul Mohammed (Inventor); Chow, Edward (Inventor); McKee, Michael Ray (Inventor); Tkacenko, Andre (Inventor); Lux, James Paul (Inventor)
2018-01-01
Systems and methods for identifying and/or authenticating individuals utilizing microwave sensing modules are disclosed. A HEaRtbeat Microwave Authentication (HERMA) system can enable the active identification and/or authentication of a user by analyzing reflected RF signals that contain a person's unique characteristics related to their heartbeats. An illumination signal is transmitted towards a person where a reflected signal captures the motion of the skin and tissue (i.e. displacement) due to the person's heartbeats. The HERMA system can utilize existing transmitters in a mobile device (e.g. Wi-Fi, Bluetooth, Cellphone signals) as the illumination source with at least one external receive antenna. The received reflected signals can be pre-processed and analyzed to identify and/or authenticate a user.
Video multiple watermarking technique based on image interlacing using DWT.
Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M
2014-01-01
Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.
Detecting adulterants in milk powder using high-throughput Raman chemical imaging
USDA-ARS?s Scientific Manuscript database
This study used a line-scan high-throughput Raman imaging system to authenticate milk powder. A 5 W 785 nm line laser (240 mm long and 1 mm wide) was used as a Raman excitation source. The system was used to acquire hyperspectral Raman images in a wavenumber range of 103–2881 cm-1 from the skim milk...
Upper Elementary Math Lessons: Case Studies of Real Teaching
ERIC Educational Resources Information Center
Graeber, Anna O.; Valli, Linda; Newton, Kristie Jones
2011-01-01
Engaging students in worthwhile learning requires more than a knowledge of underlying principles of good teaching. It demands considerable practice as well as images of what good teaching in particular situations and for particular purposes might look like. This volume provides these images. These cases were written from authentic, unrehearsed…
NASA Astrophysics Data System (ADS)
Kajiya, E. A. M.; Campos, P. H. O. V.; Rizzutto, M. A.; Appoloni, C. R.; Lopes, F.
2014-02-01
This paper presents systematic studies and analysis that contributed to the identification of the forgery of a work by the artist Emiliano Augusto Cavalcanti de Albuquerque e Melo, known as Di Cavalcanti. The use of several areas of expertise such as brush stroke analysis ("pinacologia"), applied physics, and art history resulted in an accurate diagnosis for ascertaining the authenticity of the work entitled "Violeiro" (1950). For this work we used non-destructive methods such as techniques of infrared, ultraviolet, visible and tangential light imaging combined with chemical analysis of the pigments by portable X-Ray Fluorescence (XRF) and graphic gesture analysis. Each applied method of analysis produced specific information that made possible the identification of materials and techniques employed and we concluded that this work is not consistent with patterns characteristic of the artist Di Cavalcanti.
Real time biometric surveillance with gait recognition
NASA Astrophysics Data System (ADS)
Mohapatra, Subasish; Swain, Anisha; Das, Manaswini; Mohanty, Subhadarshini
2018-04-01
Bio metric surveillance has become indispensable for every system in the recent years. The contribution of bio metric authentication, identification, and screening purposes are widely used in various domains for preventing unauthorized access. A large amount of data needs to be updated, segregated and safeguarded from malicious software and misuse. Bio metrics is the intrinsic characteristics of each individual. Recently fingerprints, iris, passwords, unique keys, and cards are commonly used for authentication purposes. These methods have various issues related to security and confidentiality. These systems are not yet automated to provide the safety and security. The gait recognition system is the alternative for overcoming the drawbacks of the recent bio metric based authentication systems. Gait recognition is newer as it hasn't been implemented in the real-world scenario so far. This is an un-intrusive system that requires no knowledge or co-operation of the subject. Gait is a unique behavioral characteristic of every human being which is hard to imitate. The walking style of an individual teamed with the orientation of joints in the skeletal structure and inclinations between them imparts the unique characteristic. A person can alter one's own external appearance but not skeletal structure. These are real-time, automatic systems that can even process low-resolution images and video frames. In this paper, we have proposed a gait recognition system and compared the performance with conventional bio metric identification systems.
Watermarking protocols for authentication and ownership protection based on timestamps and holograms
NASA Astrophysics Data System (ADS)
Dittmann, Jana; Steinebach, Martin; Croce Ferri, Lucilla
2002-04-01
Digital watermarking has become an accepted technology for enabling multimedia protection schemes. One problem here is the security of these schemes. Without a suitable framework, watermarks can be replaced and manipulated. We discuss different protocols providing security against rightful ownership attacks and other fraud attempts. We compare the characteristics of existing protocols for different media like direct embedding or seed based and required attributes of the watermarking technology like robustness or payload. We introduce two new media independent protocol schemes for rightful ownership authentication. With the first scheme we ensure security of digital watermarks used for ownership protection with a combination of two watermarks: first watermark of the copyright holder and a second watermark from a Trusted Third Party (TTP). It is based on hologram embedding and the watermark consists of e.g. a company logo. As an example we use digital images and specify the properties of the embedded additional security information. We identify components necessary for the security protocol like timestamp, PKI and cryptographic algorithms. The second scheme is used for authentication. It is designed for invertible watermarking applications which require high data integrity. We combine digital signature schemes and digital watermarking to provide a public verifiable integrity. The original data can only be reproduced with a secret key. Both approaches provide solutions for copyright and authentication watermarking and are introduced for image data but can be easily adopted for video and audio data as well.
Koppers, Lars; Wormer, Holger; Ickstadt, Katja
2017-08-01
The quality and authenticity of images is essential for data presentation, especially in the life sciences. Questionable images may often be a first indicator for questionable results, too. Therefore, a tool that uses mathematical methods to detect suspicious images in large image archives can be a helpful instrument to improve quality assurance in publications. As a first step towards a systematic screening tool, especially for journal editors and other staff members who are responsible for quality assurance, such as laboratory supervisors, we propose a basic classification of image manipulation. Based on this classification, we developed and explored some simple algorithms to detect copied areas in images. Using an artificial image and two examples of previously published modified images, we apply quantitative methods such as pixel-wise comparison, a nearest neighbor and a variance algorithm to detect copied-and-pasted areas or duplicated images. We show that our algorithms are able to detect some simple types of image alteration, such as copying and pasting background areas. The variance algorithm detects not only identical, but also very similar areas that differ only by brightness. Further types could, in principle, be implemented in a standardized scanning routine. We detected the copied areas in a proven case of image manipulation in Germany and showed the similarity of two images in a retracted paper from the Kato labs, which has been widely discussed on sites such as pubpeer and retraction watch.
USDA-ARS?s Scientific Manuscript database
Flow injection mass spectrometry (FIMS) and proton nuclear magnetic resonance spectrometry (1H-NMR), two metabolic fingerprinting methods, and DNA sequencing were used to identify and authenticate Actaea species. Initially, samples of Actaea racemosa L. from a single source were distinguished from ...
Possibility of spoof attack against robustness of multibiometric authentication systems
NASA Astrophysics Data System (ADS)
Hariri, Mahdi; Shokouhi, Shahriar Baradaran
2011-07-01
Multibiometric systems have been recently developed in order to overcome some weaknesses of single biometric authentication systems, but security of these systems against spoofing has not received enough attention. In this paper, we propose a novel practical method for simulation of possibilities of spoof attacks against a biometric authentication system. Using this method, we model matching scores from standard to completely spoofed genuine samples. Sum, product, and Bayes fusion rules are applied for score level combination. The security of multimodal authentication systems are examined and compared with the single systems against various spoof possibilities. However, vulnerability of fused systems is considerably increased against spoofing, but their robustness is generally higher than single matcher systems. In this paper we show that robustness of a combined system is not always higher than a single system against spoof attack. We propose empirical methods for upgrading the security of multibiometric systems, which contain how to organize and select biometric traits and matchers against various possibilities of spoof attack. These methods provide considerable robustness and present an appropriate reason for using combined systems against spoof attacks.
An image adaptive, wavelet-based watermarking of digital images
NASA Astrophysics Data System (ADS)
Agreste, Santa; Andaloro, Guido; Prestipino, Daniela; Puccio, Luigia
2007-12-01
In digital management, multimedia content and data can easily be used in an illegal way--being copied, modified and distributed again. Copyright protection, intellectual and material rights protection for authors, owners, buyers, distributors and the authenticity of content are crucial factors in solving an urgent and real problem. In such scenario digital watermark techniques are emerging as a valid solution. In this paper, we describe an algorithm--called WM2.0--for an invisible watermark: private, strong, wavelet-based and developed for digital images protection and authenticity. Using discrete wavelet transform (DWT) is motivated by good time-frequency features and well-matching with human visual system directives. These two combined elements are important in building an invisible and robust watermark. WM2.0 works on a dual scheme: watermark embedding and watermark detection. The watermark is embedded into high frequency DWT components of a specific sub-image and it is calculated in correlation with the image features and statistic properties. Watermark detection applies a re-synchronization between the original and watermarked image. The correlation between the watermarked DWT coefficients and the watermark signal is calculated according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has shown to be resistant against geometric, filtering and StirMark attacks with a low rate of false alarm.
Secure Oblivious Hiding, Authentication, Tamper Proofing, and Verification Techniques
2002-08-01
compressing the bit- planes. The algorithm always starts with inspecting the 5th LSB plane. For color images , all three color-channels are compressed...use classical encryption engines, such as IDEA or DES . These algorithms have a fixed encryption block size, and, depending on the image dimensions, we...information can be stored either in a separate file, in the image header, or embedded in the image itself utilizing the modern concepts of steganography
Measurement methods and accuracy analysis of Chang'E-5 Panoramic Camera installation parameters
NASA Astrophysics Data System (ADS)
Yan, Wei; Ren, Xin; Liu, Jianjun; Tan, Xu; Wang, Wenrui; Chen, Wangli; Zhang, Xiaoxia; Li, Chunlai
2016-04-01
Chang'E-5 (CE-5) is a lunar probe for the third phase of China Lunar Exploration Project (CLEP), whose main scientific objectives are to implement lunar surface sampling and to return the samples back to the Earth. To achieve these goals, investigation of lunar surface topography and geological structure within sampling area seems to be extremely important. The Panoramic Camera (PCAM) is one of the payloads mounted on CE-5 lander. It consists of two optical systems which installed on a camera rotating platform. Optical images of sampling area can be obtained by PCAM in the form of a two-dimensional image and a stereo images pair can be formed by left and right PCAM images. Then lunar terrain can be reconstructed based on photogrammetry. Installation parameters of PCAM with respect to CE-5 lander are critical for the calculation of exterior orientation elements (EO) of PCAM images, which is used for lunar terrain reconstruction. In this paper, types of PCAM installation parameters and coordinate systems involved are defined. Measurement methods combining camera images and optical coordinate observations are studied for this work. Then research contents such as observation program and specific solution methods of installation parameters are introduced. Parametric solution accuracy is analyzed according to observations obtained by PCAM scientifically validated experiment, which is used to test the authenticity of PCAM detection process, ground data processing methods, product quality and so on. Analysis results show that the accuracy of the installation parameters affects the positional accuracy of corresponding image points of PCAM stereo images within 1 pixel. So the measurement methods and parameter accuracy studied in this paper meet the needs of engineering and scientific applications. Keywords: Chang'E-5 Mission; Panoramic Camera; Installation Parameters; Total Station; Coordinate Conversion
Disambiguating authenticity: Interpretations of value and appeal.
O'Connor, Kieran; Carroll, Glenn R; Kovács, Balázs
2017-01-01
While shaping aesthetic judgment and choice, socially constructed authenticity takes on some very different meanings among observers, consumers, producers and critics. Using a theoretical framework positing four distinct meanings of socially constructed authenticity-type, moral, craft, and idiosyncratic-we aim to document empirically the unique appeal of each type. We develop predictions about the relationships between attributed authenticity and corresponding increases in the value ascribed to it through: (1) consumer value ratings, (2) willingness to pay, and (3) behavioral choice. We report empirical analyses from a research program of three multi-method studies using (1) archival data from voluntary consumer evaluations of restaurants in an online review system, (2) a university-based behavioral lab experiment, and (3) an online survey-based experiment. Evidence is consistent across the studies and suggests that perceptions of four distinct subtypes of socially constructed authenticity generate increased appeal and value even after controlling for option quality. Findings suggest additional directions for research on authenticity.
Bland, Andrew J; Topping, Annie; Tobbell, Jane
2014-07-01
High-fidelity patient simulation is a method of education increasingly utilised by educators of nursing to provide authentic learning experiences. Fidelity and authenticity, however, are not conceptually equivalent. Whilst fidelity is important when striving to replicate a life experience such as clinical practice, authenticity can be produced with low fidelity. A challenge for educators of undergraduate nursing is to ensure authentic representation of the clinical situation which is a core component for potential success. What is less clear is the relationship between fidelity and authenticity in the context of simulation based learning. Authenticity does not automatically follow fidelity and as a result, educators of nursing cannot assume that embracing the latest technology-based educational tools will in isolation provide a learning environment perceived authentic by the learner. As nursing education programmes increasingly adopt simulators that offer the possibility of representing authentic real world situations, there is an urgency to better articulate and understand the terms fidelity and authenticity. Without such understanding there is a real danger that simulation as a teaching and learning resource in nurse education will never reach its potential and be misunderstood, creating a potential barrier to learning. This paper examines current literature to promote discussion within nurse education, concluding that authenticity in the context of simulation-based learning is complex, relying on far more than engineered fidelity. Copyright © 2014 Elsevier Ltd. All rights reserved.
Security authentication using phase-encoded nanoparticle structures and polarized light.
Carnicer, Artur; Hassanfiroozi, Amir; Latorre-Carmona, Pedro; Huang, Yi-Pai; Javidi, Bahram
2015-01-15
Phase-encoded nanostructures such as quick response (QR) codes made of metallic nanoparticles are suggested to be used in security and authentication applications. We present a polarimetric optical method able to authenticate random phase-encoded QR codes. The system is illuminated using polarized light, and the QR code is encoded using a phase-only random mask. Using classification algorithms, it is possible to validate the QR code from the examination of the polarimetric signature of the speckle pattern. We used Kolmogorov-Smirnov statistical test and Support Vector Machine algorithms to authenticate the phase-encoded QR codes using polarimetric signatures.
A Secure and Efficient Handover Authentication Protocol for Wireless Networks
Wang, Weijia; Hu, Lei
2014-01-01
Handover authentication protocol is a promising access control technology in the fields of WLANs and mobile wireless sensor networks. In this paper, we firstly review an efficient handover authentication protocol, named PairHand, and its existing security attacks and improvements. Then, we present an improved key recovery attack by using the linearly combining method and reanalyze its feasibility on the improved PairHand protocol. Finally, we present a new handover authentication protocol, which not only achieves the same desirable efficiency features of PairHand, but enjoys the provable security in the random oracle model. PMID:24971471
Detection of pork and poultry meat and bone meals in animal feed using hyperspectral imaging
USDA-ARS?s Scientific Manuscript database
Animal feed with meat and bone meal (MBM) has been the source of bovine spongiform encephalopathy (BSE) in cattle and other livestock animals. Many countries have banned the use MBM as an animal feed ingredient. Spectral imaging techniques have shown potential for rapid assessment and authentication...
The "Tomahawk Chop": The Continuous Struggle of Unlearning "Indian" Stereotypes.
ERIC Educational Resources Information Center
Pewewardy, Cornel D.
"Indian" mascots of athletic teams can be offensive to Native Americans when they portray negative and stereotypical images. The notion of the "tomahawk chop" invented by Atlanta Braves fans and all the antics that go along with such images prevent millions of Americans from understanding the authentic Indian America, both long…
From Legion to Avaki: The Persistence of Vision
2006-01-01
person, but what component, is requesting an action. 5.3.1 Authentication Users authenticate themselves to a Legion grid with the login paradigm...password supplied during login is compared to the password in the state of the authentication object in order to permit or deny subsequent access to...In either case, the credential is protected by the security of the underlying operating system. Although login is the most commonly used method
Secure E-Business applications based on the European Citizen Card
NASA Astrophysics Data System (ADS)
Zipfel, Christian; Daum, Henning; Meister, Gisela
The introduction of ID cards enhanced with electronic authentication services opens up the possibility to use these for identification and authentication in e-business applications. To avoid incompatible national solutions, the specification of the European Citizen Card aims at defining interoperable services for such use cases. Especially the given device authentication methods can help to eliminate security problems with current e-business and online banking applications.
Authenticity, Shinichi Suzuki, and "Beautiful Tone with Living Soul, Please"
ERIC Educational Resources Information Center
Thompson, Merlin
2016-01-01
While there is a great deal of scholarly inquiry into the Suzuki Method of music instruction, few resources examine how the various aesthetic and pedagogic themes associated with the Suzuki Method are grounded in Dr. Shinichi Suzuki's sense of self. Using the notion of authenticity--being true to oneself--as an investigative underpinning, I trace…
ERIC Educational Resources Information Center
Hoshiar, Mitra; Dunlap, Jody; Li, Jinyi; Friedel, Janice Nahra
2014-01-01
Online learning is rapidly becoming one of the most prevalent delivery methods of learning in institutions of higher education. It provides college students, especially adult students, an alternative, convenient, and cost-efficient method to earn their credentials, upgrade their skills and knowledge, and keep or upgrade their employment. But at…
ERIC Educational Resources Information Center
Zielinski, Dianne E.
2017-01-01
This study explored how faculty members implemented constructivist teaching methods after training. The student-centered teaching methods were interactions and collaborations, authentic learning and real-world experiences, linking material to previously learned information, and using technology in the classroom. Seven faculty members trained in…
Application of Ultrasound Phase-Shift Analysis to Authenticate Wooden Panel Paintings
Bravo, José M.; Sánchez-Pérez, Juan V.; Ferri, Marcelino; Redondo, Javier; Picó, Rubén
2014-01-01
Artworks are a valuable part of the World's cultural and historical heritage. Conservation and authentication of authorship are important aspects to consider in the protection of cultural patrimony. In this paper we present a novel application of a well-known method based on the phase-shift analysis of an ultrasonic signal, providing an integrated encoding system that enables authentication of the authorship of wooden panel paintings. The method has been evaluated in comparison with optical analysis and shows promising results. The proposed method provides an integrated fingerprint of the artwork, and could be used to enrich the cataloging and protection of artworks. Other advantages that make particularly attractive the proposed technique are its robustness and the use of low-cost sensors. PMID:24803191
Carbon isotope ratios and isotopic correlations between components in fruit juices
NASA Astrophysics Data System (ADS)
Wierzchnicki, Ryszard
2013-04-01
Nowadays food products are defined by geographical origin, method of production and by some regulations concerning terms of their authenticity. Important data for confirm the authenticity of product are providing by isotopic methods of food control. The method checks crucial criteria which characterize the authenticity of inspected product. The European Union Regulations clearly show the tendency for application of the isotopic methods for food authenticity control (wine, honey, juice). The aim of the legislation steps is the protection of European market from possibility of the commercial frauds. Method of isotope ratio mass spectrometry is very effective tool for the use distinguishably the food products of various geographical origin. The basic problem for identification of the sample origin is the lack of databases of isotopic composition of components and information about the correlations of the data. The subject of the work was study the isotopic correlations existing between components of fruits. The chemical and instrumental methods of separation: water, sugars, organic acids and pulp from fruit were implemented. IRMS technique was used to measure isotopic composition of samples. The final results for original samples of fruits (apple, strawberry etc.) will be presented and discussed. Acknowledgement: This work was supported by the Polish Ministry of Science and Higher Education under grant NR12-0043-10/2010.
Compact storage of medical images with patient information.
Acharya, R; Anand, D; Bhat, S; Niranjan, U C
2001-12-01
Digital watermarking is a technique of hiding specific identification data for copyright authentication. This technique is adapted here for interleaving patient information with medical images to reduce storage and transmission overheads. The text data are encrypted before interleaving with images to ensure greater security. The graphical signals are compressed and subsequently interleaved with the image. Differential pulse-code-modulation and adaptive-delta-modulation techniques are employed for data compression, and encryption and results are tabulated for a specific example.
X-Ray Fluorescence Imaging of Ancient Artifacts
NASA Astrophysics Data System (ADS)
Thorne, Robert; Geil, Ethan; Hudson, Kathryn; Crowther, Charles
2011-03-01
Many archaeological artifacts feature inscribed and/or painted text or figures which, through erosion and aging, have become difficult or impossible to read with conventional methods. Often, however, the pigments in paints contain metallic elements, and traces may remain even after visible markings are gone. A promising non-destructive technique for revealing these remnants is X-ray fluorescence (XRF) imaging, in which a tightly focused beam of monochromatic synchrotron radiation is raster scanned across a sample. At each pixel, an energy-dispersive detector records a fluorescence spectrum, which is then analyzed to determine element concentrations. In this way, a map of various elements is made across a region of interest. We have succesfully XRF imaged ancient Greek, Roman, and Mayan artifacts, and in many cases, the element maps have revealed significant new information, including previously invisible painted lines and traces of iron from tools used to carve stone tablets. X-ray imaging can be used to determine an object's provenance, including the region where it was produced and whether it is authentic or a copy.
Dai, Qiong; Cheng, Jun-Hu; Sun, Da-Wen; Zeng, Xin-An
2015-01-01
There is an increased interest in the applications of hyperspectral imaging (HSI) for assessing food quality, safety, and authenticity. HSI provides abundance of spatial and spectral information from foods by combining both spectroscopy and imaging, resulting in hundreds of contiguous wavebands for each spatial position of food samples, also known as the curse of dimensionality. It is desirable to employ feature selection algorithms for decreasing computation burden and increasing predicting accuracy, which are especially relevant in the development of online applications. Recently, a variety of feature selection algorithms have been proposed that can be categorized into three groups based on the searching strategy namely complete search, heuristic search and random search. This review mainly introduced the fundamental of each algorithm, illustrated its applications in hyperspectral data analysis in the food field, and discussed the advantages and disadvantages of these algorithms. It is hoped that this review should provide a guideline for feature selections and data processing in the future development of hyperspectral imaging technique in foods.
Schwartze, J; Haarbrandt, B; Fortmeier, D; Haux, R; Seidel, C
2014-01-01
Integration of electronic signatures embedded in health care processes in Germany challenges health care service and supply facilities. The suitability of the signature level of an eligible authentication procedure is confirmed for a large part of documents in clinical practice. However, the concrete design of such a procedure remains unclear. To create a summary of usable user authentication systems suitable for clinical workflows. A Systematic literature review based on nine online bibliographic databases. Search keywords included authentication, access control, information systems, information security and biometrics with terms user authentication, user identification and login in title or abstract. Searches were run between 7 and 12 September 2011. Relevant conference proceedings were searched manually in February 2013. Backward reference search of selected results was done. Only publications fully describing authentication systems used or usable were included. Algorithms or purely theoretical concepts were excluded. Three authors did selection independently. DATA EXTRACTION AND ASSESSMENT: Semi-structured extraction of system characteristics was done by the main author. Identified procedures were assessed for security and fulfillment of relevant laws and guidelines as well as for applicability. Suitability for clinical workflows was derived from the assessments using a weighted sum proposed by Bonneau. Of 7575 citations retrieved, 55 publications meet our inclusion criteria. They describe 48 different authentication systems; 39 were biometric and nine graphical password systems. Assessment of authentication systems showed high error rates above European CENELEC standards and a lack of applicability of biometric systems. Graphical passwords did not add overall value compared to conventional passwords. Continuous authentication can add an additional layer of safety. Only few systems are suitable partially or entirely for use in clinical processes. Suitability strongly depends on national or institutional requirements. Four authentication systems seem to fulfill requirements of authentication procedures for clinical workflows. Research is needed in the area of continuous authentication with biometric methods. A proper authentication system should combine all factors of authentication implementing and connecting secure individual measures.
24 CFR 15.305 - Method of production of material or provision of testimony.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., to which the seal of the Department has been affixed in accordance with its authentication procedures. The authentication shall be evidence that the documents are true copies of documents in the Department...
An Updated Review of Meat Authenticity Methods and Applications.
Vlachos, Antonios; Arvanitoyannis, Ioannis S; Tserkezou, Persefoni
2016-05-18
Adulteration of foods is a serious economic problem concerning most foodstuffs, and in particular meat products. Since high-priced meat demand premium prices, producers of meat-based products might be tempted to blend these products with lower cost meat. Moreover, the labeled meat contents may not be met. Both types of adulteration are difficult to detect and lead to deterioration of product quality. For the consumer, it is of outmost importance to guarantee both authenticity and compliance with product labeling. The purpose of this article is to review the state of the art of meat authenticity with analytical and immunochemical methods with the focus on the issue of geographic origin and sensory characteristics. This review is also intended to provide an overview of the various currently applied statistical analyses (multivariate analysis (MAV), such as principal component analysis, discriminant analysis, cluster analysis, etc.) and their effectiveness for meat authenticity.
Jung, Jaewook; Moon, Jongho; Lee, Donghoon; Won, Dongho
2017-01-01
At present, users can utilize an authenticated key agreement protocol in a Wireless Sensor Network (WSN) to securely obtain desired information, and numerous studies have investigated authentication techniques to construct efficient, robust WSNs. Chang et al. recently presented an authenticated key agreement mechanism for WSNs and claimed that their authentication mechanism can both prevent various types of attacks, as well as preserve security properties. However, we have discovered that Chang et al’s method possesses some security weaknesses. First, their mechanism cannot guarantee protection against a password guessing attack, user impersonation attack or session key compromise. Second, the mechanism results in a high load on the gateway node because the gateway node should always maintain the verifier tables. Third, there is no session key verification process in the authentication phase. To this end, we describe how the previously-stated weaknesses occur and propose a security-enhanced version for WSNs. We present a detailed analysis of the security and performance of our authenticated key agreement mechanism, which not only enhances security compared to that of related schemes, but also takes efficiency into consideration. PMID:28335572
Jung, Jaewook; Moon, Jongho; Lee, Donghoon; Won, Dongho
2017-03-21
At present, users can utilize an authenticated key agreement protocol in a Wireless Sensor Network (WSN) to securely obtain desired information, and numerous studies have investigated authentication techniques to construct efficient, robust WSNs. Chang et al. recently presented an authenticated key agreement mechanism for WSNs and claimed that their authentication mechanism can both prevent various types of attacks, as well as preserve security properties. However, we have discovered that Chang et al's method possesses some security weaknesses. First, their mechanism cannot guarantee protection against a password guessing attack, user impersonation attack or session key compromise. Second, the mechanism results in a high load on the gateway node because the gateway node should always maintain the verifier tables. Third, there is no session key verification process in the authentication phase. To this end, we describe how the previously-stated weaknesses occur and propose a security-enhanced version for WSNs. We present a detailed analysis of the security and performance of our authenticated key agreement mechanism, which not only enhances security compared to that of related schemes, but also takes efficiency into consideration.
Pires, Nuno M M; Tao Dong; Berntzen, Lasse; Lonningdal, Torill
2017-07-01
This work focuses on the development of a sophisticated technique via STR typing to unequivocally verify the authenticity of urine samples before sent to laboratories. STR profiling was conducted with the CSF1PO, TPOX, TH01 Multiplex System coupled with a smartphone-based detection method. The promising capability of the method to identify distinct STR profiles from urine of different persons opens the possibility to conduct sample authenticity tests. On-site STR profiling could be realized with a self-contained autonomous device with an integrated PCR microchip shown hereby.
Efficient local representations for three-dimensional palmprint recognition
NASA Astrophysics Data System (ADS)
Yang, Bing; Wang, Xiaohua; Yao, Jinliang; Yang, Xin; Zhu, Wenhua
2013-10-01
Palmprints have been broadly used for personal authentication because they are highly accurate and incur low cost. Most previous works have focused on two-dimensional (2-D) palmprint recognition in the past decade. Unfortunately, 2-D palmprint recognition systems lose the shape information when capturing palmprint images. Moreover, such 2-D palmprint images can be easily forged or affected by noise. Hence, three-dimensional (3-D) palmprint recognition has been regarded as a promising way to further improve the performance of palmprint recognition systems. We have developed a simple, but efficient method for 3-D palmprint recognition by using local features. We first utilize shape index representation to describe the geometry of local regions in 3-D palmprint data. Then, we extract local binary pattern and Gabor wavelet features from the shape index image. The two types of complementary features are finally fused at a score level for further improvements. The experimental results on the Hong Kong Polytechnic 3-D palmprint database, which contains 8000 samples from 400 palms, illustrate the effectiveness of the proposed method.
A text zero-watermarking method based on keyword dense interval
NASA Astrophysics Data System (ADS)
Yang, Fan; Zhu, Yuesheng; Jiang, Yifeng; Qing, Yin
2017-07-01
Digital watermarking has been recognized as a useful technology for the copyright protection and authentication of digital information. However, rarely did the former methods focus on the key content of digital carrier. The idea based on the protection of key content is more targeted and can be considered in different digital information, including text, image and video. In this paper, we use text as research object and a text zero-watermarking method which uses keyword dense interval (KDI) as the key content is proposed. First, we construct zero-watermarking model by introducing the concept of KDI and giving the method of KDI extraction. Second, we design detection model which includes secondary generation of zero-watermark and the similarity computing method of keyword distribution. Besides, experiments are carried out, and the results show that the proposed method gives better performance than other available methods especially in the attacks of sentence transformation and synonyms substitution.
Osathanunkul, Maslin; Suwannapoom, Chatmongkon; Khamyong, Nuttaluck; Pintakum, Danupol; Lamphun, Santisuk Na; Triwitayakorn, Kanokporn; Osathanunkul, Kitisak; Madesis, Panagiotis
2016-01-01
Andrographis paniculata Nees is a medicinal plant with multiple pharmacological properties. It has been used over many centuries as a household remedy. A. paniculata products sold on the markets are in processed forms so it is difficult to authenticate. Therefore buying the herbal products poses a high-risk of acquiring counterfeited, substituted and/or adulterated products. Due to these issues, a reliable method to authenticate products is needed. High resolution melting analysis coupled with DNA barcoding (Bar-HRM) was applied to detect adulteration in commercial herbal products. The rbcL barcode was selected to use in primers design for HRM analysis to produce standard melting profile of A. paniculata species. DNA of the tested commercial products was isolated and their melting profiles were then generated and compared with the standard A. paniculata. The melting profiles of the rbcL amplicons of the three closely related herbal species (A. paniculata, Acanthus ebracteatus and Rhinacanthus nasutus) are clearly separated so that they can be distinguished by the developed method. The method was then used to authenticate commercial herbal products. HRM curves of all 10 samples tested are similar to A. paniculata which indicated that all tested products were contained the correct species as labeled. The method described in this study has been proved to be useful in aiding identification and/or authenticating A. paniculata. This Bar-HRM analysis has allowed us easily to determine the A. paniculata species in herbal products on the markets even they are in processed forms. We propose the use of DNA barcoding combined with High Resolution Melting analysis for authenticating of Andrographis paniculata products.The developed method can be used regardless of the type of the DNA template (fresh or dried tissue, leaf, and stem).rbcL region was chosen for the analysis and work well with our samplesWe can easily determine the A. paniculata species in herbal products tested. Abbreviations used: bp: Base pair, Tm: Melting temperature.
Using cloud models of heartbeats as the entity identifier to secure mobile devices.
Fu, Donglai; Liu, Yanhua
2017-01-01
Mobile devices are extensively used to store more private and often sensitive information. Therefore, it is important to protect them against unauthorised access. Authentication ensures that authorised users can use mobile devices. However, traditional authentication methods, such as numerical or graphic passwords, are vulnerable to passive attacks. For example, an adversary can steal the password by snooping from a shorter distance. To avoid these problems, this study presents a biometric approach that uses cloud models of heartbeats as the entity identifier to secure mobile devices. Here, it is identified that these concepts including cloud model or cloud have nothing to do with cloud computing. The cloud model appearing in the study is the cognitive model. In the proposed method, heartbeats are collected by two ECG electrodes that are connected to one mobile device. The backward normal cloud generator is used to generate ECG standard cloud models characterising the heartbeat template. When a user tries to have access to their mobile device, cloud models regenerated by fresh heartbeats will be compared with ECG standard cloud models to determine if the current user can use this mobile device. This authentication method was evaluated from three aspects including accuracy, authentication time and energy consumption. The proposed method gives 86.04% of true acceptance rate with 2.73% of false acceptance rate. One authentication can be done in 6s, and this processing consumes about 2000 mW of power.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 25 Indians 2 2011-04-01 2011-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 25 Indians 2 2010-04-01 2010-04-01 false For marketing purposes, what is the recommended method of identifying authentic Indian products? 309.8 Section 309.8 Indians INDIAN ARTS AND CRAFTS BOARD, DEPARTMENT OF THE INTERIOR PROTECTION OF INDIAN ARTS AND CRAFTS PRODUCTS § 309.8 For marketing purposes, what is the...
Tarot Images and Spiritual Education: The Three I's Model
ERIC Educational Resources Information Center
Semetsky, Inna
2011-01-01
The paper presents education as a process of human development toward becoming our authentic Selves and posits the Tarot hermeneutic as one of the means of holistic, spiritual education. As a system of images and symbols, Tarot encompasses the three I's represented by intuition, insight and imagination in contrast to the three R's of traditional…
NASA Astrophysics Data System (ADS)
Javidi, Bahram; Carnicer, Artur; Yamaguchi, Masahiro; Nomura, Takanori; Pérez-Cabré, Elisabet; Millán, María S.; Nishchal, Naveen K.; Torroba, Roberto; Fredy Barrera, John; He, Wenqi; Peng, Xiang; Stern, Adrian; Rivenson, Yair; Alfalou, A.; Brosseau, C.; Guo, Changliang; Sheridan, John T.; Situ, Guohai; Naruse, Makoto; Matsumoto, Tsutomu; Juvells, Ignasi; Tajahuerce, Enrique; Lancis, Jesús; Chen, Wen; Chen, Xudong; Pinkse, Pepijn W. H.; Mosk, Allard P.; Markman, Adam
2016-08-01
Information security and authentication are important challenges facing society. Recent attacks by hackers on the databases of large commercial and financial companies have demonstrated that more research and development of advanced approaches are necessary to deny unauthorized access to critical data. Free space optical technology has been investigated by many researchers in information security, encryption, and authentication. The main motivation for using optics and photonics for information security is that optical waveforms possess many complex degrees of freedom such as amplitude, phase, polarization, large bandwidth, nonlinear transformations, quantum properties of photons, and multiplexing that can be combined in many ways to make information encryption more secure and more difficult to attack. This roadmap article presents an overview of the potential, recent advances, and challenges of optical security and encryption using free space optics. The roadmap on optical security is comprised of six categories that together include 16 short sections written by authors who have made relevant contributions in this field. The first category of this roadmap describes novel encryption approaches, including secure optical sensing which summarizes double random phase encryption applications and flaws [Yamaguchi], the digital holographic encryption in free space optical technique which describes encryption using multidimensional digital holography [Nomura], simultaneous encryption of multiple signals [Pérez-Cabré], asymmetric methods based on information truncation [Nishchal], and dynamic encryption of video sequences [Torroba]. Asymmetric and one-way cryptosystems are analyzed by Peng. The second category is on compression for encryption. In their respective contributions, Alfalou and Stern propose similar goals involving compressed data and compressive sensing encryption. The very important area of cryptanalysis is the topic of the third category with two sections: Sheridan reviews phase retrieval algorithms to perform different attacks, whereas Situ discusses nonlinear optical encryption techniques and the development of a rigorous optical information security theory. The fourth category with two contributions reports how encryption could be implemented at the nano- or micro-scale. Naruse discusses the use of nanostructures in security applications and Carnicer proposes encoding information in a tightly focused beam. In the fifth category, encryption based on ghost imaging using single-pixel detectors is also considered. In particular, the authors [Chen, Tajahuerce] emphasize the need for more specialized hardware and image processing algorithms. Finally, in the sixth category, Mosk and Javidi analyze in their corresponding papers how quantum imaging can benefit optical encryption systems. Sources that use few photons make encryption systems much more difficult to attack, providing a secure method for authentication.
[Combine fats products: methodic opportunities of it identification].
Viktorova, E V; Kulakova, S N; Mikhaĭlov, N A
2006-01-01
At present time very topical problem is falsification of milk fat. The number of methods was considered to detection of milk fat authention and possibilities his difference from combined fat products. The analysis of modern approaches to valuation of milk fat authention has showed that the main method for detection of fat nature is gas chromatography analysis. The computer method of express identification of fat products is proposed for quick getting of information about accessory of examine fat to nature milk or combined fat product.
Nguyen, Huy Truong; Min, Jung-Eun; Long, Nguyen Phuoc; Thanh, Ma Chi; Le, Thi Hong Van; Lee, Jeongmi; Park, Jeong Hill; Kwon, Sung Won
2017-08-05
Agarwood, the resinous heartwood produced by some Aquilaria species such as Aquilaria crassna, Aquilaria malaccensis and Aquilaria sinensis, has been traditionally and widely used in medicine, incenses and especially perfumes. However, up to now, the authentication of agarwood has been largely based on morphological characteristics, a method which is prone to errors and lacks reproducibility. Hence, in this study, we applied metabolomics and a genetic approach to the authentication of two common agarwood chips, those produced by Aquilaria crassna and Aquilaria malaccensis. Primary metabolites, secondary metabolites and DNA markers of agarwood were authenticated by 1 H NMR metabolomics, GC-MS metabolomics and DNA-based techniques, respectively. The results indicated that agarwood chips could be classified accurately by all the methods illustrated in this study. Additionally, the pros and cons of each method are also discussed. To the best of our knowledge, our research is the first study detailing all the differences in the primary and secondary metabolites, as well as the DNA markers between the agarwood produced by these two species. Copyright © 2017 Elsevier B.V. All rights reserved.
[Development of indel markers for molecular authentication of Panax ginseng and P. quinquefolius].
Wang, Rong-Bo; Tian, Hui-Li; Wang, Hong-Tao; Li, Gui-Sheng
2018-04-01
Panax ginseng and P. quinquefolius are two kinds of important medicinal herbs. They are morphologically similar but have different pharmacological effects. Therefore, botanical origin authentication of these two ginsengs is of great importance for ensuring pharmaceutical efficacy and food safety. Based on the fact that intron position in orthologous genes is highly conserved across plant species, intron length polymorphisms were exploited from unigenes of ginseng. Specific primers were respectively designed for these two species based on their insertion/deletion sequences of cytochrome P450 and glyceraldehyde 3-phosphate dehydrogenase, and multiplex PCR was conducted for molecular authentication of P.ginseng and P. quinquefolius. The results showed that the developed multiplex PCR assay was effective for molecular authentication of P.ginseng and P. quinquefolius without strict PCR condition and the optimization of reaction system.This study provides a preferred ideal marker system for molecular authentication of ginseng,and the presented method can be employed in origin authentication of other herbal preparations. Copyright© by the Chinese Pharmaceutical Association.
Fang, Wanping; Meinhardt, Lyndel W; Mischke, Sue; Bellato, Cláudia M; Motilal, Lambert; Zhang, Dapeng
2014-01-15
Cacao (Theobroma cacao L.), the source of cocoa, is an economically important tropical crop. One problem with the premium cacao market is contamination with off-types adulterating raw premium material. Accurate determination of the genetic identity of single cacao beans is essential for ensuring cocoa authentication. Using nanofluidic single nucleotide polymorphism (SNP) genotyping with 48 SNP markers, we generated SNP fingerprints for small quantities of DNA extracted from the seed coat of single cacao beans. On the basis of the SNP profiles, we identified an assumed adulterant variety, which was unambiguously distinguished from the authentic beans by multilocus matching. Assignment tests based on both Bayesian clustering analysis and allele frequency clearly separated all 30 authentic samples from the non-authentic samples. Distance-based principle coordinate analysis further supported these results. The nanofluidic SNP protocol, together with forensic statistical tools, is sufficiently robust to establish authentication and to verify gourmet cacao varieties. This method shows significant potential for practical application.
A Novel GMM-Based Behavioral Modeling Approach for Smartwatch-Based Driver Authentication.
Yang, Ching-Han; Chang, Chin-Chun; Liang, Deron
2018-03-28
All drivers have their own distinct driving habits, and usually hold and operate the steering wheel differently in different driving scenarios. In this study, we proposed a novel Gaussian mixture model (GMM)-based method that can improve the traditional GMM in modeling driving behavior. This new method can be applied to build a better driver authentication system based on the accelerometer and orientation sensor of a smartwatch. To demonstrate the feasibility of the proposed method, we created an experimental system that analyzes driving behavior using the built-in sensors of a smartwatch. The experimental results for driver authentication-an equal error rate (EER) of 4.62% in the simulated environment and an EER of 7.86% in the real-traffic environment-confirm the feasibility of this approach.
Salama, Nahla N; Wang, Shudong
2008-05-28
The present study employs time of flight mass and bupivacaine in authentic, pharmaceutical and spiked human plasma as well as in the presence of their impurities 2,6-dimethylaniline and alkaline degradation product. The method is based on time of flight electron spray ionization mass spectrometry technique without preliminary chromatographic separation and makes use of bupivacaine as internal standard for ropivacaine, which is used as internal standard for bupivacaine. A linear relationship between drug concentrations and the peak intensity ratio of ions of the analyzed substances is established. The method is linear from 23.8 to 2380.0 ng mL(-1) for both drugs. The correlation coefficient was >or=0.996 in authentic and spiked human plasma. The average percentage recoveries in the ranges of 95.39%-102.75% was obtained. The method is accurate (% RE < 5%) and reproducible with intra- and inter-assay precision (RSD% < 8.0%). The quantification limit is 23.8 ng mL(-1) for both drugs. The method is not only highly sensitive and selective, but also simple and effective for determination or identification of both drugs in authentic and biological fluids. The method can be applied in purity testing, quality control and stability monitoring for the studied drugs.
Salama, Nahla N.; Wang, Shudong
2009-01-01
The present study employs time of flight mass and bupivacaine in authentic, pharmaceutical and spiked human plasma as well as in the presence of their impurities 2,6-dimethylaniline and alkaline degradation product. The method is based on time of flight electron spray ionization mass spectrometry technique without preliminary chromatographic separation and makes use of bupivacaine as internal standard for ropivacaine, which is used as internal standard for bupivacaine. A linear relationship between drug concentrations and the peak intensity ratio of ions of the analyzed substances is established. The method is linear from 23.8 to 2380.0 ng mL−1 for both drugs. The correlation coefficient was ≥0.996 in authentic and spiked human plasma. The average percentage recoveries in the ranges of 95.39%–102.75% was obtained. The method is accurate (% RE < 5%) and reproducible with intra- and inter-assay precision (RSD% < 8.0%). The quantification limit is 23.8 ng mL−1 for both drugs. The method is not only highly sensitive and selective, but also simple and effective for determination or identification of both drugs in authentic and biological fluids. The method can be applied in purity testing, quality control and stability monitoring for the studied drugs. PMID:19652756
SegAuth: A Segment-based Approach to Behavioral Biometric Authentication
Li, Yanyan; Xie, Mengjun; Bian, Jiang
2016-01-01
Many studies have been conducted to apply behavioral biometric authentication on/with mobile devices and they have shown promising results. However, the concern about the verification accuracy of behavioral biometrics is still common given the dynamic nature of behavioral biometrics. In this paper, we address the accuracy concern from a new perspective—behavior segments, that is, segments of a gesture instead of the whole gesture as the basic building block for behavioral biometric authentication. With this unique perspective, we propose a new behavioral biometric authentication method called SegAuth, which can be applied to various gesture or motion based authentication scenarios. SegAuth can achieve high accuracy by focusing on each user’s distinctive gesture segments that frequently appear across his or her gestures. In SegAuth, a time series derived from a gesture/motion is first partitioned into segments and then transformed into a set of string tokens in which the tokens representing distinctive, repetitive segments are associated with higher genuine probabilities than those tokens that are common across users. An overall genuine score calculated from all the tokens derived from a gesture is used to determine the user’s authenticity. We have assessed the effectiveness of SegAuth using 4 different datasets. Our experimental results demonstrate that SegAuth can achieve higher accuracy consistently than existing popular methods on the evaluation datasets. PMID:28573214
SegAuth: A Segment-based Approach to Behavioral Biometric Authentication.
Li, Yanyan; Xie, Mengjun; Bian, Jiang
2016-10-01
Many studies have been conducted to apply behavioral biometric authentication on/with mobile devices and they have shown promising results. However, the concern about the verification accuracy of behavioral biometrics is still common given the dynamic nature of behavioral biometrics. In this paper, we address the accuracy concern from a new perspective-behavior segments, that is, segments of a gesture instead of the whole gesture as the basic building block for behavioral biometric authentication. With this unique perspective, we propose a new behavioral biometric authentication method called SegAuth, which can be applied to various gesture or motion based authentication scenarios. SegAuth can achieve high accuracy by focusing on each user's distinctive gesture segments that frequently appear across his or her gestures. In SegAuth, a time series derived from a gesture/motion is first partitioned into segments and then transformed into a set of string tokens in which the tokens representing distinctive, repetitive segments are associated with higher genuine probabilities than those tokens that are common across users. An overall genuine score calculated from all the tokens derived from a gesture is used to determine the user's authenticity. We have assessed the effectiveness of SegAuth using 4 different datasets. Our experimental results demonstrate that SegAuth can achieve higher accuracy consistently than existing popular methods on the evaluation datasets.
Choi, Han Gyo; Ahn, Sung Hee
2016-02-01
The aim of this study was to examine the mediating effect of empowerment in the relationship of nurse managers' authentic leadership, with nurses' organizational commitment and job satisfaction. The participants in this study were 273 registered nurses working in five University hospitals located in Seoul and Gyeonggi Province. The measurements included the Authentic Leadership Questionnaire, Condition of Work Effectiveness Questionnaire-II, Organizational Commitment Questionnaire and Korea-Minnesota Satisfaction Questionnaire. Data were analyzed using t-test, ANOVA, Scheffé test, Pearson correlation coefficients, simple and multiple regression techniques with the SPSS 18.0 program. Mediation analysis was performed according to the Baron and Kenny method and Sobel test. There were significant correlations among authentic leadership, empowerment, organizational commitment and job satisfaction. Empowerment showed perfect mediating effects in the relationship between authentic leadership and organizational commitment. It had partial mediating effects in the relationship between authentic leadership and job satisfaction. In this study, nurse managers' authentic leadership had significant influences on nurses organizational commitment and job satisfaction via empowerment. Therefore, to enhance nurses' organizational commitment and job satisfaction, it is necessary to build effective strategies to enhance nurse manager's authentic leadership and to develop empowering education programs for nurses.
NASA Astrophysics Data System (ADS)
Muñoz-Franco, Granada; Criado, Ana María; García-Carmona, Antonio
2018-04-01
This article presents the results of a qualitative study aimed at determining the effectiveness of the camera obscura as a didactic tool to understand image formation (i.e., how it is possible to see objects and how their image is formed on the retina, and what the image formed on the retina is like compared to the object observed) in a context of scientific inquiry. The study involved 104 prospective primary teachers (PPTs) who were being trained in science teaching. To assess the effectiveness of this tool, an open questionnaire was applied before (pre-test) and after (post-test) the educational intervention. The data were analyzed by combining methods of inter- and intra-rater analysis. The results showed that more than half of the PPTs advanced in their ideas towards the desirable level of knowledge in relation to the phenomena studied. The conclusion reached is that the camera obscura, used in a context of scientific inquiry, is a useful tool for PPTs to improve their knowledge about image formation and experience in the first person an authentic scientific inquiry during their teacher training.
Optics for People Stuck in Traffic: License Plates.
ERIC Educational Resources Information Center
Chagnon, Paul
1995-01-01
Explains the theory behind the working of Scotchlite, a retrodirective material used for coating automotive license plates, and the Ensure Imaging System that allows law enforcement officers to verify the authenticity of the plate. (JRH)
Bajoub, Aadil; Bendini, Alessandra; Fernández-Gutiérrez, Alberto; Carrasco-Pancorbo, Alegría
2018-03-24
Over the last decades, olive oil quality and authenticity control has become an issue of great importance to consumers, suppliers, retailers, and regulators in both traditional and emerging olive oil producing countries, mainly due to the increasing worldwide popularity and the trade globalization of this product. Thus, in order to ensure olive oil authentication, various national and international laws and regulations have been adopted, although some of them are actually causing an enormous debate about the risk that they can represent for the harmonization of international olive oil trade standards. Within this context, this review was designed to provide a critical overview and comparative analysis of selected regulatory frameworks for olive oil authentication, with special emphasis on the quality and purity criteria considered by these regulation systems, their thresholds and the analytical methods employed for monitoring them. To complete the general overview, recent analytical advances to overcome drawbacks and limitations of the official methods to evaluate olive oil quality and to determine possible adulterations were reviewed. Furthermore, the latest trends on analytical approaches to assess the olive oil geographical and varietal origin traceability were also examined.
Data Embedding for Covert Communications, Digital Watermarking, and Information Augmentation
2000-03-01
proposed an image authentication algorithm based on the fragility of messages embedded in digital images using LSB encoding. In [Walt95], he proposes...Invertibility 2/ 3 SAMPLE DATA EMBEDDING TECHNIQUES 23 3.1 SPATIAL TECHNIQUES 23 LSB Encoding in Intensity Images 23 Data embedding...ATTACK 21 FIGURE 6. EFFECTS OF LSB ENCODING 25 FIGURE 7. ALGORITHM FOR EZSTEGO 28 FIGURE 8. DATA EMBEDDING IN THE FREQUENCY DOMAIN 30 FIGURE 9
Strict integrity control of biomedical images
NASA Astrophysics Data System (ADS)
Coatrieux, Gouenou; Maitre, Henri; Sankur, Bulent
2001-08-01
The control of the integrity and authentication of medical images is becoming ever more important within the Medical Information Systems (MIS). The intra- and interhospital exchange of images, such as in the PACS (Picture Archiving and Communication Systems), and the ease of copying, manipulation and distribution of images have brought forth the security aspects. In this paper we focus on the role of watermarking for MIS security and address the problem of integrity control of medical images. We discuss alternative schemes to extract verification signatures and compare their tamper detection performance.
Hong, Danfeng; Su, Jian; Hong, Qinggen; Pan, Zhenkuan; Wang, Guodong
2014-01-01
As palmprints are captured using non-contact devices, image blur is inevitably generated because of the defocused status. This degrades the recognition performance of the system. To solve this problem, we propose a stable-feature extraction method based on a Vese–Osher (VO) decomposition model to recognize blurred palmprints effectively. A Gaussian defocus degradation model is first established to simulate image blur. With different degrees of blurring, stable features are found to exist in the image which can be investigated by analyzing the blur theoretically. Then, a VO decomposition model is used to obtain structure and texture layers of the blurred palmprint images. The structure layer is stable for different degrees of blurring (this is a theoretical conclusion that needs to be further proved via experiment). Next, an algorithm based on weighted robustness histogram of oriented gradients (WRHOG) is designed to extract the stable features from the structure layer of the blurred palmprint image. Finally, a normalized correlation coefficient is introduced to measure the similarity in the palmprint features. We also designed and performed a series of experiments to show the benefits of the proposed method. The experimental results are used to demonstrate the theoretical conclusion that the structure layer is stable for different blurring scales. The WRHOG method also proves to be an advanced and robust method of distinguishing blurred palmprints. The recognition results obtained using the proposed method and data from two palmprint databases (PolyU and Blurred–PolyU) are stable and superior in comparison to previous high-performance methods (the equal error rate is only 0.132%). In addition, the authentication time is less than 1.3 s, which is fast enough to meet real-time demands. Therefore, the proposed method is a feasible way of implementing blurred palmprint recognition. PMID:24992328
Hong, Danfeng; Su, Jian; Hong, Qinggen; Pan, Zhenkuan; Wang, Guodong
2014-01-01
As palmprints are captured using non-contact devices, image blur is inevitably generated because of the defocused status. This degrades the recognition performance of the system. To solve this problem, we propose a stable-feature extraction method based on a Vese-Osher (VO) decomposition model to recognize blurred palmprints effectively. A Gaussian defocus degradation model is first established to simulate image blur. With different degrees of blurring, stable features are found to exist in the image which can be investigated by analyzing the blur theoretically. Then, a VO decomposition model is used to obtain structure and texture layers of the blurred palmprint images. The structure layer is stable for different degrees of blurring (this is a theoretical conclusion that needs to be further proved via experiment). Next, an algorithm based on weighted robustness histogram of oriented gradients (WRHOG) is designed to extract the stable features from the structure layer of the blurred palmprint image. Finally, a normalized correlation coefficient is introduced to measure the similarity in the palmprint features. We also designed and performed a series of experiments to show the benefits of the proposed method. The experimental results are used to demonstrate the theoretical conclusion that the structure layer is stable for different blurring scales. The WRHOG method also proves to be an advanced and robust method of distinguishing blurred palmprints. The recognition results obtained using the proposed method and data from two palmprint databases (PolyU and Blurred-PolyU) are stable and superior in comparison to previous high-performance methods (the equal error rate is only 0.132%). In addition, the authentication time is less than 1.3 s, which is fast enough to meet real-time demands. Therefore, the proposed method is a feasible way of implementing blurred palmprint recognition.
Osathanunkul, Maslin; Suwannapoom, Chatmongkon; Khamyong, Nuttaluck; Pintakum, Danupol; Lamphun, Santisuk Na; Triwitayakorn, Kanokporn; Osathanunkul, Kitisak; Madesis, Panagiotis
2016-01-01
Background: Andrographis paniculata Nees is a medicinal plant with multiple pharmacological properties. It has been used over many centuries as a household remedy. A. paniculata products sold on the markets are in processed forms so it is difficult to authenticate. Therefore buying the herbal products poses a high-risk of acquiring counterfeited, substituted and/or adulterated products. Due to these issues, a reliable method to authenticate products is needed. Materials and Methods: High resolution melting analysis coupled with DNA barcoding (Bar-HRM) was applied to detect adulteration in commercial herbal products. The rbcL barcode was selected to use in primers design for HRM analysis to produce standard melting profile of A. paniculata species. DNA of the tested commercial products was isolated and their melting profiles were then generated and compared with the standard A. paniculata. Results: The melting profiles of the rbcL amplicons of the three closely related herbal species (A. paniculata, Acanthus ebracteatus and Rhinacanthus nasutus) are clearly separated so that they can be distinguished by the developed method. The method was then used to authenticate commercial herbal products. HRM curves of all 10 samples tested are similar to A. paniculata which indicated that all tested products were contained the correct species as labeled. Conclusion: The method described in this study has been proved to be useful in aiding identification and/or authenticating A. paniculata. This Bar-HRM analysis has allowed us easily to determine the A. paniculata species in herbal products on the markets even they are in processed forms. SUMMARY We propose the use of DNA barcoding combined with High Resolution Melting analysis for authenticating of Andrographis paniculata products.The developed method can be used regardless of the type of the DNA template (fresh or dried tissue, leaf, and stem).rbcL region was chosen for the analysis and work well with our samplesWe can easily determine the A. paniculata species in herbal products tested. Abbreviations used: bp: Base pair, Tm: Melting temperature PMID:27041863
Joint forensics and watermarking approach for video authentication
NASA Astrophysics Data System (ADS)
Thiemert, Stefan; Liu, Huajian; Steinebach, Martin; Croce-Ferri, Lucilla
2007-02-01
In our paper we discuss and compare the possibilities and shortcomings of both content-fragile watermarking and digital forensics and analyze if the combination of both techniques allows the identification of more than the sum of all manipulations identified by both techniques on their own due to synergetic effects. The first part of the paper discusses the theoretical possibilities offered by a combined approach, in which forensics and watermarking are considered as complementary tools for data authentication or deeply combined together, in order to reduce their error rate and to enhance the detection efficiency. After this conceptual discussion the paper proposes some concrete examples in which the joint approach is applied to video authentication. Some specific forensics techniques are analyzed and expanded to handle efficiently video data. The examples show possible extensions of passive-blind image forgery detection to video data, where the motion and time related characteristics of video are efficiently exploited.
Facilitating and securing offline e-medicine service through image steganography.
Kamal, A H M; Islam, M Mahfuzul
2014-06-01
E-medicine is a process to provide health care services to people using the Internet or any networking technology. In this Letter, a new idea is proposed to model the physical structure of the e-medicine system to better provide offline health care services. Smart cards are used to authenticate the user singly. A very unique technique is also suggested to verify the card owner's identity and to embed secret data to the card while providing patients' reports either at booths or at the e-medicine server system. The simulation results of card authentication and embedding procedure justify the proposed implementation.
Approximation, Mad Men and the Death of JFK.
Bruzzi, Stella
2018-01-01
In this article I take the US television series Mad Men (2007-present) as an exemplary 'approximation', a term I adopt to signal the way in which certain texts construct a changeable, fluid 'truth' resulting from collisions, exchange and dialectical argument. Approximations are layered, their formal layerings mirroring a layered, multifaceted argument. Mad Men integrates and represents real historical events within a fictional setting, and act that suggests that an event or action can never be finished, fixed and not open to reassessment. Specifically, this article examines 'The Grown Ups', Episode 12 of Season 3, which charts the events of 22 November 1963, the day Kennedy was assassinated. Although we might be able to bring to mind the images and conspiracy theories that have been made available since (such Abraham Zapruder's 8 mm home movie footage of the assassination), these images were not available at the time. Mad Men as a series always strives to represent its historical milieu as authentically as possible, so the characters re-enact 22 November 1963 as authentically as possible by watching only what was on television that day (the news bulletin, Walter Kronkite's announcement that Kennedy is dead). The contemporary backdrop to these events, including the resonances of '9/11' through Mad Men , inform and collide with the authenticity on the screen.
Experimental evaluation of fingerprint verification system based on double random phase encoding
NASA Astrophysics Data System (ADS)
Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi
2006-03-01
We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.
Weeks, Keith W; Clochesy, John M; Hutton, B Meriel; Moseley, Laurie
2013-03-01
Advancing the art and science of education practice requires a robust evaluation of the relationship between students' exposure to learning and assessment environments and the development of their cognitive competence (knowing that and why) and functional competence (know-how and skills). Healthcare education translation research requires specific education technology assessments and evaluations that consist of quantitative analyses of empirical data and qualitative evaluations of the lived student experience of the education journey and schemata construction (Weeks et al., 2013a). This paper focuses on the outcomes of UK PhD and USA post-doctorate experimental research. We evaluated the relationship between exposure to traditional didactic methods of education, prototypes of an authentic medication dosage calculation problem-solving (MDC-PS) environment and nursing students' construction of conceptual and calculation competence in medication dosage calculation problem-solving skills. Empirical outcomes from both UK and USA programmes of research identified highly significant differences in the construction of conceptual and calculation competence in MDC-PS following exposure to the authentic learning environment to that following exposure to traditional didactic transmission methods of education (p < 0.001). This research highlighted that for many students exposure to authentic learning environments is an essential first step in the development of conceptual and calculation competence and relevant schemata construction (internal representations of the relationship between the features of authentic dosage problems and calculation functions); and how authentic environments more ably support all cognitive (learning) styles in mathematics than traditional didactic methods of education. Functional competence evaluations are addressed in Macdonald et al. (2013) and Weeks et al. (2013e). Copyright © 2012. Published by Elsevier Ltd.
USDA-ARS?s Scientific Manuscript database
Milk is a vulnerable target for economically motivated adulteration. In this study, a line-scan high-throughput Raman imaging system was used to authenticate milk powder. A 5 W 785 nm line laser (240 mm long and 1 mm wide) was used as a Raman excitation source. The system was used to acquire hypersp...
Attacks on quantum key distribution protocols that employ non-ITS authentication
NASA Astrophysics Data System (ADS)
Pacher, C.; Abidin, A.; Lorünser, T.; Peev, M.; Ursin, R.; Zeilinger, A.; Larsson, J.-Å.
2016-01-01
We demonstrate how adversaries with large computing resources can break quantum key distribution (QKD) protocols which employ a particular message authentication code suggested previously. This authentication code, featuring low key consumption, is not information-theoretically secure (ITS) since for each message the eavesdropper has intercepted she is able to send a different message from a set of messages that she can calculate by finding collisions of a cryptographic hash function. However, when this authentication code was introduced, it was shown to prevent straightforward man-in-the-middle (MITM) attacks against QKD protocols. In this paper, we prove that the set of messages that collide with any given message under this authentication code contains with high probability a message that has small Hamming distance to any other given message. Based on this fact, we present extended MITM attacks against different versions of BB84 QKD protocols using the addressed authentication code; for three protocols, we describe every single action taken by the adversary. For all protocols, the adversary can obtain complete knowledge of the key, and for most protocols her success probability in doing so approaches unity. Since the attacks work against all authentication methods which allow to calculate colliding messages, the underlying building blocks of the presented attacks expose the potential pitfalls arising as a consequence of non-ITS authentication in QKD post-processing. We propose countermeasures, increasing the eavesdroppers demand for computational power, and also prove necessary and sufficient conditions for upgrading the discussed authentication code to the ITS level.
Hyperspectral imaging of polymer banknotes for building and analysis of spectral library
NASA Astrophysics Data System (ADS)
Lim, Hoong-Ta; Murukeshan, Vadakke Matham
2017-11-01
The use of counterfeit banknotes increases crime rates and cripples the economy. New countermeasures are required to stop counterfeiters who use advancing technologies with criminal intent. Many countries started adopting polymer banknotes to replace paper notes, as polymer notes are more durable and have better quality. The research on authenticating such banknotes is of much interest to the forensic investigators. Hyperspectral imaging can be employed to build a spectral library of polymer notes, which can then be used for classification to authenticate these notes. This is however not widely reported and has become a research interest in forensic identification. This paper focuses on the use of hyperspectral imaging on polymer notes to build spectral libraries, using a pushbroom hyperspectral imager which has been previously reported. As an initial study, a spectral library will be built from three arbitrarily chosen regions of interest of five circulated genuine polymer notes. Principal component analysis is used for dimension reduction and to convert the information in the spectral library to principal components. A 99% confidence ellipse is formed around the cluster of principal component scores of each class and then used as classification criteria. The potential of the adopted methodology is demonstrated by the classification of the imaged regions as training samples.
A Smart Spoofing Face Detector by Display Features Analysis.
Lai, ChinLun; Tai, ChiuYuan
2016-07-21
In this paper, a smart face liveness detector is proposed to prevent the biometric system from being "deceived" by the video or picture of a valid user that the counterfeiter took with a high definition handheld device (e.g., iPad with retina display). By analyzing the characteristics of the display platform and using an expert decision-making core, we can effectively detect whether a spoofing action comes from a fake face displayed in the high definition display by verifying the chromaticity regions in the captured face. That is, a live or spoof face can be distinguished precisely by the designed optical image sensor. To sum up, by the proposed method/system, a normal optical image sensor can be upgraded to a powerful version to detect the spoofing actions. The experimental results prove that the proposed detection system can achieve very high detection rate compared to the existing methods and thus be practical to implement directly in the authentication systems.
Fast perceptual image hash based on cascade algorithm
NASA Astrophysics Data System (ADS)
Ruchay, Alexey; Kober, Vitaly; Yavtushenko, Evgeniya
2017-09-01
In this paper, we propose a perceptual image hash algorithm based on cascade algorithm, which can be applied in image authentication, retrieval, and indexing. Image perceptual hash uses for image retrieval in sense of human perception against distortions caused by compression, noise, common signal processing and geometrical modifications. The main disadvantage of perceptual hash is high time expenses. In the proposed cascade algorithm of image retrieval initializes with short hashes, and then a full hash is applied to the processed results. Computer simulation results show that the proposed hash algorithm yields a good performance in terms of robustness, discriminability, and time expenses.
NASA Astrophysics Data System (ADS)
Zhao, Tieyu; Ran, Qiwen; Yuan, Lin; Chi, Yingying; Ma, Jing
2015-09-01
In this paper, a novel image encryption system with fingerprint used as a secret key is proposed based on the phase retrieval algorithm and RSA public key algorithm. In the system, the encryption keys include the fingerprint and the public key of RSA algorithm, while the decryption keys are the fingerprint and the private key of RSA algorithm. If the users share the fingerprint, then the system will meet the basic agreement of asymmetric cryptography. The system is also applicable for the information authentication. The fingerprint as secret key is used in both the encryption and decryption processes so that the receiver can identify the authenticity of the ciphertext by using the fingerprint in decryption process. Finally, the simulation results show the validity of the encryption scheme and the high robustness against attacks based on the phase retrieval technique.
Privacy Preserving Facial and Fingerprint Multi-biometric Authentication
NASA Astrophysics Data System (ADS)
Anzaku, Esla Timothy; Sohn, Hosik; Ro, Yong Man
The cases of identity theft can be mitigated by the adoption of secure authentication methods. Biohashing and its variants, which utilizes secret keys and biometrics, are promising methods for secure authentication; however, their shortcoming is the degraded performance under the assumption that secret keys are compromised. In this paper, we extend the concept of Biohashing to multi-biometrics - facial and fingerprint traits. We chose these traits because they are widely used, howbeit, little research attention has been given to designing privacy preserving multi-biometric systems using them. Instead of just using a single modality (facial or fingerprint), we presented a framework for using both modalities. The improved performance of the proposed method, using face and fingerprint, as against either facial or fingerprint trait used in isolation is evaluated using two chimerical bimodal databases formed from publicly available facial and fingerprint databases.
A digital memories based user authentication scheme with privacy preservation.
Liu, JunLiang; Lyu, Qiuyun; Wang, Qiuhua; Yu, Xiangxiang
2017-01-01
The traditional username/password or PIN based authentication scheme, which still remains the most popular form of authentication, has been proved insecure, unmemorable and vulnerable to guessing, dictionary attack, key-logger, shoulder-surfing and social engineering. Based on this, a large number of new alternative methods have recently been proposed. However, most of them rely on users being able to accurately recall complex and unmemorable information or using extra hardware (such as a USB Key), which makes authentication more difficult and confusing. In this paper, we propose a Digital Memories based user authentication scheme adopting homomorphic encryption and a public key encryption design which can protect users' privacy effectively, prevent tracking and provide multi-level security in an Internet & IoT environment. Also, we prove the superior reliability and security of our scheme compared to other schemes and present a performance analysis and promising evaluation results.
A digital memories based user authentication scheme with privacy preservation
Liu, JunLiang; Lyu, Qiuyun; Wang, Qiuhua; Yu, Xiangxiang
2017-01-01
The traditional username/password or PIN based authentication scheme, which still remains the most popular form of authentication, has been proved insecure, unmemorable and vulnerable to guessing, dictionary attack, key-logger, shoulder-surfing and social engineering. Based on this, a large number of new alternative methods have recently been proposed. However, most of them rely on users being able to accurately recall complex and unmemorable information or using extra hardware (such as a USB Key), which makes authentication more difficult and confusing. In this paper, we propose a Digital Memories based user authentication scheme adopting homomorphic encryption and a public key encryption design which can protect users’ privacy effectively, prevent tracking and provide multi-level security in an Internet & IoT environment. Also, we prove the superior reliability and security of our scheme compared to other schemes and present a performance analysis and promising evaluation results. PMID:29190659
ESnet authentication services and trust federations
NASA Astrophysics Data System (ADS)
Muruganantham, Dhivakaran; Helm, Mike; Genovese, Tony
2005-01-01
ESnet provides authentication services and trust federation support for SciDAC projects, collaboratories, and other distributed computing applications. The ESnet ATF team operates the DOEGrids Certificate Authority, available to all DOE Office of Science programs, plus several custom CAs, including one for the National Fusion Collaboratory and one for NERSC. The secure hardware and software environment developed to support CAs is suitable for supporting additional custom authentication and authorization applications that your program might require. Seamless, secure interoperation across organizational and international boundaries is vital to collaborative science. We are fostering the development of international PKI federations by founding the TAGPMA, the American regional PMA, and the worldwide IGTF Policy Management Authority (PMA), as well as participating in European and Asian regional PMAs. We are investigating and prototyping distributed authentication technology that will allow us to support the "roaming scientist" (distributed wireless via eduroam), as well as more secure authentication methods (one-time password tokens).
Learning Global Leadership via Liberation Projects: An Interdisciplinary Application
ERIC Educational Resources Information Center
Nguyen, Shelbee
2014-01-01
Global leadership programs framed within singular cultural contexts do not promote authentic leadership. Unilateral methods may exclude individual multicultural experiences or identities, and further, encourage a one-size fits all approach to leadership. An interdisciplinary global leadership course aims to promote authentic unlearning and…
High Resolution Melting (HRM) applied to wine authenticity.
Pereira, Leonor; Gomes, Sónia; Castro, Cláudia; Eiras-Dias, José Eduardo; Brazão, João; Graça, António; Fernandes, José R; Martins-Lopes, Paula
2017-02-01
Wine authenticity methods are in increasing demand mainly in Denomination of Origin designations. The DNA-based methodologies are a reliable means of tracking food/wine varietal composition. The main aim of this work was the study of High Resolution Melting (HRM) application as a screening method for must and wine authenticity. Three sample types (leaf, must and wine) were used to validate the three developed HRM assays (Vv1-705bp; Vv2-375bp; and Vv3-119bp). The Vv1 HRM assay was only successful when applied to leaf and must samples. The Vv2 HRM assay successfully amplified all sample types, allowing genotype discrimination based on melting temperature values. The smallest amplicon, Vv3, produced a coincident melting curve shape in all sample types (leaf and wine) with corresponding genotypes. This study presents sensitive, rapid and efficient HRM assays applied for the first time to wine samples suitable for wine authenticity purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.
An authentic imaging probe to track cell fate from beginning to end.
Lee, Seung Koo; Mortensen, Luke J; Lin, Charles P; Tung, Ching-Hsuan
2014-10-17
Accurate tracing of cell viability is critical for optimizing delivery methods and evaluating the efficacy and safety of cell therapeutics. A nanoparticle-based cell tracker is developed to image cell fate from live to dead. The particle is fabricated from two types of optically quenched polyelectrolytes, a life indicator and a death indicator, through electrostatic interactions. On incubation with cells, the fabricated bifunctional nanoprobes are taken up efficiently and the first colour is produced by normal intracellular proteolysis, reflecting the healthy status of the cells. Depending on the number of coated layers, the signal can persist for several replication cycles. However, as the cells begin dying, the second colour appears quickly to reflect the new cell status. Using this chameleon-like cell tracker, live cells can be distinguished from apoptotic and necrotic cells instantly and definitively.
A new approach to pre-processing digital image for wavelet-based watermark
NASA Astrophysics Data System (ADS)
Agreste, Santa; Andaloro, Guido
2008-11-01
The growth of the Internet has increased the phenomenon of digital piracy, in multimedia objects, like software, image, video, audio and text. Therefore it is strategic to individualize and to develop methods and numerical algorithms, which are stable and have low computational cost, that will allow us to find a solution to these problems. We describe a digital watermarking algorithm for color image protection and authenticity: robust, not blind, and wavelet-based. The use of Discrete Wavelet Transform is motivated by good time-frequency features and a good match with Human Visual System directives. These two combined elements are important for building an invisible and robust watermark. Moreover our algorithm can work with any image, thanks to the step of pre-processing of the image that includes resize techniques that adapt to the size of the original image for Wavelet transform. The watermark signal is calculated in correlation with the image features and statistic properties. In the detection step we apply a re-synchronization between the original and watermarked image according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has been shown to be resistant against geometric, filtering, and StirMark attacks with a low rate of false alarm.
Wang, Jianji; Zheng, Nanning
2013-09-01
Fractal image compression (FIC) is an image coding technology based on the local similarity of image structure. It is widely used in many fields such as image retrieval, image denoising, image authentication, and encryption. FIC, however, suffers from the high computational complexity in encoding. Although many schemes are published to speed up encoding, they do not easily satisfy the encoding time or the reconstructed image quality requirements. In this paper, a new FIC scheme is proposed based on the fact that the affine similarity between two blocks in FIC is equivalent to the absolute value of Pearson's correlation coefficient (APCC) between them. First, all blocks in the range and domain pools are chosen and classified using an APCC-based block classification method to increase the matching probability. Second, by sorting the domain blocks with respect to APCCs between these domain blocks and a preset block in each class, the matching domain block for a range block can be searched in the selected domain set in which these APCCs are closer to APCC between the range block and the preset block. Experimental results show that the proposed scheme can significantly speed up the encoding process in FIC while preserving the reconstructed image quality well.
Line-scan system for continuous hand authentication
NASA Astrophysics Data System (ADS)
Liu, Xiaofeng; Kong, Lingsheng; Diao, Zhihui; Jia, Ping
2017-03-01
An increasing number of heavy machinery and vehicles have come into service, giving rise to a significant concern over protecting these high-security systems from misuse. Conventionally, authentication performed merely at the initial login may not be sufficient for detecting intruders throughout the operating session. To address this critical security flaw, a line-scan continuous hand authentication system with the appearance of an operating rod is proposed. Given that the operating rod is occupied throughout the operating period, it can be a possible solution for unobtrusively recording the personal characteristics for continuous monitoring. The ergonomics in the physiological and psychological aspects are fully considered. Under the shape constraints, a highly integrated line-scan sensor, a controller unit, and a gear motor with encoder are utilized. This system is suitable for both the desktop and embedded platforms with a universal serial bus interface. The volume of the proposed system is smaller than 15% of current multispectral area-based camera systems. Based on experiments on a database with 4000 images from 200 volunteers, a competitive equal error rate of 0.1179% is achieved, which is far more accurate than the state-of-the-art continuous authentication systems using other modalities.
Kakio, Tomoko; Yoshida, Naoko; Macha, Susan; Moriguchi, Kazunobu; Hiroshima, Takashi; Ikeda, Yukihiro; Tsuboi, Hirohito; Kimura, Kazuko
2017-09-01
Analytical methods for the detection of substandard and falsified medical products (SFs) are important for public health and patient safety. Research to understand how the physical and chemical properties of SFs can be most effectively applied to distinguish the SFs from authentic products has not yet been investigated enough. Here, we investigated the usefulness of two analytical methods, handheld Raman spectroscopy (handheld Raman) and X-ray computed tomography (X-ray CT), for detecting SFs among oral solid antihypertensive pharmaceutical products containing candesartan cilexetil as an active pharmaceutical ingredient (API). X-ray CT visualized at least two different types of falsified tablets, one containing many cracks and voids and the other containing aggregates with high electron density, such as from the presence of the heavy elements. Generic products that purported to contain equivalent amounts of API to the authentic products were discriminated from the authentic products by the handheld Raman and the different physical structure on X-ray CT. Approach to investigate both the chemical and physical properties with handheld Raman and X-ray CT, respectively, promise the accurate discrimination of the SFs, even if their visual appearance is similar with authentic products. We present a decision tree for investigating the authenticity of samples purporting to be authentic commercial tablets. Our results indicate that the combination approach of visual observation, handheld Raman and X-ray CT is a powerful strategy for nondestructive discrimination of suspect samples.
Ali, Eaqub; Sultana, Sharmin; Hamid, Sharifah Bee Abd; Hossain, Motalib; Yehya, Wageeh A; Kader, Abdul; Bhargava, Suresh K
2018-06-13
Gelatin is a highly purified animal protein of pig, cow, and fish origins and is extensively used in food, pharmaceuticals, and personal care products. However, the acceptability of gelatin products greatly depends on the animal sources of the gelatin. Porcine and bovine gelatins have attractive features but limited acceptance because of religious prohibitions and potential zoonotic threats, whereas fish gelatin is welcomed in all religions and cultures. Thus, source authentication is a must for gelatin products but it is greatly challenging due to the breakdown of both protein and DNA biomarkers in processed gelatins. Therefore, several methods have been proposed for gelatin identification, but a comprehensive and systematic document that includes all of the techniques does not exist. This up-to-date review addresses this research gap and presents, in an accessible format, the major gelatin source authentication techniques, which are primarily nucleic acid and protein based. Instead of presenting these methods in paragraph form which needs much attention in reading, the major methods are schematically depicted, and their comparative features are tabulated. Future technologies are forecasted, and challenges are outlined. Overall, this review paper has the merit to serve as a reference guide for the production and application of gelatin in academia and industry and will act as a platform for the development of improved methods for gelatin authentication.
Authentication Without Secrets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierson, Lyndon G.; Robertson, Perry J.
This work examines a new approach to authentication, which is the most fundamental security primitive that underpins all cyber security protections. Current Internet authentication techniques require the protection of one or more secret keys along with the integrity protection of the algorithms/computations designed to prove possession of the secret without actually revealing it. Protecting a secret requires physical barriers or encryption with yet another secret key. The reason to strive for "Authentication without Secret Keys" is that protecting secrets (even small ones only kept in a small corner of a component or device) is much harder than protecting the integritymore » of information that is not secret. Promising methods are examined for authentication of components, data, programs, network transactions, and/or individuals. The successful development of authentication without secret keys will enable far more tractable system security engineering for high exposure, high consequence systems by eliminating the need for brittle protection mechanisms to protect secret keys (such as are now protected in smart cards, etc.). This paper is a re-release of SAND2009-7032 with new figures numerous edits.« less
USDA-ARS?s Scientific Manuscript database
This AOAC Standard Method Performance Requirements (SMPR) is for authentication of selected Vaccinium species in dietary ingredients and dietary supplements containing a single Vaccinium species using anthocyanin profiles. SMPRs describe the minimum recommended performance characteristics to be used...
Extended Password Recovery Attacks against APOP, SIP, and Digest Authentication
NASA Astrophysics Data System (ADS)
Sasaki, Yu; Wang, Lei; Ohta, Kazuo; Kunihiro, Noboru
In this paper, we propose password recovery attacks against challenge-response authentication protocols. Our attacks use a message difference for a MD5 collision attack proposed in IEICE 2008. First, we show how to efficiently find a message pair that collides with the above message difference. Second, we show that a password used in authenticated post office protocol (APOP) can be recovered practically. We also show that the password recovery attack can be applied to a session initiation protocol (SIP) and digest authentication. Our attack can recover up to the first 31 password characters in a short time and up to the first 60 characters faster than the naive search method. We have implemented our attack and confirmed that 31 characters can be successfully recovered.
[Movies as a teaching resource for infectious diseases and clinical microbiology].
García-Sánchez, José Elías; Fresnadillo, María José; García-Sánchez, Enrique
2002-10-01
Since its inception, the cinema has constantly provided a reflection of infectious diseases because of their omnipresence in life and their importance to individuals and society. Few infectious diseases escape its eye, to the extent that the cinema constitutes an authentic treatise on these phenomena. The cinema is a very valuable educational resource, able to supplement classical teaching methods and to encourage critical thinking among students. The enormous flow of information, images, sounds, consequences, situations, and points of view that it provides should not be wasted and can be of great use, both in the spread of ideas and in training in infectious diseases and clinical microbiology.
Content fragile watermarking for H.264/AVC video authentication
NASA Astrophysics Data System (ADS)
Ait Sadi, K.; Guessoum, A.; Bouridane, A.; Khelifi, F.
2017-04-01
Discrete cosine transform is exploited in this work to generate the authentication data that are treated as a fragile watermark. This watermark is embedded in the motion vectors. The advances in multimedia technologies and digital processing tools have brought with them new challenges for the source and content authentication. To ensure the integrity of the H.264/AVC video stream, we introduce an approach based on a content fragile video watermarking method using an independent authentication of each group of pictures (GOPs) within the video. This technique uses robust visual features extracted from the video pertaining to the set of selected macroblocs (MBs) which hold the best partition mode in a tree-structured motion compensation process. An additional security degree is offered by the proposed method through using a more secured keyed function HMAC-SHA-256 and randomly choosing candidates from already selected MBs. In here, the watermark detection and verification processes are blind, whereas the tampered frames detection is not since it needs the original frames within the tampered GOPs. The proposed scheme achieves an accurate authentication technique with a high fragility and fidelity whilst maintaining the original bitrate and the perceptual quality. Furthermore, its ability to detect the tampered frames in case of spatial, temporal and colour manipulations is confirmed.
Nonintrusive multibiometrics on a mobile device: a comparison of fusion techniques
NASA Astrophysics Data System (ADS)
Allano, Lorene; Morris, Andrew C.; Sellahewa, Harin; Garcia-Salicetti, Sonia; Koreman, Jacques; Jassim, Sabah; Ly-Van, Bao; Wu, Dalei; Dorizzi, Bernadette
2006-04-01
In this article we test a number of score fusion methods for the purpose of multimodal biometric authentication. These tests were made for the SecurePhone project, whose aim is to develop a prototype mobile communication system enabling biometrically authenticated users to deal legally binding m-contracts during a mobile phone call on a PDA. The three biometrics of voice, face and signature were selected because they are all traditional non-intrusive and easy to use means of authentication which can readily be captured on a PDA. By combining multiple biometrics of relatively low security it may be possible to obtain a combined level of security which is at least as high as that provided by a PIN or handwritten signature, traditionally used for user authentication. As the relative success of different fusion methods depends on the database used and tests made, the database we used was recorded on a suitable PDA (the Qtek2020) and the test protocol was designed to reflect the intended application scenario, which is expected to use short text prompts. Not all of the fusion methods tested are original. They were selected for their suitability for implementation within the constraints imposed by the application. All of the methods tested are based on fusion of the match scores output by each modality. Though computationally simple, the methods tested have shown very promising results. All of the 4 fusion methods tested obtain a significant performance increase.
The Mindful School: How To Assess Thoughtful Outcomes. K-College.
ERIC Educational Resources Information Center
Burke, Kay
Authentic assessment, as referred to in this book, encompasses meaningful tasks, positive interaction between teachers and students, methods that emphasize higher-order thinking skills, and strategies that allow students to plan, monitor, and evaluate their own learning. Most important, authentic assessment means helping students to apply and…
Laptop Computers in the Elementary Classroom: Authentic Instruction with At-Risk Students
ERIC Educational Resources Information Center
Kemker, Kate; Barron, Ann E.; Harmes, J. Christine
2007-01-01
This case study investigated the integration of laptop computers into an elementary classroom in a low socioeconomic status (SES) school. Specifically, the research examined classroom management techniques and aspects of authentic learning relative to the student projects and activities. A mixed methods approach included classroom observations,…
Han, Ruizhen; He, Yong; Liu, Fei
2012-01-01
This paper presents a feasibility study on a real-time in field pest classification system design based on Blackfin DSP and 3G wireless communication technology. This prototype system is composed of remote on-line classification platform (ROCP), which uses a digital signal processor (DSP) as a core CPU, and a host control platform (HCP). The ROCP is in charge of acquiring the pest image, extracting image features and detecting the class of pest using an Artificial Neural Network (ANN) classifier. It sends the image data, which is encoded using JPEG 2000 in DSP, to the HCP through the 3G network at the same time for further identification. The image transmission and communication are accomplished using 3G technology. Our system transmits the data via a commercial base station. The system can work properly based on the effective coverage of base stations, no matter the distance from the ROCP to the HCP. In the HCP, the image data is decoded and the pest image displayed in real-time for further identification. Authentication and performance tests of the prototype system were conducted. The authentication test showed that the image data were transmitted correctly. Based on the performance test results on six classes of pests, the average accuracy is 82%. Considering the different live pests’ pose and different field lighting conditions, the result is satisfactory. The proposed technique is well suited for implementation in field pest classification on-line for precision agriculture. PMID:22736996
Han, Ruizhen; He, Yong; Liu, Fei
2012-01-01
This paper presents a feasibility study on a real-time in field pest classification system design based on Blackfin DSP and 3G wireless communication technology. This prototype system is composed of remote on-line classification platform (ROCP), which uses a digital signal processor (DSP) as a core CPU, and a host control platform (HCP). The ROCP is in charge of acquiring the pest image, extracting image features and detecting the class of pest using an Artificial Neural Network (ANN) classifier. It sends the image data, which is encoded using JPEG 2000 in DSP, to the HCP through the 3G network at the same time for further identification. The image transmission and communication are accomplished using 3G technology. Our system transmits the data via a commercial base station. The system can work properly based on the effective coverage of base stations, no matter the distance from the ROCP to the HCP. In the HCP, the image data is decoded and the pest image displayed in real-time for further identification. Authentication and performance tests of the prototype system were conducted. The authentication test showed that the image data were transmitted correctly. Based on the performance test results on six classes of pests, the average accuracy is 82%. Considering the different live pests' pose and different field lighting conditions, the result is satisfactory. The proposed technique is well suited for implementation in field pest classification on-line for precision agriculture.
Simultaneous storage of medical images in the spatial and frequency domain: A comparative study
Nayak, Jagadish; Bhat, P Subbanna; Acharya U, Rajendra; UC, Niranjan
2004-01-01
Background Digital watermarking is a technique of hiding specific identification data for copyright authentication. This technique is adapted here for interleaving patient information with medical images, to reduce storage and transmission overheads. Methods The patient information is encrypted before interleaving with images to ensure greater security. The bio-signals are compressed and subsequently interleaved with the image. This interleaving is carried out in the spatial domain and Frequency domain. The performance of interleaving in the spatial, Discrete Fourier Transform (DFT), Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) coefficients is studied. Differential pulse code modulation (DPCM) is employed for data compression as well as encryption and results are tabulated for a specific example. Results It can be seen from results, the process does not affect the picture quality. This is attributed to the fact that the change in LSB of a pixel changes its brightness by 1 part in 256. Spatial and DFT domain interleaving gave very less %NRMSE as compared to DCT and DWT domain. Conclusion The Results show that spatial domain the interleaving, the %NRMSE was less than 0.25% for 8-bit encoded pixel intensity. Among the frequency domain interleaving methods, DFT was found to be very efficient. PMID:15180899
Liu, Yang; Wang, Xiao-Yue; Wei, Xue-Min; Gao, Zi-Tong; Han, Jian-Ping
2018-05-22
Species adulteration in herbal products (HPs) exposes consumers to health risks. Chemical and morphological methods have their own deficiencies when dealing with the detection of species containing the same active compounds in HPs. In this study, we developed a rapid identification method using the recombinase polymerase amplification (RPA) assay to detect two species, Ginkgo biloba and Sophora japonica (as adulteration), in Ginkgo biloba HPs. Among 36 Ginkgo biloba HP samples, 34 were found to have Ginkgo biloba sequences, and 9 were found to have Sophora japonica sequences. During the authentication process, the RPA-LFS assay showed a higher specificity, sensitivity and efficiency than PCR-based methods. We initially applied the RPA-LSF technique to detect plant species in HPs, demonstrating that this assay can be developed into an efficient tool for the rapid on-site authentication of plant species in Ginkgo biloba HPs.
Robust Speaker Authentication Based on Combined Speech and Voiceprint Recognition
NASA Astrophysics Data System (ADS)
Malcangi, Mario
2009-08-01
Personal authentication is becoming increasingly important in many applications that have to protect proprietary data. Passwords and personal identification numbers (PINs) prove not to be robust enough to ensure that unauthorized people do not use them. Biometric authentication technology may offer a secure, convenient, accurate solution but sometimes fails due to its intrinsically fuzzy nature. This research aims to demonstrate that combining two basic speech processing methods, voiceprint identification and speech recognition, can provide a very high degree of robustness, especially if fuzzy decision logic is used.
NASA Astrophysics Data System (ADS)
Boling, M. E.
1989-09-01
Prototypes were assembled pursuant to recommendations made in report K/DSRD-96, Issues and Approaches for Electronic Document Approval and Transmittal Using Digital Signatures and Text Authentication, and to examine and discover the possibilities for integrating available hardware and software to provide cost effective systems for digital signatures and text authentication. These prototypes show that on a LAN, a multitasking, windowed, mouse/keyboard menu-driven interface can be assembled to provide easy and quick access to bit-mapped images of documents, electronic forms and electronic mail messages with a means to sign, encrypt, deliver, receive or retrieve and authenticate text and signatures. In addition they show that some of this same software may be used in a classified environment using host to terminal transactions to accomplish these same operations. Finally, a prototype was developed demonstrating that binary files may be signed electronically and sent by point to point communication and over ARPANET to remote locations where the authenticity of the code and signature may be verified. Related studies on the subject of electronic signatures and text authentication using public key encryption were done within the Department of Energy. These studies include timing studies of public key encryption software and hardware and testing of experimental user-generated host resident software for public key encryption. This software used commercially available command-line source code. These studies are responsive to an initiative within the Office of the Secretary of Defense (OSD) for the protection of unclassified but sensitive data. It is notable that these related studies are all built around the same commercially available public key encryption products from the private sector and that the software selection was made independently by each study group.
Li, Tao; Su, Chen
2018-06-02
Rhodiola is an increasingly widely used traditional Tibetan medicine and traditional Chinese medicine in China. The composition profiles of bioactive compounds are somewhat jagged according to different species, which makes it crucial to identify authentic Rhodiola species accurately so as to ensure clinical application of Rhodiola. In this paper, a nondestructive, rapid, and efficient method in classification of Rhodiola was developed by Fourier transform near-infrared (FT-NIR) spectroscopy combined with chemometrics analysis. A total of 160 batches of raw spectra were obtained from four different species of Rhodiola by FT-NIR, such as Rhodiola crenulata, Rhodiola fastigiata, Rhodiola kirilowii, and Rhodiola brevipetiolata. After excluding the outliers, different performances of 3 sample dividing methods, 12 spectral preprocessing methods, 2 wavelength selection methods, and 2 modeling evaluation methods were compared. The results indicated that this combination was superior than others in the authenticity identification analysis, which was FT-NIR combined with sample set partitioning based on joint x-y distances (SPXY), standard normal variate transformation (SNV) + Norris-Williams (NW) + 2nd derivative, competitive adaptive reweighted sampling (CARS), and kernel extreme learning machine (KELM). The accuracy (ACCU), sensitivity (SENS), and specificity (SPEC) of the optimal model were all 1, which showed that this combination of FT-NIR and chemometrics methods had the optimal authenticity identification performance. The classification performance of the partial least squares discriminant analysis (PLS-DA) model was slightly lower than KELM model, and PLS-DA model results were ACCU = 0.97, SENS = 0.93, and SPEC = 0.98, respectively. It can be concluded that FT-NIR combined with chemometrics analysis has great potential in authenticity identification and classification of Rhodiola, which can provide a valuable reference for the safety and effectiveness of clinical application of Rhodiola. Copyright © 2018 Elsevier B.V. All rights reserved.
A Fingerprint Encryption Scheme Based on Irreversible Function and Secure Authentication
Yu, Jianping; Zhang, Peng; Wang, Shulan
2015-01-01
A fingerprint encryption scheme based on irreversible function has been designed in this paper. Since the fingerprint template includes almost the entire information of users' fingerprints, the personal authentication can be determined only by the fingerprint features. This paper proposes an irreversible transforming function (using the improved SHA1 algorithm) to transform the original minutiae which are extracted from the thinned fingerprint image. Then, Chinese remainder theorem is used to obtain the biokey from the integration of the transformed minutiae and the private key. The result shows that the scheme has better performance on security and efficiency comparing with other irreversible function schemes. PMID:25873989
A fingerprint encryption scheme based on irreversible function and secure authentication.
Yang, Yijun; Yu, Jianping; Zhang, Peng; Wang, Shulan
2015-01-01
A fingerprint encryption scheme based on irreversible function has been designed in this paper. Since the fingerprint template includes almost the entire information of users' fingerprints, the personal authentication can be determined only by the fingerprint features. This paper proposes an irreversible transforming function (using the improved SHA1 algorithm) to transform the original minutiae which are extracted from the thinned fingerprint image. Then, Chinese remainder theorem is used to obtain the biokey from the integration of the transformed minutiae and the private key. The result shows that the scheme has better performance on security and efficiency comparing with other irreversible function schemes.
Facilitating and securing offline e-medicine service through image steganography
Islam, M. Mahfuzul
2014-01-01
E-medicine is a process to provide health care services to people using the Internet or any networking technology. In this Letter, a new idea is proposed to model the physical structure of the e-medicine system to better provide offline health care services. Smart cards are used to authenticate the user singly. A very unique technique is also suggested to verify the card owner's identity and to embed secret data to the card while providing patients' reports either at booths or at the e-medicine server system. The simulation results of card authentication and embedding procedure justify the proposed implementation. PMID:26609382
Ninth Grade Student Responses to Authentic Science Instruction
NASA Astrophysics Data System (ADS)
Ellison, Michael Steven
This mixed methods case study documents an effort to implement authentic science and engineering instruction in one teacher's ninth grade science classrooms in a science-focused public school. The research framework and methodology is a derivative of work developed and reported by Newmann and others (Newmann & Associates, 1996). Based on a working definition of authenticity, data were collected for eight months on the authenticity in the experienced teacher's pedagogy and in student performance. Authenticity was defined as the degree to which a classroom lesson, an assessment task, or an example of student performance demonstrates construction of knowledge through use of the meaning-making processes of science and engineering, and has some value to students beyond demonstrating success in school (Wehlage et al., 1996). Instruments adapted for this study produced a rich description of the authenticity of the teacher's instruction and student performance. The pedagogical practices of the classroom teacher were measured as moderately authentic on average. However, the authenticity model revealed the teacher's strategy of interspersing relatively low authenticity instructional units focused on building science knowledge with much higher authenticity tasks requiring students to apply these concepts and skills. The authenticity of the construction of knowledge and science meaning-making processes components of authentic pedagogy were found to be greater, than the authenticity of affordances for students to find value in classroom activities beyond demonstrating success in school. Instruction frequently included one aspect of value beyond school, connections to the world outside the classroom, but students were infrequently afforded the opportunity to present their classwork to audiences beyond the teacher. When the science instruction in the case was measured to afford a greater level of authentic intellectual work, a higher level of authentic student performance on science classwork was also measured. In addition, direct observation measures of student behavioral engagement showed that behavioral engagement was generally high, but not associated with the authenticity of the pedagogy. Direct observation measures of student self-regulation found evidence that when instruction focused on core science and engineering concepts and made stronger connections to the student's world beyond the classroom, student self-regulated learning was greater, and included evidence of student ownership. In light of the alignment between the model of authenticity used in this study and the Next Generation Science Standards (NGSS), the results suggest that further research on the value beyond school component of the model could improve understanding of student engagement and performance in response to the implementation of the NGSS. In particular, it suggests a unique role environmental education can play in affording student success in K-12 science and a tool to measure that role.
Separation of high-resolution samples of overlapping latent fingerprints using relaxation labeling
NASA Astrophysics Data System (ADS)
Qian, Kun; Schott, Maik; Schöne, Werner; Hildebrandt, Mario
2012-06-01
The analysis of latent fingerprint patterns generally requires clearly recognizable friction ridge patterns. Currently, overlapping latent fingerprints pose a major problem for traditional crime scene investigation. This is due to the fact that these fingerprints usually have very similar optical properties. Consequently, the distinction of two or more overlapping fingerprints from each other is not trivially possible. While it is possible to employ chemical imaging to separate overlapping fingerprints, the corresponding methods require sophisticated fingerprint acquisition methods and are not compatible with conventional forensic fingerprint data. A separation technique that is purely based on the local orientation of the ridge patterns of overlapping fingerprints is proposed by Chen et al. and quantitatively evaluated using off-the-shelf fingerprint matching software with mostly artificially composed overlapping fingerprint samples, which is motivated by the scarce availability of authentic test samples. The work described in this paper adapts the approach presented by Chen et al. for its application on authentic high resolution fingerprint samples acquired by a contactless measurement device based on a Chromatic White Light (CWL) sensor. An evaluation of the work is also given, with the analysis of all adapted parameters. Additionally, the separability requirement proposed by Chen et al. is also evaluated for practical feasibility. Our results show promising tendencies for the application of this approach on high-resolution data, yet the separability requirement still poses a further challenge.
Performance evaluation of wavelet-based face verification on a PDA recorded database
NASA Astrophysics Data System (ADS)
Sellahewa, Harin; Jassim, Sabah A.
2006-05-01
The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.
A cryptologic based trust center for medical images.
Wong, S T
1996-01-01
To investigate practical solutions that can integrate cryptographic techniques and picture archiving and communication systems (PACS) to improve the security of medical images. The PACS at the University of California San Francisco Medical Center consolidate images and associated data from various scanners into a centralized data archive and transmit them to remote display stations for review and consultation purposes. The purpose of this study is to investigate the model of a digital trust center that integrates cryptographic algorithms and protocols seamlessly into such a digital radiology environment to improve the security of medical images. The timing performance of encryption, decryption, and transmission of the cryptographic protocols over 81 volumetric PACS datasets has been measured. Lossless data compression is also applied before the encryption. The transmission performance is measured against three types of networks of different bandwidths: narrow-band Integrated Services Digital Network, Ethernet, and OC-3c Asynchronous Transfer Mode. The proposed digital trust center provides a cryptosystem solution to protect the confidentiality and to determine the authenticity of digital images in hospitals. The results of this study indicate that diagnostic images such as x-rays and magnetic resonance images could be routinely encrypted in PACS. However, applying encryption in teleradiology and PACS is a tradeoff between communications performance and security measures. Many people are uncertain about how to integrate cryptographic algorithms coherently into existing operations of the clinical enterprise. This paper describes a centralized cryptosystem architecture to ensure image data authenticity in a digital radiology department. The system performance has been evaluated in a hospital-integrated PACS environment.
A cryptologic based trust center for medical images.
Wong, S T
1996-01-01
OBJECTIVE: To investigate practical solutions that can integrate cryptographic techniques and picture archiving and communication systems (PACS) to improve the security of medical images. DESIGN: The PACS at the University of California San Francisco Medical Center consolidate images and associated data from various scanners into a centralized data archive and transmit them to remote display stations for review and consultation purposes. The purpose of this study is to investigate the model of a digital trust center that integrates cryptographic algorithms and protocols seamlessly into such a digital radiology environment to improve the security of medical images. MEASUREMENTS: The timing performance of encryption, decryption, and transmission of the cryptographic protocols over 81 volumetric PACS datasets has been measured. Lossless data compression is also applied before the encryption. The transmission performance is measured against three types of networks of different bandwidths: narrow-band Integrated Services Digital Network, Ethernet, and OC-3c Asynchronous Transfer Mode. RESULTS: The proposed digital trust center provides a cryptosystem solution to protect the confidentiality and to determine the authenticity of digital images in hospitals. The results of this study indicate that diagnostic images such as x-rays and magnetic resonance images could be routinely encrypted in PACS. However, applying encryption in teleradiology and PACS is a tradeoff between communications performance and security measures. CONCLUSION: Many people are uncertain about how to integrate cryptographic algorithms coherently into existing operations of the clinical enterprise. This paper describes a centralized cryptosystem architecture to ensure image data authenticity in a digital radiology department. The system performance has been evaluated in a hospital-integrated PACS environment. PMID:8930857
Authentication of medicinal herbs using PCR-amplified ITS2 with specific primers.
Chiou, Shu-Jiau; Yen, Jui-Hung; Fang, Cheng-Li; Chen, Hui-Ling; Lin, Tsai-Yun
2007-10-01
Different parts of medicinal herbs have long been used as traditional Chinese drugs for treating many diseases, whereas materials of similar morphology and chemical fingerprints are often misidentified. Analyses of sequence variations in the nuclear ribosomal DNA (rDNA) internal transcribed spacer (ITS) have become a valid method for authentication of medicinal herbs at the intergenic and interspecific levels. DNA extracted from processed materials is usually severely degraded or contaminated by microorganisms, thus generates no or unexpected PCR products. The goal of this study is to apply the ITS fragments selectively amplified with two designed primer sets for efficient and precise authentication of medicinal herbs. The designed primers led to an accurate PCR product of the specific region in ITS2, which was confirmed with DNA extracted from 55 processed medicinal herbs belonging to 48 families. Moreover, the selectively amplified ITS2 authenticated five sets of easily confusable Chinese herbal materials. The designed primers were proven to be suitable for a broad application in the authentication of herbal materials.
Disambiguating authenticity: Interpretations of value and appeal
O’Connor, Kieran; Carroll, Glenn R.; Kovács, Balázs
2017-01-01
While shaping aesthetic judgment and choice, socially constructed authenticity takes on some very different meanings among observers, consumers, producers and critics. Using a theoretical framework positing four distinct meanings of socially constructed authenticity–type, moral, craft, and idiosyncratic–we aim to document empirically the unique appeal of each type. We develop predictions about the relationships between attributed authenticity and corresponding increases in the value ascribed to it through: (1) consumer value ratings, (2) willingness to pay, and (3) behavioral choice. We report empirical analyses from a research program of three multi-method studies using (1) archival data from voluntary consumer evaluations of restaurants in an online review system, (2) a university-based behavioral lab experiment, and (3) an online survey-based experiment. Evidence is consistent across the studies and suggests that perceptions of four distinct subtypes of socially constructed authenticity generate increased appeal and value even after controlling for option quality. Findings suggest additional directions for research on authenticity. PMID:28650997
Quinlan, Elizabeth; Thomas, Roanne; Ahmed, Shahid; Fichtner, Pam; McMullen, Linda; Block, Janice
2014-01-01
The use of popular expressive arts as antidotes to the pathologies of the parallel processes of lifeworld colonization and cultural impoverishment has been under-theorized. This article enters the void with a project in which breast cancer survivors used collages and installations of everyday objects to solicit their authentic expression of the psycho-social impacts of lymphedema. The article enlists Jurgen Habermas' communicative action theory to explore the potential of these expressive arts to expand participants' meaningful engagement with their lifeworlds. The findings point to the unique non-linguistic discursivity of these non-institutional artistic forms as their liberating power to disclose silenced human needs: the images ‘spoke' for themselves for group members to recognize shared subjectivities. The authenticity claims inherent in the art forms fostered collective reflexivity and spontaneous, affective responses and compelled the group to create new collective understandings of the experience of living with lymphedema. The article contributes theoretical insights regarding the emancipatory potential of aesthetic-expressive rationality, an under-developed area of Habermasian theory of communicative action, and to the burgeoning literature on arts-based methods in social scientific research. PMID:25197263
Quinlan, Elizabeth; Thomas, Roanne; Ahmed, Shahid; Fichtner, Pam; McMullen, Linda; Block, Janice
2014-08-01
The use of popular expressive arts as antidotes to the pathologies of the parallel processes of lifeworld colonization and cultural impoverishment has been under-theorized. This article enters the void with a project in which breast cancer survivors used collages and installations of everyday objects to solicit their authentic expression of the psycho-social impacts of lymphedema. The article enlists Jurgen Habermas' communicative action theory to explore the potential of these expressive arts to expand participants' meaningful engagement with their lifeworlds. The findings point to the unique non-linguistic discursivity of these non-institutional artistic forms as their liberating power to disclose silenced human needs: the images 'spoke' for themselves for group members to recognize shared subjectivities. The authenticity claims inherent in the art forms fostered collective reflexivity and spontaneous, affective responses and compelled the group to create new collective understandings of the experience of living with lymphedema. The article contributes theoretical insights regarding the emancipatory potential of aesthetic-expressive rationality, an under-developed area of Habermasian theory of communicative action, and to the burgeoning literature on arts-based methods in social scientific research.
ERIC Educational Resources Information Center
French, Debbie Ann
2016-01-01
In this dissertation, the researcher describes authentic scientific inquiry (ASI) within three stages of teacher preparation and development: a1) undergraduate STEM courses, b2) preservice secondary science education methods courses, and c3) inservice teacher professional development (PD). Incorporating (ASI)--pedagogy closely modeling the…
Engaging Life-Sciences Students with Mathematical Models: Does Authenticity Help?
ERIC Educational Resources Information Center
Poladian, Leon
2013-01-01
Compulsory mathematics service units for the life sciences present unique challenges: even students who learn some specific skills maintain a negative attitude to mathematics and do not see the relevance of the unit towards their degree. The focus on authentic content and the presentation and teaching of global or qualitative methods before…
ERIC Educational Resources Information Center
Yunus, Faridah
2014-01-01
Authentic assessment approach applies naturalistic observation method to gather and analyse data about children's development that are socio-culturally appropriate to plan for individual teaching and learning needs. This article discusses the process of adapting an authentic developmental instrument for children of 3-6 years old. The instrument…
Arabic Language Teachers and Islamic Education Teachers' Awareness of Authentic Assessment in Jordan
ERIC Educational Resources Information Center
Al-Basheer, Akram; Ashraah, Mamdouh; Alsmadi, Rana
2015-01-01
This study aimed at investigating Islamic Education teachers' and Arabic Language teachers' perceptions of authentic assessment in Jordan, and exploring the effects of factors related to teachers' specialization, gender and years of experience on their understanding of the implications of this kind of assessment. In this mixed-method research, a…
The Strategy Project: Promoting Self-Regulated Learning through an Authentic Assignment
ERIC Educational Resources Information Center
Steiner, Hillary H.
2016-01-01
Success in college requires the development of self-regulated learning strategies that move beyond high school skills. First-year students of all ability levels benefit when given instruction in how to use these strategies in an authentic context. This paper presents an instructional method that requires deliberate practice of self-regulated…
ERIC Educational Resources Information Center
Lepp, Margret; Zorn, CeCelia R.; Duffy, Patricia R.; Dickson, Rana J.
2003-01-01
A nursing course connected U.S. and Swedish sites via interactive videoconferencing and used reflective methods (journaling, drama, photo language) and off-air group discussion. Evaluation by five Swedish and seven U.S. students suggested how reflection moved students toward greater authenticity and professionalism in nursing practice. (Contains…
Next Generation Trusted Radiation Identification System (NG-TRIS).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flynn, Adam J.; Amai, Wendy A.; Merkle, Peter Benedict
2010-05-01
The original Trusted Radiation Identification System (TRIS) was developed from 1999-2001, featuring information barrier technology to collect gamma radiation template measurements useful for arms control regime operations. The first TRIS design relied upon a multichannel analyzer (MCA) that was external to the protected volume of the system enclosure, undesirable from a system security perspective. An internal complex programmable logic device (CPLD) contained data which was not subject to software authentication. Physical authentication of the TRIS instrument case was performed by a sensitive but slow eddy-current inspection method. This paper describes progress to date for the Next Generation TRIS (NG-TRIS), whichmore » improves the TRIS design. We have incorporated the MCA internal to the trusted system volume, achieved full authentication of CPLD data, and have devised rapid methods to authenticate the system enclosure and weld seals of the NG-TRIS enclosure. For a complete discussion of the TRIS system and components upon which NG-TRIS is based, the reader is directed to the comprehensive user's manual and system reference of Seager, et al.« less
Novel Authentication of Monitoring Data Through the use of Secret and Public Cryptographic Keys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benz, Jacob M.; Tolk, Keith; Tanner, Jennifer E.
The Office of Nuclear Verification (ONV) is supporting the development of a piece of equipment to provide data authentication and protection for a suite of monitoring sensors as part of a larger effort to create an arms control technology toolkit. This device, currently called the Red Box, leverages the strengths of both secret and public cryptographic keys to authenticate, digitally sign, and pass along monitoring data to allow for host review, and redaction if necessary, without the loss of confidence in the authenticity of the data by the monitoring party. The design of the Red Box will allow for themore » addition and removal of monitoring equipment and can also verify that the data was collected by authentic monitoring equipment prior to signing the data and sending it to the host and for review. The host will then forward the data to the monitor for review and inspection. This paper will highlight the progress to date of the Red Box development, and will explain the novel method of leveraging both symmetric and asymmetric (secret and public key) cryptography to authenticate data within a warhead monitoring regime.« less
Person authentication using brainwaves (EEG) and maximum a posteriori model adaptation.
Marcel, Sébastien; Millán, José Del R
2007-04-01
In this paper, we investigate the use of brain activity for person authentication. It has been shown in previous studies that the brain-wave pattern of every individual is unique and that the electroencephalogram (EEG) can be used for biometric identification. EEG-based biometry is an emerging research topic and we believe that it may open new research directions and applications in the future. However, very little work has been done in this area and was focusing mainly on person identification but not on person authentication. Person authentication aims to accept or to reject a person claiming an identity, i.e., comparing a biometric data to one template, while the goal of person identification is to match the biometric data against all the records in a database. We propose the use of a statistical framework based on Gaussian Mixture Models and Maximum A Posteriori model adaptation, successfully applied to speaker and face authentication, which can deal with only one training session. We perform intensive experimental simulations using several strict train/test protocols to show the potential of our method. We also show that there are some mental tasks that are more appropriate for person authentication than others.
NASA Astrophysics Data System (ADS)
Al-Mansoori, Saeed; Kunhu, Alavi
2013-10-01
This paper proposes a blind multi-watermarking scheme based on designing two back-to-back encoders. The first encoder is implemented to embed a robust watermark into remote sensing imagery by applying a Discrete Cosine Transform (DCT) approach. Such watermark is used in many applications to protect the copyright of the image. However, the second encoder embeds a fragile watermark using `SHA-1' hash function. The purpose behind embedding a fragile watermark is to prove the authenticity of the image (i.e. tamper-proof). Thus, the proposed technique was developed as a result of new challenges with piracy of remote sensing imagery ownership. This led researchers to look for different means to secure the ownership of satellite imagery and prevent the illegal use of these resources. Therefore, Emirates Institution for Advanced Science and Technology (EIAST) proposed utilizing existing data security concept by embedding a digital signature, "watermark", into DubaiSat-1 satellite imagery. In this study, DubaiSat-1 images with 2.5 meter resolution are used as a cover and a colored EIAST logo is used as a watermark. In order to evaluate the robustness of the proposed technique, a couple of attacks are applied such as JPEG compression, rotation and synchronization attacks. Furthermore, tampering attacks are applied to prove image authenticity.
Biometric Authentication for Gender Classification Techniques: A Review
NASA Astrophysics Data System (ADS)
Mathivanan, P.; Poornima, K.
2017-12-01
One of the challenging biometric authentication applications is gender identification and age classification, which captures gait from far distance and analyze physical information of the subject such as gender, race and emotional state of the subject. It is found that most of the gender identification techniques have focused only with frontal pose of different human subject, image size and type of database used in the process. The study also classifies different feature extraction process such as, Principal Component Analysis (PCA) and Local Directional Pattern (LDP) that are used to extract the authentication features of a person. This paper aims to analyze different gender classification techniques that help in evaluating strength and weakness of existing gender identification algorithm. Therefore, it helps in developing a novel gender classification algorithm with less computation cost and more accuracy. In this paper, an overview and classification of different gender identification techniques are first presented and it is compared with other existing human identification system by means of their performance.
Application of visual cryptography for learning in optics and photonics
NASA Astrophysics Data System (ADS)
Mandal, Avikarsha; Wozniak, Peter; Vauderwange, Oliver; Curticapean, Dan
2016-09-01
In the age data digitalization, important applications of optics and photonics based sensors and technology lie in the field of biometrics and image processing. Protecting user data in a safe and secure way is an essential task in this area. However, traditional cryptographic protocols rely heavily on computer aided computation. Secure protocols which rely only on human interactions are usually simpler to understand. In many scenarios development of such protocols are also important for ease of implementation and deployment. Visual cryptography (VC) is an encryption technique on images (or text) in which decryption is done by human visual system. In this technique, an image is encrypted into number of pieces (known as shares). When the printed shares are physically superimposed together, the image can be decrypted with human vision. Modern digital watermarking technologies can be combined with VC for image copyright protection where the shares can be watermarks (small identification) embedded in the image. Similarly, VC can be used for improving security of biometric authentication. This paper presents about design and implementation of a practical laboratory experiment based on the concept of VC for a course in media engineering. Specifically, our contribution deals with integration of VC in different schemes for applications like digital watermarking and biometric authentication in the field of optics and photonics. We describe theoretical concepts and propose our infrastructure for the experiment. Finally, we will evaluate the learning outcome of the experiment, performed by the students.
CosmoQuest: Creative Engagement & Citizen Science Ignite Authentic Science
NASA Astrophysics Data System (ADS)
Cobb, W. H.; Noel-Storr, J.; Tweed, A.; Asplund, S.; Aiello, M. P.; Lebofsky, L. A.; Chilton, H.; Gay, P.
2016-12-01
The CosmoQuest Virtual Research Facility offers in-depth experiences to diverse audiences nationally and internationally through pioneering citizen science. An endeavor between universities, research institutes, and NASA centers, CosmoQuest brings together scientists, educators, researchers, programmers—and individuals of all ages—to explore and make sense of our solar system and beyond. CosmoQuest creates pathways for engaging diverse audiences in authentic science, encouraging scientists to engage with learners, and learners to engage with scientists. Here is a sequence of activities developed by CosmoQuest, leveraging a NASA Discovery and New Frontiers Programs activity developed for the general STEAM community, that activates STEM learning. The Spark: Igniting Curiosity Art and the Cosmic Connection uses the elements of art—shape, line, color, texture, value—to hone observation skills and inspire questions. Learners explore NASA image data from celestial bodies in our solar system—planets, asteroids, moons. They investigate their geology, analyzing features and engaging in scientific discourse rising from evidence while creating a beautiful piece of art. The Fuel: Making Connections Crater Comparisons explore authentic NASA image data sets, engrossing learners at a deeper level. With skills learned in Art and the Cosmic Connection, learners analyze specific image sets with the feedback of mission team members. The Burn: Evolving Community Become a Solar System Mapper. Investigate and analyze NASA mission image data of Mars, Mercury, the Moon and Vesta through CosmoQuest's citizen science projects. Learners make real-world connections while contributing to NASA science. Scaffolded by an educational framework that inspires 21st century learners, CosmoQuest engages people in analyzing and interpreting real NASA data, inspiring questions, defining problems, and realizing their potential to contribute to genuine scientific results. Through social channels, CosmoQuest empowers and expands its community, including science and education-focused hangouts, virtual star parties, and diverse social media. CosmoQuest offers a hub for excellent resources throughout NASA and the larger astronomy community and fosters the conversations they inspire.
Communicating food safety, authenticity and consumer choice. Field experiences.
Syntesa, Heiner Lehr
2013-04-01
The paper reviews patented and non-patented technologies, methods and solutions in the area of food traceability. It pays special attention to the communication of food safety, authenticity and consumer choice. Twenty eight recent patents are reviewed in the areas of (secure) identification, product freshness indicators, meat traceability, (secure) transport of information along the supply chain, country/region/place of origin, automated authentication, supply chain management systems, consumer interaction systems. In addition, solutions and pilot projects are described in the areas of Halal traceability, traceability of bird's nests, cold chain management, general food traceability and other areas.
Palmprint authentication using multiple classifiers
NASA Astrophysics Data System (ADS)
Kumar, Ajay; Zhang, David
2004-08-01
This paper investigates the performance improvement for palmprint authentication using multiple classifiers. The proposed methods on personal authentication using palmprints can be divided into three categories; appearance- , line -, and texture-based. A combination of these approaches can be used to achieve higher performance. We propose to simultaneously extract palmprint features from PCA, Line detectors and Gabor-filters and combine their corresponding matching scores. This paper also investigates the comparative performance of simple combination rules and the hybrid fusion strategy to achieve performance improvement. Our experimental results on the database of 100 users demonstrate the usefulness of such approach over those based on individual classifiers.
Naveena, Basappa M; Jagadeesh, Deepak S; Kamuni, Veeranna; Muthukumar, Muthupalani; Kulkarni, Vinayak V; Kiran, Mohan; Rapole, Srikanth
2018-02-01
Fraudulent mislabelling of processed meat products on a global scale that cannot be detected using conventional techniques necessitates sensitive, robust and accurate methods of meat authentication to ensure food safety and public health. In the present study, we developed an in-gel (two-dimensional gel electrophoresis, 2DE) and OFFGEL-based proteomic method for authenticating raw and cooked water buffalo (Bubalus bubalis), sheep (Ovis aries) and goat (Caprus hircus) meat and their mixes. The matrix-assisted liquid desorption/ionization time-of-flight mass spectrometric analysis of proteins separated using 2DE or OFFGEL electrophoresis delineated species-specific peptide biomarkers derived from myosin light chain 1 and 2 (MLC1 and MLC2) of buffalo-sheep-goat meat mix in definite proportions at 98:1:1, 99:0.5:0.5 and 99.8:0.1:0.1 that were found stable to resist thermal processing. In-gel and OFFGEL-based proteomic approaches are efficient in authenticating meat mixes spiked at minimum 1.0% and 0.1% levels, respectively, in triple meat mix for both raw and cooked samples. The study demonstrated that authentication of meat from a complex mix of three closely related species requires identification of more than one species-specific peptide due to close similarity between their amino acid sequences. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
High-quality JPEG compression history detection for fake uncompressed images
NASA Astrophysics Data System (ADS)
Zhang, Rong; Wang, Rang-Ding; Guo, Li-Jun; Jiang, Bao-Chuan
2017-05-01
Authenticity is one of the most important evaluation factors of images for photography competitions or journalism. Unusual compression history of an image often implies the illicit intent of its author. Our work aims at distinguishing real uncompressed images from fake uncompressed images that are saved in uncompressed formats but have been previously compressed. To detect the potential image JPEG compression, we analyze the JPEG compression artifacts based on the tetrolet covering, which corresponds to the local image geometrical structure. Since the compression can alter the structure information, the tetrolet covering indexes may be changed if a compression is performed on the test image. Such changes can provide valuable clues about the image compression history. To be specific, the test image is first compressed with different quality factors to generate a set of temporary images. Then, the test image is compared with each temporary image block-by-block to investigate whether the tetrolet covering index of each 4×4 block is different between them. The percentages of the changed tetrolet covering indexes corresponding to the quality factors (from low to high) are computed and used to form the p-curve, the local minimum of which may indicate the potential compression. Our experimental results demonstrate the advantage of our method to detect JPEG compressions of high quality, even the highest quality factors such as 98, 99, or 100 of the standard JPEG compression, from uncompressed-format images. At the same time, our detection algorithm can accurately identify the corresponding compression quality factor.
Hackländer, T; Kleber, K; Schneider, H; Demabre, N; Cramer, B M
2004-08-01
To build an infrastructure that enables radiologists on-call and external users a teleradiological access to the HTML-based image distribution system inside the hospital via internet. In addition, no investment costs should arise on the user side and the image data should be sent renamed using cryptographic techniques. A pure HTML-based system manages the image distribution inside the hospital, with an open source project extending this system through a secure gateway outside the firewall of the hospital. The gateway handles the communication between the external users and the HTML server within the network of the hospital. A second firewall is installed between the gateway and the external users and builds up a virtual private network (VPN). A connection between the gateway and the external user is only acknowledged if the computers involved authenticate each other via certificates and the external users authenticate via a multi-stage password system. All data are transferred encrypted. External users get only access to images that have been renamed to a pseudonym by means of automated processing before. With an ADSL internet access, external users achieve an image load frequency of 0.4 CT images per second. More than 90 % of the delay during image transfer results from security checks within the firewalls. Data passing the gateway induce no measurable delay. Project goals were realized by means of an infrastructure that works vendor independently with any HTML-based image distribution systems. The requirements of data security were realized using state-of-the-art web techniques. Adequate access and transfer speed lead to a widespread acceptance of the system on the part of external users.
Method and system for source authentication in group communications
NASA Technical Reports Server (NTRS)
Roy-Chowdhury, Ayan (Inventor); Baras, John S. (Inventor)
2013-01-01
A method and system for authentication is provided. A central node for issuing certificates to a plurality of nodes associated with the central node in a network is also provided. The central node receives a first key from at least one node from among the plurality of nodes and generates a second key based on the received first key and generates a certificate for the at least one node. The generated certificate is transmitted to the at least one node.
Authentication based on gestures with smartphone in hand
NASA Astrophysics Data System (ADS)
Varga, Juraj; Švanda, Dominik; Varchola, Marek; Zajac, Pavol
2017-08-01
We propose a new method of authentication for smartphones and similar devices based on gestures made by user with the device itself. The main advantage of our method is that it combines subtle biometric properties of the gesture (something you are) with a secret information that can be freely chosen by the user (something you know). Our prototype implementation shows that the scheme is feasible in practice. Further development, testing and fine tuning of parameters is required for deployment in the real world.
Hand biometric recognition based on fused hand geometry and vascular patterns.
Park, GiTae; Kim, Soowon
2013-02-28
A hand biometric authentication method based on measurements of the user's hand geometry and vascular pattern is proposed. To acquire the hand geometry, the thickness of the side view of the hand, the K-curvature with a hand-shaped chain code, the lengths and angles of the finger valleys, and the lengths and profiles of the fingers were used, and for the vascular pattern, the direction-based vascular-pattern extraction method was used, and thus, a new multimodal biometric approach is proposed. The proposed multimodal biometric system uses only one image to extract the feature points. This system can be configured for low-cost devices. Our multimodal biometric-approach hand-geometry (the side view of the hand and the back of hand) and vascular-pattern recognition method performs at the score level. The results of our study showed that the equal error rate of the proposed system was 0.06%.
Hand Biometric Recognition Based on Fused Hand Geometry and Vascular Patterns
Park, GiTae; Kim, Soowon
2013-01-01
A hand biometric authentication method based on measurements of the user's hand geometry and vascular pattern is proposed. To acquire the hand geometry, the thickness of the side view of the hand, the K-curvature with a hand-shaped chain code, the lengths and angles of the finger valleys, and the lengths and profiles of the fingers were used, and for the vascular pattern, the direction-based vascular-pattern extraction method was used, and thus, a new multimodal biometric approach is proposed. The proposed multimodal biometric system uses only one image to extract the feature points. This system can be configured for low-cost devices. Our multimodal biometric-approach hand-geometry (the side view of the hand and the back of hand) and vascular-pattern recognition method performs at the score level. The results of our study showed that the equal error rate of the proposed system was 0.06%. PMID:23449119
Rock Art: Connecting to the Past.
ERIC Educational Resources Information Center
Knipe, Marianne
2001-01-01
Presents an activity for fourth-grade students in which they learn about ancient art and create their own authentic-looking rock sculptures with pictograms, or painted images. Explains how the students create their own rocks and then paint a pictograph on the rocks with brown paint. (CMK)
ERIC Educational Resources Information Center
Travaille, Madelaine; Adams, Sandra D.
2006-01-01
Studying "Caenorhabditis elegans" ("C. elegans") live cultures provides excellent opportunities for authentic inquiry in a high school anatomy and physiology or other biology lab course. Using a digital dissection microscope, a student can photograph the organism during various stages of development and study and analyze the images. In this…
Picture Books Peek behind Cultural Curtains.
ERIC Educational Resources Information Center
Marantz, Sylvia; Marantz, Kenneth
2000-01-01
Discusses culture in picture books in three general categories: legends and histories; current life in particular areas; and the immigrant experience. Considers the translation of visual images, discusses authentic interpretations, and presents an annotated bibliography of picture books showing cultural diversity including African, Asian, Mexican,…
An implementation of wireless medical image transmission system on mobile devices.
Lee, SangBock; Lee, Taesoo; Jin, Gyehwan; Hong, Juhyun
2008-12-01
The advanced technology of computing system was followed by the rapid improvement of medical instrumentation and patient record management system. The typical examples are hospital information system (HIS) and picture archiving and communication system (PACS), which computerized the management procedure of medical records and images in hospital. Because these systems were built and used in hospitals, doctors out of hospital have problems to access them immediately on emergent cases. To solve these problems, this paper addressed the realization of system that could transmit the images acquired by medical imaging systems in hospital to the remote doctors' handheld PDA's using CDMA cellular phone network. The system consists of server and PDA. The server was developed to manage the accounts of doctors and patients and allocate the patient images to each doctor. The PDA was developed to display patient images through remote server connection. To authenticate the personal user, remote data access (RDA) method was used in PDA accessing the server database and file transfer protocol (FTP) was used to download patient images from the remove server. In laboratory experiments, it was calculated to take ninety seconds to transmit thirty images with 832 x 488 resolution and 24 bit depth and 0.37 Mb size. This result showed that the developed system has no problems for remote doctors to receive and review the patient images immediately on emergent cases.
Castonguay, Andree L; Gilchrist, Jenna D; Mack, Diane E; Sabiston, Catherine M
2013-06-01
This study explored body-related emotional experiences of pride in young adult males (n=138) and females (n=165). Data were collected using a relived emotion task and analyzed using inductive content analysis. Thirty-nine codes were identified and grouped into six categories (triggers, contexts, cognitive attributions, and affective, cognitive, and behavioral outcomes) for each of two themes (hubristic and authentic pride). Hubristic pride triggers included evaluating appearance/fitness as superior. Cognitions centered on feelings of superiority. Behaviors included strategies to show off. Triggers for authentic pride were personal improvements/maintenance in appearance and meeting or exceeding goals. Feeling accomplished was a cognitive outcome, and physical activity was a behavioral strategy. Contexts for the experience of both facets of pride primarily involved sports settings, swimming/beach, and clothes shopping. These findings provide theoretical support for models of pride as it applies to body image, and advances conceptual understanding of positive body image. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Kinay, Ismail; Bagçeci, Birsen
2016-01-01
The purpose of this study was to investigate the effect of authentic assessment, an approach used in Scientific Research Methods, on problem solving skills of prospective classroom teachers. The participant groups of the study consisted of sophomore prospective teachers who study at Dicle University in the Ziya Gökalp Education Faculty Classroom…
Perceptions of Biometric Experts on Whether or Not Biometric Modalities Will Combat Identity Fraud
ERIC Educational Resources Information Center
Edo, Galaxy Samson
2012-01-01
Electronic-authentication methods, no matter how sophisticated they are in preventing fraud, must be able to identify people to a reasonable degree of certainty before any credentials are assured (Personix, 2006). User authentication is different from identity verification, and both are separate but vital steps in the process of securing…
ERIC Educational Resources Information Center
Yardley, Sarah; Brosnan, Caragh; Richardson, Jane; Hays, Richard
2013-01-01
This paper addresses the question "what are the variables influencing social interactions and learning during Authentic Early Experience (AEE)?" AEE is a complex educational intervention for new medical students. Following critique of the existing literature, multiple qualitative methods were used to create a study framework conceptually…
NASA Astrophysics Data System (ADS)
Sugi, Slamet, Achmad; Martono, S.
2018-03-01
Teachers' performance in Temanggung in 2016 did not show maximal result. It was shown from many indicators. The low score of UN, UKG and PKB result. Individual performance was different. Achievement motivation could be seen through their attitude and behavior performances. The purpose of this research is to know the effect of authentic leadership, organizational justice, and achievement motivation on teachers' performance. The objects of this research are authentic leadership, organizational justice, achievement motivation and teachers' performance in Vocational High School Seventeen in Temanggung. The research method used is quantitative. Data collection was done by questioners. Then, the data were analyzed by using Path SPSS 16. The result of this research showed that authentic leadership, organizational justice, achievement motivation had significant effect on teachers' performance in Vocational High School Seventeen in Temanggung.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kaneko, Masahiro; Kakinuma, Ryutaro; Moriyama, Noriyuki
2010-03-01
Diagnostic MDCT imaging requires a considerable number of images to be read. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. Because of such a background, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis. We also have developed the teleradiology network system by using web medical image conference system. In the teleradiology network system, the security of information network is very important subjects. Our teleradiology network system can perform Web medical image conference in the medical institutions of a remote place using the web medical image conference system. We completed the basic proof experiment of the web medical image conference system with information security solution. We can share the screen of web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with the workstation that builds in some diagnostic assistance methods. Biometric face authentication used on site of teleradiology makes "Encryption of file" and "Success in login" effective. Our Privacy and information security technology of information security solution ensures compliance with Japanese regulations. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new teleradiology network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our teleradiology network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
A novel secret sharing with two users based on joint transform correlator and compressive sensing
NASA Astrophysics Data System (ADS)
Zhao, Tieyu; Chi, Yingying
2018-05-01
Recently, joint transform correlator (JTC) has been widely applied to image encryption and authentication. This paper presents a novel secret sharing scheme with two users based on JTC. Two users must be present during the decryption that the system has high security and reliability. In the scheme, two users use their fingerprints to encrypt plaintext, and they can decrypt only if both of them provide the fingerprints which are successfully authenticated. The linear relationship between the plaintext and ciphertext is broken using the compressive sensing, which can resist existing attacks on JTC. The results of the theoretical analysis and numerical simulation confirm the validity of the system.
An Intelligent Fingerprint-Biometric Image Scrambling Scheme
NASA Astrophysics Data System (ADS)
Khan, Muhammad Khurram; Zhang, Jiashu
To obstruct the attacks, and to hamper with the liveness and retransmission issues of biometrics images, we have researched on the challenge/response-based biometrics scrambled image transmission. We proposed an intelligent biometrics sensor, which has computational power to receive challenges from the authentication server and generate response against the challenge with the encrypted biometric image. We utilized the FRT for biometric image encryption and used its scaling factors and random phase mask as the additional secret keys. In addition, we chaotically generated the random phase masks by a chaotic map to further improve the encryption security. Experimental and simulation results have shown that the presented system is secure, robust, and deters the risks of attacks of biometrics image transmission.
Bosque-Sendra, Juan M; Cuadros-Rodríguez, Luis; Ruiz-Samblás, Cristina; de la Mata, A Paulina
2012-04-29
The characterization and authentication of fats and oils is a subject of great importance for market and health aspects. Identification and quantification of triacylglycerols in fats and oils can be excellent tools for detecting changes in their composition due to the mixtures of these products. Most of the triacylglycerol species present in either fats or oils could be analyzed and identified by chromatographic methods. However, the natural variability of these samples and the possible presence of adulterants require the application of chemometric pattern recognition methods to facilitate the interpretation of the obtained data. In view of the growing interest in this topic, this paper reviews the literature of the application of exploratory and unsupervised/supervised chemometric methods on chromatographic data, using triacylglycerol composition for the characterization and authentication of several foodstuffs such as olive oil, vegetable oils, animal fats, fish oils, milk and dairy products, cocoa and coffee. Copyright © 2012 Elsevier B.V. All rights reserved.
Personal authentication through dorsal hand vein patterns
NASA Astrophysics Data System (ADS)
Hsu, Chih-Bin; Hao, Shu-Sheng; Lee, Jen-Chun
2011-08-01
Biometric identification is an emerging technology that can solve security problems in our networked society. A reliable and robust personal verification approach using dorsal hand vein patterns is proposed in this paper. The characteristic of the approach needs less computational and memory requirements and has a higher recognition accuracy. In our work, the near-infrared charge-coupled device (CCD) camera is adopted as an input device for capturing dorsal hand vein images, it has the advantages of the low-cost and noncontact imaging. In the proposed approach, two finger-peaks are automatically selected as the datum points to define the region of interest (ROI) in the dorsal hand vein images. The modified two-directional two-dimensional principal component analysis, which performs an alternate two-dimensional PCA (2DPCA) in the column direction of images in the 2DPCA subspace, is proposed to exploit the correlation of vein features inside the ROI between images. The major advantage of the proposed method is that it requires fewer coefficients for efficient dorsal hand vein image representation and recognition. The experimental results on our large dorsal hand vein database show that the presented schema achieves promising performance (false reject rate: 0.97% and false acceptance rate: 0.05%) and is feasible for dorsal hand vein recognition.
Automatic Blocked Roads Assessment after Earthquake Using High Resolution Satellite Imagery
NASA Astrophysics Data System (ADS)
Rastiveis, H.; Hosseini-Zirdoo, E.; Eslamizade, F.
2015-12-01
In 2010, an earthquake in the city of Port-au-Prince, Haiti, happened quite by chance an accident and killed over 300000 people. According to historical data such an earthquake has not occurred in the area. Unpredictability of earthquakes has necessitated the need for comprehensive mitigation efforts to minimize deaths and injuries. Blocked roads, caused by debris of destroyed buildings, may increase the difficulty of rescue activities. In this case, a damage map, which specifies blocked and unblocked roads, can be definitely helpful for a rescue team. In this paper, a novel method for providing destruction map based on pre-event vector map and high resolution world view II satellite images after earthquake, is presented. For this purpose, firstly in pre-processing step, image quality improvement and co-coordination of image and map are performed. Then, after extraction of texture descriptor from the image after quake and SVM classification, different terrains are detected in the image. Finally, considering the classification results, specifically objects belong to "debris" class, damage analysis are performed to estimate the damage percentage. In this case, in addition to the area objects in the "debris" class their shape should also be counted. The aforementioned process are performed on all the roads in the road layer.In this research, pre-event digital vector map and post-event high resolution satellite image, acquired by Worldview-2, of the city of Port-au-Prince, Haiti's capital, were used to evaluate the proposed method. The algorithm was executed on 1200×800 m2 of the data set, including 60 roads, and all the roads were labelled correctly. The visual examination have authenticated the abilities of this method for damage assessment of urban roads network after an earthquake.
Uses of software in digital image analysis: a forensic report
NASA Astrophysics Data System (ADS)
Sharma, Mukesh; Jha, Shailendra
2010-02-01
Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.
Do placebo based validation standards mimic real batch products behaviour? Case studies.
Bouabidi, A; Talbi, M; Bouklouze, A; El Karbane, M; Bourichi, H; El Guezzar, M; Ziemons, E; Hubert, Ph; Rozet, E
2011-06-01
Analytical methods validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application. Validation usually involves validation standards or quality control samples that are prepared in placebo or reconstituted matrix made of a mixture of all the ingredients composing the drug product except the active substance or the analyte under investigation. However, one of the main concerns that can be made with this approach is that it may lack an important source of variability that come from the manufacturing process. The question that remains at the end of the validation step is about the transferability of the quantitative performance from validation standards to real authentic drug product samples. In this work, this topic is investigated through three case studies. Three analytical methods were validated using the commonly spiked placebo validation standards at several concentration levels as well as using samples coming from authentic batch samples (tablets and syrups). The results showed that, depending on the type of response function used as calibration curve, there were various degrees of differences in the results accuracy obtained with the two types of samples. Nonetheless the use of spiked placebo validation standards was showed to mimic relatively well the quantitative behaviour of the analytical methods with authentic batch samples. Adding these authentic batch samples into the validation design may help the analyst to select and confirm the most fit for purpose calibration curve and thus increase the accuracy and reliability of the results generated by the method in routine application. Copyright © 2011 Elsevier B.V. All rights reserved.
Authentication of Piper betle L. folium and quantification of their antifungal-activity.
Wirasuta, I Made Agus Gelgel; Srinadi, I Gusti Ayu Made; Dwidasmara, Ida Bagus Gede; Ardiyanti, Ni Luh Putu Putri; Trisnadewi, I Gusti Ayu Arya; Paramita, Ni Luh Putu Vidya
2017-07-01
The TLC profiles of intra- and inter-day precision for Piper betle L . (PBL) folium methanol extract was studied for their peak marker recognition and identification. The Numerical chromatographic parameters (NCPs) of the peak markers, the hierarchical clustering analysis (HCA) and the principal component analysis (PCA) were applied to authenticate the PBL. folium extract from other Piper species folium extract and to ensure the antifungal activity quality of the PBL essential oil. The spotted extract was developed with the mobile phase of toluene: ethyl acetate; 93:7, (v/v). The eluted plate was viewed with the TLC-Visualizer, scanned under absorption and fluorescent mode detection, and on each sample the in-situ UV spectra were recorded between 190 to 400 nm. The NCPs profiles of intra- and inter-day precision results offered multi-dimensional chromatogram fingerprints for better marker peak pattern recognition and identification. Using the r -value fingerprints data series generated with this method allowed more precise discrimination the PBL. from other Piper species compared to the marker peak area fingerprint method. The cosine pair comparison was a simple method for authentication of two different fingerprints. The ward linkage clustering and the pair cross-correlation comparison were better chemometric methods to determine the consistency peak area ratio between fingerprints. The first component PCA-loading values of peak marker area fingerprints were correlated linearly to both the bio-marker concentration as well as the antifungal activity. This relationship could be used to control the quality and pharmacological potency. This simple method was developed for the authentication and quantification of herbal medicine.
Gaillard, Laetitia; Guyon, Francois; Salagoïty, Marie-Hélène; Médina, Bernard
2013-12-01
A procedure to detect whether carbon dioxide was added to French ciders has been developed. For this purpose, an optimised and simplified method is proposed to determine (13)C/(12)C isotope ratio of carbon dioxide (δ(13)C) in ciders. Three critical steps were checked: (1) influence of atmospheric CO2 remaining in the loaded vial, (2) impact of helium flush, (3) sampling speed. This study showed that atmospheric CO2 does not impact the measurement, that helium flush can lead to isotopic fractionation and finally, that a fractionation occurs only 5h after bottle opening. The method, without any other preparation, consists in sampling 0.2 mL of cold (4 °C) cider in a vial that is passed in an ultrasonic bath for 10 min at room temperature to enhance cider de-carbonation. The headspace CO2 is then analysed using the link Multiflow®-isotope ratio mass spectrometer. Each year, a data bank is developed by fermenting authentic apples juices in order to control cider authenticity. Over a four year span (2008-2011), the CO2 produced during the fermentation step was studied. This set of 61 authentic ciders, from various French production areas, was used to determine a δ(13)C value range of -22.59±0.92‰ for authentic ciders CO2 bubbles. 75 commercial ciders were analysed with this method. Most of the samples analysed present a gas δ(13)C value in the expected range. Nevertheless, some ciders have δ(13)C values outside the 3σ limit, revealing carbonation by technical CO2. This practice is not allowed for organic, "Controlled Appellation of Origin" ciders and ciders specifying natural carbonation on the label. Copyright © 2013 Elsevier Ltd. All rights reserved.
Cheng, Chunsong; Yuan, Qingxi; Zhou, Hua; Huang, Luqi
2016-02-01
Growth-year authentication has extraordinary significance for plant growth, structure and development research, and has a wide range of applications in value assessment of economic crops. Panax ginseng is the most commonly used medicinal plant in Asian countries. The fix number of growth-year is an important quality evaluation which is difficult to be obtained accurately in current technical conditions. Preliminary authentication theory for growth-year has been described in previous studies using a short-lived perennial medicinal plant (Paeonia lactiflora pall.) as the research material. In this research, we focused on the growth-year estimation in ginseng cultivars, and attempt to explore the age estimation method for vascular plants according to mathematical simulation of the root structure development. Micro data was obtained from 204 individuals of 3 different kinds of ginseng cultivars, which have a series of gradient age and a clear growth record. Outer diameter of the vascular cambium (b) and the radius of cross section (r) were measured with ordinary stereo microscope. We further designed and established two different kinds of authentication model based on the taproot structure development for growth year authentication (P =β*M-α and M = K*X1 (a) (1) X2 (a) (2) ). Moreover, the models were applied to identify the growth year of ginseng without damage using Micro-CT or DEI reconstruction. A potential method, have been recently described, the age of ginseng can be analyzed by telomere length and telomerase activity. However, we found that there are different results indicated in other species. We concluded that microscopic methods perceived currently were provided a more effective means for growth-year authentication. © 2016 Wiley Periodicals, Inc.
Evaluation of the automatic optical authentication technologies for control systems of objects
NASA Astrophysics Data System (ADS)
Averkin, Vladimir V.; Volegov, Peter L.; Podgornov, Vladimir A.
2000-03-01
The report considers the evaluation of the automatic optical authentication technologies for the automated integrated system of physical protection, control and accounting of nuclear materials at RFNC-VNIITF, and for providing of the nuclear materials nonproliferation regime. The report presents the nuclear object authentication objectives and strategies, the methodology of the automatic optical authentication and results of the development of pattern recognition techniques carried out under the ISTC project #772 with the purpose of identification of unique features of surface structure of a controlled object and effects of its random treatment. The current decision of following functional control tasks is described in the report: confirmation of the item authenticity (proof of the absence of its substitution by an item of similar shape), control over unforeseen change of item state, control over unauthorized access to the item. The most important distinctive feature of all techniques is not comprehensive description of some properties of controlled item, but unique identification of item using minimum necessary set of parameters, properly comprising identification attribute of the item. The main emphasis in the technical approach is made on the development of rather simple technological methods for the first time intended for use in the systems of physical protection, control and accounting of nuclear materials. The developed authentication devices and system are described.
Performance Analysis of Motion-Sensor Behavior for User Authentication on Smartphones
Shen, Chao; Yu, Tianwen; Yuan, Sheng; Li, Yunpeng; Guan, Xiaohong
2016-01-01
The growing trend of using smartphones as personal computing platforms to access and store private information has stressed the demand for secure and usable authentication mechanisms. This paper investigates the feasibility and applicability of using motion-sensor behavior data for user authentication on smartphones. For each sample of the passcode, sensory data from motion sensors are analyzed to extract descriptive and intensive features for accurate and fine-grained characterization of users’ passcode-input actions. One-class learning methods are applied to the feature space for performing user authentication. Analyses are conducted using data from 48 participants with 129,621 passcode samples across various operational scenarios and different types of smartphones. Extensive experiments are included to examine the efficacy of the proposed approach, which achieves a false-rejection rate of 6.85% and a false-acceptance rate of 5.01%. Additional experiments on usability with respect to passcode length, sensitivity with respect to training sample size, scalability with respect to number of users, and flexibility with respect to screen size were provided to further explore the effectiveness and practicability. The results suggest that sensory data could provide useful authentication information, and this level of performance approaches sufficiency for two-factor authentication on smartphones. Our dataset is publicly available to facilitate future research. PMID:27005626
Performance Analysis of Motion-Sensor Behavior for User Authentication on Smartphones.
Shen, Chao; Yu, Tianwen; Yuan, Sheng; Li, Yunpeng; Guan, Xiaohong
2016-03-09
The growing trend of using smartphones as personal computing platforms to access and store private information has stressed the demand for secure and usable authentication mechanisms. This paper investigates the feasibility and applicability of using motion-sensor behavior data for user authentication on smartphones. For each sample of the passcode, sensory data from motion sensors are analyzed to extract descriptive and intensive features for accurate and fine-grained characterization of users' passcode-input actions. One-class learning methods are applied to the feature space for performing user authentication. Analyses are conducted using data from 48 participants with 129,621 passcode samples across various operational scenarios and different types of smartphones. Extensive experiments are included to examine the efficacy of the proposed approach, which achieves a false-rejection rate of 6.85% and a false-acceptance rate of 5.01%. Additional experiments on usability with respect to passcode length, sensitivity with respect to training sample size, scalability with respect to number of users, and flexibility with respect to screen size were provided to further explore the effectiveness and practicability. The results suggest that sensory data could provide useful authentication information, and this level of performance approaches sufficiency for two-factor authentication on smartphones. Our dataset is publicly available to facilitate future research.
BOKP: A DNA Barcode Reference Library for Monitoring Herbal Drugs in the Korean Pharmacopeia
Liu, Jinxin; Shi, Linchun; Song, Jingyuan; Sun, Wei; Han, Jianping; Liu, Xia; Hou, Dianyun; Yao, Hui; Li, Mingyue; Chen, Shilin
2017-01-01
Herbal drug authentication is an important task in traditional medicine; however, it is challenged by the limitations of traditional authentication methods and the lack of trained experts. DNA barcoding is conspicuous in almost all areas of the biological sciences and has already been added to the British pharmacopeia and Chinese pharmacopeia for routine herbal drug authentication. However, DNA barcoding for the Korean pharmacopeia still requires significant improvements. Here, we present a DNA barcode reference library for herbal drugs in the Korean pharmacopeia and developed a species identification engine named KP-IDE to facilitate the adoption of this DNA reference library for the herbal drug authentication. Using taxonomy records, specimen records, sequence records, and reference records, KP-IDE can identify an unknown specimen. Currently, there are 6,777 taxonomy records, 1,054 specimen records, 30,744 sequence records (ITS2 and psbA-trnH) and 285 reference records. Moreover, 27 herbal drug materials were collected from the Seoul Yangnyeongsi herbal medicine market to give an example for real herbal drugs authentications. Our study demonstrates the prospects of the DNA barcode reference library for the Korean pharmacopeia and provides future directions for the use of DNA barcoding for authenticating herbal drugs listed in other modern pharmacopeias. PMID:29326593
Metabolite Profiling and Classification of DNA-Authenticated Licorice Botanicals
Simmler, Charlotte; Anderson, Jeffrey R.; Gauthier, Laura; Lankin, David C.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2015-01-01
Raw licorice roots represent heterogeneous materials obtained from mainly three Glycyrrhiza species. G. glabra, G. uralensis, and G. inflata exhibit marked metabolite differences in terms of flavanones (Fs), chalcones (Cs), and other phenolic constituents. The principal objective of this work was to develop complementary chemometric models for the metabolite profiling, classification, and quality control of authenticated licorice. A total of 51 commercial and macroscopically verified samples were DNA authenticated. Principal component analysis and canonical discriminant analysis were performed on 1H NMR spectra and area under the curve values obtained from UHPLC-UV chromatograms, respectively. The developed chemometric models enable the identification and classification of Glycyrrhiza species according to their composition in major Fs, Cs, and species specific phenolic compounds. Further key outcomes demonstrated that DNA authentication combined with chemometric analyses enabled the characterization of mixtures, hybrids, and species outliers. This study provides a new foundation for the botanical and chemical authentication, classification, and metabolomic characterization of crude licorice botanicals and derived materials. Collectively, the proposed methods offer a comprehensive approach for the quality control of licorice as one of the most widely used botanical dietary supplements. PMID:26244884
A Novel Physical Layer Assisted Authentication Scheme for Mobile Wireless Sensor Networks
Wang, Qiuhua
2017-01-01
Physical-layer authentication can address physical layer vulnerabilities and security threats in wireless sensor networks, and has been considered as an effective complementary enhancement to existing upper-layer authentication mechanisms. In this paper, to advance the existing research and improve the authentication performance, we propose a novel physical layer assisted authentication scheme for mobile wireless sensor networks. In our proposed scheme, we explore the reciprocity and spatial uncorrelation of the wireless channel to verify the identities of involved transmitting users and decide whether all data frames are from the same sender. In our proposed scheme, a new method is developed for the legitimate users to compare their received signal strength (RSS) records, which avoids the information from being disclosed to the adversary. Our proposed scheme can detect the spoofing attack even in a high dynamic environment. We evaluate our scheme through experiments under indoor and outdoor environments. Experiment results show that our proposed scheme is more efficient and achieves a higher detection rate as well as keeping a lower false alarm rate. PMID:28165423
NASA Astrophysics Data System (ADS)
Sukmawati, Zuhairoh, Faihatuz
2017-05-01
The purpose of this research was to develop authentic assessment model based on showcase portfolio on learning of mathematical problem solving. This research used research and development Method (R & D) which consists of four stages of development that: Phase I, conducting a preliminary study. Phase II, determining the purpose of developing and preparing the initial model. Phase III, trial test of instrument for the initial draft model and the initial product. The respondents of this research are the students of SMAN 8 and SMAN 20 Makassar. The collection of data was through observation, interviews, documentation, student questionnaire, and instrument tests mathematical solving abilities. The data were analyzed with descriptive and inferential statistics. The results of this research are authentic assessment model design based on showcase portfolio which involves: 1) Steps in implementing the authentic assessment based Showcase, assessment rubric of cognitive aspects, assessment rubric of affective aspects, and assessment rubric of skill aspect. 2) The average ability of the students' problem solving which is scored by using authentic assessment based on showcase portfolio was in high category and the students' response in good category.
A Novel Physical Layer Assisted Authentication Scheme for Mobile Wireless Sensor Networks.
Wang, Qiuhua
2017-02-04
Physical-layer authentication can address physical layer vulnerabilities and security threats in wireless sensor networks, and has been considered as an effective complementary enhancement to existing upper-layer authentication mechanisms. In this paper, to advance the existing research and improve the authentication performance, we propose a novel physical layer assisted authentication scheme for mobile wireless sensor networks. In our proposed scheme, we explore the reciprocity and spatial uncorrelation of the wireless channel to verify the identities of involved transmitting users and decide whether all data frames are from the same sender. In our proposed scheme, a new method is developed for the legitimate users to compare their received signal strength (RSS) records, which avoids the information from being disclosed to the adversary. Our proposed scheme can detect the spoofing attack even in a high dynamic environment. We evaluate our scheme through experiments under indoor and outdoor environments. Experiment results show that our proposed scheme is more efficient and achieves a higher detection rate as well as keeping a lower false alarm rate.
The construction of a public key infrastructure for healthcare information networks in Japan.
Sakamoto, N
2001-01-01
The digital signature is a key technology in the forthcoming Internet society for electronic healthcare as well as for electronic commerce. Efficient exchanges of authorized information with a digital signature in healthcare information networks require a construction of a public key infrastructure (PKI). In order to introduce a PKI to healthcare information networks in Japan, we proposed a development of a user authentication system based on a PKI for user management, user authentication and privilege management of healthcare information systems. In this paper, we describe the design of the user authentication system and its implementation. The user authentication system provides a certification authority service and a privilege management service while it is comprised of a user authentication client and user authentication serves. It is designed on a basis of an X.509 PKI and is implemented with using OpenSSL and OpenLDAP. It was incorporated into the financial information management system for the national university hospitals and has been successfully working for about one year. The hospitals plan to use it as a user authentication method for their whole healthcare information systems. One implementation of the system is free to the national university hospitals with permission of the Japanese Ministry of Education, Culture, Sports, Science and Technology. Another implementation is open to the other healthcare institutes by support of the Medical Information System Development Center (MEDIS-DC). We are moving forward to a nation-wide construction of a PKI for healthcare information networks based on it.
Investigating the Effects of Authentic Childhood Games in Teaching English
ERIC Educational Resources Information Center
Hursen, Cigdem; Salaz, Dursun
2016-01-01
The purpose of this study is to investigate the effects of authentic childhood games in teaching English. The study group of the research consists of 43 5-year-old kindergarten students. The study was carried out in the first semester of 2014-2015 academic year. An experimental method was used in this study to state the effects of teaching English…
Jumhawan, Udi; Putri, Sastia Prama; Yusianto; Bamba, Takeshi; Fukusaki, Eiichiro
2015-11-01
Development of authenticity screening for Asian palm civet coffee, the world-renowned priciest coffee, was previously reported using metabolite profiling through gas chromatography/mass spectrometry (GC/MS). However, a major drawback of this approach is the high cost of the instrument and maintenance. Therefore, an alternative method is needed for quality and authenticity evaluation of civet coffee. A rapid, reliable and cost-effective analysis employing a universal detector, GC coupled with flame ionization detector (FID), and metabolite fingerprinting has been established for discrimination analysis of 37 commercial and non-commercial coffee beans extracts. gas chromatography/flame ionization detector (GC/FID) provided higher sensitivity over a similar range of detected compounds than GC/MS. In combination with multivariate analysis, GC/FID could successfully reproduce quality prediction from GC/MS for differentiation of commercial civet coffee, regular coffee and coffee blend with 50 wt % civet coffee content without prior metabolite details. Our study demonstrated that GC/FID-based metabolite fingerprinting can be effectively actualized as an alternative method for coffee authenticity screening in industries. Copyright © 2015. Published by Elsevier B.V.
Gerbig, Stefanie; Stern, Gerold; Brunn, Hubertus E; Düring, Rolf-Alexander; Spengler, Bernhard; Schulz, Sabine
2017-03-01
Direct analysis of fruit and vegetable surfaces is an important tool for in situ detection of food contaminants such as pesticides. We tested three different ways to prepare samples for the qualitative desorption electrospray ionization mass spectrometry (DESI-MS) analysis of 32 pesticides found on nine authentic fruits collected from food control. Best recovery rates for topically applied pesticides (88%) were found by analyzing the surface of a glass slide which had been rubbed against the surface of the food. Pesticide concentration in all samples was at or below the maximum residue level allowed. In addition to the high sensitivity of the method for qualitative analysis, quantitative or, at least, semi-quantitative information is needed in food control. We developed a DESI-MS method for the simultaneous determination of linear calibration curves of multiple pesticides of the same chemical class using normalization to one internal standard (ISTD). The method was first optimized for food extracts and subsequently evaluated for the quantification of pesticides in three authentic food extracts. Next, pesticides and the ISTD were applied directly onto food surfaces, and the corresponding calibration curves were obtained. The determination of linear calibration curves was still feasible, as demonstrated for three different food surfaces. This proof-of-principle method was used to simultaneously quantify two pesticides on an authentic sample, showing that the method developed could serve as a fast and simple preselective tool for disclosure of pesticide regulation violations. Graphical Abstract Multiple pesticide residues were detected and quantified in-situ from an authentic set of food items and extracts in a proof of principle study.
Randomized trials published in some Chinese journals: how many are randomized?
Wu, Taixiang; Li, Youping; Bian, Zhaoxiang; Liu, Guanjian; Moher, David
2009-01-01
Background The approximately 1100 medical journals now active in China are publishing a rapidly increasing number of research reports, including many studies identified by their authors as randomized controlled trials. It has been noticed that these reports mostly present positive results, and their quality and authenticity have consequently been called into question. We investigated the adequacy of randomization of clinical trials published in recent years in China to determine how many of them met acceptable standards for allocating participants to treatment groups. Methods The China National Knowledge Infrastructure electronic database was searched for reports of randomized controlled trials on 20 common diseases published from January 1994 to June 2005. From this sample, a subset of trials that appeared to have used randomization methods was selected. Twenty-one investigators trained in the relevant knowledge, communication skills and quality control issues interviewed the original authors of these trials about the participant randomization methods and related quality-control features of their trials. Results From an initial sample of 37,313 articles identified in the China National Knowledge Infrastructure database, we found 3137 apparent randomized controlled trials. Of these, 1452 were studies of conventional medicine (published in 411 journals) and 1685 were studies of traditional Chinese medicine (published in 352 journals). Interviews with the authors of 2235 of these reports revealed that only 207 studies adhered to accepted methodology for randomization and could on those grounds be deemed authentic randomized controlled trials (6.8%, 95% confidence interval 5.9–7.7). There was no statistically significant difference in the rate of authenticity between randomized controlled trials of traditional interventions and those of conventional interventions. Randomized controlled trials conducted at hospitals affiliated to medical universities were more likely to be authentic than trials conducted at level 3 and level 2 hospitals (relative risk 1.58, 95% confidence interval 1.18–2.13, and relative risk 14.42, 95% confidence interval 9.40–22.10, respectively). The likelihood of authenticity was higher in level 3 hospitals than in level 2 hospitals (relative risk 9.32, 95% confidence interval 5.83–14.89). All randomized controlled trials of pre-market drug clinical trial were authentic by our criteria. Of the trials conducted at university-affiliated hospitals, 56.3% were authentic (95% confidence interval 32.0–81.0). Conclusion Most reports of randomized controlled trials published in some Chinese journals lacked an adequate description of randomization. Similarly, most so called 'randomized controlled trials' were not real randomized controlled trials owing toa lack of adequate understanding on the part of the authors of rigorous clinical trial design. All randomized controlled trials of pre-market drug clinical trial included in this research were authentic. Randomized controlled trials conducted by authors in high level hospitals, especially in hospitals affiliated to medical universities had a higher rate of authenticity. That so many non-randomized controlled trials were published as randomized controlled trials reflected the fact that peer review needs to be improved and a good practice guide for peer review including how to identify the authenticity of the study urgently needs to be developed. PMID:19573242
Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil
2016-11-17
Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends' preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors.
Secure method for biometric-based recognition with integrated cryptographic functions.
Chiou, Shin-Yan
2013-01-01
Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.
Authentic Teachers: Student Criteria Perceiving Authenticity of Teachers
ERIC Educational Resources Information Center
De Bruyckere, Pedro; Kirschner, Paul A.
2016-01-01
Authenticity is seen by many as a key for good learning and education. There is talk of authentic instruction, authentic learning, authentic problems, authentic assessment, authentic tools and authentic teachers. The problem is that while authenticity is an often-used adjective describing almost all aspects of teaching and learning, the concept…
An EEG-Based Person Authentication System with Open-Set Capability Combining Eye Blinking Signals
Wu, Qunjian; Zeng, Ying; Zhang, Chi; Tong, Li; Yan, Bin
2018-01-01
The electroencephalogram (EEG) signal represents a subject’s specific brain activity patterns and is considered as an ideal biometric given its superior forgery prevention. However, the accuracy and stability of the current EEG-based person authentication systems are still unsatisfactory in practical application. In this paper, a multi-task EEG-based person authentication system combining eye blinking is proposed, which can achieve high precision and robustness. Firstly, we design a novel EEG-based biometric evoked paradigm using self- or non-self-face rapid serial visual presentation (RSVP). The designed paradigm could obtain a distinct and stable biometric trait from EEG with a lower time cost. Secondly, the event-related potential (ERP) features and morphological features are extracted from EEG signals and eye blinking signals, respectively. Thirdly, convolutional neural network and back propagation neural network are severally designed to gain the score estimation of EEG features and eye blinking features. Finally, a score fusion technology based on least square method is proposed to get the final estimation score. The performance of multi-task authentication system is improved significantly compared to the system using EEG only, with an increasing average accuracy from 92.4% to 97.6%. Moreover, open-set authentication tests for additional imposters and permanence tests for users are conducted to simulate the practical scenarios, which have never been employed in previous EEG-based person authentication systems. A mean false accepted rate (FAR) of 3.90% and a mean false rejected rate (FRR) of 3.87% are accomplished in open-set authentication tests and permanence tests, respectively, which illustrate the open-set authentication and permanence capability of our systems. PMID:29364848
An EEG-Based Person Authentication System with Open-Set Capability Combining Eye Blinking Signals.
Wu, Qunjian; Zeng, Ying; Zhang, Chi; Tong, Li; Yan, Bin
2018-01-24
The electroencephalogram (EEG) signal represents a subject's specific brain activity patterns and is considered as an ideal biometric given its superior forgery prevention. However, the accuracy and stability of the current EEG-based person authentication systems are still unsatisfactory in practical application. In this paper, a multi-task EEG-based person authentication system combining eye blinking is proposed, which can achieve high precision and robustness. Firstly, we design a novel EEG-based biometric evoked paradigm using self- or non-self-face rapid serial visual presentation (RSVP). The designed paradigm could obtain a distinct and stable biometric trait from EEG with a lower time cost. Secondly, the event-related potential (ERP) features and morphological features are extracted from EEG signals and eye blinking signals, respectively. Thirdly, convolutional neural network and back propagation neural network are severally designed to gain the score estimation of EEG features and eye blinking features. Finally, a score fusion technology based on least square method is proposed to get the final estimation score. The performance of multi-task authentication system is improved significantly compared to the system using EEG only, with an increasing average accuracy from 92.4% to 97.6%. Moreover, open-set authentication tests for additional imposters and permanence tests for users are conducted to simulate the practical scenarios, which have never been employed in previous EEG-based person authentication systems. A mean false accepted rate (FAR) of 3.90% and a mean false rejected rate (FRR) of 3.87% are accomplished in open-set authentication tests and permanence tests, respectively, which illustrate the open-set authentication and permanence capability of our systems.
Progress and challenges associated with halal authentication of consumer packaged goods.
Premanandh, Jagadeesan; Bin Salem, Samara
2017-11-01
Abusive business practices are increasingly evident in consumer packaged goods. Although consumers have the right to protect themselves against such practices, rapid urbanization and industrialization result in greater distances between producers and consumers, raising serious concerns on the supply chain. The operational complexities surrounding halal authentication pose serious challenges on the integrity of consumer packaged goods. This article attempts to address the progress and challenges associated with halal authentication. Advancement and concerns on the application of new, rapid analytical methods for halal authentication are discussed. The significance of zero tolerance policy in consumer packaged foods and its impact on analytical testing are presented. The role of halal assurance systems and their challenges are also considered. In conclusion, consensus on the establishment of one standard approach coupled with a sound traceability system and constant monitoring would certainly improve and ensure halalness of consumer packaged goods. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Security Considerations and Recommendations in Computer-Based Testing
Al-Saleem, Saleh M.
2014-01-01
Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee. PMID:25254250
Security considerations and recommendations in computer-based testing.
Al-Saleem, Saleh M; Ullah, Hanif
2014-01-01
Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee.
OpenID connect as a security service in Cloud-based diagnostic imaging systems
NASA Astrophysics Data System (ADS)
Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter
2015-03-01
The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.
Karunathilaka, Sanjeewa R; Kia, Ali-Reza Fardin; Srigley, Cynthia; Chung, Jin Kyu; Mossoba, Magdi M
2016-10-01
A rapid tool for evaluating authenticity was developed and applied to the screening of extra virgin olive oil (EVOO) retail products by using Fourier-transform near infrared (FT-NIR) spectroscopy in combination with univariate and multivariate data analysis methods. Using disposable glass tubes, spectra for 62 reference EVOO, 10 edible oil adulterants, 20 blends consisting of EVOO spiked with adulterants, 88 retail EVOO products and other test samples were rapidly measured in the transmission mode without any sample preparation. The univariate conformity index (CI) and the multivariate supervised soft independent modeling of class analogy (SIMCA) classification tool were used to analyze the various olive oil products which were tested for authenticity against a library of reference EVOO. Better discrimination between the authentic EVOO and some commercial EVOO products was observed with SIMCA than with CI analysis. Approximately 61% of all EVOO commercial products were flagged by SIMCA analysis, suggesting that further analysis be performed to identify quality issues and/or potential adulterants. Due to its simplicity and speed, FT-NIR spectroscopy in combination with multivariate data analysis can be used as a complementary tool to conventional official methods of analysis to rapidly flag EVOO products that may not belong to the class of authentic EVOO. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.
Broadband quantitative NQR for authentication of vitamins and dietary supplements
NASA Astrophysics Data System (ADS)
Chen, Cheng; Zhang, Fengchao; Bhunia, Swarup; Mandal, Soumyajit
2017-05-01
We describe hardware, pulse sequences, and algorithms for nuclear quadrupole resonance (NQR) spectroscopy of medicines and dietary supplements. Medicine and food safety is a pressing problem that has drawn more and more attention. NQR is an ideal technique for authenticating these substances because it is a non-invasive method for chemical identification. We have recently developed a broadband NQR front-end that can excite and detect 14N NQR signals over a wide frequency range; its operating frequency can be rapidly set by software, while sensitivity is comparable to conventional narrowband front-ends over the entire range. This front-end improves the accuracy of authentication by enabling multiple-frequency experiments. We have also developed calibration and signal processing techniques to convert measured NQR signal amplitudes into nuclear spin densities, thus enabling its use as a quantitative technique. Experimental results from several samples are used to illustrate the proposed methods.
Song, Jiajia; Fang, Guozhen; Zhang, Yan; Deng, Qiliang; Wang, Shuo
2010-01-01
A fingerprint analysis method was developed for Ginkgo biloba leaves and was successfully used for quality evaluation of related health foods by HPLC with electrospray ionization MS. Fifteen samples of G. biloba leaves, which were collected from 15 different locations in China, were analyzed and identified in this study. By both peak analysis and similarity analysis of the fingerprint chromatograms, variation of constituents was easily observed in the leaves from different sources. By comparison with batches of authentic leaves, the authenticity, and quality consistency of related health foods in different matrixes were effectively estimated. It is important to mention that studying a wide range of authentic leaves from various habitats made the quality evaluation of commercial products more convincing and reasonable. The fingerprint-based strategy of the developed method should provide improved QC of G. biloba leaves and products.
Application of analytical methods in authentication and adulteration of honey.
Siddiqui, Amna Jabbar; Musharraf, Syed Ghulam; Choudhary, M Iqbal; Rahman, Atta-Ur-
2017-02-15
Honey is synthesized from flower nectar and it is famous for its tremendous therapeutic potential since ancient times. Many factors influence the basic properties of honey including the nectar-providing plant species, bee species, geographic area, and harvesting conditions. Quality and composition of honey is also affected by many other factors, such as overfeeding of bees with sucrose, harvesting prior to maturity, and adulteration with sugar syrups. Due to the complex nature of honey, it is often challenging to authenticate the purity and quality by using common methods such as physicochemical parameters and more specialized procedures need to be developed. This article reviews the literature (between 2000 and 2016) on the use of analytical techniques, mainly NMR spectroscopy, for authentication of honey, its botanical and geographical origin, and adulteration by sugar syrups. NMR is a powerful technique and can be used as a fingerprinting technique to compare various samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Halal authenticity issues in meat and meat products.
Nakyinsige, Khadijah; Man, Yaakob Bin Che; Sazili, Awis Qurni
2012-07-01
In the recent years, Muslims have become increasingly concerned about the meat they eat. Proper product description is very crucial for consumers to make informed choices and to ensure fair trade, particularly in the ever growing halal food market. Globally, Muslim consumers are concerned about a number of issues concerning meat and meat products such as pork substitution, undeclared blood plasma, use of prohibited ingredients, pork intestine casings and non-halal methods of slaughter. Analytical techniques which are appropriate and specific have been developed to deal with particular issues. The most suitable technique for any particular sample is often determined by the nature of the sample itself. This paper sets out to identify what makes meat halal, highlight the halal authenticity issues that occur in meat and meat products and provide an overview of the possible analytical methods for halal authentication of meat and meat products. Copyright © 2012 Elsevier Ltd. All rights reserved.
War: Images of America. Social Studies Unit, Secondary Grades 7-12.
ERIC Educational Resources Information Center
Franklin, Edward; And Others
Designed to accompany an audiovisual filmstrip series devoted to presenting a visual history of life in America, this guide helps secondary school teachers supplement social studies materials dealing with wars over the past 200 years. Using authentic visuals including paintings, drawings, engravings, posters, photographs, songsheets, and cartoons,…
Dress: Images of America. Elementary Version.
ERIC Educational Resources Information Center
Franklin, Edward; And Others
Designed to accompany an audiovisual filmstrip series devoted to presenting a visual history of life in America, this guide contains an elementary school (grades 2-6) unit which traces the history of dress in America over the last century. Using authentic visuals including posters, paintings, advertising, documentary photography, movies, cartoons,…
Chang, C N; Inouye, H; Model, P; Beckwith, J
1980-01-01
An inner membrane preparation co-translationally cleaved both the alkaline phosphatase and bacteriophage f1 coat protein precursors to the mature proteins. Post-translational outer membrane proteolysis of pre-alkaline phosphatase generated a protein smaller than the authentic monomer. Images PMID:6991486
As Light Meets Matter: Art under Scrutiny.
ERIC Educational Resources Information Center
Del Federico, Eleonora; Diver, Steven; Konaklieva, Monika; Ludescher, Richard
2002-01-01
Presents a story on an investigation of the painter of an artwork (who is suspected to be Cezanne) that uses UV spectroscopy, IR spectrum, X-ray fluorescence, and luminescence images. Leaves the story open-ended as to whether the painting is authentic. Includes teaching notes and suggestions for classroom management. (YDS)
City: Images of America. Elementary Version.
ERIC Educational Resources Information Center
Franklin, Edward; And Others
Designed to accompany an audiovisual filmstrip series devoted to presenting a visual history of life in America, this guide contains an elementary social studies (grades 2-6) unit on the American city over the last century. Using authentic visuals including paintings, posters, advertising, documentary photography, and cartoons, the guide offers…
Pose Invariant Face Recognition Based on Hybrid Dominant Frequency Features
NASA Astrophysics Data System (ADS)
Wijaya, I. Gede Pasek Suta; Uchimura, Keiichi; Hu, Zhencheng
Face recognition is one of the most active research areas in pattern recognition, not only because the face is a human biometric characteristics of human being but also because there are many potential applications of the face recognition which range from human-computer interactions to authentication, security, and surveillance. This paper presents an approach to pose invariant human face image recognition. The proposed scheme is based on the analysis of discrete cosine transforms (DCT) and discrete wavelet transforms (DWT) of face images. From both the DCT and DWT domain coefficients, which describe the facial information, we build compact and meaningful features vector, using simple statistical measures and quantization. This feature vector is called as the hybrid dominant frequency features. Then, we apply a combination of the L2 and Lq metric to classify the hybrid dominant frequency features to a person's class. The aim of the proposed system is to overcome the high memory space requirement, the high computational load, and the retraining problems of previous methods. The proposed system is tested using several face databases and the experimental results are compared to a well-known Eigenface method. The proposed method shows good performance, robustness, stability, and accuracy without requiring geometrical normalization. Furthermore, the purposed method has low computational cost, requires little memory space, and can overcome retraining problem.
Ear recognition from one sample per person.
Chen, Long; Mu, Zhichun; Zhang, Baoqing; Zhang, Yi
2015-01-01
Biometrics has the advantages of efficiency and convenience in identity authentication. As one of the most promising biometric-based methods, ear recognition has received broad attention and research. Previous studies have achieved remarkable performance with multiple samples per person (MSPP) in the gallery. However, most conventional methods are insufficient when there is only one sample per person (OSPP) available in the gallery. To solve the OSPP problem by maximizing the use of a single sample, this paper proposes a hybrid multi-keypoint descriptor sparse representation-based classification (MKD-SRC) ear recognition approach based on 2D and 3D information. Because most 3D sensors capture 3D data accessorizing the corresponding 2D data, it is sensible to use both types of information. First, the ear region is extracted from the profile. Second, keypoints are detected and described for both the 2D texture image and 3D range image. Then, the hybrid MKD-SRC algorithm is used to complete the recognition with only OSPP in the gallery. Experimental results on a benchmark dataset have demonstrated the feasibility and effectiveness of the proposed method in resolving the OSPP problem. A Rank-one recognition rate of 96.4% is achieved for a gallery of 415 subjects, and the time involved in the computation is satisfactory compared to conventional methods.
NASA Astrophysics Data System (ADS)
Nikitin, P. V.; Savinov, A. N.; Bazhenov, R. I.; Sivandaev, S. V.
2018-05-01
The article describes the method of identifying a person in distance learning systems based on a keyboard rhythm. An algorithm for the organization of access control is proposed, which implements authentication, identification and verification of a person using the keyboard rhythm. Authentication methods based on biometric personal parameters, including those based on the keyboard rhythm, due to the inexistence of biometric characteristics without a particular person, are able to provide an advanced accuracy and inability to refuse authorship and convenience for operators of automated systems, in comparison with other methods of conformity checking. Methods of permanent hidden keyboard monitoring allow detecting the substitution of a student and blocking the key system.
2015-06-01
examine how a computer forensic investigator/incident handler, without specialised computer memory or software reverse engineering skills , can successfully...memory images and malware, this new series of reports will be directed at those who must analyse Linux malware-infected memory images. The skills ...disable 1287 1000 1000 /usr/lib/policykit-1-gnome/polkit-gnome-authentication- agent-1 1310 1000 1000 /usr/lib/pulseaudio/pulse/gconf- helper 1350
Simultaneous storage of medical images in the spatial and frequency domain: a comparative study.
Nayak, Jagadish; Bhat, P Subbanna; Acharya U, Rajendra; Uc, Niranjan
2004-06-05
Digital watermarking is a technique of hiding specific identification data for copyright authentication. This technique is adapted here for interleaving patient information with medical images, to reduce storage and transmission overheads. The patient information is encrypted before interleaving with images to ensure greater security. The bio-signals are compressed and subsequently interleaved with the image. This interleaving is carried out in the spatial domain and Frequency domain. The performance of interleaving in the spatial, Discrete Fourier Transform (DFT), Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) coefficients is studied. Differential pulse code modulation (DPCM) is employed for data compression as well as encryption and results are tabulated for a specific example. It can be seen from results, the process does not affect the picture quality. This is attributed to the fact that the change in LSB of a pixel changes its brightness by 1 part in 256. Spatial and DFT domain interleaving gave very less %NRMSE as compared to DCT and DWT domain. The Results show that spatial domain the interleaving, the %NRMSE was less than 0.25% for 8-bit encoded pixel intensity. Among the frequency domain interleaving methods, DFT was found to be very efficient.
Foodomics imaging by mass spectrometry and magnetic resonance.
Canela, Núria; Rodríguez, Miguel Ángel; Baiges, Isabel; Nadal, Pedro; Arola, Lluís
2016-07-01
This work explores the use of advanced imaging MS (IMS) and magnetic resonance imaging (MRI) techniques in food science and nutrition to evaluate food sensory characteristics, nutritional value and health benefits. Determining the chemical content and applying imaging tools to food metabolomics offer detailed information about food quality, safety, processing, storage and authenticity assessment. IMS and MRI are powerful analytical systems with an excellent capability for mapping the distribution of many molecules, and recent advances in these platforms are reviewed and discussed, showing the great potential of these techniques for small molecule-based food metabolomics research. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lakshmi, C; Thenmozhi, K; Rayappan, John Bosco Balaguru; Amirtharajan, Rengarajan
2018-06-01
Digital Imaging and Communications in Medicine (DICOM) is one among the significant formats used worldwide for the representation of medical images. Undoubtedly, medical-image security plays a crucial role in telemedicine applications. Merging encryption and watermarking in medical-image protection paves the way for enhancing the authentication and safer transmission over open channels. In this context, the present work on DICOM image encryption has employed a fuzzy chaotic map for encryption and the Discrete Wavelet Transform (DWT) for watermarking. The proposed approach overcomes the limitation of the Arnold transform-one of the most utilised confusion mechanisms in image ciphering. Various metrics have substantiated the effectiveness of the proposed medical-image encryption algorithm. Copyright © 2018 Elsevier B.V. All rights reserved.
Simple group password-based authenticated key agreements for the integrated EPR information system.
Lee, Tian-Fu; Chang, I-Pin; Wang, Ching-Cheng
2013-04-01
The security and privacy are important issues for electronic patient records (EPRs). The goal of EPRs is sharing the patients' medical histories such as the diagnosis records, reports and diagnosis image files among hospitals by the Internet. So the security issue for the integrated EPR information system is essential. That is, to ensure the information during transmission through by the Internet is secure and private. The group password-based authenticated key agreement (GPAKE) allows a group of users like doctors, nurses and patients to establish a common session key by using password authentication. Then the group of users can securely communicate by using this session key. Many approaches about GAPKE employ the public key infrastructure (PKI) in order to have higher security. However, it not only increases users' overheads and requires keeping an extra equipment for storing long-term secret keys, but also requires maintaining the public key system. This investigation presents a simple group password-based authenticated key agreement (SGPAKE) protocol for the integrated EPR information system. The proposed SGPAKE protocol does not require using the server or users' public keys. Each user only remembers his weak password shared with a trusted server, and then can obtain a common session key. Then all users can securely communicate by using this session key. The proposed SGPAKE protocol not only provides users with convince, but also has higher security.
Banknote authentication using chaotic elements technology
NASA Astrophysics Data System (ADS)
Ambadiyil, Sajan; P. S., Krishnendu; Mahadevan Pillai, V. P.; Prabhu, Radhakrishna
2017-10-01
The counterfeit banknote is a growing threat to the society since the advancements in the field of computers, scanners and photocopiers, as they have made the duplication process for banknote much simpler. The fake note detection systems developed so far have many drawbacks such as high cost, poor accuracy, unavailability, lack of user-friendliness and lower effectiveness. One possible solution to this problem could be the use of a system uniquely linked to the banknote itself. In this paper, we present a unique identification and authentication process for the banknote using chaotic elements embedded in it. A chaotic element means that the physical elements are formed from a random process independent from human intervention. The chaotic elements used in this paper are the random distribution patterns of such security fibres set into the paper pulp. A unique ID is generated from the fibre pattern obtained from UV image of the note, which can be verified by any person who receives the banknote to decide whether the banknote is authentic or not. Performance analysis of the system is also studied in this paper.
Eisenberg, David M; Harris, Eric S J; Littlefield, Bruce A; Cao, Shugeng; Craycroft, Jane A; Scholten, Robert; Bayliss, Peter; Fu, Yanling; Wang, Wenquan; Qiao, Yanjiang; Zhao, Zhongzhen; Chen, Hubiao; Liu, Yong; Kaptchuk, Ted; Hahn, William C; Wang, Xiaoxing; Roberts, Thomas; Shamu, Caroline E; Clardy, Jon
2011-01-01
While the popularity of and expenditures for herbal therapies (aka "ethnomedicines") have increased globally in recent years, their efficacy, safety, mechanisms of action, potential as novel therapeutic agents, cost-effectiveness, or lack thereof, remain poorly defined and controversial. Moreover, published clinical trials evaluating the efficacy of herbal therapies have rightfully been criticized, post hoc, for their lack of quality assurance and reproducibility of study materials, as well as a lack of demonstration of plausible mechanisms and dosing effects. In short, clinical botanical investigations have suffered from the lack of a cohesive research strategy which draws on the expertise of all relevant specialties. With this as background, US and Chinese co-investigators with expertise in Traditional Chinese Medicine (TCM), botany, chemistry and drug discovery, have jointly established a prototype library consisting of 202 authenticated medicinal plant and fungal species that collectively represent the therapeutic content of the majority of all commonly prescribed TCM herbal prescriptions. Currently housed at Harvard University, the library consists of duplicate or triplicate kilogram quantities of each authenticated and processed species, as well as "detanninized" extracts and sub-fractions of each mother extract. Each species has been collected at 2-3 sites, each separated geographically by hundreds of miles, with precise GPS documentation, and authenticated visually and chemically prior to testing for heavy metals and/or pesticides contamination. An explicit decision process has been developed whereby samples with the least contamination were selected to undergo ethanol extraction and HPLC sub-fractionation in preparation for high throughput screening across a broad array of biological targets including cancer biology targets. As envisioned, the subfractions in this artisan collection of authenticated medicinal plants will be tested for biological activity individually and in combinations (i.e., "complex mixtures") consistent with traditional ethnomedical practice. This manuscript summarizes the rationale, methods and preliminary "proof of principle" for the establishment of this prototype, authenticated medicinal plant library. It is hoped that these methods will foster scientific discoveries with therapeutic potential and enhance efforts to systematically evaluate commonly used herbal therapies worldwide. Copyright © 2010 Elsevier B.V. All rights reserved.
Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil
2016-01-01
Background. Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends’ preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. Methods. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. Results. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Conclusion. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors. PMID:28231172
Optical security system for the protection of personal identification information.
Doh, Yang-Hoi; Yoon, Jong-Soo; Choi, Kyung-Hyun; Alam, Mohammad S
2005-02-10
A new optical security system for the protection of personal identification information is proposed. First, authentication of the encrypted personal information is carried out by primary recognition of a personal identification number (PIN) with the proposed multiplexed minimum average correlation energy phase-encrypted (MMACE_p) filter. The MMACE_p filter, synthesized with phase-encrypted training images, can increase the discrimination capability and prevent the leak of personal identification information. After the PIN is recognized, speedy authentication of personal information can be achieved through one-to-one optical correlation by means of the optical wavelet filter. The possibility of information counterfeiting can be significantly decreased with the double-identification process. Simulation results demonstrate the effectiveness of the proposed technique.
Kono, Miyuki; Miura, Naoto; Fujii, Takao; Ohmura, Koichiro; Yoshifuji, Hajime; Yukawa, Naoichiro; Imura, Yoshitaka; Nakashima, Ran; Ikeda, Takaharu; Umemura, Shin-ichiro; Miyatake, Takafumi; Mimori, Tsuneyo
2015-01-01
Objective To examine how connective tissue diseases affect finger-vein pattern authentication. Methods The finger-vein patterns of 68 patients with connective tissue diseases and 24 healthy volunteers were acquired. Captured as CCD (charge-coupled device) images by transmitting near-infrared light through fingers, they were followed up in once in each season for one year. The similarity of the follow-up patterns and the initial one was evaluated in terms of their normalized cross-correlation C. Results The mean C values calculated for patients tended to be lower than those calculated for healthy volunteers. In midwinter (February in Japan) they showed statistically significant reduction both as compared with patients in other seasons and as compared with season-matched healthy controls, whereas the values calculated for healthy controls showed no significant seasonal changes. Values calculated for patients with systemic sclerosis (SSc) or mixed connective tissue disease (MCTD) showed major reductions in November and, especially, February. Patients with rheumatoid arthritis (RA) and patients with dermatomyositis or polymyositis (DM/PM) did not show statistically significant seasonal changes in C values. Conclusions Finger-vein patterns can be used throughout the year to identify patients with connective tissue diseases, but some attention is needed for patients with advanced disease such as SSc. PMID:26701644
Generating cancelable fingerprint templates.
Ratha, Nalini K; Chikkerur, Sharat; Connell, Jonathan H; Bolle, Ruud M
2007-04-01
Biometrics-based authentication systems offer obvious usability advantages over traditional password and token-based authentication schemes. However, biometrics raises several privacy concerns. A biometric is permanently associated with a user and cannot be changed. Hence, if a biometric identifier is compromised, it is lost forever and possibly for every application where the biometric is used. Moreover, if the same biometric is used in multiple applications, a user can potentially be tracked from one application to the next by cross-matching biometric databases. In this paper, we demonstrate several methods to generate multiple cancelable identifiers from fingerprint images to overcome these problems. In essence, a user can be given as many biometric identifiers as needed by issuing a new transformation "key." The identifiers can be cancelled and replaced when compromised. We empirically compare the performance of several algorithms such as Cartesian, polar, and surface folding transformations of the minutiae positions. It is demonstrated through multiple experiments that we can achieve revocability and prevent cross-matching of biometric databases. It is also shown that the transforms are noninvertible by demonstrating that it is computationally as hard to recover the original biometric identifier from a transformed version as by randomly guessing. Based on these empirical results and a theoretical analysis we conclude that feature-level cancelable biometric construction is practicable in large biometric deployments.
Biomarkers in Japanese Encephalitis: A Review
Kant Upadhyay, Ravi
2013-01-01
JE is a flavivirus generated dreadful CNS disease which causes high mortality in various pediatric groups. JE disease is currently diagnosed by measuring the level of viral antigens and virus neutralization IgM antibodies in blood serum and CSF by ELISA. However, it is not possible to measure various disease-identifying molecules, structural and molecular changes occurred in tissues, and cells by using such routine methods. However, few important biomarkers such as cerebrospinal fluid, plasma, neuro-imaging, brain mapping, immunotyping, expression of nonstructural viral proteins, systematic mRNA profiling, DNA and protein microarrays, active caspase-3 activity, reactive oxygen species and reactive nitrogen species, levels of stress-associated signaling molecules, and proinflammatory cytokines could be used to confirm the disease at an earlier stage. These biomarkers may also help to diagnose mutant based environment specific alterations in JEV genotypes causing high pathogenesis and have immense future applications in diagnostics. There is an utmost need for the development of new more authentic, appropriate, and reliable physiological, immunological, biochemical, biophysical, molecular, and therapeutic biomarkers to confirm the disease well in time to start the clinical aid to the patients. Hence, the present review aims to discuss new emerging biomarkers that could facilitate more authentic and fast diagnosis of JE disease and its related disorders in the future. PMID:24455705
Food fraud and the perceived integrity of European food imports into China
Raley, M.; Dean, M.; Clark, B.; Stolz, H.; Home, R.; Chan, M. Y.; Zhong, Q.; Brereton, P.; Frewer, L. J.
2018-01-01
Background/Aims Persistent incidents of food fraud in China have resulted in low levels of consumer trust in the authenticity and safety of food that is domestically produced. We examined the relationship between the concerns of Chinese consumers regarding food fraud, and the role that demonstrating authenticity may play in relieving those concerns. Methods A two-stage mixed method design research design was adopted. First, qualitative research (focus groups n = 7) was conducted in three Chinese cities, Beijing, Guangzhou and Chengdu to explore concerns held by Chinese consumers in relation to food fraud. A subsequent quantitative survey (n = 850) tested hypotheses derived from the qualitative research and theoretical literature regarding the relationship between attitudinal measures (including risk perceptions, social trust, and perceptions of benefit associated with demonstrating authenticity), and behavioral intention to purchase “authentic” European products using structural equation modelling. Results Chinese consumers perceive food fraud to be a hazard that represents a food safety risk. Food hazard concern was identified to be geographically influenced. Consumers in Chengdu (tier 2 city) possessed higher levels of hazard concern compared to consumers in Beijing and Guangzhou (tier 1). Structural trust (i.e. trust in actors and the governance of the food supply chain) was not a significant predictor of attitude and intention to purchase authenticated food products. Consumers were shown to have developed ‘risk-relieving’ strategies to compensate for the lack of trust in Chinese food and the dissonance experienced as a consequence of food fraud. Indexical and iconic authenticity cues provided by food manufacturers and regulators were important elements of product evaluations, although geographical differences in their perceived importance were observed. Conclusions Targeted communication of authenticity assurance measures, including; regulations; enforcement; product testing; and actions taken by industry may improve Chinese consumer trust in the domestic food supply chain and reduce consumer concerns regarding the food safety risks associated with food fraud. To support product differentiation and retain prestige, European food manufactures operating within the Chinese market should recognise regional disparities in consumer risk perceptions regarding food fraud and the importance of personal risk mitigation strategies adopted by Chinese consumers to support the identification of authentic products. PMID:29791434
Real-Time Food Authentication Using a Miniature Mass Spectrometer.
Gerbig, Stefanie; Neese, Stephan; Penner, Alexander; Spengler, Bernhard; Schulz, Sabine
2017-10-17
Food adulteration is a threat to public health and the economy. In order to determine food adulteration efficiently, rapid and easy-to-use on-site analytical methods are needed. In this study, a miniaturized mass spectrometer in combination with three ambient ionization methods was used for food authentication. The chemical fingerprints of three milk types, five fish species, and two coffee types were measured using electrospray ionization, desorption electrospray ionization, and low temperature plasma ionization. Minimum sample preparation was needed for the analysis of liquid and solid food samples. Mass spectrometric data was processed using the laboratory-built software MS food classifier, which allows for the definition of specific food profiles from reference data sets using multivariate statistical methods and the subsequent classification of unknown data. Applicability of the obtained mass spectrometric fingerprints for food authentication was evaluated using different data processing methods, leave-10%-out cross-validation, and real-time classification of new data. Classification accuracy of 100% was achieved for the differentiation of milk types and fish species, and a classification accuracy of 96.4% was achieved for coffee types in cross-validation experiments. Measurement of two milk mixtures yielded correct classification of >94%. For real-time classification, the accuracies were comparable. Functionality of the software program and its performance is described. Processing time for a reference data set and a newly acquired spectrum was found to be 12 s and 2 s, respectively. These proof-of-principle experiments show that the combination of a miniaturized mass spectrometer, ambient ionization, and statistical analysis is suitable for on-site real-time food authentication.
Jin, Chunhua; Xu, Chunxiang; Zhang, Xiaojun; Zhao, Jining
2015-03-01
Radio Frequency Identification(RFID) is an automatic identification technology, which can be widely used in healthcare environments to locate and track staff, equipment and patients. However, potential security and privacy problems in RFID system remain a challenge. In this paper, we design a mutual authentication protocol for RFID based on elliptic curve cryptography(ECC). We use pre-computing method within tag's communication, so that our protocol can get better efficiency. In terms of security, our protocol can achieve confidentiality, unforgeability, mutual authentication, tag's anonymity, availability and forward security. Our protocol also can overcome the weakness in the existing protocols. Therefore, our protocol is suitable for healthcare environments.
Infante, Carlos; Catanese, Gaetano; Ponce, Marian; Manchado, Manuel
2004-12-15
A novel procedure for the authentication of frigate tunas (Auxis thazard and Auxis rochei) in commercially canned products has been developed. Three mitochondrial regions were simultaneously amplified by multiplex-Polymerase Chain Reaction, one corresponding to the small rRNA 12S subunit as a positive amplification control and two species-specific fragments corresponding to cytochrome b for A. rochei and ATPase 6 for A. thazard, respectively. Testing of two different detection systems revealed the fluorescence-based approach as the most sensitive. The results demonstrate that this rapid, low-cost methodology is a reliable molecular tool for direct application in the authentication of canned products.
Calibration and testing of a Raman hyperspectral imaging system to reveal powdered food adulteration
Lohumi, Santosh; Lee, Hoonsoo; Kim, Moon S.; Qin, Jianwei; Kandpal, Lalit Mohan; Bae, Hyungjin; Rahman, Anisur
2018-01-01
The potential adulteration of foodstuffs has led to increasing concern regarding food safety and security, in particular for powdered food products where cheap ground materials or hazardous chemicals can be added to increase the quantity of powder or to obtain the desired aesthetic quality. Due to the resulting potential health threat to consumers, the development of a fast, label-free, and non-invasive technique for the detection of adulteration over a wide range of food products is necessary. We therefore report the development of a rapid Raman hyperspectral imaging technique for the detection of food adulteration and for authenticity analysis. The Raman hyperspectral imaging system comprises of a custom designed laser illumination system, sensing module, and a software interface. Laser illumination system generates a 785 nm laser line of high power, and the Gaussian like intensity distribution of laser beam is shaped by incorporating an engineered diffuser. The sensing module utilize Rayleigh filters, imaging spectrometer, and detector for collection of the Raman scattering signals along the laser line. A custom-built software to acquire Raman hyperspectral images which also facilitate the real time visualization of Raman chemical images of scanned samples. The developed system was employed for the simultaneous detection of Sudan dye and Congo red dye adulteration in paprika powder, and benzoyl peroxide and alloxan monohydrate adulteration in wheat flour at six different concentrations (w/w) from 0.05 to 1%. The collected Raman imaging data of the adulterated samples were analyzed to visualize and detect the adulterant concentrations by generating a binary image for each individual adulterant material. The results obtained based on the Raman chemical images of adulterants showed a strong correlation (R>0.98) between added and pixel based calculated concentration of adulterant materials. This developed Raman imaging system thus, can be considered as a powerful analytical technique for the quality and authenticity analysis of food products. PMID:29708973
Lohumi, Santosh; Lee, Hoonsoo; Kim, Moon S; Qin, Jianwei; Kandpal, Lalit Mohan; Bae, Hyungjin; Rahman, Anisur; Cho, Byoung-Kwan
2018-01-01
The potential adulteration of foodstuffs has led to increasing concern regarding food safety and security, in particular for powdered food products where cheap ground materials or hazardous chemicals can be added to increase the quantity of powder or to obtain the desired aesthetic quality. Due to the resulting potential health threat to consumers, the development of a fast, label-free, and non-invasive technique for the detection of adulteration over a wide range of food products is necessary. We therefore report the development of a rapid Raman hyperspectral imaging technique for the detection of food adulteration and for authenticity analysis. The Raman hyperspectral imaging system comprises of a custom designed laser illumination system, sensing module, and a software interface. Laser illumination system generates a 785 nm laser line of high power, and the Gaussian like intensity distribution of laser beam is shaped by incorporating an engineered diffuser. The sensing module utilize Rayleigh filters, imaging spectrometer, and detector for collection of the Raman scattering signals along the laser line. A custom-built software to acquire Raman hyperspectral images which also facilitate the real time visualization of Raman chemical images of scanned samples. The developed system was employed for the simultaneous detection of Sudan dye and Congo red dye adulteration in paprika powder, and benzoyl peroxide and alloxan monohydrate adulteration in wheat flour at six different concentrations (w/w) from 0.05 to 1%. The collected Raman imaging data of the adulterated samples were analyzed to visualize and detect the adulterant concentrations by generating a binary image for each individual adulterant material. The results obtained based on the Raman chemical images of adulterants showed a strong correlation (R>0.98) between added and pixel based calculated concentration of adulterant materials. This developed Raman imaging system thus, can be considered as a powerful analytical technique for the quality and authenticity analysis of food products.
Tipton, Stephen J; Forkey, Sara; Choi, Young B
2016-04-01
This paper examines various methods encompassing the authentication of users in accessing Electronic Medical Records (EMRs). From a methodological perspective, multiple authentication methods have been researched from both a desktop and mobile accessibility perspective. Each method is investigated at a high level, along with comparative analyses, as well as real world examples. The projected outcome of this examination is a better understanding of the sophistication required in protecting the vital privacy constraints of an individual's Protected Health Information (PHI). In understanding the implications of protecting healthcare data in today's technological world, the scope of this paper is to grasp an overview of confidentiality as it pertains to information security. In addressing this topic, a high level overview of the three goals of information security are examined; in particular, the goal of confidentiality is the primary focus. Expanding upon the goal of confidentiality, healthcare accessibility legal aspects are considered, with a focus upon the Health Insurance Portability and Accountability Act of 1996 (HIPAA). With the primary focus of this examination being access to EMRs, the paper will consider two types of accessibility of concern: access from a physician, or group of physicians; and access from an individual patient.
2018-01-01
Background Twenty-three years into democracy, concern is deepening regarding the slow progress of Occupational Therapy (OT) in South Africa, especially with regard to diversity and inclusion within OT. Methods This study explores authentic leadership development primarily among Black OT students attending a pilot Occupational Therapy Association of South Africa (OTASA) National Student Leadership Camp. It seeks to ascertain their perceptions on leadership and leadership development. This descriptive pilot study employs in-depth interviews and subsequent content analysis, with 12 OT students from six university OT programs in South Africa. Findings Four categories of participant perceptions on authentic leadership development emerged from the analysis: (1) perceptions about oneself as a leader based on personal narrative, self-awareness, self-control, and psychological capital; (2) perceptions about others, specifically current leaders, with regard to their moral crisis, including continuing inequality, insincerity, greed, and selfishness; (3) goals and aspirations for leadership development via student camps; and (4) effects of leadership on the system. Conclusions Recommendations for future practice include promotion of storytelling as a means of personal reflection for authentic leadership development and focused investment in camps for developing student leadership skills and building authentic leadership knowledge. PMID:29770106
Low photon count based digital holography for quadratic phase cryptography.
Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Ryle, James P; Healy, John J; Lee, Byung-Geun; Sheridan, John T
2017-07-15
Recently, the vulnerability of the linear canonical transform-based double random phase encryption system to attack has been demonstrated. To alleviate this, we present for the first time, to the best of our knowledge, a method for securing a two-dimensional scene using a quadratic phase encoding system operating in the photon-counted imaging (PCI) regime. Position-phase-shifting digital holography is applied to record the photon-limited encrypted complex samples. The reconstruction of the complex wavefront involves four sparse (undersampled) dataset intensity measurements (interferograms) at two different positions. Computer simulations validate that the photon-limited sparse-encrypted data has adequate information to authenticate the original data set. Finally, security analysis, employing iterative phase retrieval attacks, has been performed.
NASA Astrophysics Data System (ADS)
Yang, Yuan; Jia, Yingxue; Yang, Hui
2018-05-01
"The Belt and Road" initiative is a national strategy of China to promote modernization construction. The Silk Road is not only a channel of business exchange, but also the artery in Sino-foreign cultural exchange. By promoting culture first, using China's shadow puppets animation production forms as a reference, using animation production techniques, depicting the two masters' pursuing of dharma to show the communication and exchange of Buddhist culture in ancient Silk Road, the original work seeks innovative expression methods in digital media production forms and animation processing and reveals the art and human style and features of China, Japan and India authentically.
2015-01-05
Wang. KinWrite: Handwriting -Based Authentication Using Kinect, Annual Network & Distributed System Security Symposium (NDSS), San Diego, CA, 2013 21...the large varia- tion of different handwriting styles, neighboring characters within a word are usually connected, and we may need to segment a word
Food: Images of America. Social Studies Unit, Elementary Grades 2-6.
ERIC Educational Resources Information Center
Franklin, Edward; And Others
Designed to accompany an audiovisual filmstrip series devoted to presenting a visual history of life in America, this guide contains an elementary school (grades 2-6) unit on American food over the last century. Using authentic visuals including paintings, advertising, label art, documentary photography, and a movie still, the guide offers…
Is the Price Right?: Stereotypes, Co-Production Policy and Irish Television.
ERIC Educational Resources Information Center
Gibbons, Luke
The purpose of this paper is to examine how recent demand for greater realism in portrayals of Irish life in the television and film industries serves to authenticate existing stereotypes and romantic images which characterize "Irishness" in the popular imagination rather than refute or undermine them. Discussions of a political thriller…
Including the Child with Special Needs: Learning from Reggio Emilia
ERIC Educational Resources Information Center
Gilman, Sheryl
2007-01-01
Inclusive education aims toward integrating special needs students into all events of the typical classroom. For North American educators, the process of inclusion does not unfold naturally as in the routines of the Reggio Emilia approach. Reggio's powerful image of the child nourishes the authentic practice of maximizing each child's…
Photographic documentation, a practical guide for non professional forensic photography.
Ozkalipci, Onder; Volpellier, Muriel
2010-01-01
Forensic photography is essential for documentation of evidence of torture. Consent of the alleged victim should be sought in all cases. The article gives information about when and how to take pictures of what as well as image authentication, audit trail, storage, faulty pictures and the kind of camera to use.
A novel biometric authentication approach using ECG and EMG signals.
Belgacem, Noureddine; Fournier, Régis; Nait-Ali, Amine; Bereksi-Reguig, Fethi
2015-05-01
Security biometrics is a secure alternative to traditional methods of identity verification of individuals, such as authentication systems based on user name and password. Recently, it has been found that the electrocardiogram (ECG) signal formed by five successive waves (P, Q, R, S and T) is unique to each individual. In fact, better than any other biometrics' measures, it delivers proof of subject's being alive as extra information which other biometrics cannot deliver. The main purpose of this work is to present a low-cost method for online acquisition and processing of ECG signals for person authentication and to study the possibility of providing additional information and retrieve personal data from an electrocardiogram signal to yield a reliable decision. This study explores the effectiveness of a novel biometric system resulting from the fusion of information and knowledge provided by ECG and EMG (Electromyogram) physiological recordings. It is shown that biometrics based on these ECG/EMG signals offers a novel way to robustly authenticate subjects. Five ECG databases (MIT-BIH, ST-T, NSR, PTB and ECG-ID) and several ECG signals collected in-house from volunteers were exploited. A palm-based ECG biometric system was developed where the signals are collected from the palm of the subject through a minimally intrusive one-lead ECG set-up. A total of 3750 ECG beats were used in this work. Feature extraction was performed on ECG signals using Fourier descriptors (spectral coefficients). Optimum-Path Forest classifier was used to calculate the degree of similarity between individuals. The obtained results from the proposed approach look promising for individuals' authentication.
Cuadros-Rodríguez, Luis; Ruiz-Samblás, Cristina; Valverde-Som, Lucia; Pérez-Castaño, Estefanía; González-Casado, Antonio
2016-02-25
Fingerprinting methods describe a variety of analytical methods that provide analytical signals related to the composition of foodstuffs in a non-selective way such as by collecting a spectrum or a chromatogram. Mathematical processing of the information in such fingerprints may allow the characterisation and/or authentication of foodstuffs. In this context, the particular meaning of 'fingerprinting', in conjunction with 'profiling', is different from the original meanings used in metabolomics. This fact has produced some confusion with the use of these terms in analytical papers. Researchers coming from the metabolomic field could use 'profiling' or 'fingerprinting' on a different way to researchers who are devoted to food science. The arrival of an eclectic discipline, named 'foodomics' has not been enough to allay this terminological problem, since the authors keep on using the terms with both meanings. Thus, a first goal of this tutorial is to clarify the difference between both terms. In addition, the chemical approaches for food authentication, i.e., chemical markers, component profiling and instrumental fingerprinting, have been described. A new term, designated as 'food identitation', has been introduced in order to complete the life cycle of the chemical-based food authentication process. Chromatographic fingerprinting has been explained in detail and some strategies which could be applied has been clarified and discussed. Particularly, the strategies for chromatographic signals acquisition and chromatographic data handling are unified in a single framework. Finally, an overview about the applications of chromatographic (GC and LC) fingerprints in food authentication using different chemometric techniques has been included. Copyright © 2016 Elsevier B.V. All rights reserved.
Heart Electrical Actions as Biometric Indicia
NASA Technical Reports Server (NTRS)
Schipper, John F. (Inventor); Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor); Belousof, Eugene (Inventor)
2013-01-01
A method and associated system for use of statistical parameters based on peak amplitudes and/or time interval lengths and/or depolarization-repolarization vector angles and/or depolarization-repolarization vector lengths for PQRST electrical signals associated with heart waves, to identify a person. The statistical parameters, estimated to be at least 192, serve as biometric indicia, to authenticate, or to decline to authenticate, an asserted identity of a candidate person.
DNA Barcoding for the Identification and Authentication of Animal Species in Traditional Medicine.
Yang, Fan; Ding, Fei; Chen, Hong; He, Mingqi; Zhu, Shixin; Ma, Xin; Jiang, Li; Li, Haifeng
2018-01-01
Animal-based traditional medicine not only plays a significant role in therapeutic practices worldwide but also provides a potential compound library for drug discovery. However, persistent hunting and illegal trade markedly threaten numerous medicinal animal species, and increasing demand further provokes the emergence of various adulterants. As the conventional methods are difficult and time-consuming to detect processed products or identify animal species with similar morphology, developing novel authentication methods for animal-based traditional medicine represents an urgent need. During the last decade, DNA barcoding offers an accurate and efficient strategy that can identify existing species and discover unknown species via analysis of sequence variation in a standardized region of DNA. Recent studies have shown that DNA barcoding as well as minibarcoding and metabarcoding is capable of identifying animal species and discriminating the authentics from the adulterants in various types of traditional medicines, including raw materials, processed products, and complex preparations. These techniques can also be used to detect the unlabelled and threatened animal species in traditional medicine. Here, we review the recent progress of DNA barcoding for the identification and authentication of animal species used in traditional medicine, which provides a reference for quality control and trade supervision of animal-based traditional medicine.
DNA Barcoding for the Identification and Authentication of Animal Species in Traditional Medicine
Yang, Fan; Ding, Fei; Chen, Hong; He, Mingqi; Zhu, Shixin; Ma, Xin; Jiang, Li
2018-01-01
Animal-based traditional medicine not only plays a significant role in therapeutic practices worldwide but also provides a potential compound library for drug discovery. However, persistent hunting and illegal trade markedly threaten numerous medicinal animal species, and increasing demand further provokes the emergence of various adulterants. As the conventional methods are difficult and time-consuming to detect processed products or identify animal species with similar morphology, developing novel authentication methods for animal-based traditional medicine represents an urgent need. During the last decade, DNA barcoding offers an accurate and efficient strategy that can identify existing species and discover unknown species via analysis of sequence variation in a standardized region of DNA. Recent studies have shown that DNA barcoding as well as minibarcoding and metabarcoding is capable of identifying animal species and discriminating the authentics from the adulterants in various types of traditional medicines, including raw materials, processed products, and complex preparations. These techniques can also be used to detect the unlabelled and threatened animal species in traditional medicine. Here, we review the recent progress of DNA barcoding for the identification and authentication of animal species used in traditional medicine, which provides a reference for quality control and trade supervision of animal-based traditional medicine. PMID:29849709
Self-Assembled Resonance Energy Transfer Keys for Secure Communication over Classical Channels.
Nellore, Vishwa; Xi, Sam; Dwyer, Chris
2015-12-22
Modern authentication and communication protocols increasingly use physical keys in lieu of conventional software-based keys for security. This shift is primarily driven by the ability to derive a unique, unforgeable signature from a physical key. The sole demonstration of an unforgeable key, thus far, has been through quantum key distribution, which suffers from limited communication distances and expensive infrastructure requirements. Here, we show a method for creating unclonable keys by molecular self-assembly of resonance energy transfer (RET) devices. It is infeasible to clone the RET-key due to the inability to characterize the key using current technology, the large number of input-output combinations per key, and the variation of the key's response with time. However, the manufacturer can produce multiple identical devices, which enables inexpensive, secure authentication and communication over classical channels, and thus any distance. Through a detailed experimental survey of the nanoscale keys, we demonstrate that legitimate users are successfully authenticated 99.48% of the time and the false-positives are only 0.39%, over two attempts. We estimate that a legitimate user would have a computational advantage of more than 10(340) years over an attacker. Our method enables the discovery of physical key based multiparty authentication and communication schemes that are both practical and possess unprecedented security.
Kahl, Johannes; Busscher, Nicolaas; Mergardt, Gaby; Mäder, Paul; Torp, Torfinn; Ploeger, Angelika
2015-01-01
There is a need for authentication tools in order to verify the existing certification system. Recently, markers for analytical authentication of organic products were evaluated. Herein, crystallization with additives was described as an interesting fingerprint approach which needs further evidence, based on a standardized method and well-documented sample origin. The fingerprint of wheat cultivars from a controlled field trial is generated from structure analysis variables of crystal patterns. Method performance was tested on factors such as crystallization chamber, day of experiment and region of interest of the patterns. Two different organic treatments and two different treatments of the non-organic regime can be grouped together in each of three consecutive seasons. When the k-nearest-neighbor classification method was applied, approximately 84% of Runal samples and 95% of Titlis samples were classified correctly into organic and non-organic origin using cross-validation. Crystallization with additive offers an interesting complementary fingerprint method for organic wheat samples. When the method is applied to winter wheat from the DOK trial, organic and non-organic treated samples can be differentiated significantly based on pattern recognition. Therefore crystallization with additives seems to be a promising tool in organic wheat authentication. © 2014 Society of Chemical Industry.
Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions
Chiou, Shin-Yan
2013-01-01
Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied. PMID:23762851
Face Liveness Detection Using Defocus
Kim, Sooyeon; Ban, Yuseok; Lee, Sangyoun
2015-01-01
In order to develop security systems for identity authentication, face recognition (FR) technology has been applied. One of the main problems of applying FR technology is that the systems are especially vulnerable to attacks with spoofing faces (e.g., 2D pictures). To defend from these attacks and to enhance the reliability of FR systems, many anti-spoofing approaches have been recently developed. In this paper, we propose a method for face liveness detection using the effect of defocus. From two images sequentially taken at different focuses, three features, focus, power histogram and gradient location and orientation histogram (GLOH), are extracted. Afterwards, we detect forged faces through the feature-level fusion approach. For reliable performance verification, we develop two databases with a handheld digital camera and a webcam. The proposed method achieves a 3.29% half total error rate (HTER) at a given depth of field (DoF) and can be extended to camera-equipped devices, like smartphones. PMID:25594594
Transmission and storage of medical images with patient information.
Acharya U, Rajendra; Subbanna Bhat, P; Kumar, Sathish; Min, Lim Choo
2003-07-01
Digital watermarking is a technique of hiding specific identification data for copyright authentication. This technique is adapted here for interleaving patient information with medical images, to reduce storage and transmission overheads. The text data is encrypted before interleaving with images to ensure greater security. The graphical signals are interleaved with the image. Two types of error control-coding techniques are proposed to enhance reliability of transmission and storage of medical images interleaved with patient information. Transmission and storage scenarios are simulated with and without error control coding and a qualitative as well as quantitative interpretation of the reliability enhancement resulting from the use of various commonly used error control codes such as repetitive, and (7,4) Hamming code is provided.
Bouslimi, D; Coatrieux, G; Roux, Ch
2011-01-01
In this paper, we propose a new joint watermarking/encryption algorithm for the purpose of verifying the reliability of medical images in both encrypted and spatial domains. It combines a substitutive watermarking algorithm, the quantization index modulation (QIM), with a block cipher algorithm, the Advanced Encryption Standard (AES), in CBC mode of operation. The proposed solution gives access to the outcomes of the image integrity and of its origins even though the image is stored encrypted. Experimental results achieved on 8 bits encoded Ultrasound images illustrate the overall performances of the proposed scheme. By making use of the AES block cipher in CBC mode, the proposed solution is compliant with or transparent to the DICOM standard.
Simulation of millimeter-wave body images and its application to biometric recognition
NASA Astrophysics Data System (ADS)
Moreno-Moreno, Miriam; Fierrez, Julian; Vera-Rodriguez, Ruben; Parron, Josep
2012-06-01
One of the emerging applications of the millimeter-wave imaging technology is its use in biometric recognition. This is mainly due to some properties of the millimeter-waves such as their ability to penetrate through clothing and other occlusions, their low obtrusiveness when collecting the image and the fact that they are harmless to health. In this work we first describe the generation of a database comprising 1200 synthetic images at 94 GHz obtained from the body of 50 people. Then we extract a small set of distance-based features from each image and select the best feature subsets for person recognition using the SFFS feature selection algorithm. Finally these features are used in body geometry authentication obtaining promising results.
Yan, Hui; Duan, Jin-ao; Qian, Da-wei; Su, Shu-lan; Song, Bing-sheng; He, Zi-qing
2011-04-01
Evaluate the relationship between the inorganic elements and the genuineness, invigoration efficacy of this medicinal material by qualitative and quantitative analysis of the inorganic elements in Angelica sinensis and its correspondence soil. The contents of 14 kinds of inorganic elements from 40 samples from 4 main habits of Angelica sinensis in China were determined by the method of ICP-AES. In Angelica sinensis and its correspondence soil, significant positive correlations existed between each pair of Ca, Na, Ni. The enrichment coefficients of Mg by Angelica sinensis was a certain peculiarity. The analysis showed that Zn, Cu, Mn, Mg were distincter to Angelica sinensis's geo-authentic than other elements. The results seemly confirmed that the Mingui was considered as geo-authentic crude drugs by traditional knowledge. The inorganic elements in Angelica sinensis may be correlated with its geo-authentic certainly. This result can provide scientific basis for understanding of Angelica sinensis's geo-authentic nature and the active material base.
Development and optimization of an efficient qPCR system for olive authentication in edible oils.
Alonso-Rebollo, Alba; Ramos-Gómez, Sonia; Busto, María D; Ortega, Natividad
2017-10-01
The applicability of qPCR in olive-oil authentication depends on the DNA obtained from the oils and the amplification primers. Therefore, four olive-specific amplification systems based on the trnL gene were designed (A-, B-, C- and D-trnL systems). The qPCR conditions, primer concentration and annealing temperature, were optimized. The systems were tested for efficiency and sensitivity to select the most suitable for olive oil authentication. The selected system (D-trnL) demonstrated specificity toward olive in contrast to other oleaginous species (canola, soybean, sunflower, maize, peanut and coconut) and showed high sensitivity in a broad linear dynamic range (LOD and LOQ: 500ng - 0.0625pg). This qPCR system enabled detection, with high sensitivity and specificity, of olive DNA isolated from oils processed in different ways, establishing it as an efficient method for the authentication of olive oil regardless of its category. Copyright © 2017 Elsevier Ltd. All rights reserved.
Privacy-Preserving Authentication Using a Double Pseudonym for Internet of Vehicles
Xu, Wenyu; Zhang, Jing; Xu, Yan; Liu, Lu
2018-01-01
The Internet of Vehicles (IoV) plays an important role in smart transportation to reduce the drivers’s risk of having an accident and help them manage small emergencies. Therefore, security and privacy issues of the message in the tamper proof device (TPD) broadcasted to other vehicles and roadside units (RSUs) have become an important research subject in the field of smart transportation. Many authentication schemes are proposed to tackle the challenges above and most of them are heavy in computation and communication. In this paper, we propose a novel authentication scheme that utilizes the double pseudonym method to hide the real identity of vehicles and adopts the dynamic update technology to periodically update the information (such as member secret, authentication key, internal pseudo-identity) stored in the tamper-proof device to prevent the side-channel attack. Because of not using bilinear pairing, our scheme yields a better performance in terms of computation overhead and communication overhead, and is more suitable to be applied in the Internet of Vehicles. PMID:29735941
Privacy-Preserving Authentication Using a Double Pseudonym for Internet of Vehicles.
Cui, Jie; Xu, Wenyu; Zhong, Hong; Zhang, Jing; Xu, Yan; Liu, Lu
2018-05-07
The Internet of Vehicles (IoV) plays an important role in smart transportation to reduce the drivers’s risk of having an accident and help them manage small emergencies. Therefore, security and privacy issues of the message in the tamper proof device (TPD) broadcasted to other vehicles and roadside units (RSUs) have become an important research subject in the field of smart transportation. Many authentication schemes are proposed to tackle the challenges above and most of them are heavy in computation and communication. In this paper, we propose a novel authentication scheme that utilizes the double pseudonym method to hide the real identity of vehicles and adopts the dynamic update technology to periodically update the information (such as member secret, authentication key, internal pseudo-identity) stored in the tamper-proof device to prevent the side-channel attack. Because of not using bilinear pairing, our scheme yields a better performance in terms of computation overhead and communication overhead, and is more suitable to be applied in the Internet of Vehicles.
Chemical composition analysis and authentication of whisky.
Wiśniewska, Paulina; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek
2015-08-30
Whisky (whiskey) is one of the most popular spirit-based drinks made from malted or saccharified grains, which should mature for at least 3 years in wooden barrels. High popularity of products usually causes a potential risk of adulteration. Thus authenticity assessment is one of the key elements of food product marketing. Authentication of whisky is based on comparing the composition of this alcohol with other spirit drinks. The present review summarizes all information about the comparison of whisky and other alcoholic beverages, the identification of type of whisky or the assessment of its quality and finally the authentication of whisky. The article also presents the various techniques used for analyzing whisky, such as gas and liquid chromatography with different types of detectors (FID, AED, UV-Vis), electronic nose, atomic absorption spectroscopy and mass spectrometry. In some cases the application of chemometric methods is also described, namely PCA, DFA, LDA, ANOVA, SIMCA, PNN, k-NN and CA, as well as preparation techniques such SPME or SPE. © 2014 Society of Chemical Industry.
Analytical methods used for the authentication of food of animal origin.
Abbas, Ouissam; Zadravec, Manuela; Baeten, Vincent; Mikuš, Tomislav; Lešić, Tina; Vulić, Ana; Prpić, Jelena; Jemeršić, Lorena; Pleadin, Jelka
2018-04-25
Since adulteration can have serious consequences on human health, it affects market growth by destroying consumer confidence. Therefore, authentication of food is important for food processors, retailers and consumers, but also for regulatory authorities. However, a complex nature of food and an increase in types of adulterants make their detection difficult, so that food authentication often poses a challenge. This review focuses on analytical approaches to authentication of food of animal origin, with an emphasis put on determination of specific ingredients, geographical origin and adulteration by virtue of substitution. This review highlights a current overview of the application of target approaches in cases when the compound of interest is known and non-target approaches for screening issues. Papers cited herein mainly concern milk, cheese, meat and honey. Moreover, advantages, disadvantages as well as challenges regarding the use of both approaches in official food control but also in food industry are investigated. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hawthorne, Margaret; LaNoue, Marianna; Brenner, Jeffrey
2016-01-01
Abstract In the movement to improve the health of patients with multiple chronic conditions and vulnerabilities, while reducing the need for hospitalizations, care management programs have garnered wide attention and support. The qualitative data presented in this paper sheds new light on key components of successful chronic care management programs. By going beyond a task- and temporal-based framework, this analysis identifies and defines the importance of “authentic healing relationships” in driving individual and systemic change. Drawing on the voices of 30 former clients of the Camden Coalition of Healthcare Providers, the investigators use qualitative methods to identify and elaborate the core elements of the authentic healing relationship—security, genuineness, and continuity—a relationship that is linked to patient motivation and active health management. Although not readily found in the traditional health care delivery system, these authentic healing relationships present significant implications for addressing the persistent health-related needs of patients with frequent hospitalizations. (Population Health Management 2016;19:248–256) PMID:26565379
Teodoro, Janaína Aparecida Reis; Pereira, Hebert Vinicius; Sena, Marcelo Martins; Piccin, Evandro; Zacca, Jorge Jardim; Augusti, Rodinei
2017-12-15
A direct method based on the application of paper spray mass spectrometry (PS-MS) combined with a chemometric supervised method (partial least square discriminant analysis, PLS-DA) was developed and applied to the discrimination of authentic and counterfeit samples of blended Scottish whiskies. The developed methodology employed the negative ion mode MS, included 44 authentic whiskies from diverse brands and batches and 44 counterfeit samples of the same brands seized during operations of the Brazilian Federal Police, totalizing 88 samples. An exploratory principal component analysis (PCA) model showed a reasonable discrimination of the counterfeit whiskies in PC2. In spite of the samples heterogeneity, a robust, reliable and accurate PLS-DA model was generated and validated, which was able to correctly classify the samples with nearly 100% success rate. The use of PS-MS also allowed the identification of the main marker compounds associated with each type of sample analyzed: authentic or counterfeit. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ferreiro-González, Marta; Barbero, Gerardo F; Álvarez, José A; Ruiz, Antonio; Palma, Miguel; Ayuso, Jesús
2017-04-01
Adulteration of olive oil is not only a major economic fraud but can also have major health implications for consumers. In this study, a combination of visible spectroscopy with a novel multivariate curve resolution method (CR), principal component analysis (PCA) and linear discriminant analysis (LDA) is proposed for the authentication of virgin olive oil (VOO) samples. VOOs are well-known products with the typical properties of a two-component system due to the two main groups of compounds that contribute to the visible spectra (chlorophylls and carotenoids). Application of the proposed CR method to VOO samples provided the two pure-component spectra for the aforementioned families of compounds. A correlation study of the real spectra and the resolved component spectra was carried out for different types of oil samples (n=118). LDA using the correlation coefficients as variables to discriminate samples allowed the authentication of 95% of virgin olive oil samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Call progress time measurement in IP telephony
NASA Astrophysics Data System (ADS)
Khasnabish, Bhumip
1999-11-01
Usually a voice call is established through multiple stages in IP telephony. In the first stage, a phone number is dialed to reach a near-end or call-originating IP-telephony gateway. The next stages involve user identification through delivering an m-digit user-id to the authentication and/or billing server, and then user authentication by using an n- digit PIN. After that, the caller is allowed (last stage dial tone is provided) to dial a destination phone number provided that authentication is successful. In this paper, we present a very flexible method for measuring call progress time in IP telephony. The proposed technique can be used to measure the system response time at every stage. It is flexible, so that it can be easily modified to include new `tone' or a set of tones, or `voice begin' can be used in every stage to detect the system's response. The proposed method has been implemented using scripts written in Hammer visual basic language for testing with a few commercially available IP telephony gateways.
Farabegoli, Federica; Pirini, Maurizio; Rotolo, Magda; Silvi, Marina; Testi, Silvia; Ghidini, Sergio; Zanardi, Emanuela; Remondini, Daniel; Bonaldo, Alessio; Parma, Luca; Badiani, Anna
2018-06-08
The authenticity of fish products has become an imperative issue for authorities involved in the protection of consumers against fraudulent practices and in the market stabilization. The present study aimed to provide a method for authentication of European sea bass (Dicentrarchus labrax) according to the requirements for seafood labels (Regulation 1379/2013/EU). Data on biometric traits, fatty acid profile, elemental composition, and isotopic abundance of wild and reared (intensively, semi-intensively and extensively) specimens from 18 Southern European sources (n = 160) were collected and clustered in 6 sets of parameters, then subjected to multivariate analysis. Correct allocations of subjects according to their production method, origin and stocking density were demonstrated with good approximation rates (94%, 92% and 92%, respectively) using fatty acid profiles. Less satisfying results were obtained using isotopic abundance, biometric traits, and elemental composition. The multivariate analysis also revealed that extensively reared subjects cannot be analytically discriminated from wild ones.
Li, Lin-Qiu; Baibado, Joewel T; Shen, Qing; Cheung, Hon-Yeung
2017-12-01
Plastron is a nutritive and superior functional food. Due to its limited supply yet enormous demands, some functional foods supposed to contain plastron may be forged with other substitutes. This paper reports a novel and simple method for determination of the authenticity of plastron-derived functional foods based on comparison of the amino acid (AA) profiles of plastron and its possible substitutes. By applying micellar electrokinetic chromatography (MEKC), 18 common AAs along with another 2 special AAs - hydroxyproline (Hyp) and hydroxylysine (Hyl) were detected in all plastron samples. Since chicken, egg, fish, milk, pork, nail and hair lacked of Hyp and Hyl, plastron could be easily distinguished. For those containing collagen, a statistical analysis technique - principal component analysis (PCA) was adopted and plastron was successfully distinguished. When applied the proposed method to authenticate turtle shell glue in the market, fake products were commonly found. Copyright © 2017 Elsevier B.V. All rights reserved.
A new method of enhancing telecommand security: the application of GCM in TC protocol
NASA Astrophysics Data System (ADS)
Zhang, Lei; Tang, Chaojing; Zhang, Quan
2007-11-01
In recent times, security has grown to a topic of major importance for the space missions. Many space agencies have been engaged in research on the selection of proper algorithms for ensuring Telecommand security according to the space communication environment, especially in regard to the privacy and authentication. Since space missions with high security levels need to ensure both privacy and authentication, Authenticated Encryption with Associated Data schemes (AEAD) be integrated into normal Telecommand protocols. This paper provides an overview of the Galois Counter Mode (GCM) of operation, which is one of the available two-pass AEAD schemes, and some preliminary considerations and analyses about its possible application to Telecommand frames specified by CCSDS.
Student’s critical thinking skills in authentic problem based learning
NASA Astrophysics Data System (ADS)
Yuliati, L.; Fauziah, R.; Hidayat, A.
2018-05-01
This study aims to determine students’ critical thinking skills in authentic problem based learning, especially on geometric optics. The study was conducted at the vocational school. The study used a quantitative descriptive method with the open question to measure critical thinking skills. The indicators of critical thinking skills measured in this study are: formulating problems, providing simple answers, applying formulas and procedures, analyzing information, making conclusions, and synthesizing ideas. The results showed that there was a positive change in students’ critical thinking skills with the average value of N-Gain test is 0.59 and effect size test is 3.73. The critical thinking skills of students need to be trained more intensively using authentic problems in daily life.
Meng, Xianjing; Yin, Yilong; Yang, Gongping; Xi, Xiaoming
2013-07-18
Retinal identification based on retinal vasculatures in the retina provides the most secure and accurate means of authentication among biometrics and has primarily been used in combination with access control systems at high security facilities. Recently, there has been much interest in retina identification. As digital retina images always suffer from deformations, the Scale Invariant Feature Transform (SIFT), which is known for its distinctiveness and invariance for scale and rotation, has been introduced to retinal based identification. However, some shortcomings like the difficulty of feature extraction and mismatching exist in SIFT-based identification. To solve these problems, a novel preprocessing method based on the Improved Circular Gabor Transform (ICGF) is proposed. After further processing by the iterated spatial anisotropic smooth method, the number of uninformative SIFT keypoints is decreased dramatically. Tested on the VARIA and eight simulated retina databases combining rotation and scaling, the developed method presents promising results and shows robustness to rotations and scale changes.
Meng, Xianjing; Yin, Yilong; Yang, Gongping; Xi, Xiaoming
2013-01-01
Retinal identification based on retinal vasculatures in the retina provides the most secure and accurate means of authentication among biometrics and has primarily been used in combination with access control systems at high security facilities. Recently, there has been much interest in retina identification. As digital retina images always suffer from deformations, the Scale Invariant Feature Transform (SIFT), which is known for its distinctiveness and invariance for scale and rotation, has been introduced to retinal based identification. However, some shortcomings like the difficulty of feature extraction and mismatching exist in SIFT-based identification. To solve these problems, a novel preprocessing method based on the Improved Circular Gabor Transform (ICGF) is proposed. After further processing by the iterated spatial anisotropic smooth method, the number of uninformative SIFT keypoints is decreased dramatically. Tested on the VARIA and eight simulated retina databases combining rotation and scaling, the developed method presents promising results and shows robustness to rotations and scale changes. PMID:23873409
ERIC Educational Resources Information Center
Munawaroh
2017-01-01
Vocational high school must change student's mind set in order to be sure that they become entrepreneurs who will be better and nobler than become employees. This research aimed to determine the effect of teacher's ability in practicing the method of Authentic Problem Based Learning (APBL) and student's attitude to the development of…
Investigating Background Pictures for Picture Gesture Authentication
2017-06-01
computing , stating “Microsoft is committed to making sure that the technology within the agreement has a mobile-first focus, and we 2 expect to begin to...Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave Blank) 2. REPORT DATE 06-16-2017 3. REPORT TYPE AND...unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The military relies heavily on computer systems. Without a strong method of authentication
NASA Astrophysics Data System (ADS)
Pérez-Cabré, Elisabet; Millán, María S.; Javidi, Bahram
2006-09-01
Verification of a piece of information and/or authentication of a given object or person are common operations carried out by automatic security systems that can be applied, for instance, to control the entrance to restricted areas, access to public buildings, identification of cardholders, etc. Vulnerability of such security systems may depend on the ease of counterfeiting the information used as a piece of identification for verification and authentication. To protect data against tampering, the signature that identifies an object is usually encrypted to avoid an easy recognition at human sight and an easy reproduction using conventional devices for imaging or scanning. To make counterfeiting even more difficult, we propose to combine data from visible and near infrared (NIR) spectral bands. By doing this, neither the visible content nor the NIR data by theirselves are sufficient to allow the signature recognition and thus, the identification of a given object. Only the appropriate combination of both signals permits a satisfactory authentication. In addition, the resulting signature is encrypted following a fully-phase encryption technique and the obtained complex-amplitude distribution is encoded on an ID tag. Spatial multiplexing of the encrypted signature allows us to build a distortion-invariant ID tag, so that remote authentication can be achieved even if the tag is captured under rotation or at different distances. We also explore the possibility of using partial information of the encrypted signature to simplify the ID tag design.
An optical watermarking solution for color personal identification pictures
NASA Astrophysics Data System (ADS)
Tan, Yi-zhou; Liu, Hai-bo; Huang, Shui-hua; Sheng, Ben-jian; Pan, Zhong-ming
2009-11-01
This paper presents a new approach for embedding authentication information into image on printed materials based on optical projection technique. Our experimental setup consists of two parts, one is a common camera, and the other is a LCD projector, which project a pattern on personnel's body (especially on the face). The pattern, generated by a computer, act as the illumination light source with sinusoidal distribution and it is also the watermark signal. For a color image, the watermark is embedded into the blue channel. While we take pictures (256×256 and 512×512, 567×390 pixels, respectively), an invisible mark is embedded directly into magnitude coefficients of Discrete Fourier transform (DFT) at exposure moment. Both optical and digital correlation is suitable for detection of this type of watermark. The decoded watermark is a set of concentric circles or sectors in the DFT domain (middle frequencies region) which is robust to photographing, printing and scanning. The unlawful people modify or replace the original photograph, and make fake passport (drivers' license and so on). Experiments show, it is difficult to forge certificates in which a watermark was embedded by our projector-camera combination based on analogue watermark method rather than classical digital method.
Mishra, Priyanka; Shukla, Ashutosh K; Sundaresan, Velusamy
2018-01-01
Senna alexandrina (Fabaceae) is a globally recognized medicinal plant for its laxative properties as well as the only source of sennosides, and is highly exported bulk herb from India. Its major procurement is exclusively from limited cultivation, which leads to risks of deliberate or unintended adulteration. The market raw materials are in powdered or finished product form, which lead to difficulties in authentication. Here, DNA barcode tags based on chloroplast genes ( rbcL and matK ) and intergenic spacers ( psbA-trnH and ITS ) were developed for S. alexandrina along with the allied species. The ability and performance of the ITS1 region to discriminate among the Senna species resulted in the present proposal of the ITS1 tags as successful barcode. Further, these tags were coupled with high-resolution melting (HRM) curve analysis in a real-time PCR genotyping method to derive Bar-HRM (Barcoding-HRM) assays. Suitable HRM primer sets were designed through SNP detection and mutation scanning in genomic signatures of Senna species. The melting profiles of S. alexandrina and S . italica subsp. micrantha were almost identical and the remaining five species were clearly separated so that they can be differentiated by HRM method. The sensitivity of the method was utilized to authenticate market samples [Herbal Sample Assays (HSAs)]. HSA01 ( S. alexandrina crude drug sample from Bangalore) and HSA06 ( S. alexandrina crude drug sample from Tuticorin, Tamil Nadu, India) were found to be highly contaminated with S . italica subsp. micrantha . Species admixture samples mixed in varying percentage was identified sensitively with detection of contamination as low as 1%. The melting profiles of PCR amplicons are clearly distinct, which enables the authentic differentiation of species by the HRM method. This study reveals that DNA barcoding coupled with HRM is an efficient molecular tool to authenticate Senna herbal products in the market for quality control in the drug supply chain. CIMAP Communication Number: CIMAP/PUB/2017/31.
Mishra, Priyanka; Shukla, Ashutosh K.; Sundaresan, Velusamy
2018-01-01
Senna alexandrina (Fabaceae) is a globally recognized medicinal plant for its laxative properties as well as the only source of sennosides, and is highly exported bulk herb from India. Its major procurement is exclusively from limited cultivation, which leads to risks of deliberate or unintended adulteration. The market raw materials are in powdered or finished product form, which lead to difficulties in authentication. Here, DNA barcode tags based on chloroplast genes (rbcL and matK) and intergenic spacers (psbA-trnH and ITS) were developed for S. alexandrina along with the allied species. The ability and performance of the ITS1 region to discriminate among the Senna species resulted in the present proposal of the ITS1 tags as successful barcode. Further, these tags were coupled with high-resolution melting (HRM) curve analysis in a real-time PCR genotyping method to derive Bar-HRM (Barcoding-HRM) assays. Suitable HRM primer sets were designed through SNP detection and mutation scanning in genomic signatures of Senna species. The melting profiles of S. alexandrina and S. italica subsp. micrantha were almost identical and the remaining five species were clearly separated so that they can be differentiated by HRM method. The sensitivity of the method was utilized to authenticate market samples [Herbal Sample Assays (HSAs)]. HSA01 (S. alexandrina crude drug sample from Bangalore) and HSA06 (S. alexandrina crude drug sample from Tuticorin, Tamil Nadu, India) were found to be highly contaminated with S. italica subsp. micrantha. Species admixture samples mixed in varying percentage was identified sensitively with detection of contamination as low as 1%. The melting profiles of PCR amplicons are clearly distinct, which enables the authentic differentiation of species by the HRM method. This study reveals that DNA barcoding coupled with HRM is an efficient molecular tool to authenticate Senna herbal products in the market for quality control in the drug supply chain. CIMAP Communication Number: CIMAP/PUB/2017/31 PMID:29593755
Reversible Watermarking Surviving JPEG Compression.
Zain, J; Clarke, M
2005-01-01
This paper will discuss the properties of watermarking medical images. We will also discuss the possibility of such images being compressed by JPEG and give an overview of JPEG compression. We will then propose a watermarking scheme that is reversible and robust to JPEG compression. The purpose is to verify the integrity and authenticity of medical images. We used 800x600x8 bits ultrasound (US) images in our experiment. SHA-256 of the image is then embedded in the Least significant bits (LSB) of an 8x8 block in the Region of Non Interest (RONI). The image is then compressed using JPEG and decompressed using Photoshop 6.0. If the image has not been altered, the watermark extracted will match the hash (SHA256) of the original image. The result shown that the embedded watermark is robust to JPEG compression up to image quality 60 (~91% compressed).
Tan, Chun Kiat; Ng, Jason Changwei; Xu, Xiaotian; Poh, Chueh Loo; Guan, Yong Liang; Sheah, Kenneth
2011-06-01
Teleradiology applications and universal availability of patient records using web-based technology are rapidly gaining importance. Consequently, digital medical image security has become an important issue when images and their pertinent patient information are transmitted across public networks, such as the Internet. Health mandates such as the Health Insurance Portability and Accountability Act require healthcare providers to adhere to security measures in order to protect sensitive patient information. This paper presents a fully reversible, dual-layer watermarking scheme with tamper detection capability for medical images. The scheme utilizes concepts of public-key cryptography and reversible data-hiding technique. The scheme was tested using medical images in DICOM format. The results show that the scheme is able to ensure image authenticity and integrity, and to locate tampered regions in the images.
Reducing Error Rates for Iris Image using higher Contrast in Normalization process
NASA Astrophysics Data System (ADS)
Aminu Ghali, Abdulrahman; Jamel, Sapiee; Abubakar Pindar, Zahraddeen; Hasssan Disina, Abdulkadir; Mat Daris, Mustafa
2017-08-01
Iris recognition system is the most secured, and faster means of identification and authentication. However, iris recognition system suffers a setback from blurring, low contrast and illumination due to low quality image which compromises the accuracy of the system. The acceptance or rejection rates of verified user depend solely on the quality of the image. In many cases, iris recognition system with low image contrast could falsely accept or reject user. Therefore this paper adopts Histogram Equalization Technique to address the problem of False Rejection Rate (FRR) and False Acceptance Rate (FAR) by enhancing the contrast of the iris image. A histogram equalization technique enhances the image quality and neutralizes the low contrast of the image at normalization stage. The experimental result shows that Histogram Equalization Technique has reduced FRR and FAR compared to the existing techniques.
Validation of a Low Dose Simulation Technique for Computed Tomography Images
Muenzel, Daniela; Koehler, Thomas; Brown, Kevin; Žabić, Stanislav; Fingerle, Alexander A.; Waldt, Simone; Bendik, Edgar; Zahel, Tina; Schneider, Armin; Dobritz, Martin; Rummeny, Ernst J.; Noël, Peter B.
2014-01-01
Purpose Evaluation of a new software tool for generation of simulated low-dose computed tomography (CT) images from an original higher dose scan. Materials and Methods Original CT scan data (100 mAs, 80 mAs, 60 mAs, 40 mAs, 20 mAs, 10 mAs; 100 kV) of a swine were acquired (approved by the regional governmental commission for animal protection). Simulations of CT acquisition with a lower dose (simulated 10–80 mAs) were calculated using a low-dose simulation algorithm. The simulations were compared to the originals of the same dose level with regard to density values and image noise. Four radiologists assessed the realistic visual appearance of the simulated images. Results Image characteristics of simulated low dose scans were similar to the originals. Mean overall discrepancy of image noise and CT values was −1.2% (range −9% to 3.2%) and −0.2% (range −8.2% to 3.2%), respectively, p>0.05. Confidence intervals of discrepancies ranged between 0.9–10.2 HU (noise) and 1.9–13.4 HU (CT values), without significant differences (p>0.05). Subjective observer evaluation of image appearance showed no visually detectable difference. Conclusion Simulated low dose images showed excellent agreement with the originals concerning image noise, CT density values, and subjective assessment of the visual appearance of the simulated images. An authentic low-dose simulation opens up opportunity with regard to staff education, protocol optimization and introduction of new techniques. PMID:25247422
Introducing keytagging, a novel technique for the protection of medical image-based tests.
Rubio, Óscar J; Alesanco, Álvaro; García, José
2015-08-01
This paper introduces keytagging, a novel technique to protect medical image-based tests by implementing image authentication, integrity control and location of tampered areas, private captioning with role-based access control, traceability and copyright protection. It relies on the association of tags (binary data strings) to stable, semistable or volatile features of the image, whose access keys (called keytags) depend on both the image and the tag content. Unlike watermarking, this technique can associate information to the most stable features of the image without distortion. Thus, this method preserves the clinical content of the image without the need for assessment, prevents eavesdropping and collusion attacks, and obtains a substantial capacity-robustness tradeoff with simple operations. The evaluation of this technique, involving images of different sizes from various acquisition modalities and image modifications that are typical in the medical context, demonstrates that all the aforementioned security measures can be implemented simultaneously and that the algorithm presents good scalability. In addition to this, keytags can be protected with standard Cryptographic Message Syntax and the keytagging process can be easily combined with JPEG2000 compression since both share the same wavelet transform. This reduces the delays for associating keytags and retrieving the corresponding tags to implement the aforementioned measures to only ≃30 and ≃90ms respectively. As a result, keytags can be seamlessly integrated within DICOM, reducing delays and bandwidth when the image test is updated and shared in secure architectures where different users cooperate, e.g. physicians who interpret the test, clinicians caring for the patient and researchers. Copyright © 2015 Elsevier Inc. All rights reserved.
Using ICT to Foster (Pre) Reading and Writing Skills in Young Children
ERIC Educational Resources Information Center
Voogt, Joke; McKenney, Susan
2008-01-01
This study examines how technology can support the development of emergent reading and writing skills in four- to five-year-old children. The research was conducted with PictoPal, an intervention which features a software package that uses images and text in three main activity areas: reading, writing, and authentic applications. This article…
Curating a Public Self: Exploring Social Media Images of Women in the Outdoors
ERIC Educational Resources Information Center
Gray, Tonia; Norton, Christine; Breault-Hood, Joelle; Christie, Beth; Taylor, Nicole
2018-01-01
Two social media posts (Highland, 2015; Johnson, 2015) about the authenticity of women's experiences in the outdoors fueled an intense dialogue among the authors of this paper. These posts sparked healthy debate, and we asked ourselves, "Why does our apparel, our aesthetic appeal, our physicality, or even our motivation become subject to…
Betsey Holsbery's School: Place, Gender, and Memory
ERIC Educational Resources Information Center
Weiler, Kathleen
2014-01-01
Historical memory is constantly being reframed though images and objects presented as capturing the past. In the USA, the nineteenth-century country or one-room school has come to symbolize an authentic American experience and seen as evidence of the lost pure and simpler time. Central to the work of the rural school was the teacher, and in the…
Template protection and its implementation in 3D face recognition systems
NASA Astrophysics Data System (ADS)
Zhou, Xuebing
2007-04-01
As biometric recognition systems are widely applied in various application areas, security and privacy risks have recently attracted the attention of the biometric community. Template protection techniques prevent stored reference data from revealing private biometric information and enhance the security of biometrics systems against attacks such as identity theft and cross matching. This paper concentrates on a template protection algorithm that merges methods from cryptography, error correction coding and biometrics. The key component of the algorithm is to convert biometric templates into binary vectors. It is shown that the binary vectors should be robust, uniformly distributed, statistically independent and collision-free so that authentication performance can be optimized and information leakage can be avoided. Depending on statistical character of the biometric template, different approaches for transforming biometric templates into compact binary vectors are presented. The proposed methods are integrated into a 3D face recognition system and tested on the 3D facial images of the FRGC database. It is shown that the resulting binary vectors provide an authentication performance that is similar to the original 3D face templates. A high security level is achieved with reasonable false acceptance and false rejection rates of the system, based on an efficient statistical analysis. The algorithm estimates the statistical character of biometric templates from a number of biometric samples in the enrollment database. For the FRGC 3D face database, the small distinction of robustness and discriminative power between the classification results under the assumption of uniquely distributed templates and the ones under the assumption of Gaussian distributed templates is shown in our tests.
Identification of chemical markers in Cordyceps sinensis by HPLC-MS/MS.
Hu, Hankun; Xiao, Ling; Zheng, Baogen; Wei, Xin; Ellis, Alexis; Liu, Yi-Ming
2015-10-01
Authentication and quality assessment of Cordyceps sinensis, a precious and pricey natural product that offers a variety of health benefits, is highly significant. To identify effective chemical markers, authentic C. sinensis was thoroughly screened by using HPLC-MS/MS. In addition to many previously reported ingredients, two glycosides, i.e., cyclo-Ala-Leu-rhamnose and Phe-o-glucose, were detected for the first time in this material. Six ingredients detected, including cordycepin, D-mannitol, Phe, Phe-o-glucose, cyclo-Gly-Pro, and cyclo-Ala-Leu-rhamnose, were selected as a collection of chemical markers. An HPLC-MS/MS method was developed to simultaneously quantify them with sensitivity and specificity. The method had limits of detection ranging from 0.008 μg mL(-1) for cordycepin to 0.75 μg mL(-1) for cyclo-Gly-Pro. Recovery was found between 96 and 103 % in all tests. To evaluate the effectiveness of the marker collection proposed, five authentic C. sinensis samples and five samples of its substitutes were analyzed. Cordycepin, D-mannitol, and Phe were found present in all samples. The contents ranged from 0.0076 to 0.029 % (w/w) for cordycepin, 0.33 to 18.9 % for mannitol, and 0.0013 to 0.642 % for Phe. Interestingly, the two glycosides, Phe-o-glucose and cyclo-Ala-Leu-rhamnose, were detected only in authentic C. sinensis samples. These results indicated that the proposed protocol based on HPLC-MS/MS quantification of the markers might have a great potential in authentication and quality assessment of C. sinensis. Graphical abstract Chemical markers of C. sinensis identified in this work.
NASA Astrophysics Data System (ADS)
Millette, Patricia M.
Authentic field geology research is a inquiry method that encourages students to interact more with their local environment, and by solving genuine puzzles, begin to increase their intuitive understanding of the nature and processes of science. The goal of the current study was to determine if conducting authentic field research and giving high school students the opportunity to present findings to adult audiences outside of the school setting 1) enhances students' understanding of the nature of science, and 2) affects students views of themselves as researchers. To accomplish this, ninth-grade students from a public school in northern New England engaged in a community-initiated glacial geology problem, completed a field research investigation, and presented their findings at several professional conferences. Following the completion of this student-centered field research, I investigated its effects by using a mixed methods approach consisting of qualitative and quantitative data from two sources. These included selected questions from an open-response survey (VNOS-c), and interviews that were conducted with fifteen of the students of different ages and genders. Findings show that conducting original field research seems to have a positive influence on these students' understanding of the NOS as well as the processes of science. Many of the students reported feelings of accomplishment, acceptance of responsibility for the investigation, a sense of their authentic contribution to the body of scientific knowledge in the world, and becoming scientists. This type of authentic field investigation is significant because recent reforms in earth-science education stress the importance of students learning about the nature and processes of scientific knowledge along with science content.
HPTLC Fingerprint Analysis: A Quality Control for Authentication of Herbal Phytochemicals
NASA Astrophysics Data System (ADS)
Ram, Mauji; Abdin, M. Z.; Khan, M. A.; Jha, Prabhakar
Authentication and consistent quality are the basic requirement for Indian traditional medicine (TIM), Chinese traditional herbal medicine (TCHM), and their commercial products, regardless of the kind of research conducted to modernize the TIM and TCHM. The complexities of TIM and TCHM challenge the current official quality control mode, for which only a few biochemical markers were selected for identification and quantitative assay. Referring too many unknown factors existed in TIM and TCHM, it is impossible and unnecessary to pinpoint qualitatively and quantitatively every single component contained in the herbal drug. Chromatographic fingerprint is a rational option to meet the need for more effective and powerful quality assessment to TIM and TCHM. The optimized chromatographic fingerprint is not only an alternative analytical tool for authentication, but also an approach to express the various pattern of chemical ingredients distribution in the herbal drugs and preserve such "database" for further multifaced sustainable studies. Analytical separation techniques, for example, high-performance liquid chromatography (HPLC), gas chromatography (GC) and mass spectrometry (MS) were among the most popular methods of choice used for quality control of raw material and finished herbal product. Fingerprint analysis approach using high-performance thin-layer chromatography (HPTLC) has become the most potent tool for quality control of herbal medicines because of its simplicity and reliability. It can serve as a tool for identification, authentication, and quality control of herbal drugs. In this chapter, attempts are being made to expand the use of HPTLC and at the same time create interest among prospective researcher in herbal analysis. The developed method can be used as a quality control tool for rapid authentication from a wide variety of herbal samples. Some examples demonstrated the role of fingerprinting in quality control and assessment.
Kim, Ki-Wook; Han, Youn-Hee; Min, Sung-Gi
2017-09-21
Many Internet of Things (IoT) services utilize an IoT access network to connect small devices with remote servers. They can share an access network with standard communication technology, such as IEEE 802.11ah. However, an authentication and key management (AKM) mechanism for resource constrained IoT devices using IEEE 802.11ah has not been proposed as yet. We therefore propose a new AKM mechanism for an IoT access network, which is based on IEEE 802.11 key management with the IEEE 802.1X authentication mechanism. The proposed AKM mechanism does not require any pre-configured security information between the access network domain and the IoT service domain. It considers the resource constraints of IoT devices, allowing IoT devices to delegate the burden of AKM processes to a powerful agent. The agent has sufficient power to support various authentication methods for the access point, and it performs cryptographic functions for the IoT devices. Performance analysis shows that the proposed mechanism greatly reduces computation costs, network costs, and memory usage of the resource-constrained IoT device as compared to the existing IEEE 802.11 Key Management with the IEEE 802.1X authentication mechanism.
Han, Youn-Hee; Min, Sung-Gi
2017-01-01
Many Internet of Things (IoT) services utilize an IoT access network to connect small devices with remote servers. They can share an access network with standard communication technology, such as IEEE 802.11ah. However, an authentication and key management (AKM) mechanism for resource constrained IoT devices using IEEE 802.11ah has not been proposed as yet. We therefore propose a new AKM mechanism for an IoT access network, which is based on IEEE 802.11 key management with the IEEE 802.1X authentication mechanism. The proposed AKM mechanism does not require any pre-configured security information between the access network domain and the IoT service domain. It considers the resource constraints of IoT devices, allowing IoT devices to delegate the burden of AKM processes to a powerful agent. The agent has sufficient power to support various authentication methods for the access point, and it performs cryptographic functions for the IoT devices. Performance analysis shows that the proposed mechanism greatly reduces computation costs, network costs, and memory usage of the resource-constrained IoT device as compared to the existing IEEE 802.11 Key Management with the IEEE 802.1X authentication mechanism. PMID:28934152
Romano, Paolo; Manniello, Assunta; Aresu, Ottavia; Armento, Massimiliano; Cesaro, Michela; Parodi, Barbara
2009-01-01
The Cell Line Data Base (CLDB) is a well-known reference information source on human and animal cell lines including information on more than 6000 cell lines. Main biological features are coded according to controlled vocabularies derived from international lists and taxonomies. HyperCLDB (http://bioinformatics.istge.it/hypercldb/) is a hypertext version of CLDB that improves data accessibility by also allowing information retrieval through web spiders. Access to HyperCLDB is provided through indexes of biological characteristics and navigation in the hypertext is granted by many internal links. HyperCLDB also includes links to external resources. Recently, an interest was raised for a reference nomenclature for cell lines and CLDB was seen as an authoritative system. Furthermore, to overcome the cell line misidentification problem, molecular authentication methods, such as fingerprinting, single-locus short tandem repeat (STR) profile and single nucleotide polymorphisms validation, were proposed. Since this data is distributed, a reference portal on authentication of human cell lines is needed. We present here the architecture and contents of CLDB, its recent enhancements and perspectives. We also present a new related database, the Cell Line Integrated Molecular Authentication (CLIMA) database (http://bioinformatics.istge.it/clima/), that allows to link authentication data to actual cell lines. PMID:18927105
Romano, Paolo; Manniello, Assunta; Aresu, Ottavia; Armento, Massimiliano; Cesaro, Michela; Parodi, Barbara
2009-01-01
The Cell Line Data Base (CLDB) is a well-known reference information source on human and animal cell lines including information on more than 6000 cell lines. Main biological features are coded according to controlled vocabularies derived from international lists and taxonomies. HyperCLDB (http://bioinformatics.istge.it/hypercldb/) is a hypertext version of CLDB that improves data accessibility by also allowing information retrieval through web spiders. Access to HyperCLDB is provided through indexes of biological characteristics and navigation in the hypertext is granted by many internal links. HyperCLDB also includes links to external resources. Recently, an interest was raised for a reference nomenclature for cell lines and CLDB was seen as an authoritative system. Furthermore, to overcome the cell line misidentification problem, molecular authentication methods, such as fingerprinting, single-locus short tandem repeat (STR) profile and single nucleotide polymorphisms validation, were proposed. Since this data is distributed, a reference portal on authentication of human cell lines is needed. We present here the architecture and contents of CLDB, its recent enhancements and perspectives. We also present a new related database, the Cell Line Integrated Molecular Authentication (CLIMA) database (http://bioinformatics.istge.it/clima/), that allows to link authentication data to actual cell lines.
Cryptographically secure biometrics
NASA Astrophysics Data System (ADS)
Stoianov, A.
2010-04-01
Biometric systems usually do not possess a cryptographic level of security: it has been deemed impossible to perform a biometric authentication in the encrypted domain because of the natural variability of biometric samples and of the cryptographic intolerance even to a single bite error. Encrypted biometric data need to be decrypted on authentication, which creates privacy and security risks. On the other hand, the known solutions called "Biometric Encryption (BE)" or "Fuzzy Extractors" can be cracked by various attacks, for example, by running offline a database of images against the stored helper data in order to obtain a false match. In this paper, we present a novel approach which combines Biometric Encryption with classical Blum-Goldwasser cryptosystem. In the "Client - Service Provider (SP)" or in the "Client - Database - SP" architecture it is possible to keep the biometric data encrypted on all the stages of the storage and authentication, so that SP never has an access to unencrypted biometric data. It is shown that this approach is suitable for two of the most popular BE schemes, Fuzzy Commitment and Quantized Index Modulation (QIM). The approach has clear practical advantages over biometric systems using "homomorphic encryption". Future work will deal with the application of the proposed solution to one-to-many biometric systems.
Watermarked cardiac CT image segmentation using deformable models and the Hermite transform
NASA Astrophysics Data System (ADS)
Gomez-Coronel, Sandra L.; Moya-Albor, Ernesto; Escalante-Ramírez, Boris; Brieva, Jorge
2015-01-01
Medical image watermarking is an open area for research and is a solution for the protection of copyright and intellectual property. One of the main challenges of this problem is that the marked images should not differ perceptually from the original images allowing a correct diagnosis and authentication. Furthermore, we also aim at obtaining watermarked images with very little numerical distortion so that computer vision tasks such as segmentation of important anatomical structures do not be impaired or affected. We propose a preliminary watermarking application in cardiac CT images based on a perceptive approach that includes a brightness model to generate a perceptive mask and identify the image regions where the watermark detection becomes a difficult task for the human eye. We propose a normalization scheme of the image in order to improve robustness against geometric attacks. We follow a spread spectrum technique to insert an alphanumeric code, such as patient's information, within the watermark. The watermark scheme is based on the Hermite transform as a bio-inspired image representation model. In order to evaluate the numerical integrity of the image data after watermarking, we perform a segmentation task based on deformable models. The segmentation technique is based on a vector-value level sets method such that, given a curve in a specific image, and subject to some constraints, the curve can evolve in order to detect objects. In order to stimulate the curve evolution we introduce simultaneously some image features like the gray level and the steered Hermite coefficients as texture descriptors. Segmentation performance was assessed by means of the Dice index and the Hausdorff distance. We tested different mark sizes and different insertion schemes on images that were later segmented either automatic or manual by physicians.
Analysis of the hand vein pattern for people recognition
NASA Astrophysics Data System (ADS)
Castro-Ortega, R.; Toxqui-Quitl, C.; Cristóbal, G.; Marcos, J. Victor; Padilla-Vivanco, A.; Hurtado Pérez, R.
2015-09-01
The shape of the hand vascular pattern contains useful and unique features that can be used for identifying and authenticating people, with applications in access control, medicine and financial services. In this work, an optical system for the image acquisition of the hand vascular pattern is implemented. It consists of a CCD camera with sensitivity in the IR and a light source with emission in the 880 nm. The IR radiation interacts with the desoxyhemoglobin, hemoglobin and water present in the blood of the veins, making possible to see the vein pattern underneath skin. The segmentation of the Region Of Interest (ROI) is achieved using geometrical moments locating the centroid of an image. For enhancement of the vein pattern we use the technique of Histogram Equalization and Contrast Limited Adaptive Histogram Equalization (CLAHE). In order to remove unnecessary information such as body hair and skinfolds, a low pass filter is implemented. A method based on geometric moments is used to obtain the invariant descriptors of the input images. The classification task is achieved using Artificial Neural Networks (ANN) and K-Nearest Neighbors (K-nn) algorithms. Experimental results using our database show a percentage of correct classification, higher of 86.36% with ANN for 912 images of 38 people with 12 versions each one.
Laser-heat puncturing as highly effective method of post-tuberculous cystalgia treatment
NASA Astrophysics Data System (ADS)
Koultchavenia, Ekaterina V.
1999-07-01
The tuberculosis of an urine bladder in men develops is authentic less often, and recovery is authentic more often, than in the women. In 39,1 percent of the women with nephrotuberculosis and urocystis and urocystis tuberculosis a specific cystitides is finished in development of post- tuberculous cystalgia. One of starting mechanism of dysuria after the transferred urocystis tuberculosis in the women in menopause is hormonal insufficiency. The method of laser heat puncturing, developed by us, for the treatment this complication is highly effective, does not require additional introduction of medicines, can be executed as in hospitals, and in our-patient.