Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina
2015-01-01
Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin
2010-04-01
The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.
Privacy preserving, real-time and location secured biometrics for mCommerce authentication
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Al-Assam, Hisham; Jassim, Sabah; Lami, Ihsan A.
2011-06-01
Secure wireless connectivity between mobile devices and financial/commercial establishments is mature, and so is the security of remote authentication for mCommerce. However, the current techniques are open for hacking, false misrepresentation, replay and other attacks. This is because of the lack of real-time and current-precise-location in the authentication process. This paper proposes a new technique that includes freshly-generated real-time personal biometric data of the client and present-position of the mobile device used by the client to perform the mCommerce so to form a real-time biometric representation to authenticate any remote transaction. A fresh GPS fix generates the "time and location" to stamp the biometric data freshly captured to produce a single, real-time biometric representation on the mobile device. A trusted Certification Authority (CA) acts as an independent authenticator of such client's claimed realtime location and his/her provided fresh biometric data. Thus eliminates the necessity of user enrolment with many mCommerce services and application providers. This CA can also "independently from the client" and "at that instant of time" collect the client's mobile device "time and location" from the cellular network operator so to compare with the received information, together with the client's stored biometric information. Finally, to preserve the client's location privacy and to eliminate the possibility of cross-application client tracking, this paper proposes shielding the real location of the mobile device used prior to submission to the CA or authenticators.
Using Biometric Measurement in Real-Time as a Sympathetic System in Computer Games
ERIC Educational Resources Information Center
Charij, Stephanie; Oikonomou, Andreas
2013-01-01
With the increasing potential for gaming hardware and peripherals to support biometrics, their application within the games industry for software and design should be considered. This paper assesses the ability to use a form of biometric measurement, heart rate, in real-time to improve the challenge and enjoyment of a game by catering it to…
2014-04-01
must be done to determine current infrastructure and capabilities so that necessary updates and changes can be addressed up front. Mobile biometric...with existing satellite communications infrastructure . 20 PSTP 03-427BIOM 4 State of Mobile Biometric Device Market 4.1 Fingerprint...is a wireless information system highlighted by Real-time wireless data collection mobile device independence, wireless infrastructure independence
Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary
2014-11-01
Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Biometric identification: a holistic perspective
NASA Astrophysics Data System (ADS)
Nadel, Lawrence D.
2007-04-01
Significant advances continue to be made in biometric technology. However, the global war on terrorism and our increasingly electronic society have created the societal need for large-scale, interoperable biometric capabilities that challenge the capabilities of current off-the-shelf technology. At the same time, there are concerns that large-scale implementation of biometrics will infringe our civil liberties and offer increased opportunities for identity theft. This paper looks beyond the basic science and engineering of biometric sensors and fundamental matching algorithms and offers approaches for achieving greater performance and acceptability of applications enabled with currently available biometric technologies. The discussion focuses on three primary biometric system aspects: performance and scalability, interoperability, and cost benefit. Significant improvements in system performance and scalability can be achieved through careful consideration of the following elements: biometric data quality, human factors, operational environment, workflow, multibiometric fusion, and integrated performance modeling. Application interoperability hinges upon some of the factors noted above as well as adherence to interface, data, and performance standards. However, there are times when the price of conforming to such standards can be decreased local system performance. The development of biometric performance-based cost benefit models can help determine realistic requirements and acceptable designs.
21 CFR 1311.116 - Additional requirements for biometrics.
Code of Federal Regulations, 2011 CFR
2011-04-01
... controlled substances. (f) The biometric subsystem must store device ID data at enrollment (i.e., biometric registration) with the biometric data and verify the device ID at the time of authentication to the electronic prescription application. (g) The biometric subsystem must protect the biometric data (raw data or templates...
21 CFR 1311.116 - Additional requirements for biometrics.
Code of Federal Regulations, 2010 CFR
2010-04-01
... controlled substances. (f) The biometric subsystem must store device ID data at enrollment (i.e., biometric registration) with the biometric data and verify the device ID at the time of authentication to the electronic prescription application. (g) The biometric subsystem must protect the biometric data (raw data or templates...
SegAuth: A Segment-based Approach to Behavioral Biometric Authentication
Li, Yanyan; Xie, Mengjun; Bian, Jiang
2016-01-01
Many studies have been conducted to apply behavioral biometric authentication on/with mobile devices and they have shown promising results. However, the concern about the verification accuracy of behavioral biometrics is still common given the dynamic nature of behavioral biometrics. In this paper, we address the accuracy concern from a new perspective—behavior segments, that is, segments of a gesture instead of the whole gesture as the basic building block for behavioral biometric authentication. With this unique perspective, we propose a new behavioral biometric authentication method called SegAuth, which can be applied to various gesture or motion based authentication scenarios. SegAuth can achieve high accuracy by focusing on each user’s distinctive gesture segments that frequently appear across his or her gestures. In SegAuth, a time series derived from a gesture/motion is first partitioned into segments and then transformed into a set of string tokens in which the tokens representing distinctive, repetitive segments are associated with higher genuine probabilities than those tokens that are common across users. An overall genuine score calculated from all the tokens derived from a gesture is used to determine the user’s authenticity. We have assessed the effectiveness of SegAuth using 4 different datasets. Our experimental results demonstrate that SegAuth can achieve higher accuracy consistently than existing popular methods on the evaluation datasets. PMID:28573214
SegAuth: A Segment-based Approach to Behavioral Biometric Authentication.
Li, Yanyan; Xie, Mengjun; Bian, Jiang
2016-10-01
Many studies have been conducted to apply behavioral biometric authentication on/with mobile devices and they have shown promising results. However, the concern about the verification accuracy of behavioral biometrics is still common given the dynamic nature of behavioral biometrics. In this paper, we address the accuracy concern from a new perspective-behavior segments, that is, segments of a gesture instead of the whole gesture as the basic building block for behavioral biometric authentication. With this unique perspective, we propose a new behavioral biometric authentication method called SegAuth, which can be applied to various gesture or motion based authentication scenarios. SegAuth can achieve high accuracy by focusing on each user's distinctive gesture segments that frequently appear across his or her gestures. In SegAuth, a time series derived from a gesture/motion is first partitioned into segments and then transformed into a set of string tokens in which the tokens representing distinctive, repetitive segments are associated with higher genuine probabilities than those tokens that are common across users. An overall genuine score calculated from all the tokens derived from a gesture is used to determine the user's authenticity. We have assessed the effectiveness of SegAuth using 4 different datasets. Our experimental results demonstrate that SegAuth can achieve higher accuracy consistently than existing popular methods on the evaluation datasets.
Disk space and load time requirements for eye movement biometric databases
NASA Astrophysics Data System (ADS)
Kasprowski, Pawel; Harezlak, Katarzyna
2016-06-01
Biometric identification is a very popular area of interest nowadays. Problems with the so-called physiological methods like fingerprints or iris recognition resulted in increased attention paid to methods measuring behavioral patterns. Eye movement based biometric (EMB) identification is one of the interesting behavioral methods and due to the intensive development of eye tracking devices it has become possible to define new methods for the eye movement signal processing. Such method should be supported by an efficient storage used to collect eye movement data and provide it for further analysis. The aim of the research was to check various setups enabling such a storage choice. There were various aspects taken into consideration, like disk space usage, time required for loading and saving whole data set or its chosen parts.
Multi-factor challenge/response approach for remote biometric authentication
NASA Astrophysics Data System (ADS)
Al-Assam, Hisham; Jassim, Sabah A.
2011-06-01
Although biometric authentication is perceived to be more reliable than traditional authentication schemes, it becomes vulnerable to many attacks when it comes to remote authentication over open networks and raises serious privacy concerns. This paper proposes a biometric-based challenge-response approach to be used for remote authentication between two parties A and B over open networks. In the proposed approach, a remote authenticator system B (e.g. a bank) challenges its client A who wants to authenticate his/her self to the system by sending a one-time public random challenge. The client A responds by employing the random challenge along with secret information obtained from a password and a token to produce a one-time cancellable representation of his freshly captured biometric sample. The one-time biometric representation, which is based on multi-factor, is then sent back to B for matching. Here, we argue that eavesdropping of the one-time random challenge and/or the resulting one-time biometric representation does not compromise the security of the system, and no information about the original biometric data is leaked. In addition to securing biometric templates, the proposed protocol offers a practical solution for the replay attack on biometric systems. Moreover, we propose a new scheme for generating a password-based pseudo random numbers/permutation to be used as a building block in the proposed approach. The proposed scheme is also designed to provide protection against repudiation. We illustrate the viability and effectiveness of the proposed approach by experimental results based on two biometric modalities: fingerprint and face biometrics.
A lightweight approach for biometric template protection
NASA Astrophysics Data System (ADS)
Al-Assam, Hisham; Sellahewa, Harin; Jassim, Sabah
2009-05-01
Privacy and security are vital concerns for practical biometric systems. The concept of cancelable or revocable biometrics has been proposed as a solution for biometric template security. Revocable biometric means that biometric templates are no longer fixed over time and could be revoked in the same way as lost or stolen credit cards are. In this paper, we describe a novel and an efficient approach to biometric template protection that meets the revocability property. This scheme can be incorporated into any biometric verification scheme while maintaining, if not improving, the accuracy of the original biometric system. However, we shall demonstrate the result of applying such transforms on face biometric templates and compare the efficiency of our approach with that of the well-known random projection techniques. We shall also present the results of experimental work on recognition accuracy before and after applying the proposed transform on feature vectors that are generated by wavelet transforms. These results are based on experiments conducted on a number of well-known face image databases, e.g. Yale and ORL databases.
Conditional adaptive Bayesian spectral analysis of nonstationary biomedical time series.
Bruce, Scott A; Hall, Martica H; Buysse, Daniel J; Krafty, Robert T
2018-03-01
Many studies of biomedical time series signals aim to measure the association between frequency-domain properties of time series and clinical and behavioral covariates. However, the time-varying dynamics of these associations are largely ignored due to a lack of methods that can assess the changing nature of the relationship through time. This article introduces a method for the simultaneous and automatic analysis of the association between the time-varying power spectrum and covariates, which we refer to as conditional adaptive Bayesian spectrum analysis (CABS). The procedure adaptively partitions the grid of time and covariate values into an unknown number of approximately stationary blocks and nonparametrically estimates local spectra within blocks through penalized splines. CABS is formulated in a fully Bayesian framework, in which the number and locations of partition points are random, and fit using reversible jump Markov chain Monte Carlo techniques. Estimation and inference averaged over the distribution of partitions allows for the accurate analysis of spectra with both smooth and abrupt changes. The proposed methodology is used to analyze the association between the time-varying spectrum of heart rate variability and self-reported sleep quality in a study of older adults serving as the primary caregiver for their ill spouse. © 2017, The International Biometric Society.
Cryptographically secure biometrics
NASA Astrophysics Data System (ADS)
Stoianov, A.
2010-04-01
Biometric systems usually do not possess a cryptographic level of security: it has been deemed impossible to perform a biometric authentication in the encrypted domain because of the natural variability of biometric samples and of the cryptographic intolerance even to a single bite error. Encrypted biometric data need to be decrypted on authentication, which creates privacy and security risks. On the other hand, the known solutions called "Biometric Encryption (BE)" or "Fuzzy Extractors" can be cracked by various attacks, for example, by running offline a database of images against the stored helper data in order to obtain a false match. In this paper, we present a novel approach which combines Biometric Encryption with classical Blum-Goldwasser cryptosystem. In the "Client - Service Provider (SP)" or in the "Client - Database - SP" architecture it is possible to keep the biometric data encrypted on all the stages of the storage and authentication, so that SP never has an access to unencrypted biometric data. It is shown that this approach is suitable for two of the most popular BE schemes, Fuzzy Commitment and Quantized Index Modulation (QIM). The approach has clear practical advantages over biometric systems using "homomorphic encryption". Future work will deal with the application of the proposed solution to one-to-many biometric systems.
NASA Astrophysics Data System (ADS)
Grijpink, Jan
2004-06-01
Along at least twelve dimensions biometric systems might vary. We need to exploit this variety to manoeuvre biometrics into place to be able to realise its social potential. Subsequently, two perspectives on biometrics are proposed revealing that biometrics will probably be ineffective in combating identity fraud, organised crime and terrorism: (1) the value chain perspective explains the first barrier: our strong preference for large scale biometric systems for general compulsory use. These biometric systems cause successful infringements to spread unnoticed. A biometric system will only function adequately if biometrics is indispensable for solving the dominant chain problem. Multi-chain use of biometrics takes it beyond the boundaries of good manageability. (2) the identity fraud perspective exposes the second barrier: our traditional approach to identity verification. We focus on identity documents, neglecting the person and the situation involved. Moreover, western legal cultures have made identity verification procedures known, transparent, uniform and predictable. Thus, we have developed a blind spot to identity fraud. Biometrics provides good potential to better checking persons, but will probably be used to enhance identity documents. Biometrics will only pay off if it confronts the identity fraudster with less predictable verification processes and more risks of his identity fraud being spotted. Standardised large scale applications of biometrics for general compulsory use without countervailing measures will probably produce the reverse. This contribution tentatively presents a few headlines for an overall biometrics strategy that could better resist identity fraud.
Biometric and Emotion Identification: An ECG Compression Based Method.
Brás, Susana; Ferreira, Jacqueline H T; Soares, Sandra C; Pinho, Armando J
2018-01-01
We present an innovative and robust solution to both biometric and emotion identification using the electrocardiogram (ECG). The ECG represents the electrical signal that comes from the contraction of the heart muscles, indirectly representing the flow of blood inside the heart, it is known to convey a key that allows biometric identification. Moreover, due to its relationship with the nervous system, it also varies as a function of the emotional state. The use of information-theoretic data models, associated with data compression algorithms, allowed to effectively compare ECG records and infer the person identity, as well as emotional state at the time of data collection. The proposed method does not require ECG wave delineation or alignment, which reduces preprocessing error. The method is divided into three steps: (1) conversion of the real-valued ECG record into a symbolic time-series, using a quantization process; (2) conditional compression of the symbolic representation of the ECG, using the symbolic ECG records stored in the database as reference; (3) identification of the ECG record class, using a 1-NN (nearest neighbor) classifier. We obtained over 98% of accuracy in biometric identification, whereas in emotion recognition we attained over 90%. Therefore, the method adequately identify the person, and his/her emotion. Also, the proposed method is flexible and may be adapted to different problems, by the alteration of the templates for training the model.
NASA Astrophysics Data System (ADS)
Arndt, Craig M.
2004-08-01
Biometric are a powerful technology for identifying humans both locally and at a distance. In order to perform identification or verification biometric systems capture an image of some biometric of a user or subject. The image is then converted mathematical to representation of the person call a template. Since we know that every human in the world is different each human will have different biometric images (different fingerprints, or faces, etc.). This is what makes biometrics useful for identification. However unlike a credit card number or a password to can be given to a person and later revoked if it is compromised and biometric is with the person for life. The problem then is to develop biometric templates witch can be easily revoked and reissued which are also unique to the user and can be easily used for identification and verification. In this paper we develop and present a method to generate a set of templates which are fully unique to the individual and also revocable. By using bases set compression algorithms in an n-dimensional orthogonal space we can represent a give biometric image in an infinite number of equally valued and unique ways. The verification and biometric matching system would be presented with a given template and revocation code. The code will then representing where in the sequence of n-dimensional vectors to start the recognition.
Strait, Robert S.; Pearson, Peter K.; Sengupta, Sailes K.
2000-01-01
A password system comprises a set of codewords spaced apart from one another by a Hamming distance (HD) that exceeds twice the variability that can be projected for a series of biometric measurements for a particular individual and that is less than the HD that can be encountered between two individuals. To enroll an individual, a biometric measurement is taken and exclusive-ORed with a random codeword to produce a "reference value." To verify the individual later, a biometric measurement is taken and exclusive-ORed with the reference value to reproduce the original random codeword or its approximation. If the reproduced value is not a codeword, the nearest codeword to it is found, and the bits that were corrected to produce the codeword to it is found, and the bits that were corrected to produce the codeword are also toggled in the biometric measurement taken and the codeword generated during enrollment. The correction scheme can be implemented by any conventional error correction code such as Reed-Muller code R(m,n). In the implementation using a hand geometry device an R(2,5) code has been used in this invention. Such codeword and biometric measurement can then be used to see if the individual is an authorized user. Conventional Diffie-Hellman public key encryption schemes and hashing procedures can then be used to secure the communications lines carrying the biometric information and to secure the database of authorized users.
Biometric and Emotion Identification: An ECG Compression Based Method
Brás, Susana; Ferreira, Jacqueline H. T.; Soares, Sandra C.; Pinho, Armando J.
2018-01-01
We present an innovative and robust solution to both biometric and emotion identification using the electrocardiogram (ECG). The ECG represents the electrical signal that comes from the contraction of the heart muscles, indirectly representing the flow of blood inside the heart, it is known to convey a key that allows biometric identification. Moreover, due to its relationship with the nervous system, it also varies as a function of the emotional state. The use of information-theoretic data models, associated with data compression algorithms, allowed to effectively compare ECG records and infer the person identity, as well as emotional state at the time of data collection. The proposed method does not require ECG wave delineation or alignment, which reduces preprocessing error. The method is divided into three steps: (1) conversion of the real-valued ECG record into a symbolic time-series, using a quantization process; (2) conditional compression of the symbolic representation of the ECG, using the symbolic ECG records stored in the database as reference; (3) identification of the ECG record class, using a 1-NN (nearest neighbor) classifier. We obtained over 98% of accuracy in biometric identification, whereas in emotion recognition we attained over 90%. Therefore, the method adequately identify the person, and his/her emotion. Also, the proposed method is flexible and may be adapted to different problems, by the alteration of the templates for training the model. PMID:29670564
Biometric templates selection and update using quality measures
NASA Astrophysics Data System (ADS)
Abboud, Ali J.; Jassim, Sabah A.
2012-06-01
To deal with severe variation in recording conditions, most biometric systems acquire multiple biometric samples, at the enrolment stage, for the same person and then extract their individual biometric feature vectors and store them in the gallery in the form of biometric template(s), labelled with the person's identity. The number of samples/templates and the choice of the most appropriate templates influence the performance of the system. The desired biometric template(s) selection technique must aim to control the run time and storage requirements while improving the recognition accuracy of the biometric system. This paper is devoted to elaborating on and discussing a new two stages approach for biometric templates selection and update. This approach uses a quality-based clustering, followed by a special criterion for the selection of an ultimate set of biometric templates from the various clusters. This approach is developed to select adaptively a specific number of templates for each individual. The number of biometric templates depends mainly on the performance of each individual (i.e. gallery size should be optimised to meet the needs of each target individual). These experiments have been conducted on two face image databases and their results will demonstrate the effectiveness of proposed quality-guided approach.
Super Bowl Surveillance: Facing Up to Biometrics
2001-05-01
Biometric facial recognition can provide significant benefits to society. At the same time, the rapid growth and improvement in the technology could...using facial recognition where it can produce positive benefits. Biometric facial recognition is by no means a perfect technology, and much technical
2016-05-01
Biometrics in Support of Operations Biometrics -at-Sea: Business Rules for South Florida United States...Intelligence Activities Biometrics -Enabled Intelligence USCG Biometrics -at-Sea: Business Rules for...Defense Biometrics United States Intelligence Activities Active Army,
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, R.; Grace, W.
1996-07-01
This is the final report of a one-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). We won a 1994 R&D 100 Award for inventing the Bartas Iris Verification System. The system has been delivered to a sponsor and is no longer available to us. This technology can verify the identity of a person for purposes of access control, national security, law enforcement, forensics, counter-terrorism, and medical, financial, or scholastic records. The technique is non-invasive, psychologically acceptable, works in real-time, and obtains more biometric data than any other biometric except DNA analysis. This project soughtmore » to develop a new, second-generation prototype instrument.« less
Selectively Encrypted Pull-Up Based Watermarking of Biometric data
NASA Astrophysics Data System (ADS)
Shinde, S. A.; Patel, Kushal S.
2012-10-01
Biometric authentication systems are becoming increasingly popular due to their potential usage in information security. However, digital biometric data (e.g. thumb impression) are themselves vulnerable to security attacks. There are various methods are available to secure biometric data. In biometric watermarking the data are embedded in an image container and are only retrieved if the secrete key is available. This container image is encrypted to have more security against the attack. As wireless devices are equipped with battery as their power supply, they have limited computational capabilities; therefore to reduce energy consumption we use the method of selective encryption of container image. The bit pull-up-based biometric watermarking scheme is based on amplitude modulation and bit priority which reduces the retrieval error rate to great extent. By using selective Encryption mechanism we expect more efficiency in time at the time of encryption as well as decryption. Significant reduction in error rate is expected to be achieved by the bit pull-up method.
Biometrics and international migration.
Redpath, Jillyanne
2007-01-01
This paper will focus on the impact of the rapid expansion in the use of biometric systems in migration management on the rights of individuals; it seeks to highlight legal issues for consideration in implementing such systems, taking as the starting point that the security interests of the state and the rights of the individual are not, and should not be, mutually exclusive. The first part of this paper briefly describes the type of biometric applications available, how biometric systems function, and those used in migration management. The second part examines the potential offered by biometrics for greater security in migration management, and focuses on developments in the use of biometrics as a result of September 11. The third part discusses the impact of the use of biometrics in the management of migration on the individual's right to privacy and ability to move freely and lawfully. The paper highlights the increasing need for domestic and international frameworks to govern the use of biometric applications in the migration/security context, and proposes a number of issues that such frameworks could address.
The biometric recognition on contactless multi-spectrum finger images
NASA Astrophysics Data System (ADS)
Kang, Wenxiong; Chen, Xiaopeng; Wu, Qiuxia
2015-01-01
This paper presents a novel multimodal biometric system based on contactless multi-spectrum finger images, which aims to deal with the limitations of unimodal biometrics. The chief merits of the system are the richness of the permissible texture and the ease of data access. We constructed a multi-spectrum instrument to simultaneously acquire three different types of biometrics from a finger: contactless fingerprint, finger vein, and knuckleprint. On the basis of the samples with these characteristics, a moderate database was built for the evaluation of our system. Considering the real-time requirements and the respective characteristics of the three biometrics, the block local binary patterns algorithm was used to extract features and match for the fingerprints and finger veins, while the Oriented FAST and Rotated BRIEF algorithm was applied for knuckleprints. Finally, score-level fusion was performed on the matching results from the aforementioned three types of biometrics. The experiments showed that our proposed multimodal biometric recognition system achieves an equal error rate of 0.109%, which is 88.9%, 94.6%, and 89.7% lower than the individual fingerprint, knuckleprint, and finger vein recognitions, respectively. Nevertheless, our proposed system also satisfies the real-time requirements of the applications.
Mei, Jiangyuan; Liu, Meizhu; Wang, Yuan-Fang; Gao, Huijun
2016-06-01
Multivariate time series (MTS) datasets broadly exist in numerous fields, including health care, multimedia, finance, and biometrics. How to classify MTS accurately has become a hot research topic since it is an important element in many computer vision and pattern recognition applications. In this paper, we propose a Mahalanobis distance-based dynamic time warping (DTW) measure for MTS classification. The Mahalanobis distance builds an accurate relationship between each variable and its corresponding category. It is utilized to calculate the local distance between vectors in MTS. Then we use DTW to align those MTS which are out of synchronization or with different lengths. After that, how to learn an accurate Mahalanobis distance function becomes another key problem. This paper establishes a LogDet divergence-based metric learning with triplet constraint model which can learn Mahalanobis matrix with high precision and robustness. Furthermore, the proposed method is applied on nine MTS datasets selected from the University of California, Irvine machine learning repository and Robert T. Olszewski's homepage, and the results demonstrate the improved performance of the proposed approach.
Gas discharge visualization: an imaging and modeling tool for medical biometrics.
Kostyuk, Nataliya; Cole, Phyadragren; Meghanathan, Natarajan; Isokpehi, Raphael D; Cohly, Hari H P
2011-01-01
The need for automated identification of a disease makes the issue of medical biometrics very current in our society. Not all biometric tools available provide real-time feedback. We introduce gas discharge visualization (GDV) technique as one of the biometric tools that have the potential to identify deviations from the normal functional state at early stages and in real time. GDV is a nonintrusive technique to capture the physiological and psychoemotional status of a person and the functional status of different organs and organ systems through the electrophotonic emissions of fingertips placed on the surface of an impulse analyzer. This paper first introduces biometrics and its different types and then specifically focuses on medical biometrics and the potential applications of GDV in medical biometrics. We also present our previous experience with GDV in the research regarding autism and the potential use of GDV in combination with computer science for the potential development of biological pattern/biomarker for different kinds of health abnormalities including cancer and mental diseases.
Yager, Neil; Dunstone, Ted
2010-02-01
It is commonly accepted that users of a biometric system may have differing degrees of accuracy within the system. Some people may have trouble authenticating, while others may be particularly vulnerable to impersonation. Goats, wolves, and lambs are labels commonly applied to these problem users. These user types are defined in terms of verification performance when users are matched against themselves (goats) or when matched against others (lambs and wolves). The relationship between a user's genuine and impostor match results suggests four new user groups: worms, doves, chameleons, and phantoms. We establish formal definitions for these animals and a statistical test for their existence. A thorough investigation is conducted using a broad range of biometric modalities, including 2D and 3D faces, fingerprints, iris, speech, and keystroke dynamics. Patterns that emerge from the results expose novel, important, and encouraging insights into the nature of biometric match results. A new framework for the evaluation of biometric systems based on the biometric menagerie, as opposed to collective statistics, is proposed.
Modeling long-term human activeness using recurrent neural networks for biometric data.
Kim, Zae Myung; Oh, Hyungrai; Kim, Han-Gyu; Lim, Chae-Gyun; Oh, Kyo-Joong; Choi, Ho-Jin
2017-05-18
With the invention of fitness trackers, it has been possible to continuously monitor a user's biometric data such as heart rates, number of footsteps taken, and amount of calories burned. This paper names the time series of these three types of biometric data, the user's "activeness", and investigates the feasibility in modeling and predicting the long-term activeness of the user. The dataset used in this study consisted of several months of biometric time-series data gathered by seven users independently. Four recurrent neural network (RNN) architectures-as well as a deep neural network and a simple regression model-were proposed to investigate the performance on predicting the activeness of the user under various length-related hyper-parameter settings. In addition, the learned model was tested to predict the time period when the user's activeness falls below a certain threshold. A preliminary experimental result shows that each type of activeness data exhibited a short-term autocorrelation; and among the three types of data, the consumed calories and the number of footsteps were positively correlated, while the heart rate data showed almost no correlation with neither of them. It is probably due to this characteristic of the dataset that although the RNN models produced the best results on modeling the user's activeness, the difference was marginal; and other baseline models, especially the linear regression model, performed quite admirably as well. Further experimental results show that it is feasible to predict a user's future activeness with precision, for example, a trained RNN model could predict-with the precision of 84%-when the user would be less active within the next hour given the latest 15 min of his activeness data. This paper defines and investigates the notion of a user's "activeness", and shows that forecasting the long-term activeness of the user is indeed possible. Such information can be utilized by a health-related application to proactively
Gas Discharge Visualization: An Imaging and Modeling Tool for Medical Biometrics
Kostyuk, Nataliya; Cole, Phyadragren; Meghanathan, Natarajan; Isokpehi, Raphael D.; Cohly, Hari H. P.
2011-01-01
The need for automated identification of a disease makes the issue of medical biometrics very current in our society. Not all biometric tools available provide real-time feedback. We introduce gas discharge visualization (GDV) technique as one of the biometric tools that have the potential to identify deviations from the normal functional state at early stages and in real time. GDV is a nonintrusive technique to capture the physiological and psychoemotional status of a person and the functional status of different organs and organ systems through the electrophotonic emissions of fingertips placed on the surface of an impulse analyzer. This paper first introduces biometrics and its different types and then specifically focuses on medical biometrics and the potential applications of GDV in medical biometrics. We also present our previous experience with GDV in the research regarding autism and the potential use of GDV in combination with computer science for the potential development of biological pattern/biomarker for different kinds of health abnormalities including cancer and mental diseases. PMID:21747817
The research and application of multi-biometric acquisition embedded system
NASA Astrophysics Data System (ADS)
Deng, Shichao; Liu, Tiegen; Guo, Jingjing; Li, Xiuyan
2009-11-01
The identification technology based on multi-biometric can greatly improve the applicability, reliability and antifalsification. This paper presents a multi-biometric system bases on embedded system, which includes: three capture daughter boards are applied to obtain different biometric: one each for fingerprint, iris and vein of the back of hand; FPGA (Field Programmable Gate Array) is designed as coprocessor, which uses to configure three daughter boards on request and provides data path between DSP (digital signal processor) and daughter boards; DSP is the master processor and its functions include: control the biometric information acquisition, extracts feature as required and responsible for compare the results with the local database or data server through network communication. The advantages of this system were it can acquire three different biometric in real time, extracts complexity feature flexibly in different biometrics' raw data according to different purposes and arithmetic and network interface on the core-board will be the solution of big data scale. Because this embedded system has high stability, reliability, flexibility and fit for different data scale, it can satisfy the demand of multi-biometric recognition.
2012-03-13
aspects associated with the use of fingerprinting. Another form of physical biometrics is facial recognition . ― Facial recognition unlike other...have originated back to the early 1960s. ―One of the leading pioneers in facial recognition biometrics was Woodrow W. Bledsoe who developed a...identified match. There are several advantages associated with Facial recognition . It is highly reliable, used extensively in security systems, and
The biometric-based module of smart grid system
NASA Astrophysics Data System (ADS)
Engel, E.; Kovalev, I. V.; Ermoshkina, A.
2015-10-01
Within Smart Grid concept the flexible biometric-based module base on Principal Component Analysis (PCA) and selective Neural Network is developed. The formation of the selective Neural Network the biometric-based module uses the method which includes three main stages: preliminary processing of the image, face localization and face recognition. Experiments on the Yale face database show that (i) selective Neural Network exhibits promising classification capability for face detection, recognition problems; and (ii) the proposed biometric-based module achieves near real-time face detection, recognition speed and the competitive performance, as compared to some existing subspaces-based methods.
NASA Astrophysics Data System (ADS)
Hsu, Charles; Viazanko, Michael; O'Looney, Jimmy; Szu, Harold
2009-04-01
Modularity Biometric System (MBS) is an approach to support AiTR of the cooperated and/or non-cooperated standoff biometric in an area persistent surveillance. Advanced active and passive EOIR and RF sensor suite is not considered here. Neither will we consider the ROC, PD vs. FAR, versus the standoff POT in this paper. Our goal is to catch the "most wanted (MW)" two dozens, separately furthermore ad hoc woman MW class from man MW class, given their archrivals sparse front face data basis, by means of various new instantaneous input called probing faces. We present an advanced algorithm: mini-Max classifier, a sparse sample realization of Cramer-Rao Fisher bound of the Maximum Likelihood classifier that minimize the dispersions among the same woman classes and maximize the separation among different man-woman classes, based on the simple feature space of MIT Petland eigen-faces. The original aspect consists of a modular structured design approach at the system-level with multi-level architectures, multiple computing paradigms, and adaptable/evolvable techniques to allow for achieving a scalable structure in terms of biometric algorithms, identification quality, sensors, database complexity, database integration, and component heterogenity. MBS consist of a number of biometric technologies including fingerprints, vein maps, voice and face recognitions with innovative DSP algorithm, and their hardware implementations such as using Field Programmable Gate arrays (FPGAs). Biometric technologies and the composed modularity biometric system are significant for governmental agencies, enterprises, banks and all other organizations to protect people or control access to critical resources.
Secure biometric image sensor and authentication scheme based on compressed sensing.
Suzuki, Hiroyuki; Suzuki, Masamichi; Urabe, Takuya; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki
2013-11-20
It is important to ensure the security of biometric authentication information, because its leakage causes serious risks, such as replay attacks using the stolen biometric data, and also because it is almost impossible to replace raw biometric information. In this paper, we propose a secure biometric authentication scheme that protects such information by employing an optical data ciphering technique based on compressed sensing. The proposed scheme is based on two-factor authentication, the biometric information being supplemented by secret information that is used as a random seed for a cipher key. In this scheme, a biometric image is optically encrypted at the time of image capture, and a pair of restored biometric images for enrollment and verification are verified in the authentication server. If any of the biometric information is exposed to risk, it can be reenrolled by changing the secret information. Through numerical experiments, we confirm that finger vein images can be restored from the compressed sensing measurement data. We also present results that verify the accuracy of the scheme.
On Biometrics With Eye Movements.
Zhang, Youming; Juhola, Martti
2017-09-01
Eye movements are a relatively novel data source for biometric identification. When video cameras applied to eye tracking become smaller and more efficient, this data source could offer interesting opportunities for the development of eye movement biometrics. In this paper, we study primarily biometric identification as seen as a classification task of multiple classes, and secondarily biometric verification considered as binary classification. Our research is based on the saccadic eye movement signal measurements from 109 young subjects. In order to test the data measured, we use a procedure of biometric identification according to the one-versus-one (subject) principle. In a development from our previous research, which also involved biometric verification based on saccadic eye movements, we now apply another eye movement tracker device with a higher sampling frequency of 250 Hz. The results obtained are good, with correct identification rates at 80-90% at their best.
Emerging Biometric Modalities: Challenges and Opportunities
NASA Astrophysics Data System (ADS)
Gafurov, Davrondzhon
Recent advances in sensor technology and wide spread use of various electronics (computers, PDA, mobile phones etc.) provide new opportunities for capturing and analyses of novel physiological and behavioural traits of human beings for biometric authentication. This paper presents an overview of several such types of human characteristics that have been proposed as alternatives to traditional types of biometrics. We refer to these characteristics as emerging biometrics. We survey various types of emerging modalities and techniques, and discuss their pros and cons. Emerging biometrics faces several limitations and challenges which include subject population coverage (focusing mostly on adults); unavailability of benchmark databases; little research with respect to vulnerability/robustness against attacks; and some privacy concerns they may arise. In addition, recognition performance of emerging modalities are generally less accurate compared to the traditional biometrics. Despite all of these emerging biometrics posses their own benefits and advantages compared to traditional biometrics which makes them still attractive for research. First of all, emerging biometrics can always serve as a complementary source for identity information; they can be suitable in applications where traditional biometrics are difficult or impossible to adapt such as continuous or periodic re-verification of the user's identity etc.
Combining Cryptography with EEG Biometrics
Kazanavičius, Egidijus; Woźniak, Marcin
2018-01-01
Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.
Combining Cryptography with EEG Biometrics.
Damaševičius, Robertas; Maskeliūnas, Rytis; Kazanavičius, Egidijus; Woźniak, Marcin
2018-01-01
Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.
Biometrics: Accessibility challenge or opportunity?
Blanco-Gonzalo, Ramon; Lunerti, Chiara; Sanchez-Reillo, Raul; Guest, Richard Michael
2018-01-01
Biometric recognition is currently implemented in several authentication contexts, most recently in mobile devices where it is expected to complement or even replace traditional authentication modalities such as PIN (Personal Identification Number) or passwords. The assumed convenience characteristics of biometrics are transparency, reliability and ease-of-use, however, the question of whether biometric recognition is as intuitive and straightforward to use is open to debate. Can biometric systems make some tasks easier for people with accessibility concerns? To investigate this question, an accessibility evaluation of a mobile app was conducted where test subjects withdraw money from a fictitious ATM (Automated Teller Machine) scenario. The biometric authentication mechanisms used include face, voice, and fingerprint. Furthermore, we employed traditional modalities of PIN and pattern in order to check if biometric recognition is indeed a real improvement. The trial test subjects within this work were people with real-life accessibility concerns. A group of people without accessibility concerns also participated, providing a baseline performance. Experimental results are presented concerning performance, HCI (Human-Computer Interaction) and accessibility, grouped according to category of accessibility concern. Our results reveal links between individual modalities and user category establishing guidelines for future accessible biometric products.
Biometrics: Accessibility challenge or opportunity?
Lunerti, Chiara; Sanchez-Reillo, Raul; Guest, Richard Michael
2018-01-01
Biometric recognition is currently implemented in several authentication contexts, most recently in mobile devices where it is expected to complement or even replace traditional authentication modalities such as PIN (Personal Identification Number) or passwords. The assumed convenience characteristics of biometrics are transparency, reliability and ease-of-use, however, the question of whether biometric recognition is as intuitive and straightforward to use is open to debate. Can biometric systems make some tasks easier for people with accessibility concerns? To investigate this question, an accessibility evaluation of a mobile app was conducted where test subjects withdraw money from a fictitious ATM (Automated Teller Machine) scenario. The biometric authentication mechanisms used include face, voice, and fingerprint. Furthermore, we employed traditional modalities of PIN and pattern in order to check if biometric recognition is indeed a real improvement. The trial test subjects within this work were people with real-life accessibility concerns. A group of people without accessibility concerns also participated, providing a baseline performance. Experimental results are presented concerning performance, HCI (Human-Computer Interaction) and accessibility, grouped according to category of accessibility concern. Our results reveal links between individual modalities and user category establishing guidelines for future accessible biometric products. PMID:29565989
8 CFR 103.17 - Biometric service fee.
Code of Federal Regulations, 2012 CFR
2012-01-01
... BENEFITS; BIOMETRIC REQUIREMENTS; AVAILABILITY OF RECORDS Biometric Requirements § 103.17 Biometric service... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Biometric service fee. 103.17 Section 103... biometric information at a DHS office, other designated collection site overseas, or a registered State or...
8 CFR 103.17 - Biometric service fee.
Code of Federal Regulations, 2013 CFR
2013-01-01
... BENEFITS; BIOMETRIC REQUIREMENTS; AVAILABILITY OF RECORDS Biometric Requirements § 103.17 Biometric service... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Biometric service fee. 103.17 Section 103... biometric information at a DHS office, other designated collection site overseas, or a registered State or...
8 CFR 103.17 - Biometric service fee.
Code of Federal Regulations, 2014 CFR
2014-01-01
... BENEFITS; BIOMETRIC REQUIREMENTS; AVAILABILITY OF RECORDS Biometric Requirements § 103.17 Biometric service... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Biometric service fee. 103.17 Section 103... biometric information at a DHS office, other designated collection site overseas, or a registered State or...
Review of modern biometric user authentication and their development prospects
NASA Astrophysics Data System (ADS)
Boriev, Z. V.; Sokolov, S. S.; Nyrkov, A. P.
2015-09-01
This article discusses the possibility of using biometric information technologies in management. Made a brief overview of access control and time attendance. Analyzed biometrics and identification system user. Recommendations on the use of various systems depending on the specific tasks.
A bimodal biometric identification system
NASA Astrophysics Data System (ADS)
Laghari, Mohammad S.; Khuwaja, Gulzar A.
2013-03-01
Biometrics consists of methods for uniquely recognizing humans based upon one or more intrinsic physical or behavioral traits. Physicals are related to the shape of the body. Behavioral are related to the behavior of a person. However, biometric authentication systems suffer from imprecision and difficulty in person recognition due to a number of reasons and no single biometrics is expected to effectively satisfy the requirements of all verification and/or identification applications. Bimodal biometric systems are expected to be more reliable due to the presence of two pieces of evidence and also be able to meet the severe performance requirements imposed by various applications. This paper presents a neural network based bimodal biometric identification system by using human face and handwritten signature features.
United States Homeland Security and National Biometric Identification
2002-04-09
security number. Biometrics is the use of unique individual traits such as fingerprints, iris eye patterns, voice recognition, and facial recognition to...technology to control access onto their military bases using a Defense Manpower Management Command developed software application. FACIAL Facial recognition systems...installed facial recognition systems in conjunction with a series of 200 cameras to fight street crime and identify terrorists. The cameras, which are
A method of ECG template extraction for biometrics applications.
Zhou, Xiang; Lu, Yang; Chen, Meng; Bao, Shu-Di; Miao, Fen
2014-01-01
ECG has attracted widespread attention as one of the most important non-invasive physiological signals in healthcare-system related biometrics for its characteristics like ease-of-monitoring, individual uniqueness as well as important clinical value. This study proposes a method of dynamic threshold setting to extract the most stable ECG waveform as the template for the consequent ECG identification process. With the proposed method, the accuracy of ECG biometrics using the dynamic time wraping for difference measures has been significantly improved. Analysis results with the self-built electrocardiogram database show that the deployment of the proposed method was able to reduce the half total error rate of the ECG biometric system from 3.35% to 1.45%. Its average running time on the platform of android mobile terminal was around 0.06 seconds, and thus demonstrates acceptable real-time performance.
Can soft biometric traits assist user recognition?
NASA Astrophysics Data System (ADS)
Jain, Anil K.; Dass, Sarat C.; Nandakumar, Karthik
2004-08-01
Biometrics is rapidly gaining acceptance as the technology that can meet the ever increasing need for security in critical applications. Biometric systems automatically recognize individuals based on their physiological and behavioral characteristics. Hence, the fundamental requirement of any biometric recognition system is a human trait having several desirable properties like universality, distinctiveness, permanence, collectability, acceptability, and resistance to circumvention. However, a human characteristic that possesses all these properties has not yet been identified. As a result, none of the existing biometric systems provide perfect recognition and there is a scope for improving the performance of these systems. Although characteristics like gender, ethnicity, age, height, weight and eye color are not unique and reliable, they provide some information about the user. We refer to these characteristics as "soft" biometric traits and argue that these traits can complement the identity information provided by the primary biometric identifiers like fingerprint and face. This paper presents the motivation for utilizing soft biometric information and analyzes how the soft biometric traits can be automatically extracted and incorporated in the decision making process of the primary biometric system. Preliminary experiments were conducted on a fingerprint database of 160 users by synthetically generating soft biometric traits like gender, ethnicity, and height based on known statistics. The results show that the use of additional soft biometric user information significantly improves (approximately 6%) the recognition performance of the fingerprint biometric system.
Gait biometrics under spoofing attacks: an experimental investigation
NASA Astrophysics Data System (ADS)
Hadid, Abdenour; Ghahramani, Mohammad; Kellokumpu, Vili; Feng, Xiaoyi; Bustard, John; Nixon, Mark
2015-11-01
Gait is a relatively biometric modality which has a precious advantage over other modalities, such as iris and voice, in that it can be easily captured from a distance. Although it has recently become a topic of great interest in biometric research, there has been little investigation into gait spoofing attacks where a person tries to imitate the clothing or walking style of someone else. We recently analyzed for the first time the effects of spoofing attacks on silhouette-based gait biometric systems and showed that it was indeed possible to spoof gait biometric systems by clothing impersonation and the deliberate selection of a target that has a similar build to the attacker. To gain deeper insight into the performance of current gait biometric systems under spoofing attacks, we provide a thorough investigation on how clothing can be used to spoof a target and evaluate the performance of two state-of-the-art recognition methods on a gait spoofing database recorded at the University of Southampton. Furthermore, we describe and evaluate an initial solution coping with gait spoofing attacks. The obtained results are very promising and point out interesting findings which can be used for future investigations.
Compressed ECG biometric: a fast, secured and efficient method for identification of CVD patient.
Sufi, Fahim; Khalil, Ibrahim; Mahmood, Abdun
2011-12-01
Adoption of compression technology is often required for wireless cardiovascular monitoring, due to the enormous size of Electrocardiography (ECG) signal and limited bandwidth of Internet. However, compressed ECG must be decompressed before performing human identification using present research on ECG based biometric techniques. This additional step of decompression creates a significant processing delay for identification task. This becomes an obvious burden on a system, if this needs to be done for a trillion of compressed ECG per hour by the hospital. Even though the hospital might be able to come up with an expensive infrastructure to tame the exuberant processing, for small intermediate nodes in a multihop network identification preceded by decompression is confronting. In this paper, we report a technique by which a person can be identified directly from his / her compressed ECG. This technique completely obviates the step of decompression and therefore upholds biometric identification less intimidating for the smaller nodes in a multihop network. The biometric template created by this new technique is lower in size compared to the existing ECG based biometrics as well as other forms of biometrics like face, finger, retina etc. (up to 8302 times lower than face template and 9 times lower than existing ECG based biometric template). Lower size of the template substantially reduces the one-to-many matching time for biometric recognition, resulting in a faster biometric authentication mechanism.
Modular Biometric Monitoring System
NASA Technical Reports Server (NTRS)
Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor)
2017-01-01
A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to control communication of data, via the bus, with each of the plurality of data acquisition modules.
Recent Self-Reported Cannabis Use Is Associated With the Biometrics of Delta-9-Tetrahydrocannabinol.
Smith, Matthew J; Alden, Eva C; Herrold, Amy A; Roberts, Andrea; Stern, Dan; Jones, Joseph; Barnes, Allan; O'Connor, Kailyn P; Huestis, Marilyn A; Breiter, Hans C
2018-05-01
Research typically characterizes cannabis use by self-report of cannabis intake frequency. In an effort to better understand relationships between measures of cannabis use, we evaluated if Δ-9-tetrahydrocannabinol (THC) and metabolite concentrations (biometrics) were associated with a calibrated timeline followback (TLFB) assessment of cannabis use. Participants were 35 young adult male cannabis users who completed a calibrated TLFB measure of cannabis use over the past 30 days, including time of last use. The calibration required participants handling four plastic bags of a cannabis substitute (0.25, 0.5, 1.0, and 3.5 grams) to quantify cannabis consumed. Participants provided blood and urine samples for analysis of THC and metabolites, at two independent laboratories. Participants abstained from cannabis use on the day of sample collection. We tested Pearson correlations between the calibrated TLFB measures and cannabis biometrics. Strong correlations were seen between urine and blood biometrics (all r > .73, all p < .001). TLFB measures of times of use and grams of cannabis consumed were significantly related to each biometric, including urine 11-nor-9-carboxy-Δ9-tetrahydrocannabinol (THCCOOH) and blood THC, 11-hydroxy-THC (11-OH-THC), THCCOOH, THCCOOH-glucuronide (times of use: r > .48-.61, all p < .05; grams: r > .40-.49, all p < .05). This study extends prior work to show TLFB methods significantly relate to an extended array of cannabis biometrics. The calibration of cannabis intake in grams was associated with each biometric, although the simple TLFB measure of times of use produced the strongest relationships with all five biometrics. These findings suggest that combined self-report and biometric data together convey the complexity of cannabis use, but allow that either the use of calibrated TLFB measures or biometrics may be sufficient for assessment of cannabis use in research.
NASA Astrophysics Data System (ADS)
Vielhauer, Claus; Croce Ferri, Lucilla
2003-06-01
Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.
Wong, Raymond
2013-01-01
Voice biometrics is one kind of physiological characteristics whose voice is different for each individual person. Due to this uniqueness, voice classification has found useful applications in classifying speakers' gender, mother tongue or ethnicity (accent), emotion states, identity verification, verbal command control, and so forth. In this paper, we adopt a new preprocessing method named Statistical Feature Extraction (SFX) for extracting important features in training a classification model, based on piecewise transformation treating an audio waveform as a time-series. Using SFX we can faithfully remodel statistical characteristics of the time-series; together with spectral analysis, a substantial amount of features are extracted in combination. An ensemble is utilized in selecting only the influential features to be used in classification model induction. We focus on the comparison of effects of various popular data mining algorithms on multiple datasets. Our experiment consists of classification tests over four typical categories of human voice data, namely, Female and Male, Emotional Speech, Speaker Identification, and Language Recognition. The experiments yield encouraging results supporting the fact that heuristically choosing significant features from both time and frequency domains indeed produces better performance in voice classification than traditional signal processing techniques alone, like wavelets and LPC-to-CC. PMID:24288684
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
BossPro: a biometrics-based obfuscation scheme for software protection
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham
2013-05-01
This paper proposes to integrate biometric-based key generation into an obfuscated interpretation algorithm to protect authentication application software from illegitimate use or reverse-engineering. This is especially necessary for mCommerce because application programmes on mobile devices, such as Smartphones and Tablet-PCs are typically open for misuse by hackers. Therefore, the scheme proposed in this paper ensures that a correct interpretation / execution of the obfuscated program code of the authentication application requires a valid biometric generated key of the actual person to be authenticated, in real-time. Without this key, the real semantics of the program cannot be understood by an attacker even if he/she gains access to this application code. Furthermore, the security provided by this scheme can be a vital aspect in protecting any application running on mobile devices that are increasingly used to perform business/financial or other security related applications, but are easily lost or stolen. The scheme starts by creating a personalised copy of any application based on the biometric key generated during an enrolment process with the authenticator as well as a nuance created at the time of communication between the client and the authenticator. The obfuscated code is then shipped to the client's mobile devise and integrated with real-time biometric extracted data of the client to form the unlocking key during execution. The novelty of this scheme is achieved by the close binding of this application program to the biometric key of the client, thus making this application unusable for others. Trials and experimental results on biometric key generation, based on client's faces, and an implemented scheme prototype, based on the Android emulator, prove the concept and novelty of this proposed scheme.
An Intelligent Fingerprint-Biometric Image Scrambling Scheme
NASA Astrophysics Data System (ADS)
Khan, Muhammad Khurram; Zhang, Jiashu
To obstruct the attacks, and to hamper with the liveness and retransmission issues of biometrics images, we have researched on the challenge/response-based biometrics scrambled image transmission. We proposed an intelligent biometrics sensor, which has computational power to receive challenges from the authentication server and generate response against the challenge with the encrypted biometric image. We utilized the FRT for biometric image encryption and used its scaling factors and random phase mask as the additional secret keys. In addition, we chaotically generated the random phase masks by a chaotic map to further improve the encryption security. Experimental and simulation results have shown that the presented system is secure, robust, and deters the risks of attacks of biometrics image transmission.
NIST biometric evaluations and developments
NASA Astrophysics Data System (ADS)
Garris, Michael D.; Wilson, Charles L.
2005-05-01
This paper presents an R&D framework used by the National Institute of Standards and Technology (NIST) for biometric technology testing and evaluation. The focus of this paper is on fingerprint-based verification and identification. Since 9-11 the NIST Image Group has been mandated by Congress to run a program for biometric technology assessment and biometric systems certification. Four essential areas of activity are discussed: 1) developing test datasets, 2) conducting performance assessment; 3) technology development; and 4) standards participation. A description of activities and accomplishments are provided for each of these areas. In the process, methods of performance testing are described and results from specific biometric technology evaluations are presented. This framework is anticipated to have broad applicability to other technology and application domains.
Bridging the gap: from biometrics to forensics
Jain, Anil K.; Ross, Arun
2015-01-01
Biometric recognition, or simply biometrics, refers to automated recognition of individuals based on their behavioural and biological characteristics. The success of fingerprints in forensic science and law enforcement applications, coupled with growing concerns related to border control, financial fraud and cyber security, has generated a huge interest in using fingerprints, as well as other biological traits, for automated person recognition. It is, therefore, not surprising to see biometrics permeating various segments of our society. Applications include smartphone security, mobile payment, border crossing, national civil registry and access to restricted facilities. Despite these successful deployments in various fields, there are several existing challenges and new opportunities for person recognition using biometrics. In particular, when biometric data is acquired in an unconstrained environment or if the subject is uncooperative, the quality of the ensuing biometric data may not be amenable for automated person recognition. This is particularly true in crime-scene investigations, where the biological evidence gleaned from a scene may be of poor quality. In this article, we first discuss how biometrics evolved from forensic science and how its focus is shifting back to its origin in order to address some challenging problems. Next, we enumerate the similarities and differences between biometrics and forensics. We then present some applications where the principles of biometrics are being successfully leveraged into forensics in order to solve critical problems in the law enforcement domain. Finally, we discuss new collaborative opportunities for researchers in biometrics and forensics, in order to address hitherto unsolved problems that can benefit society at large. PMID:26101280
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Bridging the gap: from biometrics to forensics.
Jain, Anil K; Ross, Arun
2015-08-05
Biometric recognition, or simply biometrics, refers to automated recognition of individuals based on their behavioural and biological characteristics. The success of fingerprints in forensic science and law enforcement applications, coupled with growing concerns related to border control, financial fraud and cyber security, has generated a huge interest in using fingerprints, as well as other biological traits, for automated person recognition. It is, therefore, not surprising to see biometrics permeating various segments of our society. Applications include smartphone security, mobile payment, border crossing, national civil registry and access to restricted facilities. Despite these successful deployments in various fields, there are several existing challenges and new opportunities for person recognition using biometrics. In particular, when biometric data is acquired in an unconstrained environment or if the subject is uncooperative, the quality of the ensuing biometric data may not be amenable for automated person recognition. This is particularly true in crime-scene investigations, where the biological evidence gleaned from a scene may be of poor quality. In this article, we first discuss how biometrics evolved from forensic science and how its focus is shifting back to its origin in order to address some challenging problems. Next, we enumerate the similarities and differences between biometrics and forensics. We then present some applications where the principles of biometrics are being successfully leveraged into forensics in order to solve critical problems in the law enforcement domain. Finally, we discuss new collaborative opportunities for researchers in biometrics and forensics, in order to address hitherto unsolved problems that can benefit society at large. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
The link between national security and biometrics
NASA Astrophysics Data System (ADS)
Etter, Delores M.
2005-03-01
National security today requires identification of people, things and activities. Biometrics plays an important role in the identification of people, and indirectly, in the identification of things and activities. Therefore, the development of technology and systems that provide faster and more accurate biometric identification is critical to the defense of our country. In addition, the development of a broad range of biometrics is necessary to provide the range of options needed to address flexible and adaptive adversaries. This paper will discuss the importance of a number of critical areas in the development of an environment to support biometrics, including research and development, biometric education, standards, pilot projects, and privacy assurance.
Biometrics, identification and surveillance.
Lyon, David
2008-11-01
Governing by identity describes the emerging regime of a globalizing, mobile world. Governance depends on identification but identification increasingly depends on biometrics. This 'solution' to difficulties of verification is described and some technical weaknesses are discussed. The role of biometrics in classification systems is also considered and is shown to contain possible prejudice in relation to racialized criteria of identity. Lastly, the culture of biometric identification is shown to be limited to abstract data, artificially separated from the lived experience of the body including the orientation to others. It is proposed that creators of national ID systems in particular address these crucial deficiencies in their attempt to provide new modes of verification.
21 CFR 1311.116 - Additional requirements for biometrics.
Code of Federal Regulations, 2012 CFR
2012-04-01
... biometric as described in § 1311.115, it must comply with the following requirements. (b) The biometric subsystem must operate at a false match rate of 0.001 or lower. (c) The biometric subsystem must use... paragraph (h) of this section. (d) The biometric subsystem must conform to Personal Identity Verification...
21 CFR 1311.116 - Additional requirements for biometrics.
Code of Federal Regulations, 2014 CFR
2014-04-01
... biometric as described in § 1311.115, it must comply with the following requirements. (b) The biometric subsystem must operate at a false match rate of 0.001 or lower. (c) The biometric subsystem must use... paragraph (h) of this section. (d) The biometric subsystem must conform to Personal Identity Verification...
A novel approach to transformed biometrics using successive projections
NASA Astrophysics Data System (ADS)
Gopi, E. S.
2010-02-01
Unlike user created password, number of biometrics is limited for creating account in different organizations. Transformed biometrics attempts to solve the problem by transforming the biometric into another form, which is unique to the particular organization. This makes the availability of different transformed biometrics in different organizations transformed from the same biometrics and helps in foolproof transactions. In this article a novel approach to transformed biometrics using successive projection technique is suggested .In the proposed technique, the user can register up to 5*4n-1 organizations if the length of the biometric password is 'n'.
Infrared sensing of non-observable human biometrics
NASA Astrophysics Data System (ADS)
Willmore, Michael R.
2005-05-01
Interest and growth of biometric recognition technologies surged after 9/11. Once a technology mainly used for identity verification in law enforcement, biometrics are now being considered as a secure means of providing identity assurance in security related applications. Biometric recognition in law enforcement must, by necessity, use attributes of human uniqueness that are both observable and vulnerable to compromise. Privacy and protection of an individual's identity is not assured during criminal activity. However, a security system must rely on identity assurance for access control to physical or logical spaces while not being vulnerable to compromise and protecting the privacy of an individual. The solution resides in the use of non-observable attributes of human uniqueness to perform the biometric recognition process. This discussion will begin by presenting some key perspectives about biometric recognition and the characteristic differences between observable and non-observable biometric attributes. An introduction to the design, development, and testing of the Thermo-ID system will follow. The Thermo-ID system is an emerging biometric recognition technology that uses non-observable patterns of infrared energy naturally emanating from within the human body. As with all biometric systems, the infrared patterns recorded and compared within the Thermo-ID system are unique and individually distinguishable permitting a link to be confirmed between an individual and a claimed or previously established identity. The non-observable characteristics of infrared patterns of human uniqueness insure both the privacy and protection of an individual using this type of biometric recognition system.
An efficient visualization method for analyzing biometric data
NASA Astrophysics Data System (ADS)
Rahmes, Mark; McGonagle, Mike; Yates, J. Harlan; Henning, Ronda; Hackett, Jay
2013-05-01
We introduce a novel application for biometric data analysis. This technology can be used as part of a unique and systematic approach designed to augment existing processing chains. Our system provides image quality control and analysis capabilities. We show how analysis and efficient visualization are used as part of an automated process. The goal of this system is to provide a unified platform for the analysis of biometric images that reduce manual effort and increase the likelihood of a match being brought to an examiner's attention from either a manual or lights-out application. We discuss the functionality of FeatureSCOPE™ which provides an efficient tool for feature analysis and quality control of biometric extracted features. Biometric databases must be checked for accuracy for a large volume of data attributes. Our solution accelerates review of features by a factor of up to 100 times. Review of qualitative results and cost reduction is shown by using efficient parallel visual review for quality control. Our process automatically sorts and filters features for examination, and packs these into a condensed view. An analyst can then rapidly page through screens of features and flag and annotate outliers as necessary.
Cancelable biometrics realization with multispace random projections.
Teoh, Andrew Beng Jin; Yuang, Chong Tze
2007-10-01
Biometric characteristics cannot be changed; therefore, the loss of privacy is permanent if they are ever compromised. This paper presents a two-factor cancelable formulation, where the biometric data are distorted in a revocable but non-reversible manner by first transforming the raw biometric data into a fixed-length feature vector and then projecting the feature vector onto a sequence of random subspaces that were derived from a user-specific pseudorandom number (PRN). This process is revocable and makes replacing biometrics as easy as replacing PRNs. The formulation has been verified under a number of scenarios (normal, stolen PRN, and compromised biometrics scenarios) using 2400 Facial Recognition Technology face images. The diversity property is also examined.
Promising developments and biometric testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holmes, J.P.
1993-04-01
Biometric identity research and development activities are being conducted in universities, government, and private industry. This paper discusses some of the factors that limit the performance of biometric identity devices, looks at some new developments, and speculates on future developments.
Soft Biometrics; Human Identification Using Comparative Descriptions.
Reid, Daniel A; Nixon, Mark S; Stevenage, Sarah V
2014-06-01
Soft biometrics are a new form of biometric identification which use physical or behavioral traits that can be naturally described by humans. Unlike other biometric approaches, this allows identification based solely on verbal descriptions, bridging the semantic gap between biometrics and human description. To permit soft biometric identification the description must be accurate, yet conventional human descriptions comprising of absolute labels and estimations are often unreliable. A novel method of obtaining human descriptions will be introduced which utilizes comparative categorical labels to describe differences between subjects. This innovative approach has been shown to address many problems associated with absolute categorical labels-most critically, the descriptions contain more objective information and have increased discriminatory capabilities. Relative measurements of the subjects' traits can be inferred from comparative human descriptions using the Elo rating system. The resulting soft biometric signatures have been demonstrated to be robust and allow accurate recognition of subjects. Relative measurements can also be obtained from other forms of human representation. This is demonstrated using a support vector machine to determine relative measurements from gait biometric signatures-allowing retrieval of subjects from video footage by using human comparisons, bridging the semantic gap.
Security analysis for biometric data in ID documents
NASA Astrophysics Data System (ADS)
Schimke, Sascha; Kiltz, Stefan; Vielhauer, Claus; Kalker, Ton
2005-03-01
In this paper we analyze chances and challenges with respect to the security of using biometrics in ID documents. We identify goals for ID documents, set by national and international authorities, and discuss the degree of security, which is obtainable with the inclusion of biometric into documents like passports. Starting from classical techniques for manual authentication of ID card holders, we expand our view towards automatic methods based on biometrics. We do so by reviewing different human biometric attributes by modality, as well as by discussing possible techniques for storing and handling the particular biometric data on the document. Further, we explore possible vulnerabilities of potential biometric passport systems. Based on the findings of that discussion we will expand upon two exemplary approaches for including digital biometric data in the context of ID documents and present potential risks attack scenarios along with technical aspects such as capacity and robustness.
On the Design of Forgiving Biometric Security Systems
NASA Astrophysics Data System (ADS)
Phan, Raphael C.-W.; Whitley, John N.; Parish, David J.
This work aims to highlight the fundamental issue surrounding biometric security systems: it’s all very nice until a biometric is forged, but what do we do after that? Granted, biometric systems are by physical nature supposedly much harder to forge than other factors of authentication since biometrics on a human body are by right unique to the particular human person. Yet it is also due to this physical nature that makes it much more catastrophic when a forgery does occur, because it implies that this uniqueness has been forged as well, threatening the human individuality; and since crime has by convention relied on identifying suspects by biometric characteristics, loss of this biometric uniqueness has devastating consequences on the freedom and basic human rights of the victimized individual. This uniqueness forgery implication also raises the motivation on the adversary to forge since a successful forgery leads to much more impersonation situations when biometric systems are used i.e. physical presence at crime scenes, identification and access to security systems and premises, access to financial accounts and hence the ability to use the victim’s finances. Depending on the gains, a desperate highly motivated adversary may even resort to directly obtaining the victim’s biometric parts by force e.g. severing the parts from the victim’s body; this poses a risk and threat not just to the individual’s uniqueness claim but also to personal safety and well being. One may then wonder if it is worth putting one’s assets, property and safety into the hands of biometrics based systems when the consequences of biometric forgery far outweigh the consequences of system compromises when no biometrics are used.
Atlantic Bluefin Tuna (Thunnus thynnus) Biometrics and Condition.
Rodriguez-Marin, Enrique; Ortiz, Mauricio; Ortiz de Urbina, José María; Quelle, Pablo; Walter, John; Abid, Noureddine; Addis, Piero; Alot, Enrique; Andrushchenko, Irene; Deguara, Simeon; Di Natale, Antonio; Gatt, Mark; Golet, Walter; Karakulak, Saadet; Kimoto, Ai; Macias, David; Saber, Samar; Santos, Miguel Neves; Zarrad, Rafik
2015-01-01
The compiled data for this study represents the first Atlantic and Mediterranean-wide effort to pool all available biometric data for Atlantic bluefin tuna (Thunnus thynnus) with the collaboration of many countries and scientific groups. Biometric relationships were based on an extensive sampling (over 140,000 fish sampled), covering most of the fishing areas for this species in the North Atlantic Ocean and Mediterranean Sea. Sensitivity analyses were carried out to evaluate the representativeness of sampling and explore the most adequate procedure to fit the weight-length relationship (WLR). The selected model for the WLRs by stock included standardized data series (common measurement types) weighted by the inverse variability. There was little difference between annual stock-specific round weight-straight fork length relationships, with an overall difference of 6% in weight. The predicted weight by month was estimated as an additional component in the exponent of the weight-length function. The analyses of monthly variations of fish condition by stock, maturity state and geographic area reflect annual cycles of spawning and feeding behavior. We update and improve upon the biometric relationships for bluefin currently used by the International Commission for the Conservation of Atlantic Tunas, by incorporating substantially larger datasets than ever previously compiled, providing complete documentation of sources and employing robust statistical fitting. WLRs and other conversion factors estimated in this study differ from the ones used in previous bluefin stock assessments.
Mental State Assessment and Validation Using Personalized Physiological Biometrics
Patel, Aashish N.; Howard, Michael D.; Roach, Shane M.; Jones, Aaron P.; Bryant, Natalie B.; Robinson, Charles S. H.; Clark, Vincent P.; Pilly, Praveen K.
2018-01-01
Mental state monitoring is a critical component of current and future human-machine interfaces, including semi-autonomous driving and flying, air traffic control, decision aids, training systems, and will soon be integrated into ubiquitous products like cell phones and laptops. Current mental state assessment approaches supply quantitative measures, but their only frame of reference is generic population-level ranges. What is needed are physiological biometrics that are validated in the context of task performance of individuals. Using curated intake experiments, we are able to generate personalized models of three key biometrics as useful indicators of mental state; namely, mental fatigue, stress, and attention. We demonstrate improvements to existing approaches through the introduction of new features. Furthermore, addressing the current limitations in assessing the efficacy of biometrics for individual subjects, we propose and employ a multi-level validation scheme for the biometric models by means of k-fold cross-validation for discrete classification and regression testing for continuous prediction. The paper not only provides a unified pipeline for extracting a comprehensive mental state evaluation from a parsimonious set of sensors (only EEG and ECG), but also demonstrates the use of validation techniques in the absence of empirical data. Furthermore, as an example of the application of these models to novel situations, we evaluate the significance of correlations of personalized biometrics to the dynamic fluctuations of accuracy and reaction time on an unrelated threat detection task using a permutation test. Our results provide a path toward integrating biometrics into augmented human-machine interfaces in a judicious way that can help to maximize task performance.
Mental State Assessment and Validation Using Personalized Physiological Biometrics.
Patel, Aashish N; Howard, Michael D; Roach, Shane M; Jones, Aaron P; Bryant, Natalie B; Robinson, Charles S H; Clark, Vincent P; Pilly, Praveen K
2018-01-01
Mental state monitoring is a critical component of current and future human-machine interfaces, including semi-autonomous driving and flying, air traffic control, decision aids, training systems, and will soon be integrated into ubiquitous products like cell phones and laptops. Current mental state assessment approaches supply quantitative measures, but their only frame of reference is generic population-level ranges. What is needed are physiological biometrics that are validated in the context of task performance of individuals. Using curated intake experiments, we are able to generate personalized models of three key biometrics as useful indicators of mental state; namely, mental fatigue, stress, and attention. We demonstrate improvements to existing approaches through the introduction of new features. Furthermore, addressing the current limitations in assessing the efficacy of biometrics for individual subjects, we propose and employ a multi-level validation scheme for the biometric models by means of k -fold cross-validation for discrete classification and regression testing for continuous prediction. The paper not only provides a unified pipeline for extracting a comprehensive mental state evaluation from a parsimonious set of sensors (only EEG and ECG), but also demonstrates the use of validation techniques in the absence of empirical data. Furthermore, as an example of the application of these models to novel situations, we evaluate the significance of correlations of personalized biometrics to the dynamic fluctuations of accuracy and reaction time on an unrelated threat detection task using a permutation test. Our results provide a path toward integrating biometrics into augmented human-machine interfaces in a judicious way that can help to maximize task performance.
Multimodal biometrics for identity documents (MBioID).
Dessimoz, Damien; Richiardi, Jonas; Champod, Christophe; Drygajlo, Andrzej
2007-04-11
The MBioID initiative has been set up to address the following germane question: What and how biometric technologies could be deployed in identity documents in the foreseeable future? This research effort proposes to look at current and future practices and systems of establishing and using biometric identity documents (IDs) and evaluate their effectiveness in large-scale developments. The first objective of the MBioID project is to present a review document establishing the current state-of-the-art related to the use of multimodal biometrics in an IDs application. This research report gives the main definitions, properties and the framework of use related to biometrics, an overview of the main standards developed in the biometric industry and standardisation organisations to ensure interoperability, as well as some of the legal framework and the issues associated to biometrics such as privacy and personal data protection. The state-of-the-art in terms of technological development is also summarised for a range of single biometric modalities (2D and 3D face, fingerprint, iris, on-line signature and speech), chosen according to ICAO recommendations and availabilities, and for various multimodal approaches. This paper gives a summary of the main elements of that report. The second objective of the MBioID project is to propose relevant acquisition and evaluation protocols for a large-scale deployment of biometric IDs. Combined with the protocols, a multimodal database will be acquired in a realistic way, in order to be as close as possible to a real biometric IDs deployment. In this paper, the issues and solutions related to the acquisition setup are briefly presented.
Teubner, Diana; Paulus, Martin; Veith, Michael; Klein, Roland
2015-02-01
Piscifaunal health depends upon the state and quality of the aquatic environment. Variations in physical condition of fish may therefore be attributed to changes in environmental quality. Based on time series of up to 20 years of biometric data of bream from multiple sampling sites of the German environmental specimen bank (ESB), this study assessed whether changes in biometric parameters are able to indicate long-term alterations in fish health and environmental quality. Evaluated biometric parameters of fish health comprised length and weight of individuals of a defined age class, the condition factor, lipid content and hepatosomatic index (HSI). Although there are negative trends of the HSI, the overall development of health parameters can be interpreted as positive. This seems to suggest that health parameters conclusively mirror the long-term improvement of water quality in the selected rivers. However, the applicability of the condition factor as well as lipid content as indicators for fish health remained subject to restrictions. Altogether, the results from the ESB confirmed the high value of biometric parameters for monitoring of long-term changes in state and quality of aquatic ecosystems.
Transfer learning for bimodal biometrics recognition
NASA Astrophysics Data System (ADS)
Dan, Zhiping; Sun, Shuifa; Chen, Yanfei; Gan, Haitao
2013-10-01
Biometrics recognition aims to identify and predict new personal identities based on their existing knowledge. As the use of multiple biometric traits of the individual may enables more information to be used for recognition, it has been proved that multi-biometrics can produce higher accuracy than single biometrics. However, a common problem with traditional machine learning is that the training and test data should be in the same feature space, and have the same underlying distribution. If the distributions and features are different between training and future data, the model performance often drops. In this paper, we propose a transfer learning method for face recognition on bimodal biometrics. The training and test samples of bimodal biometric images are composed of the visible light face images and the infrared face images. Our algorithm transfers the knowledge across feature spaces, relaxing the assumption of same feature space as well as same underlying distribution by automatically learning a mapping between two different but somewhat similar face images. According to the experiments in the face images, the results show that the accuracy of face recognition has been greatly improved by the proposed method compared with the other previous methods. It demonstrates the effectiveness and robustness of our method.
Voice Biometrics as a Way to Self-service Password Reset
NASA Astrophysics Data System (ADS)
Hohgräfe, Bernd; Jacobi, Sebastian
Password resets are time consuming. Especially when urgent jobs need to be done, it is cumbersome to inform the user helpdesk, to identify oneself and then to wait for response. It is easy to enter a wrong password multiple times, which leads to the blocking of the application. Voice biometrics is an easy and secure way for individuals to reset their own password. Read more about how you can ease the burden of your user helpdesk and how voice biometric password resets benefit your expense situation without harming your security.
The Effect of Decomposition on the Efficacy of Biometrics for Positive Identification.
Sauerwein, Kelly; Saul, Tiffany B; Steadman, Dawnie Wolfe; Boehnen, Chris B
2017-11-01
Biometrics, unique measurable physiological and behavioral characteristics, are used to identify individuals in a variety of scenarios, including forensic investigations. However, data on the longevity of these indicators are incomplete. This study demonstrated that iris and fingerprint biometric data can be obtained up to four days postmortem in warmer seasons and 50 + days in the winter. It has been generally believed, but never studied, that iris recognition is only obtainable within the first 24 hours after death. However, this study showed that they remain viable for longer (2-34 days) depending upon the environmental conditions. Temperature, precipitation, insects, and scavenger activity were the primary factors affecting the retention of biometrics in decomposing human remains. While this study is an initial step in determining the utility of physiological biometrics across postmortem time, biometric research has the potential to make important contributions to human identification and the law enforcement, military, and medicolegal communities. © 2017 American Academy of Forensic Sciences.
Privacy-protected biometric templates: acoustic ear identification
NASA Astrophysics Data System (ADS)
Tuyls, Pim T.; Verbitskiy, Evgeny; Ignatenko, Tanya; Schobben, Daniel; Akkermans, Ton H.
2004-08-01
Unique Biometric Identifiers offer a very convenient way for human identification and authentication. In contrast to passwords they have hence the advantage that they can not be forgotten or lost. In order to set-up a biometric identification/authentication system, reference data have to be stored in a central database. As biometric identifiers are unique for a human being, the derived templates comprise unique, sensitive and therefore private information about a person. This is why many people are reluctant to accept a system based on biometric identification. Consequently, the stored templates have to be handled with care and protected against misuse [1, 2, 3, 4, 5, 6]. It is clear that techniques from cryptography can be used to achieve privacy. However, as biometric data are noisy, and cryptographic functions are by construction very sensitive to small changes in their input, and hence one can not apply those crypto techniques straightforwardly. In this paper we show the feasibility of the techniques developed in [5], [6] by applying them to experimental biometric data. As biometric identifier we have choosen the shape of the inner ear-canal, which is obtained by measuring the headphone-to-ear-canal Transfer Functions (HpTFs) which are known to be person dependent [7].
Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks
NASA Astrophysics Data System (ADS)
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Biometric Communication Research for Television.
ERIC Educational Resources Information Center
Malik, M. F.
Biometric communication research is defined as research dealing with the information impact of a film or television show, photographic picture, painting, exhibition, display, or any literary or functional texts or verbal stimuli on human beings, both as individuals and in groups (mass audiences). Biometric communication research consists of a…
Entropy Measurement for Biometric Verification Systems.
Lim, Meng-Hui; Yuen, Pong C
2016-05-01
Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.
NASA Astrophysics Data System (ADS)
Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi
2012-10-01
In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.
Biometrics for electronic health records.
Flores Zuniga, Alejandro Enrique; Win, Khin Than; Susilo, Willy
2010-10-01
Securing electronic health records, in scenarios in which the provision of care services is share among multiple actors, could become a complex and costly activity. Correct identification of patients and physician, protection of privacy and confidentiality, assignment of access permissions for healthcare providers and resolutions of conflicts rise as main points of concern in the development of interconnected health information networks. Biometric technologies have been proposed as a possible technological solution for these issues due to its ability to provide a mechanism for unique verification of an individual identity. This paper presents an analysis of the benefit as well as disadvantages offered by biometric technology. A comparison between this technology and more traditional identification methods is used to determine the key benefits and flaws of the use biometric in health information systems. The comparison as been made considering the viability of the technologies for medical environments, global security needs, the contemplation of a share care environment and the costs involved in the implementation and maintenance of such technologies. This paper also discusses alternative uses for biometrics technologies in health care environments. The outcome of this analysis lays in the fact that even when biometric technologies offer several advantages over traditional method of identification, they are still in the early stages of providing a suitable solution for a health care environment.
Duality between Time Series and Networks
Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.
2011-01-01
Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093
Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions
Chiou, Shin-Yan
2013-01-01
Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied. PMID:23762851
Secure method for biometric-based recognition with integrated cryptographic functions.
Chiou, Shin-Yan
2013-01-01
Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.
Biometric template transformation: a security analysis
NASA Astrophysics Data System (ADS)
Nagar, Abhishek; Nandakumar, Karthik; Jain, Anil K.
2010-01-01
One of the critical steps in designing a secure biometric system is protecting the templates of the users that are stored either in a central database or on smart cards. If a biometric template is compromised, it leads to serious security and privacy threats because unlike passwords, it is not possible for a legitimate user to revoke his biometric identifiers and switch to another set of uncompromised identifiers. One methodology for biometric template protection is the template transformation approach, where the template, consisting of the features extracted from the biometric trait, is transformed using parameters derived from a user specific password or key. Only the transformed template is stored and matching is performed directly in the transformed domain. In this paper, we formally investigate the security strength of template transformation techniques and define six metrics that facilitate a holistic security evaluation. Furthermore, we analyze the security of two wellknown template transformation techniques, namely, Biohashing and cancelable fingerprint templates based on the proposed metrics. Our analysis indicates that both these schemes are vulnerable to intrusion and linkage attacks because it is relatively easy to obtain either a close approximation of the original template (Biohashing) or a pre-image of the transformed template (cancelable fingerprints). We argue that the security strength of template transformation techniques must consider also consider the computational complexity of obtaining a complete pre-image of the transformed template in addition to the complexity of recovering the original biometric template.
A Systems Approach to Biometrics in the Military Domain.
Wilson, Lauren; Gahan, Michelle; Lennard, Chris; Robertson, James
2018-02-21
Forensic biometrics is the application of forensic science principles to physical and behavioral characteristics. Forensic biometrics is a secondary sub-system in the forensic science "system of systems," which describes forensic science as a sub-system in the larger criminal justice, law enforcement, intelligence, and military system. The purpose of this paper is to discuss biometrics in the military domain and integration into the wider forensic science system of systems. The holistic system thinking methodology was applied to the U.S. biometric system to map it to the system of systems framework. The U.S. biometric system is used as a case study to help guide other countries to develop military biometric systems that are integrated and interoperable at the whole-of-government level. The aim is to provide the system of systems framework for agencies to consider for proactive design of biometric systems. © 2018 American Academy of Forensic Sciences.
Logistic Map for Cancellable Biometrics
NASA Astrophysics Data System (ADS)
Supriya, V. G., Dr; Manjunatha, Ramachandra, Dr
2017-08-01
This paper presents design and implementation of secured biometric template protection system by transforming the biometric template using binary chaotic signals and 3 different key streams to obtain another form of template and demonstrating its efficiency by the results and investigating on its security through analysis including, key space analysis, information entropy and key sensitivity analysis.
Anatomy of biometric passports.
Malčík, Dominik; Drahanský, Martin
2012-01-01
Travelling is becoming available for more and more people. Millions of people are on a way every day. That is why a better control over global human transfer and a more reliable identity check is desired. A recent trend in a field of personal identification documents is to use RFID (Radio Frequency Identification) technology and biometrics, especially (but not only) in passports. This paper provides an insight into the electronic passports (also called e-passport or ePassport) implementation chosen in the Czech Republic. Such a summary is needed for further studies of biometric passports implementation security and biometric passports analysis. A separate description of the Czech solution is a prerequisite for a planned analysis, because of the uniqueness of each implementation. (Each country can choose the implementation details within a range specified by the ICAO (International Civil Aviation Organisation); moreover, specific security mechanisms are optional and can be omitted).
Biometrics can help protect and safeguard.
Oakes, Shaun
2017-06-01
Shaun Oakes, managing director at ievo, a north-east England-based manufacturer of biometric fingerprint readers, argues that growing use of biometrics technology can improve security and afford better protection to premises, valuable items, and people, across an ever-busier NHS.
Handwriting: Feature Correlation Analysis for Biometric Hashes
NASA Astrophysics Data System (ADS)
Vielhauer, Claus; Steinmetz, Ralf
2004-12-01
In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.
Multiple Indicator Stationary Time Series Models.
ERIC Educational Resources Information Center
Sivo, Stephen A.
2001-01-01
Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…
8 CFR 103.16 - Collection, use and storage of biometric information.
Code of Federal Regulations, 2012 CFR
2012-01-01
... REGULATIONS IMMIGRATION BENEFITS; BIOMETRIC REQUIREMENTS; AVAILABILITY OF RECORDS Biometric Requirements § 103.16 Collection, use and storage of biometric information. (a) Use of biometric information. Any... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Collection, use and storage of biometric...
8 CFR 103.16 - Collection, use and storage of biometric information.
Code of Federal Regulations, 2014 CFR
2014-01-01
... REGULATIONS IMMIGRATION BENEFITS; BIOMETRIC REQUIREMENTS; AVAILABILITY OF RECORDS Biometric Requirements § 103.16 Collection, use and storage of biometric information. (a) Use of biometric information. Any... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Collection, use and storage of biometric...
8 CFR 103.16 - Collection, use and storage of biometric information.
Code of Federal Regulations, 2013 CFR
2013-01-01
... REGULATIONS IMMIGRATION BENEFITS; BIOMETRIC REQUIREMENTS; AVAILABILITY OF RECORDS Biometric Requirements § 103.16 Collection, use and storage of biometric information. (a) Use of biometric information. Any... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Collection, use and storage of biometric...
Validating a biometric authentication system: sample size requirements.
Dass, Sarat C; Zhu, Yongfang; Jain, Anil K
2006-12-01
Authentication systems based on biometric features (e.g., fingerprint impressions, iris scans, human face images, etc.) are increasingly gaining widespread use and popularity. Often, vendors and owners of these commercial biometric systems claim impressive performance that is estimated based on some proprietary data. In such situations, there is a need to independently validate the claimed performance levels. System performance is typically evaluated by collecting biometric templates from n different subjects, and for convenience, acquiring multiple instances of the biometric for each of the n subjects. Very little work has been done in 1) constructing confidence regions based on the ROC curve for validating the claimed performance levels and 2) determining the required number of biometric samples needed to establish confidence regions of prespecified width for the ROC curve. To simplify the analysis that address these two problems, several previous studies have assumed that multiple acquisitions of the biometric entity are statistically independent. This assumption is too restrictive and is generally not valid. We have developed a validation technique based on multivariate copula models for correlated biometric acquisitions. Based on the same model, we also determine the minimum number of samples required to achieve confidence bands of desired width for the ROC curve. We illustrate the estimation of the confidence bands as well as the required number of biometric samples using a fingerprint matching system that is applied on samples collected from a small population.
Fourier domain asymmetric cryptosystem for privacy protected multimodal biometric security
NASA Astrophysics Data System (ADS)
Choudhury, Debesh
2016-04-01
We propose a Fourier domain asymmetric cryptosystem for multimodal biometric security. One modality of biometrics (such as face) is used as the plaintext, which is encrypted by another modality of biometrics (such as fingerprint). A private key is synthesized from the encrypted biometric signature by complex spatial Fourier processing. The encrypted biometric signature is further encrypted by other biometric modalities, and the corresponding private keys are synthesized. The resulting biometric signature is privacy protected since the encryption keys are provided by the human, and hence those are private keys. Moreover, the decryption keys are synthesized using those private encryption keys. The encrypted signatures are decrypted using the synthesized private keys and inverse complex spatial Fourier processing. Computer simulations demonstrate the feasibility of the technique proposed.
Novel continuous authentication using biometrics
NASA Astrophysics Data System (ADS)
Dubey, Prakash; Patidar, Rinku; Mishra, Vikas; Norman, Jasmine; Mangayarkarasi, R.
2017-11-01
We explore whether a classifier can consistent1y verify c1ients and interact with the computer using camera and behavior of users. In this paper we propose a new way of authentication of user which wi1l capture many images of user in random time and ana1ysis of its touch biometric behavior. In this system experiment the touch conduct of a c1ient/user between an en1istment stage is stored in the database and it is checked its mean time behavior during equa1 partition of time. This touch behavior wi1l ab1e to accept or reject the user. This wi1l modify the use of biometric more accurate to use. In this system the work p1an going to perform is the user wi1l ask single time to a1low to take it picture before 1ogin. Then it wi1l take images of user without permission of user automatica1ly and store in the database. This images and existing image of user wi1l be compare and reject or accept wi1l depend on its comparison. The user touch behavior wi1l keep storing with number of touch make in equa1 amount of time of the user. This touch behavior and image wi1l fina1ly perform authentication of the user automatically.
Transfer Function Control for Biometric Monitoring System
NASA Technical Reports Server (NTRS)
Chmiel, Alan J. (Inventor); Grodinsky, Carlos M. (Inventor); Humphreys, Bradley T. (Inventor)
2015-01-01
A modular apparatus for acquiring biometric data may include circuitry operative to receive an input signal indicative of a biometric condition, the circuitry being configured to process the input signal according to a transfer function thereof and to provide a corresponding processed input signal. A controller is configured to provide at least one control signal to the circuitry to programmatically modify the transfer function of the modular system to facilitate acquisition of the biometric data.
Anatomy of Biometric Passports
Malčík, Dominik; Drahanský, Martin
2012-01-01
Travelling is becoming available for more and more people. Millions of people are on a way every day. That is why a better control over global human transfer and a more reliable identity check is desired. A recent trend in a field of personal identification documents is to use RFID (Radio Frequency Identification) technology and biometrics, especially (but not only) in passports. This paper provides an insight into the electronic passports (also called e-passport or ePassport) implementation chosen in the Czech Republic. Such a summary is needed for further studies of biometric passports implementation security and biometric passports analysis. A separate description of the Czech solution is a prerequisite for a planned analysis, because of the uniqueness of each implementation. (Each country can choose the implementation details within a range specified by the ICAO (International Civil Aviation Organisation); moreover, specific security mechanisms are optional and can be omitted). PMID:22969272
NASA Astrophysics Data System (ADS)
El-Saba, Aed; Alsharif, Salim; Jagapathi, Rajendarreddy
2011-04-01
Fingerprint recognition is one of the first techniques used for automatically identifying people and today it is still one of the most popular and effective biometric techniques. With this increase in fingerprint biometric uses, issues related to accuracy, security and processing time are major challenges facing the fingerprint recognition systems. Previous work has shown that polarization enhancementencoding of fingerprint patterns increase the accuracy and security of fingerprint systems without burdening the processing time. This is mainly due to the fact that polarization enhancementencoding is inherently a hardware process and does not have detrimental time delay effect on the overall process. Unpolarized images, however, posses a high visual contrast and when fused (without digital enhancement) properly with polarized ones, is shown to increase the recognition accuracy and security of the biometric system without any significant processing time delay.
Hand Grasping Synergies As Biometrics.
Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana
2017-01-01
Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies-postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security.
Hand Grasping Synergies As Biometrics
Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K.; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana
2017-01-01
Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies—postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security. PMID:28512630
Alignment and bit extraction for secure fingerprint biometrics
NASA Astrophysics Data System (ADS)
Nagar, A.; Rane, S.; Vetro, A.
2010-01-01
Security of biometric templates stored in a system is important because a stolen template can compromise system security as well as user privacy. Therefore, a number of secure biometrics schemes have been proposed that facilitate matching of feature templates without the need for a stored biometric sample. However, most of these schemes suffer from poor matching performance owing to the difficulty of designing biometric features that remain robust over repeated biometric measurements. This paper describes a scheme to extract binary features from fingerprints using minutia points and fingerprint ridges. The features are amenable to direct matching based on binary Hamming distance, but are especially suitable for use in secure biometric cryptosystems that use standard error correcting codes. Given all binary features, a method for retaining only the most discriminable features is presented which improves the Genuine Accept Rate (GAR) from 82% to 90% at a False Accept Rate (FAR) of 0.1% on a well-known public database. Additionally, incorporating singular points such as a core or delta feature is shown to improve the matching tradeoff.
Agudelo, Juliana; Privman, Vladimir; Halámek, Jan
2017-07-05
We consider a new concept of biometric-based cybersecurity systems for active authentication by continuous tracking, which utilizes biochemical processing of metabolites present in skin secretions. Skin secretions contain a large number of metabolites and small molecules that can be targeted for analysis. Here we argue that amino acids found in sweat can be exploited for the establishment of an amino acid profile capable of identifying an individual user of a mobile or wearable device. Individual and combinations of amino acids processed by biocatalytic cascades yield physical (optical or electronic) signals, providing a time-series of several outputs that, in their entirety, should suffice to authenticate a specific user based on standard statistical criteria. Initial results, motivated by biometrics, indicate that single amino acid levels can provide analog signals that vary according to the individual donor, albeit with limited resolution versus noise. However, some such assays offer digital separation (into well-defined ranges of values) according to groups such as age, biological sex, race, and physiological state of the individual. Multi-input biocatalytic cascades that handle several amino acid signals to yield a single digital-type output, as well as continuous-tracking time-series data rather than a single-instance sample, should enable active authentication at the level of an individual. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Bordes, Roy N.
1998-12-01
The purpose of this presentation is to enlighten the reader on the advancements that have been made in the field of biometrics technology as it relates to government and industrial-type applications. The term 'biometrics' is defined as, 'Any technology that uses electronically scanned graphical information for identification purposes.' Biometric technology was for a long time in the experimental stages, with many BETA test projects that were really not applicable to industrial markets. During the course of this presentation, we will show that biometrics applications do work, can develop positive returns on investment, but from a security standpoint have some major application problems that still need to be overcome. We will also address which biometric technologies have a better future in the security world than others.
Extracting forensic evidence from biometric devices
NASA Astrophysics Data System (ADS)
Geradts, Zeno J.; Ruifrok, Arnout C.
2003-08-01
Over the past few years, both large multinationals and governments have begun to contribute to even larger projects on biometric devices. Terrorist attacks in America and in other countries have highlighted the need for better identification systems for people as well as improved systems for controlling access to buildings. Another reason for investment in Research and Development in Biometric Devices, is the massive growth in internet-based systems -- whether for e-commerce, e-government or internal processes within organizations. The interface between the system and the user is routinely abused, as people have to remember many complex passwords and handle tokens of various types. In this paper an overview is given of the information that is important to know before an examination of such is systems can be done in a forensic proper way. In forensic evidence with biometric devices the forensic examiner should consider the possibilities of tampering with the biometric systems or the possibilities of unauthorized access before drawing conclusions.
Biometric recognition via fixation density maps
NASA Astrophysics Data System (ADS)
Rigas, Ioannis; Komogortsev, Oleg V.
2014-05-01
This work introduces and evaluates a novel eye movement-driven biometric approach that employs eye fixation density maps for person identification. The proposed feature offers a dynamic representation of the biometric identity, storing rich information regarding the behavioral and physical eye movement characteristics of the individuals. The innate ability of fixation density maps to capture the spatial layout of the eye movements in conjunction with their probabilistic nature makes them a particularly suitable option as an eye movement biometrical trait in cases when free-viewing stimuli is presented. In order to demonstrate the effectiveness of the proposed approach, the method is evaluated on three different datasets containing a wide gamut of stimuli types, such as static images, video and text segments. The obtained results indicate a minimum EER (Equal Error Rate) of 18.3 %, revealing the perspectives on the utilization of fixation density maps as an enhancing biometrical cue during identification scenarios in dynamic visual environments.
NASA Astrophysics Data System (ADS)
Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay
2015-12-01
In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.
2010 Biometrics Conference Held in Arlington, Virginia on January 20-21, 2010
2010-01-21
testing of mobile biometric collection devices during future biometric field exercises, we hope to provide NOVARIS officials connectivity and an...environmental factors – Extreme Outdoor Mobile Conditions – Non-cooperative users – Field-collected samples of mixed quality – Real-time access to match results...Physics: Mobile 10-print Slap Capture Robust Face/Iris Capture Contactless Fingerprints 10 Human Factors Behavioral Sciences Division The Hard
A method for profiling biometric changes during disaccommodation.
Alderson, Alison; Davies, Leon N; Mallen, Edward A H; Sheppard, Amy L
2012-05-01
To demonstrate the application of low-coherence reflectometry to the study of biometric changes during disaccommodation responses in human eyes after cessation of a near task and to evaluate the effect of contact lenses on low-coherence reflectometry biometric measurements. Ocular biometric parameters of crystalline lens thickness (LT) and anterior chamber depth (ACD) were measured with the LenStar device during and immediately after a 5 D accommodative task in 10 participants. In a separate trial, accommodation responses were recorded with a Shin-Nippon WAM-5500 optometer in a subset of two participants. Biometric data were interleaved to form a profile of post-task anterior segment changes. In a further experiment, the effect of soft contact lenses on LenStar measurements was evaluated in 15 participants. In 10 adult participants, increased LT and reduced ACD was seen during the 5 D task. Post-task, during fixation of a 0 D target, a profile of the change in LT and ACD against time was observed. In the two participants with accommodation data (one a sufferer of nearwork-induced transient myopia and other a non-sufferer), the post-task changes in refraction compared favorably with the interleaved LenStar biometry data. The insertion of soft contact lenses did not have a significant effect on LenStar measures of ACD or LT (mean change: -0.007 mm, p = 0.265 and + 0.001 mm, p = 0.875, respectively). With the addition of a relatively simple stimulus modification, the LenStar instrument can be used to produce a profile of post-task changes in LT and ACD. The spatial and temporal resolution of the system is sufficient for the investigation of nearwork-induced transient myopia from a biometric viewpoint. LenStar measurements of ACD and LT remain valid after the fitting of soft contact lenses.
Biometrics IRB best practices and data protection
NASA Astrophysics Data System (ADS)
Boehnen, Christopher; Bolme, David; Flynn, Patrick
2015-05-01
The collection of data from human subjects for biometrics research in the United States requires the development of a data collection protocol that is reviewed by a Human Subjects Institutional Review Board (IRB). The IRB reviews the protocol for risks and approves it if it meets the criteria for approval specified in the relevant Federal regulations (45 CFR 46). Many other countries operate similar mechanisms for the protection of human subjects. IRBs review protocols for safety, confidentiality, and for minimization of risk associated with identity disclosure. Since biometric measurements are potentially identifying, IRB scrutiny of biometrics data collection protocols can be expected to be thorough. This paper discusses the intricacies of IRB best practices within the worldwide biometrics community. This is important because research decisions involving human subjects are made at a local level and do not set a precedent for decisions made by another IRB board. In many cases, what one board approves is not approved by another board, resulting in significant inconsistencies that prove detrimental to both researchers and human subjects. Furthermore, the level of biometrics expertise may be low on IRBs, which can contribute to the unevenness of reviews. This publication will suggest possible best practices for designing and seeking IRB approval for human subjects research involving biometrics measurements. The views expressed are the opinions of the authors.
Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew
2014-01-01
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.
Biometrics Enabling Capability Increment 1 (BEC Inc 1)
2016-03-01
2016 Major Automated Information System Annual Report Biometrics Enabling Capability Increment 1 (BEC Inc 1) Defense Acquisition Management...Phone: 227-3119 DSN Fax: Date Assigned: July 15, 2015 Program Information Program Name Biometrics Enabling Capability Increment 1 (BEC Inc 1) DoD...therefore, no Original Estimate has been established. BEC Inc 1 2016 MAR UNCLASSIFIED 4 Program Description The Biometrics Enabling Capability (BEC
On Hunting Animals of the Biometric Menagerie for Online Signature
Houmani, Nesma; Garcia-Salicetti, Sonia
2016-01-01
Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database. PMID:27054836
On Hunting Animals of the Biometric Menagerie for Online Signature.
Houmani, Nesma; Garcia-Salicetti, Sonia
2016-01-01
Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database.
Carbon Nanotube Embedded Nanostructure for Biometrics.
Park, Juhyuk; Youn, Jae Ryoun; Song, Young Seok
2017-12-27
Low electric energy loss is a very important problem to minimize the decay of transferred energy intensity due to impedance mismatch. This issue has been dealt with by adding an impedance matching layer at the interface between two media. A strategy was proposed to improve the charge transfer from the human body to a biometric device by using an impedance matching nanostructure. Nanocomposite pattern arrays were fabricated with shape memory polymer and carbon nanotubes. The shape recovery ability of the nanopatterns enhanced durability and sustainability of the structure. It was found that the composite nanopatterns improved the current transfer by two times compared with the nonpatterned composite sample. The underlying mechanism of the enhanced charge transport was understood by carrying out a numerical simulation. We anticipate that this study can provide a new pathway for developing advanced biometric devices with high sensitivity to biological information.
Visibility Graph Based Time Series Analysis
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115
Visibility Graph Based Time Series Analysis.
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay
2013-12-01
Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations. © 2013, The International Biometric Society.
75 FR 39323 - Amendment to the Biometric Visa Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-08
... DEPARTMENT OF STATE [Public Notice: 7047] Amendment to the Biometric Visa Program AGENCY: Department of State. ACTION: Notice of Amendment to the Biometric Visa Program. This public notice announces an amendment to the Biometric Visa Program. Section 303 of the Enhanced Border Security and Visa...
Unobtrusive Multimodal Biometric Authentication: The HUMABIO Project Concept
NASA Astrophysics Data System (ADS)
Damousis, Ioannis G.; Tzovaras, Dimitrios; Bekiaris, Evangelos
2008-12-01
Human Monitoring and Authentication using Biodynamic Indicators and Behavioural Analysis (HUMABIO) (2007) is an EU Specific Targeted Research Project (STREP) where new types of biometrics are combined with state of the art sensorial technologies in order to enhance security in a wide spectrum of applications. The project aims to develop a modular, robust, multimodal biometrics security authentication and monitoring system which utilizes a biodynamic physiological profile, unique for each individual, and advancements of the state-of-the art in behavioural and other biometrics, such as face, speech, gait recognition, and seat-based anthropometrics. Several shortcomings in biometric authentication will be addressed in the course of HUMABIO which will provide the basis for improving existing sensors, develop new algorithms, and design applications, towards creating new, unobtrusive biometric authentication procedures in security sensitive, controlled environments. This paper presents the concept of this project, describes its unobtrusive authentication demonstrator, and reports some preliminary results.
Biometric recognition via texture features of eye movement trajectories in a visual searching task.
Li, Chunyong; Xue, Jiguo; Quan, Cheng; Yue, Jingwei; Zhang, Chenggang
2018-01-01
Biometric recognition technology based on eye-movement dynamics has been in development for more than ten years. Different visual tasks, feature extraction and feature recognition methods are proposed to improve the performance of eye movement biometric system. However, the correct identification and verification rates, especially in long-term experiments, as well as the effects of visual tasks and eye trackers' temporal and spatial resolution are still the foremost considerations in eye movement biometrics. With a focus on these issues, we proposed a new visual searching task for eye movement data collection and a new class of eye movement features for biometric recognition. In order to demonstrate the improvement of this visual searching task being used in eye movement biometrics, three other eye movement feature extraction methods were also tested on our eye movement datasets. Compared with the original results, all three methods yielded better results as expected. In addition, the biometric performance of these four feature extraction methods was also compared using the equal error rate (EER) and Rank-1 identification rate (Rank-1 IR), and the texture features introduced in this paper were ultimately shown to offer some advantages with regard to long-term stability and robustness over time and spatial precision. Finally, the results of different combinations of these methods with a score-level fusion method indicated that multi-biometric methods perform better in most cases.
Biometric recognition via texture features of eye movement trajectories in a visual searching task
Li, Chunyong; Xue, Jiguo; Quan, Cheng; Yue, Jingwei
2018-01-01
Biometric recognition technology based on eye-movement dynamics has been in development for more than ten years. Different visual tasks, feature extraction and feature recognition methods are proposed to improve the performance of eye movement biometric system. However, the correct identification and verification rates, especially in long-term experiments, as well as the effects of visual tasks and eye trackers’ temporal and spatial resolution are still the foremost considerations in eye movement biometrics. With a focus on these issues, we proposed a new visual searching task for eye movement data collection and a new class of eye movement features for biometric recognition. In order to demonstrate the improvement of this visual searching task being used in eye movement biometrics, three other eye movement feature extraction methods were also tested on our eye movement datasets. Compared with the original results, all three methods yielded better results as expected. In addition, the biometric performance of these four feature extraction methods was also compared using the equal error rate (EER) and Rank-1 identification rate (Rank-1 IR), and the texture features introduced in this paper were ultimately shown to offer some advantages with regard to long-term stability and robustness over time and spatial precision. Finally, the results of different combinations of these methods with a score-level fusion method indicated that multi-biometric methods perform better in most cases. PMID:29617383
Biometric Authentication Using the PPG: A Long-Term Feasibility Study
Alesanco, Álvaro
2018-01-01
The photoplethysmogram (PPG) is a biomedical signal that can be used to estimate volumetric blood flow changes in the peripheral circulation. During the past few years, several works have been published in order to assess the potential for PPGs to be used in biometric authentication systems, but results are inconclusive. In this paper we perform an analysis of the feasibility of using the PPG as a realistic biometric alternative in the long term. Several feature extractors (based on the time domain and the Karhunen–Loève transform) and matching metrics (Manhattan and Euclidean distances) have been tested using four different PPG databases (PRRB, MIMIC-II, Berry, and Nonin). We show that the false match rate (FMR) and false non-match rate (FNMR) values remain constant in different time instances for a selected threshold, which is essential for using the PPG for biometric authentication purposes. On the other hand, obtained equal error rate (EER) values for signals recorded during the same session range from 1.0% for high-quality signals recorded in controlled conditions to 8% for those recorded in conditions closer to real-world scenarios. Moreover, in certain scenarios, EER values rise up to 23.2% for signals recorded over different days, signaling that performance degradation could take place with time. PMID:29751681
Biometric Authentication Using the PPG: A Long-Term Feasibility Study.
Sancho, Jorge; Alesanco, Álvaro; García, José
2018-05-11
The photoplethysmogram (PPG) is a biomedical signal that can be used to estimate volumetric blood flow changes in the peripheral circulation. During the past few years, several works have been published in order to assess the potential for PPGs to be used in biometric authentication systems, but results are inconclusive. In this paper we perform an analysis of the feasibility of using the PPG as a realistic biometric alternative in the long term. Several feature extractors (based on the time domain and the Karhunen⁻Loève transform) and matching metrics (Manhattan and Euclidean distances) have been tested using four different PPG databases (PRRB, MIMIC-II, Berry, and Nonin). We show that the false match rate ( FMR ) and false non-match rate ( FNMR ) values remain constant in different time instances for a selected threshold, which is essential for using the PPG for biometric authentication purposes. On the other hand, obtained equal error rate (EER) values for signals recorded during the same session range from 1.0% for high-quality signals recorded in controlled conditions to 8% for those recorded in conditions closer to real-world scenarios. Moreover, in certain scenarios, EER values rise up to 23.2% for signals recorded over different days, signaling that performance degradation could take place with time.
Crop biometric maps: the key to prediction.
Rovira-Más, Francisco; Sáiz-Rubio, Verónica
2013-09-23
The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular "identity." This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed.
Crop Biometric Maps: The Key to Prediction
Rovira-Más, Francisco; Sáiz-Rubio, Verónica
2013-01-01
The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular “identity.” This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed. PMID:24064605
Fu, Patricia Lin; Bradley, Kent L; Viswanathan, Sheila; Chan, June M; Stampfer, Meir
2016-07-01
To evaluate changes in employees' biometrics over time relative to outcome-based incentive thresholds. Retrospective cohort analysis of biometric screening participants (n = 26 388). Large employer primarily in Western United States. Office, retail, and distribution workforce. A voluntary outcome-based biometric screening program, incentivized with health insurance premium discounts. Body mass index (BMI), cholesterol, blood glucose, blood pressure, and nicotine. Followed were participants from their first year of participation, evaluating changes in measures. On average, participants who did not meet the incentive threshold at baseline decreased their BMI (1%), glucose (8%), blood pressure (systolic 9%, diastolic 8%), and total cholesterol (8%) by year 2 with improvements generally sustained or continued during each additional year of participation. On average, individuals at high health risk who participated in a financially incentivized biometric assessment program improved their health indices over time. Further research is needed to understand key determinants that drive health improvement indicated here. © The Author(s) 2016.
Quantum Biometrics with Retinal Photon Counting
NASA Astrophysics Data System (ADS)
Loulakis, M.; Blatsios, G.; Vrettou, C. S.; Kominis, I. K.
2017-10-01
It is known that the eye's scotopic photodetectors, rhodopsin molecules, and their associated phototransduction mechanism leading to light perception, are efficient single-photon counters. We here use the photon-counting principles of human rod vision to propose a secure quantum biometric identification based on the quantum-statistical properties of retinal photon detection. The photon path along the human eye until its detection by rod cells is modeled as a filter having a specific transmission coefficient. Precisely determining its value from the photodetection statistics registered by the conscious observer is a quantum parameter estimation problem that leads to a quantum secure identification method. The probabilities for false-positive and false-negative identification of this biometric technique can readily approach 10-10 and 10-4, respectively. The security of the biometric method can be further quantified by the physics of quantum measurements. An impostor must be able to perform quantum thermometry and quantum magnetometry with energy resolution better than 10-9ℏ , in order to foil the device by noninvasively monitoring the biometric activity of a user.
Volatility of linear and nonlinear time series
NASA Astrophysics Data System (ADS)
Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo
2005-07-01
Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.
Iris recognition as a biometric method after cataract surgery.
Roizenblatt, Roberto; Schor, Paulo; Dante, Fabio; Roizenblatt, Jaime; Belfort, Rubens
2004-01-28
Biometric methods are security technologies, which use human characteristics for personal identification. Iris recognition systems use iris textures as unique identifiers. This paper presents an analysis of the verification of iris identities after intra-ocular procedures, when individuals were enrolled before the surgery. Fifty-five eyes from fifty-five patients had their irises enrolled before a cataract surgery was performed. They had their irises verified three times before and three times after the procedure, and the Hamming (mathematical) distance of each identification trial was determined, in a controlled ideal biometric environment. The mathematical difference between the iris code before and after the surgery was also compared to a subjective evaluation of the iris anatomy alteration by an experienced surgeon. A correlation between visible subjective iris texture alteration and mathematical difference was verified. We found only six cases in which the eye was no more recognizable, but these eyes were later reenrolled. The main anatomical changes that were found in the new impostor eyes are described. Cataract surgeries change iris textures in such a way that iris recognition systems, which perform mathematical comparisons of textural biometric features, are able to detect these changes and sometimes even discard a pre-enrolled iris considering it an impostor. In our study, re-enrollment proved to be a feasible procedure.
A Survey of Keystroke Dynamics Biometrics
Yue, Shigang
2013-01-01
Research on keystroke dynamics biometrics has been increasing, especially in the last decade. The main motivation behind this effort is due to the fact that keystroke dynamics biometrics is economical and can be easily integrated into the existing computer security systems with minimal alteration and user intervention. Numerous studies have been conducted in terms of data acquisition devices, feature representations, classification methods, experimental protocols, and evaluations. However, an up-to-date extensive survey and evaluation is not yet available. The objective of this paper is to provide an insightful survey and comparison on keystroke dynamics biometrics research performed throughout the last three decades, as well as offering suggestions and possible future research directions. PMID:24298216
A survey of keystroke dynamics biometrics.
Teh, Pin Shen; Teoh, Andrew Beng Jin; Yue, Shigang
2013-01-01
Research on keystroke dynamics biometrics has been increasing, especially in the last decade. The main motivation behind this effort is due to the fact that keystroke dynamics biometrics is economical and can be easily integrated into the existing computer security systems with minimal alteration and user intervention. Numerous studies have been conducted in terms of data acquisition devices, feature representations, classification methods, experimental protocols, and evaluations. However, an up-to-date extensive survey and evaluation is not yet available. The objective of this paper is to provide an insightful survey and comparison on keystroke dynamics biometrics research performed throughout the last three decades, as well as offering suggestions and possible future research directions.
Network structure of multivariate time series.
Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito
2015-10-21
Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.
Societal and ethical implications of anti-spoofing technologies in biometrics.
Rebera, Andrew P; Bonfanti, Matteo E; Venier, Silvia
2014-03-01
Biometric identification is thought to be less vulnerable to fraud and forgery than are traditional forms of identification. However biometric identification is not without vulnerabilities. In a 'spoofing attack' an artificial replica of an individual's biometric trait is used to induce a system to falsely infer that individual's presence. Techniques such as liveness-detection and multi-modality, as well as the development of new and emerging modalities, are intended to secure biometric identification systems against such threats. Unlike biometrics in general, the societal and ethical issues raised by spoofing and anti-spoofing techniques have not received much attention. This paper examines these issues.
A robust probabilistic collaborative representation based classification for multimodal biometrics
NASA Astrophysics Data System (ADS)
Zhang, Jing; Liu, Huanxi; Ding, Derui; Xiao, Jianli
2018-04-01
Most of the traditional biometric recognition systems perform recognition with a single biometric indicator. These systems have suffered noisy data, interclass variations, unacceptable error rates, forged identity, and so on. Due to these inherent problems, it is not valid that many researchers attempt to enhance the performance of unimodal biometric systems with single features. Thus, multimodal biometrics is investigated to reduce some of these defects. This paper proposes a new multimodal biometric recognition approach by fused faces and fingerprints. For more recognizable features, the proposed method extracts block local binary pattern features for all modalities, and then combines them into a single framework. For better classification, it employs the robust probabilistic collaborative representation based classifier to recognize individuals. Experimental results indicate that the proposed method has improved the recognition accuracy compared to the unimodal biometrics.
Multimodal biometric approach for cancelable face template generation
NASA Astrophysics Data System (ADS)
Paul, Padma Polash; Gavrilova, Marina
2012-06-01
Due to the rapid growth of biometric technology, template protection becomes crucial to secure integrity of the biometric security system and prevent unauthorized access. Cancelable biometrics is emerging as one of the best solutions to secure the biometric identification and verification system. We present a novel technique for robust cancelable template generation algorithm that takes advantage of the multimodal biometric using feature level fusion. Feature level fusion of different facial features is applied to generate the cancelable template. A proposed algorithm based on the multi-fold random projection and fuzzy communication scheme is used for this purpose. In cancelable template generation, one of the main difficulties is keeping interclass variance of the feature. We have found that interclass variations of the features that are lost during multi fold random projection can be recovered using fusion of different feature subsets and projecting in a new feature domain. Applying the multimodal technique in feature level, we enhance the interclass variability hence improving the performance of the system. We have tested the system for classifier fusion for different feature subset and different cancelable template fusion. Experiments have shown that cancelable template improves the performance of the biometric system compared with the original template.
Corneal topography measurements for biometric applications
NASA Astrophysics Data System (ADS)
Lewis, Nathan D.
The term biometrics is used to describe the process of analyzing biological and behavioral traits that are unique to an individual in order to confirm or determine his or her identity. Many biometric modalities are currently being researched and implemented including, fingerprints, hand and facial geometry, iris recognition, vein structure recognition, gait, voice recognition, etc... This project explores the possibility of using corneal topography measurements as a trait for biometric identification. Two new corneal topographers were developed for this study. The first was designed to function as an operator-free device that will allow a user to approach the device and have his or her corneal topography measured. Human subject topography data were collected with this device and compared to measurements made with the commercially available Keratron Piccolo topographer (Optikon, Rome, Italy). A third topographer that departs from the standard Placido disk technology allows for arbitrary pattern illumination through the use of LCD monitors. This topographer was built and tested to be used in future research studies. Topography data was collected from 59 subjects and modeled using Zernike polynomials, which provide for a simple method of compressing topography data and comparing one topographical measurement with a database for biometric identification. The data were analyzed to determine the biometric error rates associated with corneal topography measurements. Reasonably accurate results, between three to eight percent simultaneous false match and false non-match rates, were achieved.
Analyzing handwriting biometrics in metadata context
NASA Astrophysics Data System (ADS)
Scheidat, Tobias; Wolf, Franziska; Vielhauer, Claus
2006-02-01
In this article, methods for user recognition by online handwriting are experimentally analyzed using a combination of demographic data of users in relation to their handwriting habits. Online handwriting as a biometric method is characterized by having high variations of characteristics that influences the reliance and security of this method. These variations have not been researched in detail so far. Especially in cross-cultural application it is urgent to reveal the impact of personal background to security aspects in biometrics. Metadata represent the background of writers, by introducing cultural, biological and conditional (changing) aspects like fist language, country of origin, gender, handedness, experiences the influence handwriting and language skills. The goal is the revelation of intercultural impacts on handwriting in order to achieve higher security in biometrical systems. In our experiments, in order to achieve a relatively high coverage, 48 different handwriting tasks have been accomplished by 47 users from three countries (Germany, India and Italy) have been investigated with respect to the relations of metadata and biometric recognition performance. For this purpose, hypotheses have been formulated and have been evaluated using the measurement of well-known recognition error rates from biometrics. The evaluation addressed both: system reliance and security threads by skilled forgeries. For the later purpose, a novel forgery type is introduced, which applies the personal metadata to security aspects and includes new methods of security tests. Finally in our paper, we formulate recommendations for specific user groups and handwriting samples.
Forgery quality and its implications for behavioral biometric security.
Ballard, Lucas; Lopresti, Daniel; Monrose, Fabian
2007-10-01
Biometric security is a topic of rapidly growing importance in the areas of user authentication and cryptographic key generation. In this paper, we describe our steps toward developing evaluation methodologies for behavioral biometrics that take into account threat models that have been largely ignored. We argue that the pervasive assumption that forgers are minimally motivated (or, even worse, naive) is too optimistic and even dangerous. Taking handwriting as a case in point, we show through a series of experiments that some users are significantly better forgers than others, that such forgers can be trained in a relatively straightforward fashion to pose an even greater threat, that certain users are easy targets for forgers, and that most humans are a relatively poor judge of handwriting authenticity, and hence, their unaided instincts cannot be trusted. Additionally, to overcome current labor-intensive hurdles in performing more accurate assessments of system security, we present a generative attack model based on concatenative synthesis that can provide a rapid indication of the security afforded by the system. We show that our generative attacks match or exceed the effectiveness of forgeries rendered by the skilled humans we have encountered.
Association mining of dependency between time series
NASA Astrophysics Data System (ADS)
Hafez, Alaaeldin
2001-03-01
Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.
New biometric modalities using internal physical characteristics
NASA Astrophysics Data System (ADS)
Mortenson, Juliana (Brooks)
2010-04-01
Biometrics is described as the science of identifying people based on physical characteristics such as their fingerprints, facial features, hand geometry, iris patterns, palm prints, or speech recognition. Notably, all of these physical characteristics are visible or detectable from the exterior of the body. These external characteristics can be lifted, photographed, copied or recorded for unauthorized access to a biometric system. Individual humans are unique internally, however, just as they are unique externally. New biometric modalities have been developed which identify people based on their unique internal characteristics. For example, "BoneprintsTM" use acoustic fields to scan the unique bone density pattern of a thumb pressed on a small acoustic sensor. Thanks to advances in piezoelectric materials the acoustic sensor can be placed in virtually any device such as a steering wheel, door handle, or keyboard. Similarly, "Imp-PrintsTM" measure the electrical impedance patterns of a hand to identify or verify a person's identity. Small impedance sensors can be easily embedded in devices such as smart cards, handles, or wall mounts. These internal biometric modalities rely on physical characteristics which are not visible or photographable, providing an added level of security. In addition, both the acoustic and impedance methods can be combined with physiologic measurements such as acoustic Doppler or impedance plethysmography, respectively. Added verification that the biometric pattern came from a living person can be obtained. These new biometric modalities have the potential to allay user concerns over protection of privacy, while providing a higher level of security.*
Smoothing of climate time series revisited
NASA Astrophysics Data System (ADS)
Mann, Michael E.
2008-08-01
We present an easily implemented method for smoothing climate time series, generalizing upon an approach previously described by Mann (2004). The method adaptively weights the three lowest order time series boundary constraints to optimize the fit with the raw time series. We apply the method to the instrumental global mean temperature series from 1850-2007 and to various surrogate global mean temperature series from 1850-2100 derived from the CMIP3 multimodel intercomparison project. These applications demonstrate that the adaptive method systematically out-performs certain widely used default smoothing methods, and is more likely to yield accurate assessments of long-term warming trends.
A Review of Subsequence Time Series Clustering
Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332
A review of subsequence time series clustering.
Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.
Compressed domain ECG biometric with two-lead features
NASA Astrophysics Data System (ADS)
Lee, Wan-Jou; Chang, Wen-Whei
2016-07-01
This study presents a new method to combine ECG biometrics with data compression within a common JPEG2000 framework. We target the two-lead ECG configuration that is routinely used in long-term heart monitoring. Incorporation of compressed-domain biometric techniques enables faster person identification as it by-passes the full decompression. Experiments on public ECG databases demonstrate the validity of the proposed method for biometric identification with high accuracies on both healthy and diseased subjects.
Biometrics in support of special forces medical operations.
Kershner, Michael R
2012-01-01
Recommendations on ways in which the ODA can leverage biometrics in medical operations to improve their security, improve relations with indigenous personnel, and contribute to the larger theater biometrics program. 2012.
Adaptive time-variant models for fuzzy-time-series forecasting.
Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching
2010-12-01
A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.
Time series with tailored nonlinearities
NASA Astrophysics Data System (ADS)
Räth, C.; Laut, I.
2015-10-01
It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.
Body, biometrics and identity.
Mordini, Emilio; Massari, Sonia
2008-11-01
According to a popular aphorism, biometrics are turning the human body into a passport or a password. As usual, aphorisms say more than they intend. Taking the dictum seriously, we would be two: ourself and our body. Who are we, if we are not our body? And what is our body without us? The endless history of identification systems teaches that identification is not a trivial fact but always involves a web of economic interests, political relations, symbolic networks, narratives and meanings. Certainly there are reasons for the ethical and political concerns surrounding biometrics but these reasons are probably quite different from those usually alleged.
Multimodal biometric system using rank-level fusion approach.
Monwar, Md Maruf; Gavrilova, Marina L
2009-08-01
In many real-world applications, unimodal biometric systems often face significant limitations due to sensitivity to noise, intraclass variability, data quality, nonuniversality, and other factors. Attempting to improve the performance of individual matchers in such situations may not prove to be highly effective. Multibiometric systems seek to alleviate some of these problems by providing multiple pieces of evidence of the same identity. These systems help achieve an increase in performance that may not be possible using a single-biometric indicator. This paper presents an effective fusion scheme that combines information presented by multiple domain experts based on the rank-level fusion integration method. The developed multimodal biometric system possesses a number of unique qualities, starting from utilizing principal component analysis and Fisher's linear discriminant methods for individual matchers (face, ear, and signature) identity authentication and utilizing the novel rank-level fusion method in order to consolidate the results obtained from different biometric matchers. The ranks of individual matchers are combined using the highest rank, Borda count, and logistic regression approaches. The results indicate that fusion of individual modalities can improve the overall performance of the biometric system, even in the presence of low quality data. Insights on multibiometric design using rank-level fusion and its performance on a variety of biometric databases are discussed in the concluding section.
Practical security and privacy attacks against biometric hashing using sparse recovery
NASA Astrophysics Data System (ADS)
Topcu, Berkay; Karabat, Cagatay; Azadmanesh, Matin; Erdogan, Hakan
2016-12-01
Biometric hashing is a cancelable biometric verification method that has received research interest recently. This method can be considered as a two-factor authentication method which combines a personal password (or secret key) with a biometric to obtain a secure binary template which is used for authentication. We present novel practical security and privacy attacks against biometric hashing when the attacker is assumed to know the user's password in order to quantify the additional protection due to biometrics when the password is compromised. We present four methods that can reconstruct a biometric feature and/or the image from a hash and one method which can find the closest biometric data (i.e., face image) from a database. Two of the reconstruction methods are based on 1-bit compressed sensing signal reconstruction for which the data acquisition scenario is very similar to biometric hashing. Previous literature introduced simple attack methods, but we show that we can achieve higher level of security threats using compressed sensing recovery techniques. In addition, we present privacy attacks which reconstruct a biometric image which resembles the original image. We quantify the performance of the attacks using detection error tradeoff curves and equal error rates under advanced attack scenarios. We show that conventional biometric hashing methods suffer from high security and privacy leaks under practical attacks, and we believe more advanced hash generation methods are necessary to avoid these attacks.
Biometrics: A Look at Facial Recognition
a facial recognition system in the city’s Oceanfront tourist area. The system has been tested and has recently been fully implemented. Senator...Kenneth W. Stolle, the Chairman of the Virginia State Crime Commission, established a Facial Recognition Technology Sub-Committee to examine the issue of... facial recognition technology. This briefing begins by defining biometrics and discussing examples of the technology. It then explains how biometrics
Clustering of financial time series
NASA Astrophysics Data System (ADS)
D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo
2013-05-01
This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.
Extreme value analysis in biometrics.
Hüsler, Jürg
2009-04-01
We review some approaches of extreme value analysis in the context of biometrical applications. The classical extreme value analysis is based on iid random variables. Two different general methods are applied, which will be discussed together with biometrical examples. Different estimation, testing, goodness-of-fit procedures for applications are discussed. Furthermore, some non-classical situations are considered where the data are possibly dependent, where a non-stationary behavior is observed in the data or where the observations are not univariate. A few open problems are also stated.
Iris recognition as a biometric method after cataract surgery
Roizenblatt, Roberto; Schor, Paulo; Dante, Fabio; Roizenblatt, Jaime; Belfort, Rubens
2004-01-01
Background Biometric methods are security technologies, which use human characteristics for personal identification. Iris recognition systems use iris textures as unique identifiers. This paper presents an analysis of the verification of iris identities after intra-ocular procedures, when individuals were enrolled before the surgery. Methods Fifty-five eyes from fifty-five patients had their irises enrolled before a cataract surgery was performed. They had their irises verified three times before and three times after the procedure, and the Hamming (mathematical) distance of each identification trial was determined, in a controlled ideal biometric environment. The mathematical difference between the iris code before and after the surgery was also compared to a subjective evaluation of the iris anatomy alteration by an experienced surgeon. Results A correlation between visible subjective iris texture alteration and mathematical difference was verified. We found only six cases in which the eye was no more recognizable, but these eyes were later reenrolled. The main anatomical changes that were found in the new impostor eyes are described. Conclusions Cataract surgeries change iris textures in such a way that iris recognition systems, which perform mathematical comparisons of textural biometric features, are able to detect these changes and sometimes even discard a pre-enrolled iris considering it an impostor. In our study, re-enrollment proved to be a feasible procedure. PMID:14748929
Heart Electrical Actions as Biometric Indicia
NASA Technical Reports Server (NTRS)
Schipper, John F. (Inventor); Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor); Belousof, Eugene (Inventor)
2013-01-01
A method and associated system for use of statistical parameters based on peak amplitudes and/or time interval lengths and/or depolarization-repolarization vector angles and/or depolarization-repolarization vector lengths for PQRST electrical signals associated with heart waves, to identify a person. The statistical parameters, estimated to be at least 192, serve as biometric indicia, to authenticate, or to decline to authenticate, an asserted identity of a candidate person.
A biometric access personal optical storage device
NASA Astrophysics Data System (ADS)
Davies, David H.; Ray, Steve; Gurkowski, Mark; Lee, Lane
2007-01-01
A portable USB2.0 personal storage device that uses built-in encryption and allows data access through biometric scanning of a finger print is described. Biometric image derived templates are stored on the removable 32 mm write once (WO) media. The encrypted templates travel with the disc and allow access to the data providing the biometric feature (e.g. the finger itself) is present. The device also allows for export and import of the templates under secure key exchange protocols. The storage system is built around the small form factor optical engine that uses a tilt arm rotary actuator and front surface media.
Data Acquisition for Modular Biometric Monitoring System
NASA Technical Reports Server (NTRS)
Grodsinsky, Carlos M. (Inventor); Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor)
2014-01-01
A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to collect data asynchronously, via the bus, from the memory of the plurality of data acquisition modules according to a relative fullness of the memory of the plurality of data acquisition modules.
Drosou, A.; Ioannidis, D.; Moustakas, K.; Tzovaras, D.
2011-01-01
Unobtrusive Authentication Using ACTIvity-Related and Soft BIOmetrics (ACTIBIO) is an EU Specific Targeted Research Project (STREP) where new types of biometrics are combined with state-of-the-art unobtrusive technologies in order to enhance security in a wide spectrum of applications. The project aims to develop a modular, robust, multimodal biometrics security authentication and monitoring system, which uses a biodynamic physiological profile, unique for each individual, and advancements of the state of the art in unobtrusive behavioral and other biometrics, such as face, gait recognition, and seat-based anthropometrics. Several shortcomings of existing biometric recognition systems are addressed within this project, which have helped in improving existing sensors, in developing new algorithms, and in designing applications, towards creating new, unobtrusive, biometric authentication procedures in security-sensitive, Ambient Intelligence environments. This paper presents the concept of the ACTIBIO project and describes its unobtrusive authentication demonstrator in a real scenario by focusing on the vision-based biometric recognition modalities. PMID:21380485
Drosou, A; Ioannidis, D; Moustakas, K; Tzovaras, D
2011-03-01
Unobtrusive Authentication Using ACTIvity-Related and Soft BIOmetrics (ACTIBIO) is an EU Specific Targeted Research Project (STREP) where new types of biometrics are combined with state-of-the-art unobtrusive technologies in order to enhance security in a wide spectrum of applications. The project aims to develop a modular, robust, multimodal biometrics security authentication and monitoring system, which uses a biodynamic physiological profile, unique for each individual, and advancements of the state of the art in unobtrusive behavioral and other biometrics, such as face, gait recognition, and seat-based anthropometrics. Several shortcomings of existing biometric recognition systems are addressed within this project, which have helped in improving existing sensors, in developing new algorithms, and in designing applications, towards creating new, unobtrusive, biometric authentication procedures in security-sensitive, Ambient Intelligence environments. This paper presents the concept of the ACTIBIO project and describes its unobtrusive authentication demonstrator in a real scenario by focusing on the vision-based biometric recognition modalities.
Feature Selection for Nonstationary Data: Application to Human Recognition Using Medical Biometrics.
Komeili, Majid; Louis, Wael; Armanfard, Narges; Hatzinakos, Dimitrios
2018-05-01
Electrocardiogram (ECG) and transient evoked otoacoustic emission (TEOAE) are among the physiological signals that have attracted significant interest in biometric community due to their inherent robustness to replay and falsification attacks. However, they are time-dependent signals and this makes them hard to deal with in across-session human recognition scenario where only one session is available for enrollment. This paper presents a novel feature selection method to address this issue. It is based on an auxiliary dataset with multiple sessions where it selects a subset of features that are more persistent across different sessions. It uses local information in terms of sample margins while enforcing an across-session measure. This makes it a perfect fit for aforementioned biometric recognition problem. Comprehensive experiments on ECG and TEOAE variability due to time lapse and body posture are done. Performance of the proposed method is compared against seven state-of-the-art feature selection algorithms as well as another six approaches in the area of ECG and TEOAE biometric recognition. Experimental results demonstrate that the proposed method performs noticeably better than other algorithms.
Joint sparse representation for robust multimodal biometrics recognition.
Shekhar, Sumit; Patel, Vishal M; Nasrabadi, Nasser M; Chellappa, Rama
2014-01-01
Traditional biometric recognition systems rely on a single biometric signature for authentication. While the advantage of using multiple sources of information for establishing the identity has been widely recognized, computational models for multimodal biometrics recognition have only recently received attention. We propose a multimodal sparse representation method, which represents the test data by a sparse linear combination of training data, while constraining the observations from different modalities of the test subject to share their sparse representations. Thus, we simultaneously take into account correlations as well as coupling information among biometric modalities. A multimodal quality measure is also proposed to weigh each modality as it gets fused. Furthermore, we also kernelize the algorithm to handle nonlinearity in data. The optimization problem is solved using an efficient alternative direction method. Various experiments show that the proposed method compares favorably with competing fusion-based methods.
Biometric identification based on feature fusion with PCA and SVM
NASA Astrophysics Data System (ADS)
Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina
2018-04-01
Biometric identification is gaining ground compared to traditional identification methods. Many biometric measurements may be used for secure human identification. The most reliable among them is the iris pattern because of its uniqueness, stability, unforgeability and inalterability over time. The approach presented in this paper is a fusion of different feature descriptor methods such as HOG, LIOP, LBP, used for extracting iris texture information. The classifiers obtained through the SVM and PCA methods demonstrate the effectiveness of our system applied to one and both irises. The performances measured are highly accurate and foreshadow a fusion system with a rate of identification approaching 100% on the UPOL database.
A definitional framework for the human/biometric sensor interaction model
NASA Astrophysics Data System (ADS)
Elliott, Stephen J.; Kukula, Eric P.
2010-04-01
Existing definitions for biometric testing and evaluation do not fully explain errors in a biometric system. This paper provides a definitional framework for the Human Biometric-Sensor Interaction (HBSI) model. This paper proposes six new definitions based around two classifications of presentations, erroneous and correct. The new terms are: defective interaction (DI), concealed interaction (CI), false interaction (FI), failure to detect (FTD), failure to extract (FTX), and successfully acquired samples (SAS). As with all definitions, the new terms require a modification to the general biometric model developed by Mansfield and Wayman [1].
Analyzing Personalized Policies for Online Biometric Verification
Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M.
2014-01-01
Motivated by India’s nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident’s biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India’s program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India’s biometric program. The mean delay is sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32–41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident. PMID:24787752
Analyzing personalized policies for online biometric verification.
Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M
2014-01-01
Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.
Wu, Qunjian; Yan, Bin; Zeng, Ying; Zhang, Chi; Tong, Li
2018-05-03
The electroencephalogram (EEG) signal represents a subject's specific brain activity patterns and is considered as an ideal biometric given its superior invisibility, non-clonality, and non-coercion. In order to enhance its applicability in identity authentication, a novel EEG-based identity authentication method is proposed based on self- or non-self-face rapid serial visual presentation. In contrast to previous studies that extracted EEG features from rest state or motor imagery, the designed paradigm could obtain a distinct and stable biometric trait with a lower time cost. Channel selection was applied to select specific channels for each user to enhance system portability and improve discriminability between users and imposters. Two different imposter scenarios were designed to test system security, which demonstrate the capability of anti-deception. Fifteen users and thirty imposters participated in the experiment. The mean authentication accuracy values for the two scenarios were 91.31 and 91.61%, with 6 s time cost, which illustrated the precision and real-time capability of the system. Furthermore, in order to estimate the repeatability and stability of our paradigm, another data acquisition session is conducted for each user. Using the classification models generated from the previous sessions, a mean false rejected rate of 7.27% has been achieved, which demonstrates the robustness of our paradigm. Experimental results reveal that the proposed paradigm and methods are effective for EEG-based identity authentication.
Time averaging, ageing and delay analysis of financial time series
NASA Astrophysics Data System (ADS)
Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf
2017-06-01
We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.
Iris biometric system design using multispectral imaging
NASA Astrophysics Data System (ADS)
Widhianto, Benedictus Yohanes Bagus Y. B.; Nasution, Aulia M. T.
2016-11-01
An identity recognition system is a vital component that cannot be separated from life, iris biometric is one of the biometric that has the best accuracy reaching 99%. Usually, iris biometric systems use infrared spectrum lighting to reduce discomfort caused by radiation when the eye is given direct light, while the eumelamin that is forming the iris has the most flourescent radiation when given a spectrum of visible light. This research will be conducted by detecting iris wavelengths of 850 nm, 560 nm, and 590 nm, where the detection algorithm will be using Daugman algorithm by using a Gabor wavelet extraction feature, and matching feature using a Hamming distance. Results generated will be analyzed to identify how much differences there are, and to improve the accuracy of the multispectral biometric system and as a detector of the authenticity of the iris. The results obtained from the analysis of wavelengths 850 nm, 560 nm, and 590 nm respectively has an accuracy of 99,35 , 97,5 , 64,5 with a matching score of 0,26 , 0,23 , 0,37.
Biometric recognition using 3D ear shape.
Yan, Ping; Bowyer, Kevin W
2007-08-01
Previous works have shown that the ear is a promising candidate for biometric identification. However, in prior work, the preprocessing of ear images has had manual steps and algorithms have not necessarily handled problems caused by hair and earrings. We present a complete system for ear biometrics, including automated segmentation of the ear in a profile view image and 3D shape matching for recognition. We evaluated this system with the largest experimental study to date in ear biometrics, achieving a rank-one recognition rate of 97.8 percent for an identification scenario and an equal error rate of 1.2 percent for a verification scenario on a database of 415 subjects and 1,386 total probes.
Intraocular pressure and ocular biometric parameters changes in migraine.
Koban, Yaran; Ozlece, Hatice Kose; Bilgin, Gorkem; Koc, Mustafa; Cagatay, Halil Huseyin; Durgunlu, Emre I; Burcu, Ayse
2016-05-31
The aim of this study was to assess the intraocular pressure and ocular biometric parameters in migraine patients during acute migraine attacks and compare them with painless period and healthy controls using a new optical biometer AL-Scan. In this prospective, case-control study, the axial length, corneal curvature radius, anterior chamber depth, central corneal thickness, and pupil size of 40 migraine patients during acute migraine attacks and painless period and 40 age- and sex-matched healthy subjects were measured using a AL-Scan optical biometer (Nidek Co., Gamagori, Japan). All patients underwent a complete ophthalmic examination before the measurements. IOP and biometer measurements were taken at the same time of day (10:00-12:00) in order to minimize the effects of diurnal variation. There was not a statistically significant difference in intraocular pressure between the migraine patients during acute migraine attacks (15.07 mmHg), painless period (14.10 mmHg), and the controls (15,73 ± 0,81). Also, the ocular biometric parameters did not significantly vary during the acute migraine attacks. Further studies are needed to evaluate the etiopathologic relationship between intraocular pressure and ocular biometric parameters and acute migraine attack.
A user authentication scheme using physiological and behavioral biometrics for multitouch devices.
Koong, Chorng-Shiuh; Yang, Tzu-I; Tseng, Chien-Chao
2014-01-01
With the rapid growth of mobile network, tablets and smart phones have become sorts of keys to access personal secured services in our daily life. People use these devices to manage personal finances, shop on the Internet, and even pay at vending machines. Besides, it also helps us get connected with friends and business partners through social network applications, which were widely used as personal identifications in both real and virtual societies. However, these devices use inherently weak authentication mechanism, based upon passwords and PINs that is not changed all the time. Although forcing users to change password periodically can enhance the security level, it may also be considered annoyances for users. Biometric technologies are straightforward because of the simple authentication process. However, most of the traditional biometrics methodologies require diverse equipment to acquire biometric information, which may be expensive and not portable. This paper proposes a multibiometric user authentication scheme with both physiological and behavioral biometrics. Only simple rotations with fingers on multitouch devices are required to enhance the security level without annoyances for users. In addition, the user credential is replaceable to prevent from the privacy leakage.
A User Authentication Scheme Using Physiological and Behavioral Biometrics for Multitouch Devices
Koong, Chorng-Shiuh; Tseng, Chien-Chao
2014-01-01
With the rapid growth of mobile network, tablets and smart phones have become sorts of keys to access personal secured services in our daily life. People use these devices to manage personal finances, shop on the Internet, and even pay at vending machines. Besides, it also helps us get connected with friends and business partners through social network applications, which were widely used as personal identifications in both real and virtual societies. However, these devices use inherently weak authentication mechanism, based upon passwords and PINs that is not changed all the time. Although forcing users to change password periodically can enhance the security level, it may also be considered annoyances for users. Biometric technologies are straightforward because of the simple authentication process. However, most of the traditional biometrics methodologies require diverse equipment to acquire biometric information, which may be expensive and not portable. This paper proposes a multibiometric user authentication scheme with both physiological and behavioral biometrics. Only simple rotations with fingers on multitouch devices are required to enhance the security level without annoyances for users. In addition, the user credential is replaceable to prevent from the privacy leakage. PMID:25147864
Biometric Attendance and Big Data Analysis for Optimizing Work Processes.
Verma, Neetu; Xavier, Teenu; Agrawal, Deepak
2016-01-01
Although biometric attendance management is available, large healthcare organizations have difficulty in big data analysis for optimization of work processes. The aim of this project was to assess the implementation of a biometric attendance system and its utility following big data analysis. In this prospective study the implementation of biometric system was evaluated over 3 month period at our institution. Software integration with other existing systems for data analysis was also evaluated. Implementation of the biometric system could be successfully done over a two month period with enrollment of 10,000 employees into the system. However generating reports and taking action this large number of staff was a challenge. For this purpose software was made for capturing the duty roster of each employee and integrating it with the biometric system and adding an SMS gateway. This helped in automating the process of sending SMSs to each employee who had not signed in. Standalone biometric systems have limited functionality in large organizations unless it is meshed with employee duty roster.
Entropic Analysis of Electromyography Time Series
NASA Astrophysics Data System (ADS)
Kaufman, Miron; Sung, Paul
2005-03-01
We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.
3D Biometrics for Hindfoot Alignment Using Weightbearing CT.
Lintz, François; Welck, Matthew; Bernasconi, Alessio; Thornton, James; Cullen, Nicholas P; Singh, Dishan; Goldberg, Andy
2017-06-01
Hindfoot alignment on 2D radiographs can present anatomical and operator-related bias. In this study, software designed for weightbearing computed tomography (WBCT) was used to calculate a new 3D biometric tool: the Foot and Ankle Offset (FAO). We described the distribution of FAO in a series of data sets from clinically normal, varus, and valgus cases, hypothesizing that FAO values would be significantly different in the 3 groups. In this retrospective cohort study, 135 data sets (57 normal, 38 varus, 40 valgus) from WBCT (PedCAT; CurveBeam LLC, Warrington, PA) were obtained from a specialized foot and ankle unit. 3D coordinates of specific anatomical landmarks (weightbearing points of the calcaneus, of the first and fifth metatarsal heads and the highest and centermost point on the talar dome) were collected. These data were processed with the TALAS system (CurveBeam), which resulted in an FAO value for each case. Intraobserver and interobserver reliability were also assessed. In normal cases, the mean value for FAO was 2.3% ± 2.9%, whereas in varus and valgus cases, the mean was -11.6% ± 6.9% and 11.4% ± 5.7%, respectively, with a statistically significant difference among groups ( P < .001). The distribution of the normal population was Gaussian. The inter- and intraobserver reliability were 0.99 +/- 0.00 and 0.97 +/-0.02 Conclusions: This pilot study suggests that the FAO is an efficient tool for measuring hindfoot alignment using WBCT. Previously published research in this field has looked at WBCT by adapting 2D biometrics. The present study introduces the concept of 3D biometrics and describes an efficient, semiautomatic tool for measuring hindfoot alignment. Level III, retrospective comparative study.
The ocular biometric differences of diabetic patients.
Kocatürk, Tolga; Zengin, Mehmet Özgür; Cakmak, Harun; Evliçoglu, Gökhan Evren; Dündar, Sema Oruç; Omürlü, Imran Kurt; Unübol, Mustafa; Güney, Engin
2014-01-01
To investigate the differences in ocular biometric and keratometric characteristics in comparison with biometric measurements using the noncontact optical low coherence reflectometer (OLCR) (Lenstar LS 900, Haag-Streit) on diabetic patients. The eyes of 170 patients were included in this study, including 81 diabetic and 89 nondiabetic subjects. Optical biometric measurements of diabetic and nondiabetic patients (between the ages of 25 and 85 years) who applied to the ophthalmology clinic were noted from March to June 2013. Detailed ophthalmologic examinations were done for every subject. Biometric measurements were done using the noncontact OLCR device. Patient age ranged from 29 to 83 years. Subgroup analyses were done in diabetic patients according to their Hba1C levels. The minimum Hba1C value was 5.3, maximum was 12.4, and mean was 7.56 ± 1.48. The median duration of diabetes was 5 years (25th-75th percentile 3.00-11.75). Diabetic patients were found to have thicker lens and shallower anterior chamber in both eyes compared to nondiabetic control subjects. There were no statistical differences between the groups according to central corneal thickness, axial length, or keratometric values in both eyes. However, lens thicknesses were found to be thicker and anterior chamber depth values were found to be shallower in the diabetic group in both eyes. It may useful to determine eyeglasses prescription, refractive surgery calculation, lens selection, and previous cataract surgery according to biometric measurements after the regulation of blood glucose.
Genetics, biometrics and the informatization of the body.
van der Ploeg, Irma
2007-01-01
"Genetics" is a term covering a wide set of theories, practices, and technologies, only some of which overlap with the practices and technologies of biometrics. In this paper some current technological developments relating to biometric applications of genetics will be highlighted. Next, the author will elaborate the notion of the informatization of the body, by means of a brief philosophical detour on the dualisms of language and reality, words and things. In the subsequent sections she will then draw out some of the questions relevant to the purposes of Biometrics Identification Technology Ethics (BITE), and discuss the ethical problems associated with the informatization of the body. There are, however some problems and limitations to the currently dominant ethical discourse to deal with all things ethical in relation to information technology in general, and biometrics or genetics in particular. The final section will discuss some of these meta-problems.
Homogenising time series: Beliefs, dogmas and facts
NASA Astrophysics Data System (ADS)
Domonkos, P.
2010-09-01
For obtaining reliable information about climate change and climate variability the use of high quality data series is essentially important, and one basic tool of quality improvements is the statistical homogenisation of observed time series. In the recent decades large number of homogenisation methods has been developed, but the real effects of their application on time series are still not known entirely. The ongoing COST HOME project (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. The author believes that several old theoretical rules have to be re-evaluated. Some examples of the hot questions, a) Statistically detected change-points can be accepted only with the confirmation of metadata information? b) Do semi-hierarchic algorithms for detecting multiple change-points in time series function effectively in practise? c) Is it good to limit the spatial comparison of candidate series with up to five other series in the neighbourhood? Empirical results - those from the COST benchmark, and other experiments too - show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities seem like part of the climatic variability, thus the pure application of classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality, than in raw time series. The developers and users of homogenisation methods have to bear in mind that
Quantifying memory in complex physiological time-series.
Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R
2013-01-01
In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.
Quantifying Memory in Complex Physiological Time-Series
Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.
2013-01-01
In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811
Unveiling the Biometric Potential of Finger-Based ECG Signals
Lourenço, André; Silva, Hugo; Fred, Ana
2011-01-01
The ECG signal has been shown to contain relevant information for human identification. Even though results validate the potential of these signals, data acquisition methods and apparatus explored so far compromise user acceptability, requiring the acquisition of ECG at the chest. In this paper, we propose a finger-based ECG biometric system, that uses signals collected at the fingers, through a minimally intrusive 1-lead ECG setup recurring to Ag/AgCl electrodes without gel as interface with the skin. The collected signal is significantly more noisy than the ECG acquired at the chest, motivating the application of feature extraction and signal processing techniques to the problem. Time domain ECG signal processing is performed, which comprises the usual steps of filtering, peak detection, heartbeat waveform segmentation, and amplitude normalization, plus an additional step of time normalization. Through a simple minimum distance criterion between the test patterns and the enrollment database, results have revealed this to be a promising technique for biometric applications. PMID:21837235
Unveiling the biometric potential of finger-based ECG signals.
Lourenço, André; Silva, Hugo; Fred, Ana
2011-01-01
The ECG signal has been shown to contain relevant information for human identification. Even though results validate the potential of these signals, data acquisition methods and apparatus explored so far compromise user acceptability, requiring the acquisition of ECG at the chest. In this paper, we propose a finger-based ECG biometric system, that uses signals collected at the fingers, through a minimally intrusive 1-lead ECG setup recurring to Ag/AgCl electrodes without gel as interface with the skin. The collected signal is significantly more noisy than the ECG acquired at the chest, motivating the application of feature extraction and signal processing techniques to the problem. Time domain ECG signal processing is performed, which comprises the usual steps of filtering, peak detection, heartbeat waveform segmentation, and amplitude normalization, plus an additional step of time normalization. Through a simple minimum distance criterion between the test patterns and the enrollment database, results have revealed this to be a promising technique for biometric applications.
Biometric Borders and Counterterrorism
2010-12-01
105 Gildas Avoine, Kassem Kalach and Jean-Jaques Quisquater, "EPassport: Securing International Contacts...2006): 336. Ashbourn, Julian. Practical Biometrics: From Aspiration to Implementation. London; New York: Springer, 2004. Avoine, Gildas, Kassem
Spectroscopically Enhanced Method and System for Multi-Factor Biometric Authentication
NASA Astrophysics Data System (ADS)
Pishva, Davar
This paper proposes a spectroscopic method and system for preventing spoofing of biometric authentication. One of its focus is to enhance biometrics authentication with a spectroscopic method in a multifactor manner such that a person's unique ‘spectral signatures’ or ‘spectral factors’ are recorded and compared in addition to a non-spectroscopic biometric signature to reduce the likelihood of imposter getting authenticated. By using the ‘spectral factors’ extracted from reflectance spectra of real fingers and employing cluster analysis, it shows how the authentic fingerprint image presented by a real finger can be distinguished from an authentic fingerprint image embossed on an artificial finger, or molded on a fingertip cover worn by an imposter. This paper also shows how to augment two widely used biometrics systems (fingerprint and iris recognition devices) with spectral biometrics capabilities in a practical manner and without creating much overhead or inconveniencing their users.
A cancelable biometric scheme based on multi-lead ECGs.
Peng-Tzu Chen; Shun-Chi Wu; Jui-Hsuan Hsieh
2017-07-01
Biometric technologies offer great advantages over other recognition methods, but there are concerns that they may compromise the privacy of individuals. In this paper, an electrocardiogram (ECG)-based cancelable biometric scheme is proposed to relieve such concerns. In this scheme, distinct biometric templates for a given beat bundle are constructed via "subspace collapsing." To determine the identity of any unknown beat bundle, the multiple signal classification (MUSIC) algorithm, incorporating a "suppression and poll" strategy, is adopted. Unlike the existing cancelable biometric schemes, knowledge of the distortion transform is not required for recognition. Experiments with real ECGs from 285 subjects are presented to illustrate the efficacy of the proposed scheme. The best recognition rate of 97.58 % was achieved under the test condition N train = 10 and N test = 10.
Risk-Based Neuro-Grid Architecture for Multimodal Biometrics
NASA Astrophysics Data System (ADS)
Venkataraman, Sitalakshmi; Kulkarni, Siddhivinayak
Recent research indicates that multimodal biometrics is the way forward for a highly reliable adoption of biometric identification systems in various applications, such as banks, businesses, government and even home environments. However, such systems would require large distributed datasets with multiple computational realms spanning organisational boundaries and individual privacies.
Homogenising time series: beliefs, dogmas and facts
NASA Astrophysics Data System (ADS)
Domonkos, P.
2011-06-01
In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.
Yang, Licai; Shen, Jun; Bao, Shudi; Wei, Shoushui
2013-10-01
To treat the problem of identification performance and the complexity of the algorithm, we proposed a piecewise linear representation and dynamic time warping (PLR-DTW) method for ECG biometric identification. Firstly we detected R peaks to get the heartbeats after denoising preprocessing. Then we used the PLR method to keep important information of an ECG signal segment while reducing the data dimension at the same time. The improved DTW method was used for similarity measurements between the test data and the templates. The performance evaluation was carried out on the two ECG databases: PTB and MIT-BIH. The analystic results showed that compared to the discrete wavelet transform method, the proposed PLR-DTW method achieved a higher accuracy rate which is nearly 8% of rising, and saved about 30% operation time, and this demonstrated that the proposed method could provide a better performance.
Ocean time-series near Bermuda: Hydrostation S and the US JGOFS Bermuda Atlantic time-series study
NASA Technical Reports Server (NTRS)
Michaels, Anthony F.; Knap, Anthony H.
1992-01-01
Bermuda is the site of two ocean time-series programs. At Hydrostation S, the ongoing biweekly profiles of temperature, salinity and oxygen now span 37 years. This is one of the longest open-ocean time-series data sets and provides a view of decadal scale variability in ocean processes. In 1988, the U.S. JGOFS Bermuda Atlantic Time-series Study began a wide range of measurements at a frequency of 14-18 cruises each year to understand temporal variability in ocean biogeochemistry. On each cruise, the data range from chemical analyses of discrete water samples to data from electronic packages of hydrographic and optics sensors. In addition, a range of biological and geochemical rate measurements are conducted that integrate over time-periods of minutes to days. This sampling strategy yields a reasonable resolution of the major seasonal patterns and of decadal scale variability. The Sargasso Sea also has a variety of episodic production events on scales of days to weeks and these are only poorly resolved. In addition, there is a substantial amount of mesoscale variability in this region and some of the perceived temporal patterns are caused by the intersection of the biweekly sampling with the natural spatial variability. In the Bermuda time-series programs, we have added a series of additional cruises to begin to assess these other sources of variation and their impacts on the interpretation of the main time-series record. However, the adequate resolution of higher frequency temporal patterns will probably require the introduction of new sampling strategies and some emerging technologies such as biogeochemical moorings and autonomous underwater vehicles.
Modeling associations between latent event processes governing time series of pulsing hormones.
Liu, Huayu; Carlson, Nichole E; Grunwald, Gary K; Polotsky, Alex J
2017-10-31
This work is motivated by a desire to quantify relationships between two time series of pulsing hormone concentrations. The locations of pulses are not directly observed and may be considered latent event processes. The latent event processes of pulsing hormones are often associated. It is this joint relationship we model. Current approaches to jointly modeling pulsing hormone data generally assume that a pulse in one hormone is coupled with a pulse in another hormone (one-to-one association). However, pulse coupling is often imperfect. Existing joint models are not flexible enough for imperfect systems. In this article, we develop a more flexible class of pulse association models that incorporate parameters quantifying imperfect pulse associations. We propose a novel use of the Cox process model as a model of how pulse events co-occur in time. We embed the Cox process model into a hormone concentration model. Hormone concentration is the observed data. Spatial birth and death Markov chain Monte Carlo is used for estimation. Simulations show the joint model works well for quantifying both perfect and imperfect associations and offers estimation improvements over single hormone analyses. We apply this model to luteinizing hormone (LH) and follicle stimulating hormone (FSH), two reproductive hormones. Use of our joint model results in an ability to investigate novel hypotheses regarding associations between LH and FSH secretion in obese and non-obese women. © 2017, The International Biometric Society.
Optimization of a Biometric System Based on Acoustic Images
Izquierdo Fuente, Alberto; Del Val Puente, Lara; Villacorta Calvo, Juan J.; Raboso Mateos, Mariano
2014-01-01
On the basis of an acoustic biometric system that captures 16 acoustic images of a person for 4 frequencies and 4 positions, a study was carried out to improve the performance of the system. On a first stage, an analysis to determine which images provide more information to the system was carried out showing that a set of 12 images allows the system to obtain results that are equivalent to using all of the 16 images. Finally, optimization techniques were used to obtain the set of weights associated with each acoustic image that maximizes the performance of the biometric system. These results improve significantly the performance of the preliminary system, while reducing the time of acquisition and computational burden, since the number of acoustic images was reduced. PMID:24616643
Multivariate Time Series Decomposition into Oscillation Components.
Matsuda, Takeru; Komaki, Fumiyasu
2017-08-01
Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.
Forecasting Enrollments with Fuzzy Time Series.
ERIC Educational Resources Information Center
Song, Qiang; Chissom, Brad S.
The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…
Zeroing Biometrics: Collecting Biometrics Before the Shooting Starts
2012-04-01
structure, prison, or a border crossing. Facial Photo Facial recognition is the least radical of the modes of biometric collection and may offer the...most promise for the future. Facial recognition is the ability to recognize an individual from a photo or other visual representation. It is no...speed, unfortunately facial recognition programs are more susceptible to acts of disguise than a human observer.24 In the field the use of photos
The rationale for chemical time-series sampling has its roots in the same fundamental relationships as govern well hydraulics. Samples of ground water are collected as a function of increasing time of pumpage. The most efficient pattern of collection consists of logarithmically s...
Remote secure proof of identity using biometrics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, S. K.; Pearson, P.; Strait, R.S.
1997-06-10
Biometric measurements derived from finger- or voiceprints, hand geometry, retinal vessel pattern and iris texture characteristics etc. can be identifiers of individuals. In each case, the measurements can be coded into a statistically unique bit-string for each individual. While in electronic commerce and other electronic transactions the proof of identity of an individual is provided by the use of either public key cryptography or biometric data, more secure applications can be achieved by employing both. However the former requires the use of exact bit patterns. An error correction procedure allows us to successfully combine the use of both to providemore » a general procedure for remote secure proof of identity using a generic biometric device. One such procedure has been demonstrated using a device based on hand geometry.« less
Animal biometrics: quantifying and detecting phenotypic appearance.
Kühl, Hjalmar S; Burghardt, Tilo
2013-07-01
Animal biometrics is an emerging field that develops quantified approaches for representing and detecting the phenotypic appearance of species, individuals, behaviors, and morphological traits. It operates at the intersection between pattern recognition, ecology, and information sciences, producing computerized systems for phenotypic measurement and interpretation. Animal biometrics can benefit a wide range of disciplines, including biogeography, population ecology, and behavioral research. Currently, real-world applications are gaining momentum, augmenting the quantity and quality of ecological data collection and processing. However, to advance animal biometrics will require integration of methodologies among the scientific disciplines involved. Such efforts will be worthwhile because the great potential of this approach rests with the formal abstraction of phenomics, to create tractable interfaces between different organizational levels of life. Copyright © 2013 Elsevier Ltd. All rights reserved.
EEG biometric identification: a thorough exploration of the time-frequency domain
NASA Astrophysics Data System (ADS)
DelPozo-Banos, Marcos; Travieso, Carlos M.; Weidemann, Christoph T.; Alonso, Jesús B.
2015-10-01
Objective. Although interest in using electroencephalogram (EEG) activity for subject identification has grown in recent years, the state of the art still lacks a comprehensive exploration of the discriminant information within it. This work aims to fill this gap, and in particular, it focuses on the time-frequency representation of the EEG. Approach. We executed qualitative and quantitative analyses of six publicly available data sets following a sequential experimentation approach. This approach was divided in three blocks analysing the configuration of the power spectrum density, the representation of the data and the properties of the discriminant information. A total of ten experiments were applied. Main results. Results show that EEG information below 40 Hz is unique enough to discriminate across subjects (a maximum of 100 subjects were evaluated here), regardless of the recorded cognitive task or the sensor location. Moreover, the discriminative power of rhythms follows a W-like shape between 1 and 40 Hz, with the central peak located at the posterior rhythm (around 10 Hz). This information is maximized with segments of around 2 s, and it proved to be moderately constant across montages and time. Significance. Therefore, we characterize how EEG activity differs across individuals and detail the optimal conditions to detect subject-specific information. This work helps to clarify the results of previous studies and to solve some unanswered questions. Ultimately, it will serve as guide for the design of future biometric systems.
Privacy Enhancements for Inexact Biometric Templates
NASA Astrophysics Data System (ADS)
Ratha, Nalini; Chikkerur, Sharat; Connell, Jonathan; Bolle, Ruud
Traditional authentication schemes utilize tokens or depend on some secret knowledge possessed by the user for verifying his or her identity. Although these techniques are widely used, they have several limitations. Both tokenand knowledge-based approaches cannot differentiate between an authorized user and an impersonator having access to the tokens or passwords. Biometrics-based authentication schemes overcome these limitations while offering usability advantages in the area of password management. However, despite its obvious advantages, the use of biometrics raises several security and privacy concerns.
On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature
NASA Astrophysics Data System (ADS)
Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar
Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the
Learning time series for intelligent monitoring
NASA Technical Reports Server (NTRS)
Manganaris, Stefanos; Fisher, Doug
1994-01-01
We address the problem of classifying time series according to their morphological features in the time domain. In a supervised machine-learning framework, we induce a classification procedure from a set of preclassified examples. For each class, we infer a model that captures its morphological features using Bayesian model induction and the minimum message length approach to assign priors. In the performance task, we classify a time series in one of the learned classes when there is enough evidence to support that decision. Time series with sufficiently novel features, belonging to classes not present in the training set, are recognized as such. We report results from experiments in a monitoring domain of interest to NASA.
Directionality volatility in electroencephalogram time series
NASA Astrophysics Data System (ADS)
Mansor, Mahayaudin M.; Green, David A.; Metcalfe, Andrew V.
2016-06-01
We compare time series of electroencephalograms (EEGs) from healthy volunteers with EEGs from subjects diagnosed with epilepsy. The EEG time series from the healthy group are recorded during awake state with their eyes open and eyes closed, and the records from subjects with epilepsy are taken from three different recording regions of pre-surgical diagnosis: hippocampal, epileptogenic and seizure zone. The comparisons for these 5 categories are in terms of deviations from linear time series models with constant variance Gaussian white noise error inputs. One feature investigated is directionality, and how this can be modelled by either non-linear threshold autoregressive models or non-Gaussian errors. A second feature is volatility, which is modelled by Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) processes. Other features include the proportion of variability accounted for by time series models, and the skewness and the kurtosis of the residuals. The results suggest these comparisons may have diagnostic potential for epilepsy and provide early warning of seizures.
Evaluating the effectiveness of biometric access control systems
NASA Astrophysics Data System (ADS)
Lively, Valerie M.
2005-05-01
This paper describes the contribution by the National Safe Skies Alliance (Safe Skies) in operational testing of biometric access control systems under the guidance of the Transportation Security Administration (TSA). Safe Skies has been conducting operational tests of biometric access control systems on behalf of the TSA for approximately four years. The majority of this testing has occurred at the McGhee Tyson Airport (TYS) in Knoxville, Tennessee. Twelve separate biometric devices - eight fingerprint, facial, iris, hand geometry, and fingerprint and iris, have been tested to date. Tests were conducted at a TYS administrative door and different airports to evaluate the access control device under normal, abnormal, and attempt-to-defeat conditions.
Clustering Financial Time Series by Network Community Analysis
NASA Astrophysics Data System (ADS)
Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio
In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.
NASA Technical Reports Server (NTRS)
Bebis, George
2013-01-01
Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
Biometrics between opacity and transparency.
Gutwirth, Serge
2007-01-01
The overall aim of the democratic constitutional state is to protect a social order in which the individual liberty of the citizen is a major concern. As a consequence the democratic constitutional state should guarantee simultaneously and paradoxically a high level of individual freedom and an order in which such freedom is made possible and guaranteed. Biometrics provide a strong and expressive example both of the necessity to address the issue of opacity and transparency and the complexity of the process. Indeed, the large scale use of biometrics does not only question the position of the individual in society, but it also alters the architecture or nature of this society as such.
Time Series Model Identification by Estimating Information.
1982-11-01
principle, Applications of Statistics, P. R. Krishnaiah , ed., North-Holland: Amsterdam, 27-41. Anderson, T. W. (1971). The Statistical Analysis of Time Series...E. (1969). Multiple Time Series Modeling, Multivariate Analysis II, edited by P. Krishnaiah , Academic Press: New York, 389-409. Parzen, E. (1981...Newton, H. J. (1980). Multiple Time Series Modeling, II Multivariate Analysis - V, edited by P. Krishnaiah , North Holland: Amsterdam, 181-197. Shibata, R
Retinal biometrics based on Iterative Closest Point algorithm.
Hatanaka, Yuji; Tajima, Mikiya; Kawasaki, Ryo; Saito, Koko; Ogohara, Kazunori; Muramatsu, Chisako; Sunayama, Wataru; Fujita, Hiroshi
2017-07-01
The pattern of blood vessels in the eye is unique to each person because it rarely changes over time. Therefore, it is well known that retinal blood vessels are useful for biometrics. This paper describes a biometrics method using the Jaccard similarity coefficient (JSC) based on blood vessel regions in retinal image pairs. The retinal image pairs were rough matched by the center of their optic discs. Moreover, the image pairs were aligned using the Iterative Closest Point algorithm based on detailed blood vessel skeletons. For registration, perspective transform was applied to the retinal images. Finally, the pairs were classified as either correct or incorrect using the JSC of the blood vessel region in the image pairs. The proposed method was applied to temporal retinal images, which were obtained in 2009 (695 images) and 2013 (87 images). The 87 images acquired in 2013 were all from persons already examined in 2009. The accuracy of the proposed method reached 100%.
Scale-dependent intrinsic entropies of complex time series.
Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E
2016-04-13
Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease. © 2016 The Author(s).
An Energy-Based Similarity Measure for Time Series
NASA Astrophysics Data System (ADS)
Boudraa, Abdel-Ouahab; Cexus, Jean-Christophe; Groussat, Mathieu; Brunagel, Pierre
2007-12-01
A new similarity measure, called SimilB, for time series analysis, based on the cross-[InlineEquation not available: see fulltext.]-energy operator (2004), is introduced. [InlineEquation not available: see fulltext.] is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED) or the Pearson correlation coefficient (CC), SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of [InlineEquation not available: see fulltext.] are presented. Particularly, we show that [InlineEquation not available: see fulltext.] as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.
Biometric characteristics of eyes with central serous chorioretinopathy.
Oh, Jong-Hyun; Oh, Jaeryung; Togloom, Ariunaa; Kim, Seong-Woo; Huh, Kuhl
2014-03-13
To investigate the biometric characteristics of eyes with idiopathic central serous chorioretinopathy (CSC). Medical records of 52 consecutive patients with unilateral CSC were reviewed. Central serous chorioretinopathy was diagnosed using spectral-domain optical coherence tomography (SD-OCT) and fluorescein angiography. Data collected for comparison with fellow eyes were refractive error, biometric measurements using partial coherence interferometry, and SD-OCT parameters. Mean time from subjective symptom onset to initial visit was 8.3 ± 12.29 weeks. Mean axial length (AL) was shorter in CSC eyes than in fellow eyes by 0.24 ± 0.379 mm (P < 0.001), and mean anterior chamber depth (ACD) was shallower in CSC eyes than in fellow eyes by 0.03 ± 0.088 mm (P = 0.021). Central serous chorioretinopathy eyes also had thicker subfoveal choroidal thickness (CT) than fellow eyes by 34.0 ± 45.93 μm (P < 0.001). Differences in spherical equivalents between CSC and fellow eyes correlated with AL differences (r = -0.690, P < 0.001) and CT differences (r = 0.473, P = 0.001). On multiple linear regression analysis, the differences in ACD between CSC and fellow eyes were significantly correlated with AL differences (P = 0.032) and symptom duration (P = 0.019). Biometric characteristics such as AL and ACD were different between eyes with CSC and fellow eyes. Variations in biometry, which correlated with CT differences, might be related to differences in refractive errors between eyes.
Detecting chaos in irregularly sampled time series.
Kulp, C W
2013-09-01
Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.
Phase walk analysis of leptokurtic time series.
Schreiber, Korbinian; Modest, Heike I; Räth, Christoph
2018-06-01
The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.
FATS: Feature Analysis for Time Series
NASA Astrophysics Data System (ADS)
Nun, Isadora; Protopapas, Pavlos; Sim, Brandon; Zhu, Ming; Dave, Rahul; Castro, Nicolas; Pichara, Karim
2017-11-01
FATS facilitates and standardizes feature extraction for time series data; it quickly and efficiently calculates a compilation of many existing light curve features. Users can characterize or analyze an astronomical photometric database, though this library is not necessarily restricted to the astronomical domain and can also be applied to any kind of time series data.
Use of Biometrics within Sub-Saharan Refugee Communities
2013-12-01
fingerprint patterns, iris pattern recognition, and facial recognition as a means of establishing an individual’s identity. Biometrics creates and...Biometrics typically comprises fingerprint patterns, iris pattern recognition, and facial recognition as a means of establishing an individual’s identity...authentication because it identifies an individual based on mathematical analysis of the random pattern visible within the iris. Facial recognition is
Time-dependent limited penetrable visibility graph analysis of nonstationary time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong
2017-06-01
Recent years have witnessed the development of visibility graph theory, which allows us to analyze a time series from the perspective of complex network. We in this paper develop a novel time-dependent limited penetrable visibility graph (TDLPVG). Two examples using nonstationary time series from RR intervals and gas-liquid flows are provided to demonstrate the effectiveness of our approach. The results of the first example suggest that our TDLPVG method allows characterizing the time-varying behaviors and classifying heart states of healthy, congestive heart failure and atrial fibrillation from RR interval time series. For the second example, we infer TDLPVGs from gas-liquid flow signals and interestingly find that the deviation of node degree of TDLPVGs enables to effectively uncover the time-varying dynamical flow behaviors of gas-liquid slug and bubble flow patterns. All these results render our TDLPVG method particularly powerful for characterizing the time-varying features underlying realistic complex systems from time series.
Tongue prints: A novel biometric and potential forensic tool.
Radhika, T; Jeddy, Nadeem; Nithya, S
2016-01-01
Tongue is a vital internal organ well encased within the oral cavity and protected from the environment. It has unique features which differ from individual to individual and even between identical twins. The color, shape, and surface features are characteristic of every individual, and this serves as a tool for identification. Many modes of biometric systems have come into existence such as fingerprint, iris scan, skin color, signature verification, voice recognition, and face recognition. The search for a new personal identification method secure has led to the use of the lingual impression or the tongue print as a method of biometric authentication. Tongue characteristics exhibit sexual dimorphism thus aiding in the identification of the person. Emerging as a novel biometric tool, tongue prints also hold the promise of a potential forensic tool. This review highlights the uniqueness of tongue prints and its superiority over other biometric identification systems. The various methods of tongue print collection and the classification of tongue features are also elucidated.
Efficient Algorithms for Segmentation of Item-Set Time Series
NASA Astrophysics Data System (ADS)
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
NASA Technical Reports Server (NTRS)
Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)
2015-01-01
Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
Image quality guided approach for adaptive modelling of biometric intra-class variations
NASA Astrophysics Data System (ADS)
Abboud, Ali J.; Jassim, Sabah A.
2010-04-01
The high intra-class variability of acquired biometric data can be attributed to several factors such as quality of acquisition sensor (e.g. thermal), environmental (e.g. lighting), behavioural (e.g. change face pose). Such large fuzziness of biometric data can cause a big difference between an acquired and stored biometric data that will eventually lead to reduced performance. Many systems store multiple templates in order to account for such variations in the biometric data during enrolment stage. The number and typicality of these templates are the most important factors that affect system performance than other factors. In this paper, a novel offline approach is proposed for systematic modelling of intra-class variability and typicality in biometric data by regularly selecting new templates from a set of available biometric images. Our proposed technique is a two stage algorithm whereby in the first stage image samples are clustered in terms of their image quality profile vectors, rather than their biometric feature vectors, and in the second stage a per cluster template is selected from a small number of samples in each clusters to create an ultimate template sets. These experiments have been conducted on five face image databases and their results will demonstrate the effectiveness of proposed quality guided approach.
Defense Biometric and Forensic Office Research, Development, Test and Evaluation Strategy
2015-01-06
investments in biometric and forensic RDT&E. From refining biometric modalities to exploring ‘ game changing’ forensic technologies such as rapid DNA to the... ASD (R&E)), is to identify, fund, manage and transition projects that support biometric and/or forensic requirements. In the second role, the DBFO...forensic stakeholders cannot fund, to the COIs for consideration. Increase contacts with ASD (R&E) divisions/laboratories focused on basic research
Sensor-Generated Time Series Events: A Definition Language
Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan
2012-01-01
There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.
Remote Sensing Time Series Product Tool
NASA Technical Reports Server (NTRS)
Predos, Don; Ryan, Robert E.; Ross, Kenton W.
2006-01-01
The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced
Study on a Biometric Authentication Model based on ECG using a Fuzzy Neural Network
NASA Astrophysics Data System (ADS)
Kim, Ho J.; Lim, Joon S.
2018-03-01
Traditional authentication methods use numbers or graphic passwords and thus involve the risk of loss or theft. Various studies are underway regarding biometric authentication because it uses the unique biometric data of a human being. Biometric authentication technology using ECG from biometric data involves signals that record electrical stimuli from the heart. It is difficult to manipulate and is advantageous in that it enables unrestrained measurements from sensors that are attached to the skin. This study is on biometric authentication methods using the neural network with weighted fuzzy membership functions (NEWFM). In the biometric authentication process, normalization and the ensemble average is applied during preprocessing, characteristics are extracted using Haar-wavelets, and a registration process called “training” is performed in the fuzzy neural network. In the experiment, biometric authentication was performed on 73 subjects in the Physionet Database. 10-40 ECG waveforms were tested for use in the registration process, and 15 ECG waveforms were deemed the appropriate number for registering ECG waveforms. 1 ECG waveforms were used during the authentication stage to conduct the biometric authentication test. Upon testing the proposed biometric authentication method based on 73 subjects from the Physionet Database, the TAR was 98.32% and FAR was 5.84%.
Yang, Meixue; Liu, Bin; Zhao, Miaomiao; Li, Fan; Wang, Guoqing; Zhou, Fengfeng
2013-01-01
Although electrocardiogram (ECG) fluctuates over time and physical activity, some of its intrinsic measurements serve well as biometric features. Considering its constant availability and difficulty in being faked, the ECG signal is becoming a promising factor for biometric authentication. The majority of the currently available algorithms only work well on healthy participants. A novel normalization and interpolation algorithm is proposed to convert an ECG signal into multiple template cycles, which are comparable between any two ECGs, no matter the sampling rates or health status. The overall accuracies reach 100% and 90.11% for healthy participants and cardiovascular disease (CVD) patients, respectively.
Zhao, Miaomiao; Li, Fan; Wang, Guoqing; Zhou, Fengfeng
2013-01-01
Although electrocardiogram (ECG) fluctuates over time and physical activity, some of its intrinsic measurements serve well as biometric features. Considering its constant availability and difficulty in being faked, the ECG signal is becoming a promising factor for biometric authentication. The majority of the currently available algorithms only work well on healthy participants. A novel normalization and interpolation algorithm is proposed to convert an ECG signal into multiple template cycles, which are comparable between any two ECGs, no matter the sampling rates or health status. The overall accuracies reach 100% and 90.11% for healthy participants and cardiovascular disease (CVD) patients, respectively. PMID:23977063
Extending nonlinear analysis to short ecological time series.
Hsieh, Chih-hao; Anderson, Christian; Sugihara, George
2008-01-01
Nonlinearity is important and ubiquitous in ecology. Though detectable in principle, nonlinear behavior is often difficult to characterize, analyze, and incorporate mechanistically into models of ecosystem function. One obvious reason is that quantitative nonlinear analysis tools are data intensive (require long time series), and time series in ecology are generally short. Here we demonstrate a useful method that circumvents data limitation and reduces sampling error by combining ecologically similar multispecies time series into one long time series. With this technique, individual ecological time series containing as few as 20 data points can be mined for such important information as (1) significantly improved forecast ability, (2) the presence and location of nonlinearity, and (3) the effective dimensionality (the number of relevant variables) of an ecological system.
Financial Time-series Analysis: a Brief Overview
NASA Astrophysics Data System (ADS)
Chakraborti, A.; Patriarca, M.; Santhanam, M. S.
Prices of commodities or assets produce what is called time-series. Different kinds of financial time-series have been recorded and studied for decades. Nowadays, all transactions on a financial market are recorded, leading to a huge amount of data available, either for free in the Internet or commercially. Financial time-series analysis is of great interest to practitioners as well as to theoreticians, for making inferences and predictions. Furthermore, the stochastic uncertainties inherent in financial time-series and the theory needed to deal with them make the subject especially interesting not only to economists, but also to statisticians and physicists [1]. While it would be a formidable task to make an exhaustive review on the topic, with this review we try to give a flavor of some of its aspects.
Biometric identification based on novel frequency domain facial asymmetry measures
NASA Astrophysics Data System (ADS)
Mitra, Sinjini; Savvides, Marios; Vijaya Kumar, B. V. K.
2005-03-01
In the modern world, the ever-growing need to ensure a system's security has spurred the growth of the newly emerging technology of biometric identification. The present paper introduces a novel set of facial biometrics based on quantified facial asymmetry measures in the frequency domain. In particular, we show that these biometrics work well for face images showing expression variations and have the potential to do so in presence of illumination variations as well. A comparison of the recognition rates with those obtained from spatial domain asymmetry measures based on raw intensity values suggests that the frequency domain representation is more robust to intra-personal distortions and is a novel approach for performing biometric identification. In addition, some feature analysis based on statistical methods comparing the asymmetry measures across different individuals and across different expressions is presented.
Multiple-stage pure phase encoding with biometric information
NASA Astrophysics Data System (ADS)
Chen, Wen
2018-01-01
In recent years, many optical systems have been developed for securing information, and optical encryption/encoding has attracted more and more attention due to the marked advantages, such as parallel processing and multiple-dimensional characteristics. In this paper, an optical security method is presented based on pure phase encoding with biometric information. Biometric information (such as fingerprint) is employed as security keys rather than plaintext used in conventional optical security systems, and multiple-stage phase-encoding-based optical systems are designed for generating several phase-only masks with biometric information. Subsequently, the extracted phase-only masks are further used in an optical setup for encoding an input image (i.e., plaintext). Numerical simulations are conducted to illustrate the validity, and the results demonstrate that high flexibility and high security can be achieved.
Estimation of coupling between time-delay systems from time series
NASA Astrophysics Data System (ADS)
Prokhorov, M. D.; Ponomarenko, V. I.
2005-07-01
We propose a method for estimation of coupling between the systems governed by scalar time-delay differential equations of the Mackey-Glass type from the observed time series data. The method allows one to detect the presence of certain types of linear coupling between two time-delay systems, to define the type, strength, and direction of coupling, and to recover the model equations of coupled time-delay systems from chaotic time series corrupted by noise. We verify our method using both numerical and experimental data.
Reconstruction of ensembles of coupled time-delay systems from time series.
Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P
2014-06-01
We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.
Time series smoother for effect detection.
You, Cheng; Lin, Dennis K J; Young, S Stanley
2018-01-01
In environmental epidemiology, it is often encountered that multiple time series data with a long-term trend, including seasonality, cannot be fully adjusted by the observed covariates. The long-term trend is difficult to separate from abnormal short-term signals of interest. This paper addresses how to estimate the long-term trend in order to recover short-term signals. Our case study demonstrates that the current spline smoothing methods can result in significant positive and negative cross-correlations from the same dataset, depending on how the smoothing parameters are chosen. To circumvent this dilemma, three classes of time series smoothers are proposed to detrend time series data. These smoothers do not require fine tuning of parameters and can be applied to recover short-term signals. The properties of these smoothers are shown with both a case study using a factorial design and a simulation study using datasets generated from the original dataset. General guidelines are provided on how to discover short-term signals from time series with a long-term trend. The benefit of this research is that a problem is identified and characteristics of possible solutions are determined.
Time series smoother for effect detection
Lin, Dennis K. J.; Young, S. Stanley
2018-01-01
In environmental epidemiology, it is often encountered that multiple time series data with a long-term trend, including seasonality, cannot be fully adjusted by the observed covariates. The long-term trend is difficult to separate from abnormal short-term signals of interest. This paper addresses how to estimate the long-term trend in order to recover short-term signals. Our case study demonstrates that the current spline smoothing methods can result in significant positive and negative cross-correlations from the same dataset, depending on how the smoothing parameters are chosen. To circumvent this dilemma, three classes of time series smoothers are proposed to detrend time series data. These smoothers do not require fine tuning of parameters and can be applied to recover short-term signals. The properties of these smoothers are shown with both a case study using a factorial design and a simulation study using datasets generated from the original dataset. General guidelines are provided on how to discover short-term signals from time series with a long-term trend. The benefit of this research is that a problem is identified and characteristics of possible solutions are determined. PMID:29684033
Time Series Model Identification and Prediction Variance Horizon.
1980-06-01
stationary time series Y(t). -6- In terms of p(v), the definition of the three time series memory types is: No Memory Short Memory Long Memory X IP (v)I 0 0...X lp(v)l < - I IP (v) = v=1 v=l v=l Within short memory time series there are three types whose classification in terms of correlation functions is...1974) "Some Recent Advances in Time Series Modeling", TEEE Transactions on Automatic ControZ, VoZ . AC-19, No. 6, December, 723-730. Parzen, E. (1976) "An
Large-scale evaluation of multimodal biometric authentication using state-of-the-art systems.
Snelick, Robert; Uludag, Umut; Mink, Alan; Indovina, Michael; Jain, Anil
2005-03-01
We examine the performance of multimodal biometric authentication systems using state-of-the-art Commercial Off-the-Shelf (COTS) fingerprint and face biometric systems on a population approaching 1,000 individuals. The majority of prior studies of multimodal biometrics have been limited to relatively low accuracy non-COTS systems and populations of a few hundred users. Our work is the first to demonstrate that multimodal fingerprint and face biometric systems can achieve significant accuracy gains over either biometric alone, even when using highly accurate COTS systems on a relatively large-scale population. In addition to examining well-known multimodal methods, we introduce new methods of normalization and fusion that further improve the accuracy.
Time series modeling in traffic safety research.
Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue
2018-08-01
The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.
Similarity Search in Large Collections of Biometric Data
2009-10-01
instantaneous identification of a person by converting the biometric into a digital form and then comparing it against a computerized database . They can...combined to get reliable results. Exact match in biometric collections have very little meaning and only a relative ordering of database objects with...running several indices for different aspects of the data, e.g. facial features, fingerprints and palmprints of a person, together. The system then
Time series of the northeast Pacific
NASA Astrophysics Data System (ADS)
Peña, M. Angelica; Bograd, Steven J.
2007-10-01
In July 2006, the North Pacific Marine Science Organization (PICES) and Fisheries & Oceans Canada sponsored the symposium “Time Series of the Northeast Pacific: A symposium to mark the 50th anniversary of Line P”. The symposium, which celebrated 50 years of oceanography along Line P and at Ocean Station Papa (OSP), explored the scientific value of the Line P and other long oceanographic time series of the northeast Pacific (NEP). Overviews of the principal NEP time-series were presented, which facilitated regional comparisons and promoted interaction and exchange of information among investigators working in the NEP. More than 80 scientists from 8 countries attended the symposium. This introductory essay is a brief overview of the symposium and the 10 papers that were selected for this special issue of Progress in Oceanography.
Electronic Biometric Transmission Specification. Version 1.2
2006-11-08
Prescribed by ANSI Std Z39-18 Electronic Biometric Transmission Specification DIN: DOD_BTF_TS_EBTS_ Nov06_01.02.00 i Revision History Revision...contains: • the ORI • a Greenwich Mean (a.k.a. Zulu or UTC) date/time stamp • a code for the software used at the point of collection/transmission...long names and would generally include the tribe name. Subfield 1 Item 1 Character Type AS Characters 1 to 50 Special Characters: Any 7-bit non
On the feasibility of interoperable schemes in hand biometrics.
Morales, Aythami; González, Ester; Ferrer, Miguel A
2012-01-01
Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors.
Printable, scannable biometric templates for secure documents and materials
NASA Astrophysics Data System (ADS)
Cambier, James L.; Musgrave, Clyde
2000-04-01
Biometric technology has been widely acknowledged as an effective means for enhancing private and public security through applications in physical access control, computer and computer network access control, medical records protection, banking security, public identification programs, and others. Nearly all of these applications involve use of a biometric token to control access to a physical entity or private information. There are also unique benefits to be derived from attaching a biometric template to a physical entity such as a document, package, laboratory sample, etc. Such an association allows fast, reliable, and highly accurate association of an individual person's identity to the physical entity, and can be used to enhance security, convenience, and privacy in many types of transactions. Examples include authentication of documents, tracking of laboratory samples in a testing environment, monitoring the movement of physical evidence within the criminal justice system, and authenticating the identity of both sending and receiving parties in shipment of high value parcels. A system is described which combines a biometric technology based on iris recognition with a printing and scanning technology for high-density bar codes.
On the Feasibility of Interoperable Schemes in Hand Biometrics
Morales, Aythami; González, Ester; Ferrer, Miguel A.
2012-01-01
Personal recognition through hand-based biometrics has attracted the interest of many researchers in the last twenty years. A significant number of proposals based on different procedures and acquisition devices have been published in the literature. However, comparisons between devices and their interoperability have not been thoroughly studied. This paper tries to fill this gap by proposing procedures to improve the interoperability among different hand biometric schemes. The experiments were conducted on a database made up of 8,320 hand images acquired from six different hand biometric schemes, including a flat scanner, webcams at different wavelengths, high quality cameras, and contactless devices. Acquisitions on both sides of the hand were included. Our experiment includes four feature extraction methods which determine the best performance among the different scenarios for two of the most popular hand biometrics: hand shape and palm print. We propose smoothing techniques at the image and feature levels to reduce interdevice variability. Results suggest that comparative hand shape offers better performance in terms of interoperability than palm prints, but palm prints can be more effective when using similar sensors. PMID:22438714
Correlation between fetal mild ventriculomegaly and biometric parameters.
Fishel-Bartal, Michal; Shai, Daniel; Shina, Avi; Achiron, Reuven; Katorza, Eldad
2017-09-19
The aim of this study was to assess the correlation between fetal lateral ventricle width and biometric measurements. A prospective study on 335 fetuses, 101 fetuses with isolated mild ventriculomegaly and a control group of 234 fetuses with a normal US examination. All fetuses underwent a detailed brain ultrasound scan and a full biometric evaluation. To further compare biometric parameters, we matched, according to gestational week and gender, 91 fetuses from the study group to 91 fetuses from the control group. The mean gestational week during the exam was significantly different between the groups (29.6 weeks in the study group versus 28.3 in the control group, p = .001). The mean maternal age, obstetrical history, mode of conception, or fetal gender did not differ between the groups. After matching according to gestational age and fetal gender, the mean gestational week between the matched groups did not differ and was 29 + 5 weeks in both groups. The study group had significantly larger head circumference (p = .009), biparietal diameter (p < .001), femur length (p = .023), and estimated fetal weight (p = .024) compared with the control group. Isolated mild ventriculomegaly could be related to other larger fetal biometric measurements and does not necessarily mean a pathological condition.
A biometric authentication model using hand gesture images.
Fong, Simon; Zhuang, Yan; Fister, Iztok; Fister, Iztok
2013-10-30
A novel hand biometric authentication method based on measurements of the user's stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password 'iloveu' in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, 'i' , 'l' , 'o' , 'v' , 'e' , and 'u'. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy.
Privacy Preserving Facial and Fingerprint Multi-biometric Authentication
NASA Astrophysics Data System (ADS)
Anzaku, Esla Timothy; Sohn, Hosik; Ro, Yong Man
The cases of identity theft can be mitigated by the adoption of secure authentication methods. Biohashing and its variants, which utilizes secret keys and biometrics, are promising methods for secure authentication; however, their shortcoming is the degraded performance under the assumption that secret keys are compromised. In this paper, we extend the concept of Biohashing to multi-biometrics - facial and fingerprint traits. We chose these traits because they are widely used, howbeit, little research attention has been given to designing privacy preserving multi-biometric systems using them. Instead of just using a single modality (facial or fingerprint), we presented a framework for using both modalities. The improved performance of the proposed method, using face and fingerprint, as against either facial or fingerprint trait used in isolation is evaluated using two chimerical bimodal databases formed from publicly available facial and fingerprint databases.
Mathematical and information maintenance of biometric systems
NASA Astrophysics Data System (ADS)
Boriev, Z.; Sokolov, S.; Nyrkov, A.; Nekrasova, A.
2016-04-01
This article describes the different mathematical methods for processing biometric data. A brief overview of methods for personality recognition by means of a signature is conducted. Mathematical solutions of a dynamic authentication method are considered. Recommendations on use of certain mathematical methods, depending on specific tasks, are provided. Based on the conducted analysis of software and the choice made in favor of the wavelet analysis, a brief basis for its use in the course of software development for biometric personal identification is given for the purpose of its practical application.
ERIC Educational Resources Information Center
Gale, Doug
2006-01-01
Authentication is based on something one knows (e.g., a password), something one has (e.g., a driver's license), or something one is (e.g., a fingerprint). The last of these refers to the use of biometrics for authentication. With the blink of an eye, the touch of a finger, or the uttering of a pass-phrase, colleges and schools can now get deadly…
Developing a multimodal biometric authentication system using soft computing methods.
Malcangi, Mario
2015-01-01
Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.
Atmospheric turbulence and sensor system effects on biometric algorithm performance
NASA Astrophysics Data System (ADS)
Espinola, Richard L.; Leonard, Kevin R.; Byrd, Kenneth A.; Potvin, Guy
2015-05-01
Biometric technologies composed of electro-optical/infrared (EO/IR) sensor systems and advanced matching algorithms are being used in various force protection/security and tactical surveillance applications. To date, most of these sensor systems have been widely used in controlled conditions with varying success (e.g., short range, uniform illumination, cooperative subjects). However the limiting conditions of such systems have yet to be fully studied for long range applications and degraded imaging environments. Biometric technologies used for long range applications will invariably suffer from the effects of atmospheric turbulence degradation. Atmospheric turbulence causes blur, distortion and intensity fluctuations that can severely degrade image quality of electro-optic and thermal imaging systems and, for the case of biometrics technology, translate to poor matching algorithm performance. In this paper, we evaluate the effects of atmospheric turbulence and sensor resolution on biometric matching algorithm performance. We use a subset of the Facial Recognition Technology (FERET) database and a commercial algorithm to analyze facial recognition performance on turbulence degraded facial images. The goal of this work is to understand the feasibility of long-range facial recognition in degraded imaging conditions, and the utility of camera parameter trade studies to enable the design of the next generation biometrics sensor systems.
Nonlinear parametric model for Granger causality of time series
NASA Astrophysics Data System (ADS)
Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano
2006-06-01
The notion of Granger causality between two time series examines if the prediction of one series could be improved by incorporating information of the other. In particular, if the prediction error of the first time series is reduced by including measurements from the second time series, then the second time series is said to have a causal influence on the first one. We propose a radial basis function approach to nonlinear Granger causality. The proposed model is not constrained to be additive in variables from the two time series and can approximate any function of these variables, still being suitable to evaluate causality. Usefulness of this measure of causality is shown in two applications. In the first application, a physiological one, we consider time series of heart rate and blood pressure in congestive heart failure patients and patients affected by sepsis: we find that sepsis patients, unlike congestive heart failure patients, show symmetric causal relationships between the two time series. In the second application, we consider the feedback loop in a model of excitatory and inhibitory neurons: we find that in this system causality measures the combined influence of couplings and membrane time constants.
Pearson correlation estimation for irregularly sampled time series
NASA Astrophysics Data System (ADS)
Rehfeld, K.; Marwan, N.; Heitzig, J.; Kurths, J.
2012-04-01
Many applications in the geosciences call for the joint and objective analysis of irregular time series. For automated processing, robust measures of linear and nonlinear association are needed. Up to now, the standard approach would have been to reconstruct the time series on a regular grid, using linear or spline interpolation. Interpolation, however, comes with systematic side-effects, as it increases the auto-correlation in the time series. We have searched for the best method to estimate Pearson correlation for irregular time series, i.e. the one with the lowest estimation bias and variance. We adapted a kernel-based approach, using Gaussian weights. Pearson correlation is calculated, in principle, as a mean over products of previously centralized observations. In the regularly sampled case, observations in both time series were observed at the same time and thus the allocation of measurement values into pairs of products is straightforward. In the irregularly sampled case, however, measurements were not necessarily observed at the same time. Now, the key idea of the kernel-based method is to calculate weighted means of products, with the weight depending on the time separation between the observations. If the lagged correlation function is desired, the weights depend on the absolute difference between observation time separation and the estimation lag. To assess the applicability of the approach we used extensive simulations to determine the extent of interpolation side-effects with increasing irregularity of time series. We compared different approaches, based on (linear) interpolation, the Lomb-Scargle Fourier Transform, the sinc kernel and the Gaussian kernel. We investigated the role of kernel bandwidth and signal-to-noise ratio in the simulations. We found that the Gaussian kernel approach offers significant advantages and low Root-Mean Square Errors for regular, slightly irregular and very irregular time series. We therefore conclude that it is a good
Galbally, Javier; Marcel, Sébastien; Fierrez, Julian
2014-02-01
To ensure the actual presence of a real legitimate trait in contrast to a fake self-manufactured synthetic or reconstructed sample is a significant problem in biometric authentication, which requires the development of new and efficient protection measures. In this paper, we present a novel software-based fake detection method that can be used in multiple biometric systems to detect different types of fraudulent access attempts. The objective of the proposed system is to enhance the security of biometric recognition frameworks, by adding liveness assessment in a fast, user-friendly, and non-intrusive manner, through the use of image quality assessment. The proposed approach presents a very low degree of complexity, which makes it suitable for real-time applications, using 25 general image quality features extracted from one image (i.e., the same acquired for authentication purposes) to distinguish between legitimate and impostor samples. The experimental results, obtained on publicly available data sets of fingerprint, iris, and 2D face, show that the proposed method is highly competitive compared with other state-of-the-art approaches and that the analysis of the general image quality of real biometric samples reveals highly valuable information that may be very efficiently used to discriminate them from fake traits.
Characterizing time series via complexity-entropy curves
NASA Astrophysics Data System (ADS)
Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.
2017-06-01
The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.
Forbidden patterns in financial time series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano
2008-03-01
The existence of forbidden patterns, i.e., certain missing sequences in a given time series, is a recently proposed instrument of potential application in the study of time series. Forbidden patterns are related to the permutation entropy, which has the basic properties of classic chaos indicators, such as Lyapunov exponent or Kolmogorov entropy, thus allowing to separate deterministic (usually chaotic) from random series; however, it requires fewer values of the series to be calculated, and it is suitable for using with small datasets. In this paper, the appearance of forbidden patterns is studied in different economical indicators such as stock indices (Dow Jones Industrial Average and Nasdaq Composite), NYSE stocks (IBM and Boeing), and others (ten year Bond interest rate), to find evidence of deterministic behavior in their evolutions. Moreover, the rate of appearance of the forbidden patterns is calculated, and some considerations about the underlying dynamics are suggested.
Analysis of Nonstationary Time Series for Biological Rhythms Research.
Leise, Tanya L
2017-06-01
This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.
Demographic Analysis from Biometric Data: Achievements, Challenges, and New Frontiers.
Sun, Yunlian; Zhang, Man; Sun, Zhenan; Tan, Tieniu
2018-02-01
Biometrics is the technique of automatically recognizing individuals based on their biological or behavioral characteristics. Various biometric traits have been introduced and widely investigated, including fingerprint, iris, face, voice, palmprint, gait and so forth. Apart from identity, biometric data may convey various other personal information, covering affect, age, gender, race, accent, handedness, height, weight, etc. Among these, analysis of demographics (age, gender, and race) has received tremendous attention owing to its wide real-world applications, with significant efforts devoted and great progress achieved. This survey first presents biometric demographic analysis from the standpoint of human perception, then provides a comprehensive overview of state-of-the-art advances in automated estimation from both academia and industry. Despite these advances, a number of challenging issues continue to inhibit its full potential. We second discuss these open problems, and finally provide an outlook into the future of this very active field of research by sharing some promising opportunities.
Optical and biometric relationships of the isolated pig crystalline lens.
Vilupuru, A S; Glasser, A
2001-07-01
To investigate the interrelationships between optical and biometric properties of the porcine crystalline lens, to compare these findings with similar relationships found for the human lens and to attempt to fit this data to a geometric model of the optical and biometric properties of the pig lens. Weight, focal length, spherical aberration, surface curvatures, thickness and diameters of 20 isolated pig lenses were measured and equivalent refractive index was calculated. These parameters were compared and used to geometrically model the pig lens. Linear relationships were identified between many of the lens biometric and optical properties. The existence of these relationships allowed a simple geometrical model of the pig lens to be calculated which offers predictions of the optical properties. The linear relationships found and the agreement observed between measured and modeled results suggest that the pig lens confirms to a predictable, preset developmental pattern and that the optical and biometric properties are predictably interrelated.
Complex network approach to fractional time series
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manshour, Pouya
In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacencymore » matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.« less
Advanced spectral methods for climatic time series
Ghil, M.; Allen, M.R.; Dettinger, M.D.; Ide, K.; Kondrashov, D.; Mann, M.E.; Robertson, A.W.; Saunders, A.; Tian, Y.; Varadi, F.; Yiou, P.
2002-01-01
The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict climatic variability. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this review we describe the connections between time series analysis and nonlinear dynamics, discuss signal- to-noise enhancement, and present some of the novel methods for spectral analysis. The various steps, as well as the advantages and disadvantages of these methods, are illustrated by their application to an important climatic time series, the Southern Oscillation Index. This index captures major features of interannual climate variability and is used extensively in its prediction. Regional and global sea surface temperature data sets are used to illustrate multivariate spectral methods. Open questions and further prospects conclude the review.
Optical coherence tomography used for internal biometrics
NASA Astrophysics Data System (ADS)
Chang, Shoude; Sherif, Sherif; Mao, Youxin; Flueraru, Costel
2007-06-01
Traditional biometric technologies used for security and person identification essentially deal with fingerprints, hand geometry and face images. However, because all these technologies use external features of human body, they can be easily fooled and tampered with by distorting, modifying or counterfeiting these features. Nowadays, internal biometrics which detects the internal ID features of an object is becoming increasingly important. Being capable of exploring under-skin structure, optical coherence tomography (OCT) system can be used as a powerful tool for internal biometrics. We have applied fiber-optic and full-field OCT systems to detect the multiple-layer 2D images and 3D profile of the fingerprints, which eventually result in a higher discrimination than the traditional 2D recognition methods. More importantly, the OCT based fingerprint recognition has the ability to easily distinguish artificial fingerprint dummies by analyzing the extracted layered surfaces. Experiments show that our OCT systems successfully detected the dummy, which was made of plasticene and was used to bypass the commercially available fingerprint scanning system with a false accept rate (FAR) of 100%.
Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip
2016-06-28
The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.
The non-contact biometric identified bio signal measurement sensor and algorithms.
Kim, Chan-Il; Lee, Jong-Ha
2018-01-01
In these days, wearable devices have been developed for effectively measuring biological data. However, these devices have tissue allege and noise problem. To solve these problems, biometric measurement based on a non-contact method, such as face image sequencing is developed. This makes it possible to measure biometric data without any operation and side effects. However, it is impossible for a remote center to identify the person whose data are measured by the novel methods. In this paper, we propose the novel non-contact heart rate and blood pressure imaging system, Deep Health Eye. This system has authentication process at the same time as measuring bio signals, through non-contact method. In the future, this system can be convenient home bio signal monitoring system by combined with smart mirror.
Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum
2012-01-01
Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing. PMID:22919273
Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum
2012-01-01
Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing.
Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance
Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao
2018-01-01
Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy. PMID:29795600
Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance.
Liu, Yongli; Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao
2018-01-01
Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy.
Tongue prints in biometric authentication: A pilot study
Jeddy, Nadeem; Radhika, T; Nithya, S
2017-01-01
Background and Objectives: Biometric authentication is an important process for the identification and verification of individuals for security purposes. There are many biometric systems that are currently in use and also being researched. Tongue print is a new biometric authentication tool that is unique and cannot be easily forged because no two tongue prints are similar. The present study aims to evaluate the common morphological features of the tongue and its variations in males and females. The usefulness of alginate impression and dental cast in obtaining the lingual impression was also evaluated. Materials and Methods: The study sample included twenty participants. The participants were subjected to visual examination following which digital photographs of the dorsal surface of the tongue were taken. Alginate impressions of the tongue were made, and casts were prepared using dental stone. The photographs and the casts were analyzed by two observers separately for the surface morphology including shape, presence or absence of fissures and its pattern of distribution. Three reference points were considered to determine the shape of the tongue. Results: The most common morphological feature on the dorsum of the tongue was the presence of central fissures. Multiple vertical fissures were observed in males whereas single vertical fissure was a common finding in females. The fissures were predominantly shallow in males and deep in females. The tongue was predominantly U shaped in males and females. V-shaped tongue was observed in 25% of females. Conclusion: Tongue prints are useful in biometric authentication. The methodology used in the study is simple, easy and can be adopted by dentists on a regular basis. However, large-scale studies are required to validate the results and also identify other features of the tongue that can be used in forensics and biometric authentication process. PMID:28479712
Tongue prints in biometric authentication: A pilot study.
Jeddy, Nadeem; Radhika, T; Nithya, S
2017-01-01
Biometric authentication is an important process for the identification and verification of individuals for security purposes. There are many biometric systems that are currently in use and also being researched. Tongue print is a new biometric authentication tool that is unique and cannot be easily forged because no two tongue prints are similar. The present study aims to evaluate the common morphological features of the tongue and its variations in males and females. The usefulness of alginate impression and dental cast in obtaining the lingual impression was also evaluated. The study sample included twenty participants. The participants were subjected to visual examination following which digital photographs of the dorsal surface of the tongue were taken. Alginate impressions of the tongue were made, and casts were prepared using dental stone. The photographs and the casts were analyzed by two observers separately for the surface morphology including shape, presence or absence of fissures and its pattern of distribution. Three reference points were considered to determine the shape of the tongue. The most common morphological feature on the dorsum of the tongue was the presence of central fissures. Multiple vertical fissures were observed in males whereas single vertical fissure was a common finding in females. The fissures were predominantly shallow in males and deep in females. The tongue was predominantly U shaped in males and females. V-shaped tongue was observed in 25% of females. Tongue prints are useful in biometric authentication. The methodology used in the study is simple, easy and can be adopted by dentists on a regular basis. However, large-scale studies are required to validate the results and also identify other features of the tongue that can be used in forensics and biometric authentication process.
Stochastic nature of series of waiting times.
Anvari, Mehrnaz; Aghamohammadi, Cina; Dashti-Naserabadi, H; Salehi, E; Behjat, E; Qorbani, M; Nezhad, M Khazaei; Zirak, M; Hadjihosseini, Ali; Peinke, Joachim; Tabar, M Reza Rahimi
2013-06-01
Although fluctuations in the waiting time series have been studied for a long time, some important issues such as its long-range memory and its stochastic features in the presence of nonstationarity have so far remained unstudied. Here we find that the "waiting times" series for a given increment level have long-range correlations with Hurst exponents belonging to the interval 1/2
A novel biometric authentication approach using ECG and EMG signals.
Belgacem, Noureddine; Fournier, Régis; Nait-Ali, Amine; Bereksi-Reguig, Fethi
2015-05-01
Security biometrics is a secure alternative to traditional methods of identity verification of individuals, such as authentication systems based on user name and password. Recently, it has been found that the electrocardiogram (ECG) signal formed by five successive waves (P, Q, R, S and T) is unique to each individual. In fact, better than any other biometrics' measures, it delivers proof of subject's being alive as extra information which other biometrics cannot deliver. The main purpose of this work is to present a low-cost method for online acquisition and processing of ECG signals for person authentication and to study the possibility of providing additional information and retrieve personal data from an electrocardiogram signal to yield a reliable decision. This study explores the effectiveness of a novel biometric system resulting from the fusion of information and knowledge provided by ECG and EMG (Electromyogram) physiological recordings. It is shown that biometrics based on these ECG/EMG signals offers a novel way to robustly authenticate subjects. Five ECG databases (MIT-BIH, ST-T, NSR, PTB and ECG-ID) and several ECG signals collected in-house from volunteers were exploited. A palm-based ECG biometric system was developed where the signals are collected from the palm of the subject through a minimally intrusive one-lead ECG set-up. A total of 3750 ECG beats were used in this work. Feature extraction was performed on ECG signals using Fourier descriptors (spectral coefficients). Optimum-Path Forest classifier was used to calculate the degree of similarity between individuals. The obtained results from the proposed approach look promising for individuals' authentication.
NASA Astrophysics Data System (ADS)
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Multimodal biometric method that combines veins, prints, and shape of a finger
NASA Astrophysics Data System (ADS)
Kang, Byung Jun; Park, Kang Ryoung; Yoo, Jang-Hee; Kim, Jeong Nyeo
2011-01-01
Multimodal biometrics provides high recognition accuracy and population coverage by using various biometric features. A single finger contains finger veins, fingerprints, and finger geometry features; by using multimodal biometrics, information on these multiple features can be simultaneously obtained in a short time and their fusion can outperform the use of a single feature. This paper proposes a new finger recognition method based on the score-level fusion of finger veins, fingerprints, and finger geometry features. This research is novel in the following four ways. First, the performances of the finger-vein and fingerprint recognition are improved by using a method based on a local derivative pattern. Second, the accuracy of the finger geometry recognition is greatly increased by combining a Fourier descriptor with principal component analysis. Third, a fuzzy score normalization method is introduced; its performance is better than the conventional Z-score normalization method. Fourth, finger-vein, fingerprint, and finger geometry recognitions are combined by using three support vector machines and a weighted SUM rule. Experimental results showed that the equal error rate of the proposed method was 0.254%, which was lower than those of the other methods.
A biometric authentication model using hand gesture images
2013-01-01
A novel hand biometric authentication method based on measurements of the user’s stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password ‘iloveu’ in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, ‘i’ , ‘l’ , ‘o’ , ‘v’ , ‘e’ , and ‘u’. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy. PMID:24172288
GNSS Network time series analysis
NASA Astrophysics Data System (ADS)
Normand, M.; Balodis, J.; Janpaule, I.; Haritonova, D.
2012-12-01
Time series of GNSS station results of both the EUPOS®-Riga and LatPos networks have been developed at the Institute of Geodesy and Geoinformation (University of Latvia) using Bernese v.5.0 software. The base stations were selected among the EPN and IGS stations in surroundings of Latvia at the distances up to 700 km. The results of time series are analysed and coordinate velocity vectors have been determined. The background of the map of tectonic faults helps to interpret the GNSS station coordinate velocity vector behaviour in proper environment. The outlying situations recognized. The question still aroused on the nature of the some of outlying situations. The dependence from various influences has been tested.
Delay differential analysis of time series.
Lainscsek, Claudia; Sejnowski, Terrence J
2015-03-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time
A Collaborative Evaluation Framework for Biometric Connected Devices in Healthcare.
Farnia, Troskah; Jaulent, Marie Christine; Marchand, Guillaume; Yasini, Mobin
2017-01-01
A large number of biometric connected devices are currently available with a variety of designs. Healthcare users cannot easily choose the reliable ones that correspond the best to their healthcare problems. The existing evaluation methods do not consider at the same time aspects of connectivity and healthcare usage. In this study, a collaborative evaluation framework for biometric connected devices in healthcare usage is proposed. This framework contains six dimensions: medical validity, technical reliability, usability, ergonomy, legal compliance, and accuracy of measurements. In a first step, these dimensions were assessed by designing a self administered questionnaire answered by the stakeholders (patients, health professionals, payers, and manufacturers). A case study was then carried out in a second step to test this framework in a project of telemonitoring for heart failure patients. The results are in favor of the efficiency of the proposed framework as a decision making tool in healthcare usage.
Stochastic nature of series of waiting times
NASA Astrophysics Data System (ADS)
Anvari, Mehrnaz; Aghamohammadi, Cina; Dashti-Naserabadi, H.; Salehi, E.; Behjat, E.; Qorbani, M.; Khazaei Nezhad, M.; Zirak, M.; Hadjihosseini, Ali; Peinke, Joachim; Tabar, M. Reza Rahimi
2013-06-01
Although fluctuations in the waiting time series have been studied for a long time, some important issues such as its long-range memory and its stochastic features in the presence of nonstationarity have so far remained unstudied. Here we find that the “waiting times” series for a given increment level have long-range correlations with Hurst exponents belonging to the interval 1/2
Characterizing artifacts in RR stress test time series.
Astudillo-Salinas, Fabian; Palacio-Baus, Kenneth; Solano-Quinde, Lizandro; Medina, Ruben; Wong, Sara
2016-08-01
Electrocardiographic stress test records have a lot of artifacts. In this paper we explore a simple method to characterize the amount of artifacts present in unprocessed RR stress test time series. Four time series classes were defined: Very good lead, Good lead, Low quality lead and Useless lead. 65 ECG, 8 lead, records of stress test series were analyzed. Firstly, RR-time series were annotated by two experts. The automatic methodology is based on dividing the RR-time series in non-overlapping windows. Each window is marked as noisy whenever it exceeds an established standard deviation threshold (SDT). Series are classified according to the percentage of windows that exceeds a given value, based upon the first manual annotation. Different SDT were explored. Results show that SDT close to 20% (as a percentage of the mean) provides the best results. The coincidence between annotators classification is 70.77% whereas, the coincidence between the second annotator and the automatic method providing the best matches is larger than 63%. Leads classified as Very good leads and Good leads could be combined to improve automatic heartbeat labeling.
Time series, periodograms, and significance
NASA Astrophysics Data System (ADS)
Hernandez, G.
1999-05-01
The geophysical literature shows a wide and conflicting usage of methods employed to extract meaningful information on coherent oscillations from measurements. This makes it difficult, if not impossible, to relate the findings reported by different authors. Therefore, we have undertaken a critical investigation of the tests and methodology used for determining the presence of statistically significant coherent oscillations in periodograms derived from time series. Statistical significance tests are only valid when performed on the independent frequencies present in a measurement. Both the number of possible independent frequencies in a periodogram and the significance tests are determined by the number of degrees of freedom, which is the number of true independent measurements, present in the time series, rather than the number of sample points in the measurement. The number of degrees of freedom is an intrinsic property of the data, and it must be determined from the serial coherence of the time series. As part of this investigation, a detailed study has been performed which clearly illustrates the deleterious effects that the apparently innocent and commonly used processes of filtering, de-trending, and tapering of data have on periodogram analysis and the consequent difficulties in the interpretation of the statistical significance thus derived. For the sake of clarity, a specific example of actual field measurements containing unevenly-spaced measurements, gaps, etc., as well as synthetic examples, have been used to illustrate the periodogram approach, and pitfalls, leading to the (statistical) significance tests for the presence of coherent oscillations. Among the insights of this investigation are: (1) the concept of a time series being (statistically) band limited by its own serial coherence and thus having a critical sampling rate which defines one of the necessary requirements for the proper statistical design of an experiment; (2) the design of a critical
NASA Astrophysics Data System (ADS)
Ricci, R.; Chollet, G.; Crispino, M. V.; Jassim, S.; Koreman, J.; Olivar-Dimas, M.; Garcia-Salicetti, S.; Soria-Rodriguez, P.
2006-05-01
This article presents an overview of the SecurePhone project, with an account of the first results obtained. SecurePhone's primary aim is to realise a mobile phone prototype - the 'SecurePhone' - in which biometrical authentication enables users to deal secure, dependable transactions over a mobile network. The SecurePhone is based on a commercial PDA-phone, supplemented with specific software modules and a customised SIM card. It integrates in a single environment a number of advanced features: access to cryptographic keys through strong multimodal biometric authentication; appending and verification of digital signatures; real-time exchange and interactive modification of (esigned) documents and voice recordings. SecurePhone's 'biometric recogniser' is based on original research. A fused combination of three different biometric methods - speaker, face and handwritten signature verification - is exploited, with no need for dedicated hardware components. The adoption of non-intrusive, psychologically neutral biometric techniques is expected to mitigate rejection problems that often inhibit the social use of biometrics, and speed up the spread of e-signature technology. Successful biometric authentication grants access to SecurePhone's built-in esignature services through a user-friendly interface. Special emphasis is accorded to the definition of a trustworthy security chain model covering all aspects of system operation. The SecurePhone is expected to boost m-commerce and open new scenarios for m-business and m-work, by changing the way people interact and by improving trust and confidence in information technologies, often considered intimidating and difficult to use. Exploitation plans will also explore other application domains (physical and logical access control, securised mobile communications).
A novel weight determination method for time series data aggregation
NASA Astrophysics Data System (ADS)
Xu, Paiheng; Zhang, Rong; Deng, Yong
2017-09-01
Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.
Entropy of electromyography time series
NASA Astrophysics Data System (ADS)
Kaufman, Miron; Zurcher, Ulrich; Sung, Paul S.
2007-12-01
A nonlinear analysis based on Renyi entropy is applied to electromyography (EMG) time series from back muscles. The time dependence of the entropy of the EMG signal exhibits a crossover from a subdiffusive regime at short times to a plateau at longer times. We argue that this behavior characterizes complex biological systems. The plateau value of the entropy can be used to differentiate between healthy and low back pain individuals.
Martín-Serrano, María José; Roman-Ortiz, Carmen; Villa-Sáez, M Luz; Labrador-Castellanos, M Purificación; Blanco-Carrasco, Rosario; Lozano-Ballesteros, Felicidad; Pedraza-Martín, Carmen; José-Herrero, M Teresa San; López-Ropero, Ana M; Tenías Burillo, José María
2014-01-01
To estimate in patients awaiting cataract surgery the concordance and interchangeability of axial eye length measurements performed with the aid of various biometric methods (optical or ultrasonic) by different operators (nurses) at different times during the period prior to surgery. We selected 182 consecutive eyes from 91 patients.Ocular axial length was measured with the aid of 2 methods (IOLMaster® and Ocuscan®) by 9 randomly allocated technicians at 2 different times during the waiting period. The concordance between measurements was evaluated by means of the intraclass correlation coefficient (ICC); the interchangeability of the results was assessed with Bland Altman plots and Passing and Bablok regression. The measurements were consistent between biometric methods (ICC 0.975, 95% confidence interval [CI] 0.968 to 0.980) and measurement dates (ICC 0.996, 95% CI 0.995 to 0.997). Interobserver agreement was more heterogeneous (ICC range 0.844 to 0.998). No systematic errors were observed among the various biometric methods and measurement dates. Because measurement of axial length in phakic patients may be technician-dependent, the technician's experience should be noted in the protocols of ophthalmology services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schulz, Douglas A.
2007-10-08
A biometric system suitable for validating user identity using only mouse movements and no specialized equipment is presented. Mouse curves (mouse movements with little or no pause between them) are individually classied and used to develop classication histograms, which are representative of an individual's typical mouse use. These classication histograms can then be compared to validate identity. This classication approach is suitable for providing continuous identity validation during an entire user session.
Recurrent Neural Network Applications for Astronomical Time Series
NASA Astrophysics Data System (ADS)
Protopapas, Pavlos
2017-06-01
The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.
Visualizing frequent patterns in large multivariate time series
NASA Astrophysics Data System (ADS)
Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.
2011-01-01
The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.
Robust extrema features for time-series data analysis.
Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N
2013-06-01
The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.
Face biometrics with renewable templates
NASA Astrophysics Data System (ADS)
van der Veen, Michiel; Kevenaar, Tom; Schrijen, Geert-Jan; Akkermans, Ton H.; Zuo, Fei
2006-02-01
In recent literature, privacy protection technologies for biometric templates were proposed. Among these is the so-called helper-data system (HDS) based on reliable component selection. In this paper we integrate this approach with face biometrics such that we achieve a system in which the templates are privacy protected, and multiple templates can be derived from the same facial image for the purpose of template renewability. Extracting binary feature vectors forms an essential step in this process. Using the FERET and Caltech databases, we show that this quantization step does not significantly degrade the classification performance compared to, for example, traditional correlation-based classifiers. The binary feature vectors are integrated in the HDS leading to a privacy protected facial recognition algorithm with acceptable FAR and FRR, provided that the intra-class variation is sufficiently small. This suggests that a controlled enrollment procedure with a sufficient number of enrollment measurements is required.
Hand biometric recognition based on fused hand geometry and vascular patterns.
Park, GiTae; Kim, Soowon
2013-02-28
A hand biometric authentication method based on measurements of the user's hand geometry and vascular pattern is proposed. To acquire the hand geometry, the thickness of the side view of the hand, the K-curvature with a hand-shaped chain code, the lengths and angles of the finger valleys, and the lengths and profiles of the fingers were used, and for the vascular pattern, the direction-based vascular-pattern extraction method was used, and thus, a new multimodal biometric approach is proposed. The proposed multimodal biometric system uses only one image to extract the feature points. This system can be configured for low-cost devices. Our multimodal biometric-approach hand-geometry (the side view of the hand and the back of hand) and vascular-pattern recognition method performs at the score level. The results of our study showed that the equal error rate of the proposed system was 0.06%.
Analysis and generation of groundwater concentration time series
NASA Astrophysics Data System (ADS)
Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae
2018-01-01
Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.
Transition Icons for Time-Series Visualization and Exploratory Analysis.
Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa
2018-03-01
The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.
Multiscale structure of time series revealed by the monotony spectrum.
Vamoş, Călin
2017-03-01
Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.
Reference charts for fetal biometric parameters in twin pregnancies according to chorionicity.
Araujo Júnior, Edward; Ruano, Rodrigo; Javadian, Pouya; Martins, Wellington P; Elito, Julio; Pires, Claudio Rodrigues; Zanforlin Filho, Sebastião Marques
2014-04-01
The objective of this article is to determine reference values for fetal biometric parameters in twin pregnancies and to compare these values between monochorionic and dichorionic pregnancies. A retrospective cross-sectional study was conducted among 157 monochorionic and 176 dichorionic twin pregnancies between 14 and 38 weeks of gestation. Biometric measurements included the biparietal diameter (BPD), abdominal circumference (AC), femurs length (FL) and estimated fetal weight (EFW). To evaluate the correlation between biometric parameters and gestational age, polynomial regression models were created, with adjustments using the coefficient of determination (R(2) ). Comparison between monochorionic and dichorionic pregnancies was performed using analysis of covariance. The mean BPD, AC, FL and EFW for the dichorionic pregnancies were 56.16 mm, 191.1 mm, 41.08 mm and 816.1 g, respectively. The mean BPD, AC, FL and EFW for the monochorionic pregnancies were 57.14 mm, 184.2 mm, 39.29 mm and 723.4 g, respectively. There was a statistical difference between mono and dichorionic pregnancies for all the biometric parameters (BPD p = 0.012; AC p = 0.047; FL p = 0.007; EFW p = 0.011). Reference curves of biometric parameters in twin pregnancies were determined. Biometric parameters were statistically different between monochorionic and dichorionic pregnancies. © 2014 John Wiley & Sons, Ltd.
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
Data imputation analysis for Cosmic Rays time series
NASA Astrophysics Data System (ADS)
Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.
2017-05-01
The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.
Algorithm for Compressing Time-Series Data
NASA Technical Reports Server (NTRS)
Hawkins, S. Edward, III; Darlington, Edward Hugo
2012-01-01
An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").
A proposed simple method for measurement in the anterior chamber angle: biometric gonioscopy.
Congdon, N G; Spaeth, G L; Augsburger, J; Klancnik, J; Patel, K; Hunter, D G
1999-11-01
To design a system of gonioscopy that will allow greater interobserver reliability and more clearly defined screening cutoffs for angle closure than current systems while being simple to teach and technologically appropriate for use in rural Asia, where the prevalence of angle-closure glaucoma is highest. Clinic-based validation and interobserver reliability trial. Study 1: 21 patients 18 years of age and older recruited from a university-based specialty glaucoma clinic; study 2: 32 patients 18 years of age and older recruited from the same clinic. In study 1, all participants underwent conventional gonioscopy by an experienced observer (GLS) using the Spaeth system and in the same eye also underwent Scheimpflug photography, ultrasonographic measurement of anterior chamber depth and axial length, automatic refraction, and biometric gonioscopy with measurement of the distance from iris insertion to Schwalbe's line using a reticule based in the slit-lamp ocular. In study 2, all participants underwent both conventional gonioscopy and biometric gonioscopy by an experienced gonioscopist (NGC) and a medical student with no previous training in gonioscopy (JK). Study 1: The association between biometric gonioscopy and conventional gonioscopy, Scheimpflug photography, and other factors known to correlate with the configuration of the angle. Study 2: Interobserver agreement using biometric gonioscopy compared to that obtained with conventional gonioscopy. In study 1, there was an independent, monotonic, statistically significant relationship between biometric gonioscopy and both Spaeth angle (P = 0.001, t test) and Spaeth insertion (P = 0.008, t test) grades. Biometric gonioscopy correctly identified six of six patients with occludable angles according to Spaeth criteria. Biometric gonioscopic grade was also significantly associated with the anterior chamber angle as measured by Scheimpflug photography (P = 0.005, t test). In study 2, the intraclass correlation coefficient
Layered Ensemble Architecture for Time Series Forecasting.
Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin
2016-01-01
Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.
Time series regression studies in environmental epidemiology.
Bhaskaran, Krishnan; Gasparrini, Antonio; Hajat, Shakoor; Smeeth, Liam; Armstrong, Ben
2013-08-01
Time series regression studies have been widely used in environmental epidemiology, notably in investigating the short-term associations between exposures such as air pollution, weather variables or pollen, and health outcomes such as mortality, myocardial infarction or disease-specific hospital admissions. Typically, for both exposure and outcome, data are available at regular time intervals (e.g. daily pollution levels and daily mortality counts) and the aim is to explore short-term associations between them. In this article, we describe the general features of time series data, and we outline the analysis process, beginning with descriptive analysis, then focusing on issues in time series regression that differ from other regression methods: modelling short-term fluctuations in the presence of seasonal and long-term patterns, dealing with time varying confounding factors and modelling delayed ('lagged') associations between exposure and outcome. We finish with advice on model checking and sensitivity analysis, and some common extensions to the basic model.
Exploratory Causal Analysis in Bivariate Time Series Data
NASA Astrophysics Data System (ADS)
McCracken, James M.
Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data
Hand Biometric Recognition Based on Fused Hand Geometry and Vascular Patterns
Park, GiTae; Kim, Soowon
2013-01-01
A hand biometric authentication method based on measurements of the user's hand geometry and vascular pattern is proposed. To acquire the hand geometry, the thickness of the side view of the hand, the K-curvature with a hand-shaped chain code, the lengths and angles of the finger valleys, and the lengths and profiles of the fingers were used, and for the vascular pattern, the direction-based vascular-pattern extraction method was used, and thus, a new multimodal biometric approach is proposed. The proposed multimodal biometric system uses only one image to extract the feature points. This system can be configured for low-cost devices. Our multimodal biometric-approach hand-geometry (the side view of the hand and the back of hand) and vascular-pattern recognition method performs at the score level. The results of our study showed that the equal error rate of the proposed system was 0.06%. PMID:23449119
A window-based time series feature extraction method.
Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife
2017-10-01
This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.
NCI: DCTD: Biometric Research Branch
The Biometric Research Branch (BRP) is the statistical and biomathematical component of the Division of Cancer Treatment, Diagnosis and Centers (DCTDC). Its members provide statistical leadership for the national and international research programs of the division in developmental therapeutics, developmental diagnostics, diagnostic imaging and clinical trials.
NCI: DCTD: Biometric Research Program
The Biometric Research Program (BRP) is the statistical and biomathematical component of the Division of Cancer Treatment, Diagnosis and Centers (DCTDC). Its members provide statistical leadership for the national and international research programs of the division in developmental therapeutics, developmental diagnostics, diagnostic imaging and clinical trials.
NCI: DCTD: Biometric Research Branch
The Biometric Research Branch (BRB) is the statistical and biomathematical component of the Division of Cancer Treatment, Diagnosis and Centers (DCTDC). Its members provide statistical leadership for the national and international research programs of the division in developmental therapeutics, developmental diagnostics, diagnostic imaging and clinical trials.
NCI: DCTD: Biometric Research Program
The Biometric Research Program (BRB) is the statistical and biomathematical component of the Division of Cancer Treatment, Diagnosis and Centers (DCTDC). Its members provide statistical leadership for the national and international research programs of the division in developmental therapeutics, developmental diagnostics, diagnostic imaging and clinical trials.
Human body as a set of biometric features identified by means of optoelectronics
NASA Astrophysics Data System (ADS)
Podbielska, Halina; Bauer, Joanna
2005-09-01
Human body posses many unique, singular features that are impossible to copy or forge. Nowadays, to establish and to ensure the public security requires specially designed devices and systems. Biometrics is a field of science and technology, exploiting human body characteristics for people recognition. It identifies the most characteristic and unique ones in order to design and construct systems capable to recognize people. In this paper some overview is given, presenting the achievements in biometrics. The verification and identification process is explained, along with the way of evaluation of biometric recognition systems. The most frequently human biometrics used in practice are shortly presented, including fingerprints, facial imaging (including thermal characteristic), hand geometry and iris patterns.
A Framework for Analyzing Biometric Template Aging and Renewal Prediction
2009-03-01
databases has sufficient data to support template aging over an extended period of time. Another assumption is that there is significant variance to...mentioned above for enrollment also apply to verification. When combining enrollment and verification, there is a significant amount of variance that... significant advancement in the biometrics body of knowledge. This research presents the CTARP Framework, a novel foundational framework for methods of
Unconstrained and contactless hand geometry biometrics.
de-Santos-Sierra, Alberto; Sánchez-Ávila, Carmen; Del Pozo, Gonzalo Bailador; Guerra-Casanova, Javier
2011-01-01
This paper presents a hand biometric system for contact-less, platform-free scenarios, proposing innovative methods in feature extraction, template creation and template matching. The evaluation of the proposed method considers both the use of three contact-less publicly available hand databases, and the comparison of the performance to two competitive pattern recognition techniques existing in literature: namely support vector machines (SVM) and k-nearest neighbour (k-NN). Results highlight the fact that the proposed method outcomes existing approaches in literature in terms of computational cost, accuracy in human identification, number of extracted features and number of samples for template creation. The proposed method is a suitable solution for human identification in contact-less scenarios based on hand biometrics, providing a feasible solution to devices with limited hardware requirements like mobile devices.
Unconstrained and Contactless Hand Geometry Biometrics
de-Santos-Sierra, Alberto; Sánchez-Ávila, Carmen; del Pozo, Gonzalo Bailador; Guerra-Casanova, Javier
2011-01-01
This paper presents a hand biometric system for contact-less, platform-free scenarios, proposing innovative methods in feature extraction, template creation and template matching. The evaluation of the proposed method considers both the use of three contact-less publicly available hand databases, and the comparison of the performance to two competitive pattern recognition techniques existing in literature: namely Support Vector Machines (SVM) and k-Nearest Neighbour (k-NN). Results highlight the fact that the proposed method outcomes existing approaches in literature in terms of computational cost, accuracy in human identification, number of extracted features and number of samples for template creation. The proposed method is a suitable solution for human identification in contact-less scenarios based on hand biometrics, providing a feasible solution to devices with limited hardware requirements like mobile devices. PMID:22346634
Simulation of time series by distorted Gaussian processes
NASA Technical Reports Server (NTRS)
Greenhall, C. A.
1977-01-01
Distorted stationary Gaussian process can be used to provide computer-generated imitations of experimental time series. A method of analyzing a source time series and synthesizing an imitation is shown, and an example using X-band radiometer data is given.
Lew, Susie Q; Sikka, Neal; Thompson, Clinton; Cherian, Teena; Magnus, Manya
2017-01-01
We examined participant uptake and utilization of remote monitoring devices, and the relationship between remote biometric monitoring (RBM) of weight (Wt) and blood pressure (BP) with self-monitoring requirements. Participants on peritoneal dialysis (PD) ( n = 269) participated in a Telehealth pilot study of which 253 used remote monitoring of BP and 255 for Wt. Blood pressure and Wt readings were transmitted in real time to a Telehealth call center, which were then forwarded to the PD nurses for real-time review. Uptake of RBM was substantial, with 89.7% accepting RBM, generating 74,266 BP and 52,880 Wt measurements over the study period. We found no significant correlates of RBM uptake with regard to gender, marital, educational, socio-economic or employment status, or baseline experience with computers; frequency of use of BP RBM by Black participants was less than non-Black participants, as was Wt RBM, and participants over 55 years old were more likely to use the Wt RBM than their younger counterparts. Having any review of the breach by a nurse was associated with reduced odds of a subsequent BP breach after adjusting for sex, age, and race. Remote biometric monitoring was associated with adherence to self-monitoring BP and Wt requirements associated with PD. Remote biometric monitoring was feasible, allowing for increased communication between patient and PD clinical staff with real-time patient data for providers to act on to potentially improve adherence and outcomes. Copyright © 2017 International Society for Peritoneal Dialysis.
The use of biometrics in the Personal Health Record (PHR).
Bonney, Wilfred
2011-01-01
The emergence of the Personal Health Record (PHR) has made individual health information more readily accessible to a wide range of users including patients, consumers, practitioners, and healthcare providers. However, increased accessibility of PHR threatens the confidentiality, privacy, and security of personalized health information. Therefore, a need for robust and reliable forms of authentication is of prime concern. The concept of biometric authentication is now highly visible to healthcare providers as a technology to prevent unauthorized access to individual health information. Implementing biometric authentication mechanisms to protect PHR facilitates access control and secure exchange of health information. In this paper, a literature review is used to explore the key benefits, technical barriers, challenges, and ethical implications for using biometric authentication in PHR.
Wang, Yi; Wan, Jianwu; Guo, Jun; Cheung, Yiu-Ming; Yuen, Pong C; Yi Wang; Jianwu Wan; Jun Guo; Yiu-Ming Cheung; Yuen, Pong C; Cheung, Yiu-Ming; Guo, Jun; Yuen, Pong C; Wan, Jianwu; Wang, Yi
2018-07-01
Similarity search is essential to many important applications and often involves searching at scale on high-dimensional data based on their similarity to a query. In biometric applications, recent vulnerability studies have shown that adversarial machine learning can compromise biometric recognition systems by exploiting the biometric similarity information. Existing methods for biometric privacy protection are in general based on pairwise matching of secured biometric templates and have inherent limitations in search efficiency and scalability. In this paper, we propose an inference-based framework for privacy-preserving similarity search in Hamming space. Our approach builds on an obfuscated distance measure that can conceal Hamming distance in a dynamic interval. Such a mechanism enables us to systematically design statistically reliable methods for retrieving most likely candidates without knowing the exact distance values. We further propose to apply Montgomery multiplication for generating search indexes that can withstand adversarial similarity analysis, and show that information leakage in randomized Montgomery domains can be made negligibly small. Our experiments on public biometric datasets demonstrate that the inference-based approach can achieve a search accuracy close to the best performance possible with secure computation methods, but the associated cost is reduced by orders of magnitude compared to cryptographic primitives.
Long-range correlations in time series generated by time-fractional diffusion: A numerical study
NASA Astrophysics Data System (ADS)
Barbieri, Davide; Vivoli, Alessandro
2005-09-01
Time series models showing power law tails in autocorrelation functions are common in econometrics. A special non-Markovian model for such kind of time series is provided by the random walk introduced by Gorenflo et al. as a discretization of time fractional diffusion. The time series so obtained are analyzed here from a numerical point of view in terms of autocorrelations and covariance matrices.
An analysis of random projection for changeable and privacy-preserving biometric verification.
Wang, Yongjin; Plataniotis, Konstantinos N
2010-10-01
Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.
A novel feature ranking algorithm for biometric recognition with PPG signals.
Reşit Kavsaoğlu, A; Polat, Kemal; Recep Bozkurt, M
2014-06-01
This study is intended for describing the application of the Photoplethysmography (PPG) signal and the time domain features acquired from its first and second derivatives for biometric identification. For this purpose, a sum of 40 features has been extracted and a feature-ranking algorithm is proposed. This proposed algorithm calculates the contribution of each feature to biometric recognition and collocates the features, the contribution of which is from great to small. While identifying the contribution of the features, the Euclidean distance and absolute distance formulas are used. The efficiency of the proposed algorithms is demonstrated by the results of the k-NN (k-nearest neighbor) classifier applications of the features. During application, each 15-period-PPG signal belonging to two different durations from each of the thirty healthy subjects were used with a PPG data acquisition card. The first PPG signals recorded from the subjects were evaluated as the 1st configuration; the PPG signals recorded later at a different time as the 2nd configuration and the combination of both were evaluated as the 3rd configuration. When the results were evaluated for the k-NN classifier model created along with the proposed algorithm, an identification of 90.44% for the 1st configuration, 94.44% for the 2nd configuration, and 87.22% for the 3rd configuration has successfully been attained. The obtained results showed that both the proposed algorithm and the biometric identification model based on this developed PPG signal are very promising for contactless recognizing the people with the proposed method. Copyright © 2014 Elsevier Ltd. All rights reserved.
Inter-observer variability in fetal biometric measurements.
Kilani, Rami; Aleyadeh, Wesam; Atieleh, Luay Abu; Al Suleimat, Abdul Mane; Khadra, Maysa; Hawamdeh, Hassan M
2018-02-01
To evaluate inter-observer variability and reproducibility of ultrasound measurements for fetal biometric parameters. A prospective cohort study was implemented in two tertiary care hospitals in Amman, Jordan; Prince Hamza Hospital and Albashir Hospital. 192 women with a singleton pregnancy at a gestational age of 18-36 weeks were the participants in the study. Transabdominal scans for fetal biometric parameter measurement were performed on study participants from the period of November 2014 to March 2015. Women who agreed to participate in the study were administered two ultrasound scans for head circumference, abdominal circumference and femur length. The correlation coefficient was calculated. Bland-Altman plots were used to analyze the degree of measurement agreement between observers. Limits of agreement ± 2 SD for the differences in fetal biometry measurements in proportions of the mean of the measurements were derived. Main outcome measures examine the reproducibility of fetal biometric measurements by different observers. High inter-observer inter-class correlation coefficient (ICC) was found for femur length (0.990) and abdominal circumference (0.996) where Bland-Altman plots showed high degrees of agreement. The highest degrees of agreement were noted in the measurement of abdominal circumference followed by head circumference. The lowest degree of agreement was found for femur length measurement. We used a paired-sample t-test and found that the mean difference between duplicate measurements was not significant (P > 0.05). Biometric fetal parameter measurements may be reproducible by different operators in the clinical setting with similar results. Fetal head circumference, abdominal circumference and femur length were highly reproducible. Large organized studies are needed to ensure accurate fetal measurements due to the important clinical implications of inaccurate measurements. Copyright © 2018. Published by Elsevier B.V.
Trend time-series modeling and forecasting with neural networks.
Qi, Min; Zhang, G Peter
2008-05-01
Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.
Wavelet analysis and scaling properties of time series
NASA Astrophysics Data System (ADS)
Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.
2005-10-01
We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.
Non-parametric characterization of long-term rainfall time series
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Pandey, Brij Kishor
2018-03-01
The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.
Biometrics in Government, Post-9/11: Advancing Science, Enhancing Operations
2008-08-01
responsibilities include advising the President in policy formulation and budget development on all questions in which S&T are important elements; articulating the...group of approximately 30 individuals from government, industry, and academia were in a hotel conference room in Orlando, Fla., at a Biometric...of a set of usability guidelines for biometric systems that enhance performance (throughput and quality), improve user satisfaction and acceptance
Time Series Decomposition into Oscillation Components and Phase Estimation.
Matsuda, Takeru; Komaki, Fumiyasu
2017-02-01
Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.
Individual Biometric Identification Using Multi-Cycle Electrocardiographic Waveform Patterns.
Lee, Wonki; Kim, Seulgee; Kim, Daeeun
2018-03-28
The electrocardiogram (ECG) waveform conveys information regarding the electrical property of the heart. The patterns vary depending on the individual heart characteristics. ECG features can be potentially used for biometric recognition. This study presents a new method using the entire ECG waveform pattern for matching and demonstrates that the approach can potentially be employed for individual biometric identification. Multi-cycle ECG signals were assessed using an ECG measuring circuit, and three electrodes can be patched on the wrists or fingers for considering various measurements. For biometric identification, our-fold cross validation was used in the experiments for assessing how the results of a statistical analysis will generalize to an independent data set. Four different pattern matching algorithms, i.e., cosine similarity, cross correlation, city block distance, and Euclidean distances, were tested to compare the individual identification performances with a single channel of ECG signal (3-wire ECG). To evaluate the pattern matching for biometric identification, the ECG recordings for each subject were partitioned into training and test set. The suggested method obtained a maximum performance of 89.9% accuracy with two heartbeats of ECG signals measured on the wrist and 93.3% accuracy with three heartbeats for 55 subjects. The performance rate with ECG signals measured on the fingers improved up to 99.3% with two heartbeats and 100% with three heartbeats of signals for 20 subjects.
NASA Astrophysics Data System (ADS)
Kelkboom, Emile J. C.; Breebaart, Jeroen; Buhan, Ileana; Veldhuis, Raymond N. J.
2010-04-01
Template protection techniques are used within biometric systems in order to protect the stored biometric template against privacy and security threats. A great portion of template protection techniques are based on extracting a key from or binding a key to a biometric sample. The achieved protection depends on the size of the key and its closeness to being random. In the literature it can be observed that there is a large variation on the reported key lengths at similar classification performance of the same template protection system, even when based on the same biometric modality and database. In this work we determine the analytical relationship between the system performance and the theoretical maximum key size given a biometric source modeled by parallel Gaussian channels. We consider the case where the source capacity is evenly distributed across all channels and the channels are independent. We also determine the effect of the parameters such as the source capacity, the number of enrolment and verification samples, and the operating point selection on the maximum key size. We show that a trade-off exists between the privacy protection of the biometric system and its convenience for its users.
Kim, Hanvit; Minh Phuong Nguyen; Se Young Chun
2017-07-01
Biometrics such as ECG provides a convenient and powerful security tool to verify or identify an individual. However, one important drawback of biometrics is that it is irrevocable. In other words, biometrics cannot be re-used practically once it is compromised. Cancelable biometrics has been investigated to overcome this drawback. In this paper, we propose a cancelable ECG biometrics by deriving a generalized likelihood ratio test (GLRT) detector from a composite hypothesis testing in randomly projected domain. Since it is common to observe performance degradation for cancelable biometrics, we also propose a guided filtering (GF) with irreversible guide signal that is a non-invertibly transformed signal of ECG authentication template. We evaluated our proposed method using ECG-ID database with 89 subjects. Conventional Euclidean detector with original ECG template yielded 93.9% PD1 (detection probability at 1% FAR) while Euclidean detector with 10% compressed ECG (1/10 of the original data size) yielded 90.8% PD1. Our proposed GLRT detector with 10% compressed ECG yielded 91.4%, which is better than Euclidean with the same compressed ECG. GF with our proposed irreversible ECG template further improved the performance of our GLRT with 10% compressed ECG up to 94.3%, which is higher than Euclidean detector with original ECG. Lastly, we showed that our proposed cancelable ECG biometrics practically met cancelable biometrics criteria such as efficiency, re-usability, diversity and non-invertibility.
Deconvolution of mixing time series on a graph
Blocker, Alexander W.; Airoldi, Edoardo M.
2013-01-01
In many applications we are interested in making inference on latent time series from indirect measurements, which are often low-dimensional projections resulting from mixing or aggregation. Positron emission tomography, super-resolution, and network traffic monitoring are some examples. Inference in such settings requires solving a sequence of ill-posed inverse problems, yt = Axt, where the projection mechanism provides information on A. We consider problems in which A specifies mixing on a graph of times series that are bursty and sparse. We develop a multilevel state-space model for mixing times series and an efficient approach to inference. A simple model is used to calibrate regularization parameters that lead to efficient inference in the multilevel state-space model. We apply this method to the problem of estimating point-to-point traffic flows on a network from aggregate measurements. Our solution outperforms existing methods for this problem, and our two-stage approach suggests an efficient inference strategy for multilevel models of multivariate time series. PMID:25309135
RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.
Stránský, V; Thinová, L
2017-11-01
In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-10
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.
On the implementation of IP protection using biometrics based information hiding and firewall
NASA Astrophysics Data System (ADS)
Basu, Abhishek; Nandy, Kingshuk; Banerjee, Avishek; Giri, Supratick; Sarkar, Souvik; Sarkar, Subir Kumar
2016-02-01
System-on-chip-based design style creates a revolution in very large scale integration industry with design efficiency, operating speed and development time. To support this process, reuse and exchange of components are essential in electronic form called intellectual property (IP). This, however, increases the possibility of encroachment of IP of the design. So copyright protection of IP against piracy is the most important concern for IP vendors. The existing solutions for IP protection are still not secure enough with flexibility, cost, etc. This paper proposes an information-hiding-based solution for IP protection by embedding a biometric copyright information and firewall inside an IP in the form of a finite state machine with unique configuration. The scheme first introduces biometric signature-based copyright as ownership proof. Second, firewall interrupts the normal functionality of IP at the end of the user time period. The experimental outcomes of field-programmable-gate-array implementation illustrate the efficiency of the proposed method.
Nonstationary time series prediction combined with slow feature analysis
NASA Astrophysics Data System (ADS)
Wang, G.; Chen, X.
2015-07-01
Almost all climate time series have some degree of nonstationarity due to external driving forces perturbing the observed system. Therefore, these external driving forces should be taken into account when constructing the climate dynamics. This paper presents a new technique of obtaining the driving forces of a time series from the slow feature analysis (SFA) approach, and then introduces them into a predictive model to predict nonstationary time series. The basic theory of the technique is to consider the driving forces as state variables and to incorporate them into the predictive model. Experiments using a modified logistic time series and winter ozone data in Arosa, Switzerland, were conducted to test the model. The results showed improved prediction skills.
Time Series Econometrics for the 21st Century
ERIC Educational Resources Information Center
Hansen, Bruce E.
2017-01-01
The field of econometrics largely started with time series analysis because many early datasets were time-series macroeconomic data. As the field developed, more cross-sectional and longitudinal datasets were collected, which today dominate the majority of academic empirical research. In nonacademic (private sector, central bank, and governmental)…
Semi-autonomous remote sensing time series generation tool
NASA Astrophysics Data System (ADS)
Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher
2017-10-01
High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.
The examination of headache activity using time-series research designs.
Houle, Timothy T; Remble, Thomas A; Houle, Thomas A
2005-05-01
The majority of research conducted on headache has utilized cross-sectional designs which preclude the examination of dynamic factors and principally rely on group-level effects. The present article describes the application of an individual-oriented process model using time-series analytical techniques. The blending of a time-series approach with an interactive process model allows consideration of the relationships of intra-individual dynamic processes, while not precluding the researcher to examine inter-individual differences. The authors explore the nature of time-series data and present two necessary assumptions underlying the time-series approach. The concept of shock and its contribution to headache activity is also presented. The time-series approach is not without its problems and two such problems are specifically reported: autocorrelation and the distribution of daily observations. The article concludes with the presentation of several analytical techniques suited to examine the time-series interactive process model.
Time Series Analysis of Insar Data: Methods and Trends
NASA Technical Reports Server (NTRS)
Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique
2015-01-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
Interpretation of a compositional time series
NASA Astrophysics Data System (ADS)
Tolosana-Delgado, R.; van den Boogaart, K. G.
2012-04-01
Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA
Modeling and prototyping of biometric systems using dataflow programming
NASA Astrophysics Data System (ADS)
Minakova, N.; Petrov, I.
2018-01-01
The development of biometric systems is one of the labor-intensive processes. Therefore, the creation and analysis of approaches and techniques is an urgent task at present. This article presents a technique of modeling and prototyping biometric systems based on dataflow programming. The technique includes three main stages: the development of functional blocks, the creation of a dataflow graph and the generation of a prototype. A specially developed software modeling environment that implements this technique is described. As an example of the use of this technique, an example of the implementation of the iris localization subsystem is demonstrated. A variant of modification of dataflow programming is suggested to solve the problem related to the undefined order of block activation. The main advantage of the presented technique is the ability to visually display and design the model of the biometric system, the rapid creation of a working prototype and the reuse of the previously developed functional blocks.
Seet, Li-Fong; Narayanaswamy, Arun; Finger, Sharon N; Htoon, Hla M; Nongpiur, Monisha E; Toh, Li Zhen; Ho, Henrietta; Perera, Shamira A; Wong, Tina T
2016-11-01
This study aimed to evaluate differences in iris gene expression profiles between primary angle closure glaucoma (PACG) and primary open angle glaucoma (POAG) and their interaction with biometric characteristics. Prospective study. Thirty-five subjects with PACG and thirty-three subjects with POAG who required trabeculectomy were enrolled at the Singapore National Eye Centre, Singapore. Iris specimens, obtained by iridectomy, were analysed by real-time polymerase chain reaction for expression of type I collagen, vascular endothelial growth factor (VEGF)-A, -B and -C, as well as VEGF receptors (VEGFRs) 1 and 2. Anterior segment optical coherence tomography (ASOCT) imaging for biometric parameters, including anterior chamber depth (ACD), anterior chamber volume (ACV) and lens vault (LV), was also performed pre-operatively. Relative mRNA levels between PACG and POAG irises, biometric measurements, discriminant analyses using genes and biometric parameters. COL1A1, VEGFB, VEGFC and VEGFR2 mRNA expression was higher in PACG compared to POAG irises. LV, ACD and ACV were significantly different between the two subgroups. Discriminant analyses based on gene expression, biometric parameters or a combination of both gene expression and biometrics (LV and ACV), correctly classified 94.1%, 85.3% and 94.1% of the original PACG and POAG cases, respectively. The discriminant function combining genes and biometrics demonstrated the highest accuracy in cross-validated classification of the two glaucoma subtypes. Distinct iris gene expression supports the pathophysiological differences that exist between PACG and POAG. Biometric parameters can combine with iris gene expression to more accurately define PACG from POAG. © 2016 The Authors. Clinical & Experimental Ophthalmology published by John Wiley & Sons Australia, Ltd on behalf of Royal Australian and New Zealand College of Ophthalmologists.
Teoh, Andrew B J; Goh, Alwyn; Ngo, David C L
2006-12-01
Biometric analysis for identity verification is becoming a widespread reality. Such implementations necessitate large-scale capture and storage of biometric data, which raises serious issues in terms of data privacy and (if such data is compromised) identity theft. These problems stem from the essential permanence of biometric data, which (unlike secret passwords or physical tokens) cannot be refreshed or reissued if compromised. Our previously presented biometric-hash framework prescribes the integration of external (password or token-derived) randomness with user-specific biometrics, resulting in bitstring outputs with security characteristics (i.e., noninvertibility) comparable to cryptographic ciphers or hashes. The resultant BioHashes are hence cancellable, i.e., straightforwardly revoked and reissued (via refreshed password or reissued token) if compromised. BioHashing furthermore enhances recognition effectiveness, which is explained in this paper as arising from the Random Multispace Quantization (RMQ) of biometric and external random inputs.
Comparison of Fingerprint and Iris Biometric Authentication for Control of Digital Signatures
Zuckerman, Alan E.; Moon, Kenneth A.; Eaddy, Kenneth
2002-01-01
Biometric authentication systems can be used to control digital signature of medical documents. This pilot study evaluated the use of two different fingerprint technologies and one iris technology to control creation of digital signatures on a central server using public private key pairs stored on the server. Documents and signatures were stored in XML for portability. Key pairs and authentication certificates were generated during biometric enrollment. Usability and user acceptance were guarded and limitations of biometric systems prevented use of the system with all test subjects. The system detected alternations in the data content and provided future signer re-authentication for non-repudiation.
Self-affinity in the dengue fever time series
NASA Astrophysics Data System (ADS)
Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.
2016-06-01
Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.
Association of Wage With Employee Participation in Health Assessments and Biometric Screening.
Sherman, Bruce W; Addy, Carol
2018-02-01
To understand differences in health risk assessment (HRA) and biometric screening participation rates among benefits-enrolled employees in association with wage category. Cross-sectional analysis of employee eligibility file and health benefits (wellness and claims) data. Data from self-insured employers participating in the RightOpt private exchange (Conduent HR Services) during 2014. Active employees from 4 companies continuously enrolled in health insurance for which wage data were available. Measures included HRA and biometric screening participation rates and wage status, with employee age, sex, employer, job tenure, household income, geographic location, and health benefits deductible as a percentage of total wages serving as covariates. Employees were separated into 5 groups based on wage status. Logistic regression analysis incorporated other measures as covariates to adjust for differences between groups, with HRA and biometric screening participation rates determined as binary outcomes. Participation rates for HRA and biometric screening were 90% and 87%, respectively, in the highest wage category, decreasing to 67% and 60%, respectively, among the lowest wage category. Employee wage status is associated with significant differences in HRA and biometric participation rates. Generalizing the results generated by modest participation in these offerings to entire populations may risk misinterpretation of results based on variable participation rates across wage categories.
Developing consistent time series landsat data products
USDA-ARS?s Scientific Manuscript database
The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...
Time-series modeling of long-term weight self-monitoring data.
Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka
2015-08-01
Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.
NASA Astrophysics Data System (ADS)
Burger, Benjamin; Meimon, Serge C.; Petit, Cyril; Nguyen, Minh Chau
2015-02-01
This communication presents the results obtained for decreasing the response time of electrowetting-based real time focus correctors (liquid lenses). In order to provide a compact iris biometric system demonstrator, we have achieved a response time at 90% of 7.5 ms for a change in focalization from 0 diopter to 10 diopter with a liquid lens having an aperture of 1.9 mm. We have used a hydrodynamic fluid reorganization model to predict the features of these fast liquid lenses and evaluated the sensivity of the response time to the different conception parameters.
Nonstationary time series prediction combined with slow feature analysis
NASA Astrophysics Data System (ADS)
Wang, G.; Chen, X.
2015-01-01
Almost all climate time series have some degree of nonstationarity due to external driving forces perturbations of the observed system. Therefore, these external driving forces should be taken into account when reconstructing the climate dynamics. This paper presents a new technique of combining the driving force of a time series obtained using the Slow Feature Analysis (SFA) approach, then introducing the driving force into a predictive model to predict non-stationary time series. In essence, the main idea of the technique is to consider the driving forces as state variables and incorporate them into the prediction model. To test the method, experiments using a modified logistic time series and winter ozone data in Arosa, Switzerland, were conducted. The results showed improved and effective prediction skill.
Fast Algorithms for Mining Co-evolving Time Series
2011-09-01
Keogh et al., 2001, 2004] and (b) forecasting, like an autoregressive integrated moving average model ( ARIMA ) and related meth- ods [Box et al., 1994...computing hardware? We develop models to mine time series with missing values, to extract compact representation from time sequences, to segment the...sequences, and to do forecasting. For large scale data, we propose algorithms for learning time series models , in particular, including Linear Dynamical
European securitization and biometric identification: the uses of genetic profiling.
Johnson, Paul; Williams, Robin
2007-01-01
The recent loss of confidence in textual and verbal methods for validating the identity claims of individual subjects has resulted in growing interest in the use of biometric technologies to establish corporeal uniqueness. Once established, this foundational certainty allows changing biographies and shifting category memberships to be anchored to unchanging bodily surfaces, forms or features. One significant source for this growth has been the "securitization" agendas of nation states that attempt the greater control and monitoring of population movement across geographical borders. Among the wide variety of available biometric schemes, DNA profiling is regarded as a key method for discerning and recording embodied individuality. This paper discusses the current limitations on the use of DNA profiling in civil identification practices and speculates on future uses of the technology with regard to its interoperability with other biometric databasing systems.
Characterizing Time Series Data Diversity for Wind Forecasting: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Chartan, Erol Kevin; Feng, Cong
Wind forecasting plays an important role in integrating variable and uncertain wind power into the power grid. Various forecasting models have been developed to improve the forecasting accuracy. However, it is challenging to accurately compare the true forecasting performances from different methods and forecasters due to the lack of diversity in forecasting test datasets. This paper proposes a time series characteristic analysis approach to visualize and quantify wind time series diversity. The developed method first calculates six time series characteristic indices from various perspectives. Then the principal component analysis is performed to reduce the data dimension while preserving the importantmore » information. The diversity of the time series dataset is visualized by the geometric distribution of the newly constructed principal component space. The volume of the 3-dimensional (3D) convex polytope (or the length of 1D number axis, or the area of the 2D convex polygon) is used to quantify the time series data diversity. The method is tested with five datasets with various degrees of diversity.« less
Characterization of time series via Rényi complexity-entropy curves
NASA Astrophysics Data System (ADS)
Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.
2018-05-01
One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.
Quantifying Selection with Pool-Seq Time Series Data.
Taus, Thomas; Futschik, Andreas; Schlötterer, Christian
2017-11-01
Allele frequency time series data constitute a powerful resource for unraveling mechanisms of adaptation, because the temporal dimension captures important information about evolutionary forces. In particular, Evolve and Resequence (E&R), the whole-genome sequencing of replicated experimentally evolving populations, is becoming increasingly popular. Based on computer simulations several studies proposed experimental parameters to optimize the identification of the selection targets. No such recommendations are available for the underlying parameters selection strength and dominance. Here, we introduce a highly accurate method to estimate selection parameters from replicated time series data, which is fast enough to be applied on a genome scale. Using this new method, we evaluate how experimental parameters can be optimized to obtain the most reliable estimates for selection parameters. We show that the effective population size (Ne) and the number of replicates have the largest impact. Because the number of time points and sequencing coverage had only a minor effect, we suggest that time series analysis is feasible without major increase in sequencing costs. We anticipate that time series analysis will become routine in E&R studies. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 8 Aliens and Nationality 1 2011-01-01 2011-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...
Online Conditional Outlier Detection in Nonstationary Time Series.
Liu, Siqi; Wright, Adam; Hauskrecht, Milos
2017-05-01
The objective of this work is to develop methods for detecting outliers in time series data. Such methods can become the key component of various monitoring and alerting systems, where an outlier may be equal to some adverse condition that needs human attention. However, real-world time series are often affected by various sources of variability present in the environment that may influence the quality of detection; they may (1) explain some of the changes in the signal that would otherwise lead to false positive detections, as well as, (2) reduce the sensitivity of the detection algorithm leading to increase in false negatives. To alleviate these problems, we propose a new two-layer outlier detection approach that first tries to model and account for the nonstationarity and periodic variation in the time series, and then tries to use other observable variables in the environment to explain any additional signal variation. Our experiments on several data sets in different domains show that our method provides more accurate modeling of the time series, and that it is able to significantly improve outlier detection performance.
Transformation-cost time-series method for analyzing irregularly sampled data
NASA Astrophysics Data System (ADS)
Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen
2015-06-01
Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.
Transformation-cost time-series method for analyzing irregularly sampled data.
Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen
2015-06-01
Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.
Higher-Order Hurst Signatures: Dynamical Information in Time Series
NASA Astrophysics Data System (ADS)
Ferenbaugh, Willis
2005-10-01
Understanding and comparing time series from different systems requires characteristic measures of the dynamics embedded in the series. The Hurst exponent is a second-order dynamical measure of a time series which grew up within the blossoming fractal world of Mandelbrot. This characteristic measure is directly related to the behavior of the autocorrelation, the power-spectrum, and other second-order things. And as with these other measures, the Hurst exponent captures and quantifies some but not all of the intrinsic nature of a series. The more elusive characteristics live in the phase spectrum and the higher-order spectra. This research is a continuing quest to (more) fully characterize the dynamical information in time series produced by plasma experiments or models. The goal is to supplement the series information which can be represented by a Hurst exponent, and we would like to develop supplemental techniques in analogy with Hurst's original R/S analysis. These techniques should be another way to plumb the higher-order dynamics.
A multidisciplinary database for geophysical time series management
NASA Astrophysics Data System (ADS)
Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.
2013-12-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
Noise analysis of GPS time series in Taiwan
NASA Astrophysics Data System (ADS)
Lee, You-Chia; Chang, Wu-Lung
2017-04-01
Global positioning system (GPS) usually used for researches of plate tectonics and crustal deformation. In most studies, GPS time series considered only time-independent noises (white noise), but time-dependent noises (flicker noise, random walk noise) which were found by nearly twenty years are also important to the precision of data. The rate uncertainties of stations will be underestimated if the GPS time series are assumed only time-independent noise. Therefore studying the noise properties of GPS time series is necessary in order to realize the precision and reliability of velocity estimates. The lengths of our GPS time series are from over 500 stations around Taiwan with time spans longer than 2.5 years up to 20 years. The GPS stations include different monument types such as deep drill braced, roof, metal tripod, and concrete pier, and the most common type in Taiwan is the metal tripod. We investigated the noise properties of continuous GPS time series by using the spectral index and amplitude of the power law noise. During the process we first remove the data outliers, and then estimate linear trend, size of offsets, and seasonal signals, and finally the amplitudes of the power-law and white noise are estimated simultaneously. Our preliminary results show that the noise amplitudes of the north component are smaller than that of the other two components, and the largest amplitudes are in the vertical. We also find that the amplitudes of white noise and power-law noises are positively correlated in three components. Comparisons of noise amplitudes of different monument types in Taiwan reveal that the deep drill braced monuments have smaller data uncertainties and therefore are more stable than other monuments.
Modeling Time Series Data for Supervised Learning
ERIC Educational Resources Information Center
Baydogan, Mustafa Gokce
2012-01-01
Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…
Analysis of Time-Series Quasi-Experiments. Final Report.
ERIC Educational Resources Information Center
Glass, Gene V.; Maguire, Thomas O.
The objective of this project was to investigate the adequacy of statistical models developed by G. E. P. Box and G. C. Tiao for the analysis of time-series quasi-experiments: (1) The basic model developed by Box and Tiao is applied to actual time-series experiment data from two separate experiments, one in psychology and one in educational…
A Computer Evolution in Teaching Undergraduate Time Series
ERIC Educational Resources Information Center
Hodgess, Erin M.
2004-01-01
In teaching undergraduate time series courses, we have used a mixture of various statistical packages. We have finally been able to teach all of the applied concepts within one statistical package; R. This article describes the process that we use to conduct a thorough analysis of a time series. An example with a data set is provided. We compare…
Code of Federal Regulations, 2014 CFR
2014-01-01
... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Requirements for biometric identifiers from... Requirements for biometric identifiers from aliens on departure from the United States. (a)(1) The Secretary of... designated port of entry, to provide fingerprints, photograph(s) or other specified biometric identifiers...
GNSS Network Time Series Analysis
NASA Astrophysics Data System (ADS)
Balodis, J.; Janpaule, I.; Haritonova, D.; Normand, M.; Silabriedis, G.; Zarinjsh, A.; Zvirgzds, J.
2012-04-01
Time series of GNSS station results of both the EUPOS®-RIGA and LATPOS networks has been developed at the Institute of Geodesy and Geoinformation (University of Latvia) using Bernese v.5.0 software. The base stations were selected among the EPN and IGS stations in surroundings of Latvia. In various day solutions the base station selection has been miscellaneous. Most frequently 5 - 8 base stations were selected from a set of stations {BOR1, JOEN, JOZE, MDVJ, METS, POLV, PULK, RIGA, TORA, VAAS, VISO, VLNS}. The rejection of "bad base stations" was performed by Bernese software depending on the quality of proper station data in proper day. This caused a reason of miscellaneous base station selection in various days. The results of time series are analysed. The question aroused on the nature of some outlying situations. The seasonal effect of the behaviour of the network has been identified when distance and elevation changes between stations has been analysed. The dependence from various influences has been recognised.
Empirical method to measure stochasticity and multifractality in nonlinear time series
NASA Astrophysics Data System (ADS)
Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping
2013-12-01
An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.
A new approach for willingness test in biometric systems
NASA Astrophysics Data System (ADS)
Yang, Kai; Du, Yingzi; Zhou, Zhi
2011-06-01
Biometrics identifies/verifies a person using his/her physiological or behavioral characteristics. It is becoming an important ally for law enforcement and homeland security. However, there are some safety and privacy concerns: biometric based systems can be accessed when users are under threat, reluctant or even unconscious states. In this paper, we introduce a new method which can identify a person and detect his/her willingness. Our experimental results show that the new approach can enhance the security by checking the consent signature while achieving very high recognition accuracy.
Evaluation of Scaling Invariance Embedded in Short Time Series
Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping
2014-01-01
Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length . Calculations with specified Hurst exponent values of show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias () and sharp confidential interval (standard deviation ). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records. PMID:25549356
Clinical time series prediction: towards a hierarchical dynamical system framework
Liu, Zitao; Hauskrecht, Milos
2014-01-01
Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive
Clinical time series prediction: Toward a hierarchical dynamical system framework.
Liu, Zitao; Hauskrecht, Milos
2015-09-01
Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.
Characterizing time series: when Granger causality triggers complex networks
NASA Astrophysics Data System (ADS)
Ge, Tian; Cui, Yindong; Lin, Wei; Kurths, Jürgen; Liu, Chong
2012-08-01
In this paper, we propose a new approach to characterize time series with noise perturbations in both the time and frequency domains by combining Granger causality and complex networks. We construct directed and weighted complex networks from time series and use representative network measures to describe their physical and topological properties. Through analyzing the typical dynamical behaviors of some physical models and the MIT-BIHMassachusetts Institute of Technology-Beth Israel Hospital. human electrocardiogram data sets, we show that the proposed approach is able to capture and characterize various dynamics and has much potential for analyzing real-world time series of rather short length.
Filter-based multiscale entropy analysis of complex physiological time series.
Xu, Yuesheng; Zhao, Liang
2013-08-01
Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.
Measurements of spatial population synchrony: influence of time series transformations.
Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël
2015-09-01
Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies.
Continuous Biometric Authentication for Authorized Aircraft Personnel: A Proposed Design
2003-06-01
Security Magazine , “Biometrics Technology: Making Moves in the Security Game", pp. 28-34 Volume 12 #3 March 2002. [17] International Biometrics... in this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government. 12a...words) Today, there is no way to ensure that the personnel working within the cockpit of an aircraft in flight are authorized to be there. The
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-01
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375
Neural network versus classical time series forecasting models
NASA Astrophysics Data System (ADS)
Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam
2017-05-01
Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.
Signatures of ecological processes in microbial community time series.
Faust, Karoline; Bauchinger, Franziska; Laroche, Béatrice; de Buyl, Sophie; Lahti, Leo; Washburne, Alex D; Gonze, Didier; Widder, Stefanie
2018-06-28
Growth rates, interactions between community members, stochasticity, and immigration are important drivers of microbial community dynamics. In sequencing data analysis, such as network construction and community model parameterization, we make implicit assumptions about the nature of these drivers and thereby restrict model outcome. Despite apparent risk of methodological bias, the validity of the assumptions is rarely tested, as comprehensive procedures are lacking. Here, we propose a classification scheme to determine the processes that gave rise to the observed time series and to enable better model selection. We implemented a three-step classification scheme in R that first determines whether dependence between successive time steps (temporal structure) is present in the time series and then assesses with a recently developed neutrality test whether interactions between species are required for the dynamics. If the first and second tests confirm the presence of temporal structure and interactions, then parameters for interaction models are estimated. To quantify the importance of temporal structure, we compute the noise-type profile of the community, which ranges from black in case of strong dependency to white in the absence of any dependency. We applied this scheme to simulated time series generated with the Dirichlet-multinomial (DM) distribution, Hubbell's neutral model, the generalized Lotka-Volterra model and its discrete variant (the Ricker model), and a self-organized instability model, as well as to human stool microbiota time series. The noise-type profiles for all but DM data clearly indicated distinctive structures. The neutrality test correctly classified all but DM and neutral time series as non-neutral. The procedure reliably identified time series for which interaction inference was suitable. Both tests were required, as we demonstrated that all structured time series, including those generated with the neutral model, achieved a moderate to high
New surveillance technologies and their publics: A case of biometrics.
Martin, Aaron K; Donovan, Kevin P
2015-10-01
Before a newly-elected government abandoned the project in 2010, for at least eight years the British state actively sought to introduce a mandatory national identification scheme for which the science and technology of biometrics was central. Throughout the effort, government representatives attempted to portray biometrics as a technology that was easily understandable and readily accepted by the public. However, neither task was straightforward. Instead, particular publics emerged that showed biometric technology was rarely well understood and often disagreeable. In contrast to some traditional conceptualizations of the relationship between public understanding and science, it was often those entities that best understood the technology that found it least acceptable, rather than those populations that lacked knowledge. This paper analyzes the discourses that pervaded the case in order to untangle how various publics are formed and exhibit differing, conflicting understandings of a novel technology. © The Author(s) 2014.
Testing for nonlinearity in non-stationary physiological time series.
Guarín, Diego; Delgado, Edilson; Orozco, Álvaro
2011-01-01
Testing for nonlinearity is one of the most important preprocessing steps in nonlinear time series analysis. Typically, this is done by means of the linear surrogate data methods. But it is a known fact that the validity of the results heavily depends on the stationarity of the time series. Since most physiological signals are non-stationary, it is easy to falsely detect nonlinearity using the linear surrogate data methods. In this document, we propose a methodology to extend the procedure for generating constrained surrogate time series in order to assess nonlinearity in non-stationary data. The method is based on the band-phase-randomized surrogates, which consists (contrary to the linear surrogate data methods) in randomizing only a portion of the Fourier phases in the high frequency domain. Analysis of simulated time series showed that in comparison to the linear surrogate data method, our method is able to discriminate between linear stationarity, linear non-stationary and nonlinear time series. Applying our methodology to heart rate variability (HRV) records of five healthy patients, we encountered that nonlinear correlations are present in this non-stationary physiological signals.
Permutation entropy of finite-length white-noise time series.
Little, Douglas J; Kane, Deb M
2016-08-01
Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.
Time-Series Analysis: A Cautionary Tale
NASA Technical Reports Server (NTRS)
Damadeo, Robert
2015-01-01
Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.
Multiresolution analysis of Bursa Malaysia KLCI time series
NASA Astrophysics Data System (ADS)
Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed
2017-05-01
In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.
Report of the Defense Science Board Task Force on Defense Biometrics
2007-03-01
certificates, crypto variables, and encoded biometric indices. The Department of Defense has invested prestige and resources in its Common Access Card (CAC...in turn, could be used to unlock an otherwise secret key or crypto variable which would support the remote authentication. A new key variable...The PSA for biometrics should commission development of appropriate threat model(s) and assign responsibility for maintaining currency of the model
A biometric method to secure telemedicine systems.
Zhang, G H; Poon, Carmen C Y; Li, Ye; Zhang, Y T
2009-01-01
Security and privacy are among the most crucial issues for data transmission in telemedicine systems. This paper proposes a solution for securing wireless data transmission in telemedicine systems, i.e. within a body sensor network (BSN), between the BSN and server as well as between the server and professionals who have assess to the server. A unique feature of this solution is the generation of random keys by physiological data (i.e. a biometric approach) for securing communication at all 3 levels. In the performance analysis, inter-pulse interval of photoplethysmogram is used as an example to generate these biometric keys to protect wireless data transmission. The results of statistical analysis and computational complexity suggest that this type of key is random enough to make telemedicine systems resistant to attacks.
Time series momentum and contrarian effects in the Chinese stock market
NASA Astrophysics Data System (ADS)
Shi, Huai-Long; Zhou, Wei-Xing
2017-10-01
This paper concentrates on the time series momentum or contrarian effects in the Chinese stock market. We evaluate the performance of the time series momentum strategy applied to major stock indices in mainland China and explore the relation between the performance of time series momentum strategies and some firm-specific characteristics. Our findings indicate that there is a time series momentum effect in the short run and a contrarian effect in the long run in the Chinese stock market. The performances of the time series momentum and contrarian strategies are highly dependent on the look-back and holding periods and firm-specific characteristics.
Playful biometrics: controversial technology through the lens of play.
Ellerbrok, Ariane
2011-01-01
This article considers the role of play in the context of technological emergence and expansion, particularly as it relates to recently emerging surveillance technologies. As a case study, I consider the trajectory of automated face recognition—a biometric technology of numerous applications, from its more controversial manifestations under the rubric of national security to a clearly emerging orientation toward play. This shift toward “playful” biometrics—or from a technology traditionally coded as “hard” to one now increasingly coded as “soft”—is critical insofar as it renders problematic the traditional modes of critique that have, up until this point, challenged the expansion of biometric systems into increasingly ubiquitous realms of everyday life. In response to this dynamic, I propose theorizing the expansion of face recognition specifically in relation to “play,” a step that allows us to broaden the critical space around newly emerging playful biometrics, as well as playful surveillance more generally. In addition, play may also have relevance for theorizing other forms of controversial technology, particularly given its potential role in processes of obfuscation, normalization, and marginalization.
Alternative predictors in chaotic time series
NASA Astrophysics Data System (ADS)
Alves, P. R. L.; Duarte, L. G. S.; da Mota, L. A. C. P.
2017-06-01
In the scheme of reconstruction, non-polynomial predictors improve the forecast from chaotic time series. The algebraic manipulation in the Maple environment is the basis for obtaining of accurate predictors. Beyond the different times of prediction, the optional arguments of the computational routines optimize the running and the analysis of global mappings.
Online Conditional Outlier Detection in Nonstationary Time Series
Liu, Siqi; Wright, Adam; Hauskrecht, Milos
2017-01-01
The objective of this work is to develop methods for detecting outliers in time series data. Such methods can become the key component of various monitoring and alerting systems, where an outlier may be equal to some adverse condition that needs human attention. However, real-world time series are often affected by various sources of variability present in the environment that may influence the quality of detection; they may (1) explain some of the changes in the signal that would otherwise lead to false positive detections, as well as, (2) reduce the sensitivity of the detection algorithm leading to increase in false negatives. To alleviate these problems, we propose a new two-layer outlier detection approach that first tries to model and account for the nonstationarity and periodic variation in the time series, and then tries to use other observable variables in the environment to explain any additional signal variation. Our experiments on several data sets in different domains show that our method provides more accurate modeling of the time series, and that it is able to significantly improve outlier detection performance. PMID:29644345
Biometric Fusion Demonstration System Scientific Report
2004-03-01
verification and facial recognition , searching watchlist databases comprised of full or partial facial images or voice recordings. Multiple-biometric...17 2.2.1.1 Fingerprint and Facial Recognition ............................... 17...iv DRDC Ottawa CR 2004 – 056 2.2.1.2 Iris Recognition and Facial Recognition ........................ 18
A perturbative approach for enhancing the performance of time series forecasting.
de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C
2017-04-01
This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.
Drunk driving detection based on classification of multivariate time series.
Li, Zhenlong; Jin, Xue; Zhao, Xiaohua
2015-09-01
This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.
Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi
2015-02-01
We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.
Evaluation of scaling invariance embedded in short time series.
Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping
2014-01-01
Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.
Modelling of Biometric Identification System with Given Parameters Using Colored Petri Nets
NASA Astrophysics Data System (ADS)
Petrosyan, G.; Ter-Vardanyan, L.; Gaboutchian, A.
2017-05-01
Biometric identification systems use given parameters and function on the basis of Colored Petri Nets as a modelling language developed for systems in which communication, synchronization and distributed resources play an important role. Colored Petri Nets combine the strengths of Classical Petri Nets with the power of a high-level programming language. Coloured Petri Nets have both, formal intuitive and graphical presentations. Graphical CPN model consists of a set of interacting modules which include a network of places, transitions and arcs. Mathematical representation has a well-defined syntax and semantics, as well as defines system behavioural properties. One of the best known features used in biometric is the human finger print pattern. During the last decade other human features have become of interest, such as iris-based or face recognition. The objective of this paper is to introduce the fundamental concepts of Petri Nets in relation to tooth shape analysis. Biometric identification systems functioning has two phases: data enrollment phase and identification phase. During the data enrollment phase images of teeth are added to database. This record contains enrollment data as a noisy version of the biometrical data corresponding to the individual. During the identification phase an unknown individual is observed again and is compared to the enrollment data in the database and then system estimates the individual. The purpose of modeling biometric identification system by means of Petri Nets is to reveal the following aspects of the functioning model: the efficiency of the model, behavior of the model, mistakes and accidents in the model, feasibility of the model simplification or substitution of its separate components for more effective components without interfering system functioning. The results of biometric identification system modeling and evaluating are presented and discussed.
Zhang, Guang-He; Poon, Carmen C Y; Zhang, Yuan-Ting
2012-01-01
Wireless body sensor network (WBSN), a key building block for m-Health, demands extremely stringent resource constraints and thus lightweight security methods are preferred. To minimize resource consumption, utilizing information already available to a WBSN, particularly common to different sensor nodes of a WBSN, for security purposes becomes an attractive solution. In this paper, we tested the randomness and distinctiveness of the 128-bit biometric binary sequences (BSs) generated from interpulse intervals (IPIs) of 20 healthy subjects as well as 30 patients suffered from myocardial infarction and 34 subjects with other cardiovascular diseases. The encoding time of a biometric BS on a WBSN node is on average 23 ms and memory occupation is 204 bytes for any given IPI sequence. The results from five U.S. National Institute of Standards and Technology statistical tests suggest that random biometric BSs can be generated from both healthy subjects and cardiovascular patients and can potentially be used as authentication identifiers for securing WBSNs. Ultimately, it is preferred that these biometric BSs can be used as encryption keys such that key distribution over the WBSN can be avoided.
Using SAR satellite data time series for regional glacier mapping
NASA Astrophysics Data System (ADS)
Winsvold, Solveig H.; Kääb, Andreas; Nuth, Christopher; Andreassen, Liss M.; van Pelt, Ward J. J.; Schellenberger, Thomas
2018-03-01
With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8) and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.
Non-linear forecasting in high-frequency financial time series
NASA Astrophysics Data System (ADS)
Strozzi, F.; Zaldívar, J. M.
2005-08-01
A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.
Pseudo-random bit generator based on lag time series
NASA Astrophysics Data System (ADS)
García-Martínez, M.; Campos-Cantón, E.
2014-12-01
In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.
miniSEED: The Backbone Data Format for Seismological Time Series
NASA Astrophysics Data System (ADS)
Ahern, T. K.; Benson, R. B.; Trabant, C. M.
2017-12-01
In 1987, the International Federation of Digital Seismograph Networks (FDSN), adopted the Standard for the Exchange of Earthquake Data (SEED) format to be used for data archiving and exchange of seismological time series data. Since that time, the format has evolved to accommodate new capabilities and features. For example, a notable change in 1992 allowed the format, which includes both the comprehensive metadata and the time series samples, to be used in two additional forms: a container for metadata only called "dataless SEED", and 2) a stand-alone structure for time series called "miniSEED". While specifically designed for seismological data and related metadata, this format has proven to be a useful format for a wide variety of geophysical time series data. Many FDSN data centers now store temperature, pressure, infrasound, tilt and other time series measurements in this internationally used format. Since April 2016, members of the FDSN have been in discussions to design a next generation miniSEED format to accommodate current and future needs, to further generalize the format, and to address a number of historical problems or limitations. We believe the correct approach is to simplify the header, allow for arbitrary header additions, expand the current identifiers, and allow for anticipated future identifiers which are currently unknown. We also believe the primary goal of the format is for efficient archiving, selection and exchange of time series data. By focusing on these goals we avoid trying to generalize the format too broadly into specialized areas such as efficient, low-latency delivery, or including unbounded non-time series data. Our presentation will provide an overview of this format and highlight its most valuable characteristics for time series data from any geophysical domain or beyond.
Privacy and Biometric Passports
Vakalis, Ioannis
2011-01-01
This work deals with privacy implications and threats that can emerge with the large-scale use of electronic biometric documents, such the recently introduced electronic passport (e-Passport). A brief introduction to privacy and personal data protection is followed by a presentation of the technical characteristics of the e-Passport. The description includes the digital data structure, and the communication and reading mechanisms of the e-Passport, indicating the possible points and methods of attack. PMID:21380483
Biometric identity management for standard mobile medical networks.
Egner, Alexandru; Soceanu, Alexandru; Moldoveanu, Florica
2012-01-01
The explosion of healthcare costs over the last decade has prompted the ICT industry to respond with solutions for reducing costs while improving healthcare quality. The ISO/IEEE 11073 family of standards recently released is the first step towards interoperability of mobile medical devices used in patient environments. The standards do not, however, tackle security problems, such as identity management, or the secure exchange of medical data. This paper proposes an enhancement of the ISO/IEEE 11073-20601 protocol with an identity management system based on biometry. The paper describes a novel biometric-based authentication process, together with the biometric key generation algorithm. The proposed extension of the ISO/IEEE 11073-20601 is also presented.
False-nearest-neighbors algorithm and noise-corrupted time series
NASA Astrophysics Data System (ADS)
Rhodes, Carl; Morari, Manfred
1997-05-01
The false-nearest-neighbors (FNN) algorithm was originally developed to determine the embedding dimension for autonomous time series. For noise-free computer-generated time series, the algorithm does a good job in predicting the embedding dimension. However, the problem of predicting the embedding dimension when the time-series data are corrupted by noise was not fully examined in the original studies of the FNN algorithm. Here it is shown that with large data sets, even small amounts of noise can lead to incorrect prediction of the embedding dimension. Surprisingly, as the length of the time series analyzed by FNN grows larger, the cause of incorrect prediction becomes more pronounced. An analysis of the effect of noise on the FNN algorithm and a solution for dealing with the effects of noise are given here. Some results on the theoretically correct choice of the FNN threshold are also presented.
Biometrics encryption combining palmprint with two-layer error correction codes
NASA Astrophysics Data System (ADS)
Li, Hengjian; Qiu, Jian; Dong, Jiwen; Feng, Guang
2017-07-01
To bridge the gap between the fuzziness of biometrics and the exactitude of cryptography, based on combining palmprint with two-layer error correction codes, a novel biometrics encryption method is proposed. Firstly, the randomly generated original keys are encoded by convolutional and cyclic two-layer coding. The first layer uses a convolution code to correct burst errors. The second layer uses cyclic code to correct random errors. Then, the palmprint features are extracted from the palmprint images. Next, they are fused together by XORing operation. The information is stored in a smart card. Finally, the original keys extraction process is the information in the smart card XOR the user's palmprint features and then decoded with convolutional and cyclic two-layer code. The experimental results and security analysis show that it can recover the original keys completely. The proposed method is more secure than a single password factor, and has higher accuracy than a single biometric factor.
Scalable Prediction of Energy Consumption using Incremental Time Series Clustering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simmhan, Yogesh; Noor, Muhammad Usman
2013-10-09
Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 datamore » points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.« less
Practical Pocket PC Application w/Biometric Security
NASA Technical Reports Server (NTRS)
Logan, Julian
2004-01-01
I work in the Flight Software Engineering Branch, where we provide design and development of embedded real-time software applications for flight and supporting ground systems to support the NASA Aeronautics and Space Programs. In addition, this branch evaluates, develops and implements new technologies for embedded real-time systems, and maintains a laboratory for applications of embedded technology. The majority of microchips that are used in modern society have been programmed using embedded technology. These small chips can be found in microwaves, calculators, home security systems, cell phones and more. My assignment this summer entails working with an iPAQ HP 5500 Pocket PC. This top-of-the-line hand-held device is one of the first mobile PC's to introduce biometric security capabilities. Biometric security, in this case a fingerprint authentication system, is on the edge of technology as far as securing information. The benefits of fingerprint authentication are enormous. The most significant of them are that it is extremely difficult to reproduce someone else's fingerprint, and it is equally difficult to lose or forget your own fingerprint as opposed to a password or pin number. One of my goals for this summer is to integrate this technology with another Pocket PC application. The second task for the summer is to develop a simple application that provides an Astronaut EVA (Extravehicular Activity) Log Book capability. The Astronaut EVA Log Book is what an astronaut would use to report the status of field missions, crew physical health, successes, future plans, etc. My goal is to develop a user interface into which these data fields can be entered and stored. The applications that I am developing are created using eMbedded Visual C++ 4.0 with the Pocket PC 2003 Software Development Kit provided by Microsoft.
A data mining framework for time series estimation.
Hu, Xiao; Xu, Peng; Wu, Shaozhi; Asgari, Shadnaz; Bergsneider, Marvin
2010-04-01
Time series estimation techniques are usually employed in biomedical research to derive variables less accessible from a set of related and more accessible variables. These techniques are traditionally built from systems modeling approaches including simulation, blind decovolution, and state estimation. In this work, we define target time series (TTS) and its related time series (RTS) as the output and input of a time series estimation process, respectively. We then propose a novel data mining framework for time series estimation when TTS and RTS represent different sets of observed variables from the same dynamic system. This is made possible by mining a database of instances of TTS, its simultaneously recorded RTS, and the input/output dynamic models between them. The key mining strategy is to formulate a mapping function for each TTS-RTS pair in the database that translates a feature vector extracted from RTS to the dissimilarity between true TTS and its estimate from the dynamic model associated with the same TTS-RTS pair. At run time, a feature vector is extracted from an inquiry RTS and supplied to the mapping function associated with each TTS-RTS pair to calculate a dissimilarity measure. An optimal TTS-RTS pair is then selected by analyzing these dissimilarity measures. The associated input/output model of the selected TTS-RTS pair is then used to simulate the TTS given the inquiry RTS as an input. An exemplary implementation was built to address a biomedical problem of noninvasive intracranial pressure assessment. The performance of the proposed method was superior to that of a simple training-free approach of finding the optimal TTS-RTS pair by a conventional similarity-based search on RTS features. 2009 Elsevier Inc. All rights reserved.
Using Time-Series Regression to Predict Academic Library Circulations.
ERIC Educational Resources Information Center
Brooks, Terrence A.
1984-01-01
Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…
Biometric Analysis - A Reliable Indicator for Diagnosing Taurodontism using Panoramic Radiographs.
Hegde, Veda; Anegundi, Rajesh Trayambhak; Pravinchandra, K R
2013-08-01
Taurodontism is a clinical entity with a morpho-anatomical change in the shape of the tooth, which was thought to be absent in modern man. Taurodontism is mostly observed as an isolated trait or a component of a syndrome. Various techniques have been devised to diagnose taurodontism. The aim of this study was to analyze whether a biometric analysis was useful in diagnosing taurodontism, in radiographs which appeared to be normal on cursory observations. This study was carried out in our institution by using radiographs which were taken for routine procedures. In this retrospective study, panoramic radiographs were obtained from dental records of children who were aged between 9-14 years, who did not have any abnormality on cursory observations. Biometric analyses were carried out on permanent mandibular first molar(s) by using a novel biometric method. The values were tabulated and analysed. Fischer exact probability test, Chi square test and Chi-square test with Yates correction were used for statistical analysis of the data. Cursory observation did not yield us any case of taurodontism. In contrast, the biometric analysis yielded us a statistically significant number of cases of taurodontism. However, there was no statistically significant difference in the number of cases with taurodontism, which was obtained between the genders and the age group which was considered. Thus, taurodontism was diagnosed on a biometric analysis, which was otherwise missed on a cursory observation. It is therefore necessary from the clinical point of view, to diagnose even the mildest form of taurodontism by using metric analysis rather than just relying on a visual radiographic assessment, as its occurrence has many clinical implications and a diagnostic importance.
JWST NIRCam Time Series Observations
NASA Technical Reports Server (NTRS)
Greene, Tom; Schlawin, E.
2017-01-01
We explain how to make time-series observations with the Near-Infrared camera (NIRCam) science instrument of the James Webb Space Telescope. Both photometric and spectroscopic observations are described. We present the basic capabilities and performance of NIRCam and show examples of how to set its observing parameters using the Space Telescope Science Institute's Astronomer's Proposal Tool (APT).
Biometrics based key management of double random phase encoding scheme using error control codes
NASA Astrophysics Data System (ADS)
Saini, Nirmala; Sinha, Aloka
2013-08-01
In this paper, an optical security system has been proposed in which key of the double random phase encoding technique is linked to the biometrics of the user to make it user specific. The error in recognition due to the biometric variation is corrected by encoding the key using the BCH code. A user specific shuffling key is used to increase the separation between genuine and impostor Hamming distance distribution. This shuffling key is then further secured using the RSA public key encryption to enhance the security of the system. XOR operation is performed between the encoded key and the feature vector obtained from the biometrics. The RSA encoded shuffling key and the data obtained from the XOR operation are stored into a token. The main advantage of the present technique is that the key retrieval is possible only in the simultaneous presence of the token and the biometrics of the user which not only authenticates the presence of the original input but also secures the key of the system. Computational experiments showed the effectiveness of the proposed technique for key retrieval in the decryption process by using the live biometrics of the user.
Time series analysis of InSAR data: Methods and trends
NASA Astrophysics Data System (ADS)
Osmanoğlu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cabral-Cano, Enrique
2016-05-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ;unwrapping; of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
CauseMap: fast inference of causality from complex time series.
Maher, M Cyrus; Hernandez, Ryan D
2015-01-01
Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data. Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM), a method for establishing causality from long time series data (≳25 observations). Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens' Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement CCM in Julia, a
Providing web-based tools for time series access and analysis
NASA Astrophysics Data System (ADS)
Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane
2014-05-01
Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30
Transforming Security Screening With Biometrics
2003-04-09
prompted the Defense Advanced Research Projects Agency to experiment with facial recognition technology for identification of known terrorists. While DoD...screening of individuals. Facial recognition technology has been tested to some degree for accessing highly sensitive military areas, but not for...the military can implement facial recognition to screen personnel requesting access to bases and stations, DoD is not likely to use biometrics to
NASA Astrophysics Data System (ADS)
Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron
2009-10-01
A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.
PRESEE: An MDL/MML Algorithm to Time-Series Stream Segmenting
Jiang, Yexi; Tang, Mingjie; Yuan, Changan; Tang, Changjie
2013-01-01
Time-series stream is one of the most common data types in data mining field. It is prevalent in fields such as stock market, ecology, and medical care. Segmentation is a key step to accelerate the processing speed of time-series stream mining. Previous algorithms for segmenting mainly focused on the issue of ameliorating precision instead of paying much attention to the efficiency. Moreover, the performance of these algorithms depends heavily on parameters, which are hard for the users to set. In this paper, we propose PRESEE (parameter-free, real-time, and scalable time-series stream segmenting algorithm), which greatly improves the efficiency of time-series stream segmenting. PRESEE is based on both MDL (minimum description length) and MML (minimum message length) methods, which could segment the data automatically. To evaluate the performance of PRESEE, we conduct several experiments on time-series streams of different types and compare it with the state-of-art algorithm. The empirical results show that PRESEE is very efficient for real-time stream datasets by improving segmenting speed nearly ten times. The novelty of this algorithm is further demonstrated by the application of PRESEE in segmenting real-time stream datasets from ChinaFLUX sensor networks data stream. PMID:23956693
PRESEE: an MDL/MML algorithm to time-series stream segmenting.
Xu, Kaikuo; Jiang, Yexi; Tang, Mingjie; Yuan, Changan; Tang, Changjie
2013-01-01
Time-series stream is one of the most common data types in data mining field. It is prevalent in fields such as stock market, ecology, and medical care. Segmentation is a key step to accelerate the processing speed of time-series stream mining. Previous algorithms for segmenting mainly focused on the issue of ameliorating precision instead of paying much attention to the efficiency. Moreover, the performance of these algorithms depends heavily on parameters, which are hard for the users to set. In this paper, we propose PRESEE (parameter-free, real-time, and scalable time-series stream segmenting algorithm), which greatly improves the efficiency of time-series stream segmenting. PRESEE is based on both MDL (minimum description length) and MML (minimum message length) methods, which could segment the data automatically. To evaluate the performance of PRESEE, we conduct several experiments on time-series streams of different types and compare it with the state-of-art algorithm. The empirical results show that PRESEE is very efficient for real-time stream datasets by improving segmenting speed nearly ten times. The novelty of this algorithm is further demonstrated by the application of PRESEE in segmenting real-time stream datasets from ChinaFLUX sensor networks data stream.
Local normalization: Uncovering correlations in non-stationary financial time series
NASA Astrophysics Data System (ADS)
Schäfer, Rudi; Guhr, Thomas
2010-09-01
The measurement of correlations between financial time series is of vital importance for risk management. In this paper we address an estimation error that stems from the non-stationarity of the time series. We put forward a method to rid the time series of local trends and variable volatility, while preserving cross-correlations. We test this method in a Monte Carlo simulation, and apply it to empirical data for the S&P 500 stocks.
Fuzzy time-series based on Fibonacci sequence for stock price forecasting
NASA Astrophysics Data System (ADS)
Chen, Tai-Liang; Cheng, Ching-Hsue; Jong Teoh, Hia
2007-07-01
Time-series models have been utilized to make reasonably accurate predictions in the areas of stock price movements, academic enrollments, weather, etc. For promoting the forecasting performance of fuzzy time-series models, this paper proposes a new model, which incorporates the concept of the Fibonacci sequence, the framework of Song and Chissom's model and the weighted method of Yu's model. This paper employs a 5-year period TSMC (Taiwan Semiconductor Manufacturing Company) stock price data and a 13-year period of TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) stock index data as experimental datasets. By comparing our forecasting performances with Chen's (Forecasting enrollments based on fuzzy time-series. Fuzzy Sets Syst. 81 (1996) 311-319), Yu's (Weighted fuzzy time-series models for TAIEX forecasting. Physica A 349 (2004) 609-624) and Huarng's (The application of neural networks to forecast fuzzy time series. Physica A 336 (2006) 481-491) models, we conclude that the proposed model surpasses in accuracy these conventional fuzzy time-series models.
Ocular biometric characteristics of cataract patients in western China.
Huang, Qing; Huang, Yongzhi; Luo, Qu; Fan, Wei
2018-04-17
We aimed to measure ocular biometric characteristics in older cataract patients from western China. Ocular biometry records were retrospectively analyzed for 6933 patients with cataracts (6933 eyes) at least 50 years old who were treated at West China Hospital of Sichuan University. Partial coherence laser interferometry gave the following population averages: axial length (AL), 24.32 ± 2.42 mm; anterior chamber depth (ACD), 3.08 ± 0.47 mm; keratometric power (K), 44.23 ± 1.66 diopters; and corneal astigmatism (CA), 1.00 ± 0.92 diopters. The percentage of individuals with AL > 26.5 mm was 13.66%, while the percentage with CA > 1.0 diopters was 35.54%. Mean AL and ACD showed a trend of decrease with increasing age (P < 0.001). AL correlated positively with ACD (Spearman coefficient, 0.542) and CA (0.111), but negatively with K (- 0.411) (all P < 0.01). K also correlated negatively with ACD (- 0.078, P < 0.01). These results show, for the first time, that older cataract patients from western China have similar ocular biometric characteristics as other populations. The high prevalence of severe axial myopia warrants further investigation.
Ocular biometric measurements in cataract surgery candidates in Portugal
Hoffer, Kenneth J.; Ribeiro, Filomena; Ribeiro, Paulo; O’Neill, João G.
2017-01-01
Objective Describe the ocular biometric parameters and their associations in a population of cataract surgery candidates. Methods A cross-sectional study of 13,012 eyes of 6,506 patients was performed. Biometric parameters of the eyes were measured by optical low-coherence reflectometry. The axial length (AL), mean keratometry (K) and astigmatism, anterior chamber depth (ACD) (epithelium to lens), lens thickness (LT), and Corneal Diameter (CD) were evaluated. Results The mean age was 69 ± 10 years (44–99 years). Mean AL, Km, and ACD were 23.87 ± 1.55 mm (19.8–31.92 mm), 43.91 ± 1.71 D (40.61–51.14 D), and 3.25 ± 0.44 mm (2.04–5.28 mm), respectively. The mean LT was 4.32 ± 0.49 mm (2.73–5.77 mm) and the mean CD was 12.02 ± 0.46 mm (10.50–14.15 mm). The mean corneal astigmatism was 1.08 ± 0.84 D (0.00–7.58 D) and 43.5% of eyes had astigmatism ≥ 1.00 D. Male patients had longer AL and ACDs (p < .001) and flatter corneas (p < .001). In regression models considering age, gender, Km, ACD, LT, and CD, a longer AL was associated with being male and having higher ACD, LT and CD. Conclusions These data represent normative biometric values for the Portuguese population. The greatest predictor of ocular biometrics was gender. There was no significant correlation between age and AL, ACD, or Km. These results may be relevant in the evaluation of refractive error and in the calculation of intraocular lens power. PMID:28982150
Ocular biometric measurements in cataract surgery candidates in Portugal.
Ferreira, Tiago B; Hoffer, Kenneth J; Ribeiro, Filomena; Ribeiro, Paulo; O'Neill, João G
2017-01-01
Describe the ocular biometric parameters and their associations in a population of cataract surgery candidates. A cross-sectional study of 13,012 eyes of 6,506 patients was performed. Biometric parameters of the eyes were measured by optical low-coherence reflectometry. The axial length (AL), mean keratometry (K) and astigmatism, anterior chamber depth (ACD) (epithelium to lens), lens thickness (LT), and Corneal Diameter (CD) were evaluated. The mean age was 69 ± 10 years (44-99 years). Mean AL, Km, and ACD were 23.87 ± 1.55 mm (19.8-31.92 mm), 43.91 ± 1.71 D (40.61-51.14 D), and 3.25 ± 0.44 mm (2.04-5.28 mm), respectively. The mean LT was 4.32 ± 0.49 mm (2.73-5.77 mm) and the mean CD was 12.02 ± 0.46 mm (10.50-14.15 mm). The mean corneal astigmatism was 1.08 ± 0.84 D (0.00-7.58 D) and 43.5% of eyes had astigmatism ≥ 1.00 D. Male patients had longer AL and ACDs (p < .001) and flatter corneas (p < .001). In regression models considering age, gender, Km, ACD, LT, and CD, a longer AL was associated with being male and having higher ACD, LT and CD. These data represent normative biometric values for the Portuguese population. The greatest predictor of ocular biometrics was gender. There was no significant correlation between age and AL, ACD, or Km. These results may be relevant in the evaluation of refractive error and in the calculation of intraocular lens power.
Aggregated Indexing of Biomedical Time Series Data
Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A.T.
2016-01-01
Remote and wearable medical sensing has the potential to create very large and high dimensional datasets. Medical time series databases must be able to efficiently store, index, and mine these datasets to enable medical professionals to effectively analyze data collected from their patients. Conventional high dimensional indexing methods are a two stage process. First, a superset of the true matches is efficiently extracted from the database. Second, supersets are pruned by comparing each of their objects to the query object and rejecting any objects falling outside a predetermined radius. This pruning stage heavily dominates the computational complexity of most conventional search algorithms. Therefore, indexing algorithms can be significantly improved by reducing the amount of pruning. This paper presents an online algorithm to aggregate biomedical times series data to significantly reduce the search space (index size) without compromising the quality of search results. This algorithm is built on the observation that biomedical time series signals are composed of cyclical and often similar patterns. This algorithm takes in a stream of segments and groups them to highly concentrated collections. Locality Sensitive Hashing (LSH) is used to reduce the overall complexity of the algorithm, allowing it to run online. The output of this aggregation is used to populate an index. The proposed algorithm yields logarithmic growth of the index (with respect to the total number of objects) while keeping sensitivity and specificity simultaneously above 98%. Both memory and runtime complexities of time series search are improved when using aggregated indexes. In addition, data mining tasks, such as clustering, exhibit runtimes that are orders of magnitudes faster when run on aggregated indexes. PMID:27617298
Analyses of Inhomogeneities in Radiosonde Temperature and Humidity Time Series.
NASA Astrophysics Data System (ADS)
Zhai, Panmao; Eskridge, Robert E.
1996-04-01
Twice daily radiosonde data from selected stations in the United States (period 1948 to 1990) and China (period 1958 to 1990) were sorted into time series. These stations have one sounding taken in darkness and the other in sunlight. The analysis shows that the 0000 and 1200 UTC time series are highly correlated. Therefore, the Easterling and Peterson technique was tested on the 0000 and 1200 time series to detect inhomogeneities and to estimate the size of the biases. Discontinuities were detected using the difference series created from the 0000 and 1200 UTC time series. To establish that the detected bias was significant, a t test was performed to confirm that the change occurs in the daytime series but not in the nighttime series.Both U.S. and Chinese radiosonde temperature and humidity data include inhomogeneities caused by changes in radiosonde sensors and observation times. The U.S. humidity data have inhomogeneities that were caused by instrument changes and the censoring of data. The practice of reporting relative humidity as 19% when it is lower than 20% or the temperature is below 40°C is called censoring. This combination of procedural and instrument changes makes the detection of biases and adjustment of the data very difficult. In the Chinese temperatures, them are inhomogeneities related to a change in the radiation correction procedure.Test results demonstrate that a modified Easterling and Peterson method is suitable for use in detecting and adjusting time series radiosonde data.Accurate stations histories are very desirable. Stations histories can confirm that detected inhomogeneities are related to instrument or procedural changes. Adjustments can then he made to the data with some confidence.
Self-organising mixture autoregressive model for non-stationary time series modelling.
Ni, He; Yin, Hujun
2008-12-01
Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.
Body identification, biometrics and medicine: ethical and social considerations.
Mordini, Emilio; Ottolini, Corinna
2007-01-01
Identity is important when it is weak. This apparent paradox is the core of the current debate on identity. Traditionally, verification of identity has been based upon authentication of attributed and biographical characteristics. After small scale societies and large scale, industrial societies, globalization represents the third period of personal identification. The human body lies at the heart of all strategies for identity management. The tension between human body and personal identity is critical in the health care sector. The health care sector is second only to the financial sector in term of the number of biometric users. Many hospitals and healthcare organizations are in progress to deploy biometric security architecture. Secure identification is critical in the health care system, both to control logic access to centralized archives of digitized patients' data, and to limit physical access to buildings and hospital wards, and to authenticate medical and social support personnel. There is also an increasing need to identify patients with a high degree of certainty. Finally there is the risk that biometric authentication devices can significantly reveal any health information. All these issues require a careful ethical and political scrutiny.
Okpala, Charles Odilichukwu R; Bono, Gioacchino
2016-03-15
The practicality of biometrics of seafood cannot be overemphasized, particularly for competent authorities of the shrimp industry. However, there is a paucity of relevant literature on the relationship between biometric and physicochemical indices of freshly harvested shrimp. This work therefore investigated the relationship between biometric (standard length (SL), total weight (TW) and condition factor (CF)) and physicochemical (moisture content, pH, titratable acidity, water activity, water retention index, colour values and fracturability) characteristics of freshly harvested Pacific white shrimp (Litopenaeus vannamei) obtained from three different farms. The relationships between these parameters were determined using correlation and regression analyses. No significant correlation (P > 0.05) was found between the biometric and physicochemical indices of the sampled L. vannamei specimens. Possibly the lack of post-mortem and physical change(s) at day of harvest together with the absence of temporal variable may have collectively limited the degree of any significant correlation between biometric and physicochemical data points measured in this study. Although the TWs of freshly harvested L. vannamei shrimp resembled (P > 0.05), SL and CF differed significantly (P < 0.05) with minimal explained variance. Moreover, some biometric and physicochemical variables were independently correlated (P < 0.05). Data indicated that no significant correlation existed between biometric and physicochemical characteristics of freshly harvested L. vannamei shrimp. Across the farms studied, however, the biometric data were comparable. To best knowledge, this is the first study to investigate the biometric and physicochemical properties of freshly harvested shrimp using a comparative approach, which is also applicable to other economically important aquaculture species. Overall, this work provides useful information for competent authorities/stakeholders of the fishery industry and
Gaze Estimation for Off-Angle Iris Recognition Based on the Biometric Eye Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karakaya, Mahmut; Barstow, Del R; Santos-Villalobos, Hector J
Iris recognition is among the highest accuracy biometrics. However, its accuracy relies on controlled high quality capture data and is negatively affected by several factors such as angle, occlusion, and dilation. Non-ideal iris recognition is a new research focus in biometrics. In this paper, we present a gaze estimation method designed for use in an off-angle iris recognition framework based on the ANONYMIZED biometric eye model. Gaze estimation is an important prerequisite step to correct an off-angle iris images. To achieve the accurate frontal reconstruction of an off-angle iris image, we first need to estimate the eye gaze direction frommore » elliptical features of an iris image. Typically additional information such as well-controlled light sources, head mounted equipment, and multiple cameras are not available. Our approach utilizes only the iris and pupil boundary segmentation allowing it to be applicable to all iris capture hardware. We compare the boundaries with a look-up-table generated by using our biologically inspired biometric eye model and find the closest feature point in the look-up-table to estimate the gaze. Based on the results from real images, the proposed method shows effectiveness in gaze estimation accuracy for our biometric eye model with an average error of approximately 3.5 degrees over a 50 degree range.« less
Indispensable finite time corrections for Fokker-Planck equations from time series data.
Ragwitz, M; Kantz, H
2001-12-17
The reconstruction of Fokker-Planck equations from observed time series data suffers strongly from finite sampling rates. We show that previously published results are degraded considerably by such effects. We present correction terms which yield a robust estimation of the diffusion terms, together with a novel method for one-dimensional problems. We apply these methods to time series data of local surface wind velocities, where the dependence of the diffusion constant on the state variable shows a different behavior than previously suggested.
Nonlinear time-series-based adaptive control applications
NASA Technical Reports Server (NTRS)
Mohler, R. R.; Rajkumar, V.; Zakrzewski, R. R.
1991-01-01
A control design methodology based on a nonlinear time-series reference model is presented. It is indicated by highly nonlinear simulations that such designs successfully stabilize troublesome aircraft maneuvers undergoing large changes in angle of attack as well as large electric power transients due to line faults. In both applications, the nonlinear controller was significantly better than the corresponding linear adaptive controller. For the electric power network, a flexible AC transmission system with series capacitor power feedback control is studied. A bilinear autoregressive moving average reference model is identified from system data, and the feedback control is manipulated according to a desired reference state. The control is optimized according to a predictive one-step quadratic performance index. A similar algorithm is derived for control of rapid changes in aircraft angle of attack over a normally unstable flight regime. In the latter case, however, a generalization of a bilinear time-series model reference includes quadratic and cubic terms in angle of attack.
Space Object Classification Using Fused Features of Time Series Data
NASA Astrophysics Data System (ADS)
Jia, B.; Pham, K. D.; Blasch, E.; Shen, D.; Wang, Z.; Chen, G.
In this paper, a fused feature vector consisting of raw time series and texture feature information is proposed for space object classification. The time series data includes historical orbit trajectories and asteroid light curves. The texture feature is derived from recurrence plots using Gabor filters for both unsupervised learning and supervised learning algorithms. The simulation results show that the classification algorithms using the fused feature vector achieve better performance than those using raw time series or texture features only.
Relating the large-scale structure of time series and visibility networks.
Rodríguez, Miguel A
2017-06-01
The structure of time series is usually characterized by means of correlations. A new proposal based on visibility networks has been considered recently. Visibility networks are complex networks mapped from surfaces or time series using visibility properties. The structures of time series and visibility networks are closely related, as shown by means of fractional time series in recent works. In these works, a simple relationship between the Hurst exponent H of fractional time series and the exponent of the distribution of edges γ of the corresponding visibility network, which exhibits a power law, is shown. To check and generalize these results, in this paper we delve into this idea of connected structures by defining both structures more properly. In addition to the exponents used before, H and γ, which take into account local properties, we consider two more exponents that, as we will show, characterize global properties. These are the exponent α for time series, which gives the scaling of the variance with the size as var∼T^{2α}, and the exponent κ of their corresponding network, which gives the scaling of the averaged maximum of the number of edges, 〈k_{M}〉∼N^{κ}. With this representation, a more precise connection between the structures of general time series and their associated visibility network is achieved. Similarities and differences are more clearly established, and new scaling forms of complex networks appear in agreement with their respective classes of time series.
The Prediction of Teacher Turnover Employing Time Series Analysis.
ERIC Educational Resources Information Center
Costa, Crist H.
The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…
Sir Ronald A. Fisher and the International Biometric Society.
Billard, Lynne
2014-06-01
The year 2012 marks the 50th anniversary of the death of Sir Ronald A. Fisher, one of the two Fathers of Statistics and a Founder of the International Biometric Society (the "Society"). To celebrate the extraordinary genius of Fisher and the far-sighted vision of Fisher and Chester Bliss in organizing and promoting the formation of the Society, this article looks at the origins and growth of the Society, some of the key players and events, and especially the roles played by Fisher himself as the First President. A fresh look at Fisher, the man rather than the scientific genius is also presented. © 2014, The International Biometric Society.
Detection of "noisy" chaos in a time series
NASA Technical Reports Server (NTRS)
Chon, K. H.; Kanters, J. K.; Cohen, R. J.; Holstein-Rathlou, N. H.
1997-01-01
Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both the internal dynamics of the systems, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series, and if this determinism has chaotic attributes. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations, and applied to heart rate variability data.
The Value of Interrupted Time-Series Experiments for Community Intervention Research
Biglan, Anthony; Ary, Dennis; Wagenaar, Alexander C.
2015-01-01
Greater use of interrupted time-series experiments is advocated for community intervention research. Time-series designs enable the development of knowledge about the effects of community interventions and policies in circumstances in which randomized controlled trials are too expensive, premature, or simply impractical. The multiple baseline time-series design typically involves two or more communities that are repeatedly assessed, with the intervention introduced into one community at a time. It is particularly well suited to initial evaluations of community interventions and the refinement of those interventions. This paper describes the main features of multiple baseline designs and related repeated-measures time-series experiments, discusses the threats to internal validity in multiple baseline designs, and outlines techniques for statistical analyses of time-series data. Examples are given of the use of multiple baseline designs in evaluating community interventions and policy changes. PMID:11507793
Inference of scale-free networks from gene expression time series.
Daisuke, Tominaga; Horton, Paul
2006-04-01
Quantitative time-series observation of gene expression is becoming possible, for example by cell array technology. However, there are no practical methods with which to infer network structures using only observed time-series data. As most computational models of biological networks for continuous time-series data have a high degree of freedom, it is almost impossible to infer the correct structures. On the other hand, it has been reported that some kinds of biological networks, such as gene networks and metabolic pathways, may have scale-free properties. We hypothesize that the architecture of inferred biological network models can be restricted to scale-free networks. We developed an inference algorithm for biological networks using only time-series data by introducing such a restriction. We adopt the S-system as the network model, and a distributed genetic algorithm to optimize models to fit its simulated results to observed time series data. We have tested our algorithm on a case study (simulated data). We compared optimization under no restriction, which allows for a fully connected network, and under the restriction that the total number of links must equal that expected from a scale free network. The restriction reduced both false positive and false negative estimation of the links and also the differences between model simulation and the given time-series data.
Recurrent Neural Networks for Multivariate Time Series with Missing Values.
Che, Zhengping; Purushotham, Sanjay; Cho, Kyunghyun; Sontag, David; Liu, Yan
2018-04-17
Multivariate time series data in practical applications, such as health care, geoscience, and biology, are characterized by a variety of missing values. In time series prediction and other related tasks, it has been noted that missing values and their missing patterns are often correlated with the target labels, a.k.a., informative missingness. There is very limited work on exploiting the missing patterns for effective imputation and improving prediction performance. In this paper, we develop novel deep learning models, namely GRU-D, as one of the early attempts. GRU-D is based on Gated Recurrent Unit (GRU), a state-of-the-art recurrent neural network. It takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results. Experiments of time series classification tasks on real-world clinical datasets (MIMIC-III, PhysioNet) and synthetic datasets demonstrate that our models achieve state-of-the-art performance and provide useful insights for better understanding and utilization of missing values in time series analysis.
Characterising experimental time series using local intrinsic dimension
NASA Astrophysics Data System (ADS)
Buzug, Thorsten M.; von Stamm, Jens; Pfister, Gerd
1995-02-01
Experimental strange attractors are analysed with the averaged local intrinsic dimension proposed by A. Passamante et al. [Phys. Rev. A 39 (1989) 3640] which is based on singular value decomposition of local trajectory matrices. The results are compared to the values of Kaplan-Yorke and the correlation dimension. The attractors, reconstructed with Takens' delay time coordinates from scalar velocity time series, are measured in the hydrodynamic Taylor-Couette system. A period doubling route towards chaos obtained from a very short Taylor-Couette cylinder yields a sequence of experimental time series where the local intrinsic dimension is applied.
Ragan, Elizabeth J; Johnson, Courtney; Milton, Jacqueline N; Gill, Christopher J
2016-11-02
One of the greatest public health challenges in low- and middle-income countries (LMICs) is identifying people over time and space. Recent years have seen an explosion of interest in developing electronic approaches to addressing this problem, with mobile technology at the forefront of these efforts. We investigate the possibility of biometrics as a simple, cost-efficient, and portable solution. Common biometrics approaches include fingerprinting, iris scanning and facial recognition, but all are less than ideal due to complexity, infringement on privacy, cost, or portability. Ear biometrics, however, proved to be a unique and viable solution. We developed an identification algorithm then conducted a cross sectional study in which we photographed left and right ears from 25 consenting adults. We then conducted re-identification and statistical analyses to identify the accuracy and replicability of our approach. Through principal component analysis, we found the curve of the ear helix to be the most reliable anatomical structure and the basis for re-identification. Although an individual ear allowed for high re-identification rate (88.3%), when both left and right ears were paired together, our rate of re-identification amidst the pool of potential matches was 100%. The results of this study have implications on future efforts towards building a biometrics solution for patient identification in LMICs. We provide a conceptual platform for further investigation into the development of an ear biometrics identification mobile application.
Improving the recognition of fingerprint biometric system using enhanced image fusion
NASA Astrophysics Data System (ADS)
Alsharif, Salim; El-Saba, Aed; Stripathi, Reshma
2010-04-01
Fingerprints recognition systems have been widely used by financial institutions, law enforcement, border control, visa issuing, just to mention few. Biometric identifiers can be counterfeited, but considered more reliable and secure compared to traditional ID cards or personal passwords methods. Fingerprint pattern fusion improves the performance of a fingerprint recognition system in terms of accuracy and security. This paper presents digital enhancement and fusion approaches that improve the biometric of the fingerprint recognition system. It is a two-step approach. In the first step raw fingerprint images are enhanced using high-frequency-emphasis filtering (HFEF). The second step is a simple linear fusion process between the raw images and the HFEF ones. It is shown that the proposed approach increases the verification and identification of the fingerprint biometric recognition system, where any improvement is justified using the correlation performance metrics of the matching algorithm.
Time series analysis of temporal networks
NASA Astrophysics Data System (ADS)
Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh
2016-01-01
A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue
Heritability of refractive error and ocular biometrics: the Genes in Myopia (GEM) twin study.
Dirani, Mohamed; Chamberlain, Matthew; Shekar, Sri N; Islam, Amirul F M; Garoufalis, Pam; Chen, Christine Y; Guymer, Robyn H; Baird, Paul N
2006-11-01
A classic twin study was undertaken to assess the contribution of genes and environment to the development of refractive errors and ocular biometrics in a twin population. A total of 1224 twins (345 monozygotic [MZ] and 267 dizygotic [DZ] twin pairs) aged between 18 and 88 years were examined. All twins completed a questionnaire consisting of a medical history, education, and zygosity. Objective refraction was measured in all twins, and biometric measurements were obtained using partial coherence interferometry. Intrapair correlations for spherical equivalent and ocular biometrics were significantly higher in the MZ than in the DZ twin pairs (P < 0.05), when refraction was considered as a continuous variable. A significant gender difference in the variation of spherical equivalent and ocular biometrics was found (P < 0.05). A genetic model specifying an additive, dominant, and unique environmental factor that was sex limited was the best fit for all measured variables. Heritability of spherical equivalents of 88% and 75% were found in the men and women, respectively, whereas, that of axial length was 94% and 92%, respectively. Additive genetic effects accounted for a greater proportion of the variance in spherical equivalent, whereas the variance in ocular biometrics, particularly axial length was explained mostly by dominant genetic effects. Genetic factors, both additive and dominant, play a significant role in refractive error (myopia and hypermetropia) as well as in ocular biometrics, particularly axial length. The sex limitation ADE model (additive genetic, nonadditive genetic, and environmental components) provided the best-fit genetic model for all parameters.
Early biometric lag in the prediction of small for gestational age neonates and preeclampsia.
Schwartz, Nadav; Pessel, Cara; Coletta, Jaclyn; Krieger, Abba M; Timor-Tritsch, Ilan E
2011-01-01
An early fetal growth lag may be a marker of future complications. We sought to determine the utility of early biometric variables in predicting adverse pregnancy outcomes. In this retrospective cohort study, the crown-rump length at 11 to 14 weeks and the head circumference, biparietal diameter, abdominal circumference, femur length, humerus length, transverse cerebellar diameter, and estimated fetal weight at 18 to 24 weeks were converted to an estimated gestational age using published regression formulas. Sonographic fetal growth (difference between each biometric gestational age and the crown-rump length gestational age) minus expected fetal growth (number of days elapsed between the two scans) yielded the biometric growth lag. These lags were tested as predictors of small for gestational age (SGA) neonates (≤10th percentile) and preeclampsia. A total of 245 patients were included. Thirty-two (13.1%) delivered an SGA neonate, and 43 (17.6%) had the composite outcome. The head circumference, biparietal diameter, abdominal circumference, and estimated fetal weight lags were identified as significant predictors of SGA neonates after adjusted analyses (P < .05). The addition of either the estimated fetal weight or abdominal circumference lag to maternal characteristics alone significantly improved the performance of the predictive model, achieving areas under the curve of 0.72 and 0.74, respectively. No significant association was found between the biometric lag variables and the development of preeclampsia. Routinely available biometric data can be used to improve the prediction of adverse outcomes such as SGA. These biometric lags should be considered in efforts to develop screening algorithms for adverse outcomes.
Visual analytics techniques for large multi-attribute time series data
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.
2008-01-01
Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.
Dynamical Analysis and Visualization of Tornadoes Time Series
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns. PMID:25790281
Dynamical analysis and visualization of tornadoes time series.
Lopes, António M; Tenreiro Machado, J A
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.
"Observation Obscurer" - Time Series Viewer, Editor and Processor
NASA Astrophysics Data System (ADS)
Andronov, I. L.
The program is described, which contains a set of subroutines suitable for East viewing and interactive filtering and processing of regularly and irregularly spaced time series. Being a 32-bit DOS application, it may be used as a default fast viewer/editor of time series in any compute shell ("commander") or in Windows. It allows to view the data in the "time" or "phase" mode, to remove ("obscure") or filter outstanding bad points; to make scale transformations and smoothing using few methods (e.g. mean with phase binning, determination of the statistically opti- mal number of phase bins; "running parabola" (Andronov, 1997, As. Ap. Suppl, 125, 207) fit and to make time series analysis using some methods, e.g. correlation, autocorrelation and histogram analysis: determination of extrema etc. Some features have been developed specially for variable star observers, e.g. the barycentric correction, the creation and fast analysis of "OC" diagrams etc. The manual for "hot keys" is presented. The computer code was compiled with a 32-bit Free Pascal (www.freepascal.org).
Modelling road accidents: An approach using structural time series
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-09-01
In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.
Temporal stability of visual search-driven biometrics
NASA Astrophysics Data System (ADS)
Yoon, Hong-Jun; Carmichael, Tandy R.; Tourassi, Georgia
2015-03-01
Previously, we have shown the potential of using an individual's visual search pattern as a possible biometric. That study focused on viewing images displaying dot-patterns with different spatial relationships to determine which pattern can be more effective in establishing the identity of an individual. In this follow-up study we investigated the temporal stability of this biometric. We performed an experiment with 16 individuals asked to search for a predetermined feature of a random-dot pattern as we tracked their eye movements. Each participant completed four testing sessions consisting of two dot patterns repeated twice. One dot pattern displayed concentric circles shifted to the left or right side of the screen overlaid with visual noise, and participants were asked which side the circles were centered on. The second dot-pattern displayed a number of circles (between 0 and 4) scattered on the screen overlaid with visual noise, and participants were asked how many circles they could identify. Each session contained 5 untracked tutorial questions and 50 tracked test questions (200 total tracked questions per participant). To create each participant's "fingerprint", we constructed a Hidden Markov Model (HMM) from the gaze data representing the underlying visual search and cognitive process. The accuracy of the derived HMM models was evaluated using cross-validation for various time-dependent train-test conditions. Subject identification accuracy ranged from 17.6% to 41.8% for all conditions, which is significantly higher than random guessing (1/16 = 6.25%). The results suggest that visual search pattern is a promising, temporally stable personalized fingerprint of perceptual organization.
Multiscale Poincaré plots for visualizing the structure of heartbeat time series.
Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L
2016-02-09
Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.
Time series patterns and language support in DBMS
NASA Astrophysics Data System (ADS)
Telnarova, Zdenka
2017-07-01
This contribution is focused on pattern type Time Series as a rich in semantics representation of data. Some example of implementation of this pattern type in traditional Data Base Management Systems is briefly presented. There are many approaches how to manipulate with patterns and query patterns. Crucial issue can be seen in systematic approach to pattern management and specific pattern query language which takes into consideration semantics of patterns. Query language SQL-TS for manipulating with patterns is shown on Time Series data.
ERIC Educational Resources Information Center
Carpenter, Darrell R.
2011-01-01
Biometric technology is rapidly gaining popularity as an access control mechanism in the workplace. In some instances, systems relying on biometric technology have not been well received by employees. One reason for resistance may be perceived privacy issues associated with biometrics. This research draws on previous organizational information…
Estimation of Parameters from Discrete Random Nonstationary Time Series
NASA Astrophysics Data System (ADS)
Takayasu, H.; Nakamura, T.
For the analysis of nonstationary stochastic time series we introduce a formulation to estimate the underlying time-dependent parameters. This method is designed for random events with small numbers that are out of the applicability range of the normal distribution. The method is demonstrated for numerical data generated by a known system, and applied to time series of traffic accidents, batting average of a baseball player and sales volume of home electronics.
Documentation of a spreadsheet for time-series analysis and drawdown estimation
Halford, Keith J.
2006-01-01
Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user
InSAR Deformation Time Series Processed On-Demand in the Cloud
NASA Astrophysics Data System (ADS)
Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.
2017-12-01
During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time
Time series, correlation matrices and random matrix models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vinayak; Seligman, Thomas H.
2014-01-08
In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series.more » By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.« less
A univariate model of river water nitrate time series
NASA Astrophysics Data System (ADS)
Worrall, F.; Burt, T. P.
1999-01-01
Four time series were taken from three catchments in the North and South of England. The sites chosen included two in predominantly agricultural catchments, one at the tidal limit and one downstream of a sewage treatment works. A time series model was constructed for each of these series as a means of decomposing the elements controlling river water nitrate concentrations and to assess whether this approach could provide a simple management tool for protecting water abstractions. Autoregressive (AR) modelling of the detrended and deseasoned time series showed a "memory effect". This memory effect expressed itself as an increase in the winter-summer difference in nitrate levels that was dependent upon the nitrate concentration 12 or 6 months previously. Autoregressive moving average (ARMA) modelling showed that one of the series contained seasonal, non-stationary elements that appeared as an increasing trend in the winter-summer difference. The ARMA model was used to predict nitrate levels and predictions were tested against data held back from the model construction process - predictions gave average percentage errors of less than 10%. Empirical modelling can therefore provide a simple, efficient method for constructing management models for downstream water abstraction.
Biometric Identifiers and Border Security: 9/11 Commission Recommendations and Related Issues
2005-02-07
joints, and knuckles, has been used for about 30 years to control access to secure facilities such as nuclear power plants. Facial recognition analyzes...this end, however, DOS has also begun phasing in the use of facial recognition technologies with visa and passport photos, but these technologies are...party, have approved interoperable biometric standards, and the baseline biometric will be facial recognition . Member states will also have the option
Aviation Security: Biometric Technology and Risk Based Security Aviation Passenger Screening Program
2012-12-01
distribution is unlimited 12b. DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words ) Since 9/11, the Transportation Security Administration (TSA...Council POE Point Of Entry RBS Risk-Based Security SENTRI Secure Electronic Network for Travelers Rapid Inspection SFPD Secure Flight Passenger...Committee on Biometrics provides the origins of biometrics; the term “biometrics” is derived from the Greek words “bio” (life) and “metrics” (to measure
A reference system for animal biometrics: application to the northern leopard frog
Petrovska-Delacretaz, D.; Edwards, A.; Chiasson, J.; Chollet, G.; Pilliod, D.S.
2014-01-01
Reference systems and public databases are available for human biometrics, but to our knowledge nothing is available for animal biometrics. This is surprising because animals are not required to give their agreement to be in a database. This paper proposes a reference system and database for the northern leopard frog (Lithobates pipiens). Both are available for reproducible experiments. Results of both open set and closed set experiments are given.
Vilupuru, Abhiram S.; Glasser, Adrian
2010-01-01
Experiments were undertaken to understand the relationship between dynamic accommodative refractive and biometric (lens thickness (LT), anterior chamber depth (ACD) and anterior segment length (ASL=ACD+LT)) changes during Edinger–Westphal stimulated accommodation in rhesus monkeys. Experiments were conducted on three rhesus monkeys (aged 11·5, 4·75 and 4·75 years) which had undergone prior, bilateral, complete iridectomies and implantation of a stimulating electrode in the Edinger–Westphal (EW) nucleus. Accommodative refractive responses were first measured dynamically with video-based infrared photorefraction and then ocular biometric responses were measured dynamically with continuous ultrasound biometry (CUB) during EW stimulation. The same stimulus amplitudes were used for the refractive and biometric measurements to allow them to be compared. Main sequence relationships (ratio of peak velocity to amplitude) were calculated. Dynamic accommodative refractive changes are linearly correlated with the biometric changes and accommodative biometric changes in ACD, ASL and LT show systematic linear correlations with increasing accommodative amplitudes. The relationships are relatively similar for the eyes of the different monkeys. Dynamic analysis showed that main sequence relationships for both biometry and refraction are linear. Although accommodative refractive changes in the eye occur primarily due to changes in lens surface curvature, the refractive changes are well correlated with A-scan measured accommodative biometric changes. Accommodative changes in ACD, LT and ASL are all well correlated over the full extent of the accommodative response. PMID:15721617
FALSE DETERMINATIONS OF CHAOS IN SHORT NOISY TIME SERIES. (R828745)
A method (NEMG) proposed in 1992 for diagnosing chaos in noisy time series with 50 or fewer observations entails fitting the time series with an empirical function which predicts an observation in the series from previous observations, and then estimating the rate of divergenc...
Joint Feature Extraction and Classifier Design for ECG-Based Biometric Recognition.
Gutta, Sandeep; Cheng, Qi
2016-03-01
Traditional biometric recognition systems often utilize physiological traits such as fingerprint, face, iris, etc. Recent years have seen a growing interest in electrocardiogram (ECG)-based biometric recognition techniques, especially in the field of clinical medicine. In existing ECG-based biometric recognition methods, feature extraction and classifier design are usually performed separately. In this paper, a multitask learning approach is proposed, in which feature extraction and classifier design are carried out simultaneously. Weights are assigned to the features within the kernel of each task. We decompose the matrix consisting of all the feature weights into sparse and low-rank components. The sparse component determines the features that are relevant to identify each individual, and the low-rank component determines the common feature subspace that is relevant to identify all the subjects. A fast optimization algorithm is developed, which requires only the first-order information. The performance of the proposed approach is demonstrated through experiments using the MIT-BIH Normal Sinus Rhythm database.
Symplectic geometry spectrum regression for prediction of noisy time series
NASA Astrophysics Data System (ADS)
Xie, Hong-Bo; Dokos, Socrates; Sivakumar, Bellie; Mengersen, Kerrie
2016-05-01
We present the symplectic geometry spectrum regression (SGSR) technique as well as a regularized method based on SGSR for prediction of nonlinear time series. The main tool of analysis is the symplectic geometry spectrum analysis, which decomposes a time series into the sum of a small number of independent and interpretable components. The key to successful regularization is to damp higher order symplectic geometry spectrum components. The effectiveness of SGSR and its superiority over local approximation using ordinary least squares are demonstrated through prediction of two noisy synthetic chaotic time series (Lorenz and Rössler series), and then tested for prediction of three real-world data sets (Mississippi River flow data and electromyographic and mechanomyographic signal recorded from human body).
Time-series analysis of delta13C from tree rings. I. Time trends and autocorrelation.
Monserud, R A; Marshall, J D
2001-09-01
Univariate time-series analyses were conducted on stable carbon isotope ratios obtained from tree-ring cellulose. We looked for the presence and structure of autocorrelation. Significant autocorrelation violates the statistical independence assumption and biases hypothesis tests. Its presence would indicate the existence of lagged physiological effects that persist for longer than the current year. We analyzed data from 28 trees (60-85 years old; mean = 73 years) of western white pine (Pinus monticola Dougl.), ponderosa pine (Pinus ponderosa Laws.), and Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. glauca) growing in northern Idaho. Material was obtained by the stem analysis method from rings laid down in the upper portion of the crown throughout each tree's life. The sampling protocol minimized variation caused by changing light regimes within each tree. Autoregressive moving average (ARMA) models were used to describe the autocorrelation structure over time. Three time series were analyzed for each tree: the stable carbon isotope ratio (delta(13)C); discrimination (delta); and the difference between ambient and internal CO(2) concentrations (c(a) - c(i)). The effect of converting from ring cellulose to whole-leaf tissue did not affect the analysis because it was almost completely removed by the detrending that precedes time-series analysis. A simple linear or quadratic model adequately described the time trend. The residuals from the trend had a constant mean and variance, thus ensuring stationarity, a requirement for autocorrelation analysis. The trend over time for c(a) - c(i) was particularly strong (R(2) = 0.29-0.84). Autoregressive moving average analyses of the residuals from these trends indicated that two-thirds of the individual tree series contained significant autocorrelation, whereas the remaining third were random (white noise) over time. We were unable to distinguish between individuals with and without significant autocorrelation
Spectral analysis for GNSS coordinate time series using chirp Fourier transform
NASA Astrophysics Data System (ADS)
Feng, Shengtao; Bo, Wanju; Ma, Qingzun; Wang, Zifan
2017-12-01
Spectral analysis for global navigation satellite system (GNSS) coordinate time series provides a principal tool to understand the intrinsic mechanism that affects tectonic movements. Spectral analysis methods such as the fast Fourier transform, Lomb-Scargle spectrum, evolutionary power spectrum, wavelet power spectrum, etc. are used to find periodic characteristics in time series. Among spectral analysis methods, the chirp Fourier transform (CFT) with less stringent requirements is tested with synthetic and actual GNSS coordinate time series, which proves the accuracy and efficiency of the method. With the length of series only limited to even numbers, CFT provides a convenient tool for windowed spectral analysis. The results of ideal synthetic data prove CFT accurate and efficient, while the results of actual data show that CFT is usable to derive periodic information from GNSS coordinate time series.
Ozone Time Series From GOMOS and SAGE II Measurements
NASA Astrophysics Data System (ADS)
Kyrola, E. T.; Laine, M.; Tukiainen, S.; Sofieva, V.; Zawodny, J. M.; Thomason, L. W.
2011-12-01
Satellite measurements are essential for monitoring changes in the global stratospheric ozone distribution. Both the natural variation and anthropogenic change are strongly dependent on altitude. Stratospheric ozone has been measured from space with good vertical resolution since 1985 by the SAGE II solar occultation instrument. The advantage of the occultation measurement principle is the self-calibration, which is essential to ensuring stable time series. SAGE II measurements in 1985-2005 have been a valuable data set in investigations of trends in the vertical distribution of ozone. This time series can now be extended by the GOMOS measurements started in 2002. GOMOS is a stellar occultation instrument and offers, therefore, a natural continuation of SAGE II measurements. In this paper we study how well GOMOS and SAGE II measurements agree with each other in the period 2002-2005 when both instruments were measuring. We detail how the different spatial and temporal sampling of these two instruments affect the conformity of measurements. We study also how the retrieval specifics like absorption cross sections and assumed aerosol modeling affect the results. Various combined time series are constructed using different estimators and latitude-time grids. We also show preliminary results from a novel time series analysis based on Markov chain Monte Carlo approach.
Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time
NASA Astrophysics Data System (ADS)
Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.
2017-12-01
We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.
Stochastic modeling of hourly rainfall times series in Campania (Italy)
NASA Astrophysics Data System (ADS)
Giorgio, M.; Greco, R.
2009-04-01
Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil
Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines
del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J.; Raboso, Mariano
2015-01-01
Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements. PMID:26091392
Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.
del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano
2015-06-17
Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.
Generalized Riemann hypothesis and stochastic time series
NASA Astrophysics Data System (ADS)
Mussardo, Giuseppe; LeClair, André
2018-06-01
Using the Dirichlet theorem on the equidistribution of residue classes modulo q and the Lemke Oliver–Soundararajan conjecture on the distribution of pairs of residues on consecutive primes, we show that the domain of convergence of the infinite product of Dirichlet L-functions of non-principal characters can be extended from down to , without encountering any zeros before reaching this critical line. The possibility of doing so can be traced back to a universal diffusive random walk behavior of a series C N over the primes which underlies the convergence of the infinite product of the Dirichlet functions. The series C N presents several aspects in common with stochastic time series and its control requires to address a problem similar to the single Brownian trajectory problem in statistical mechanics. In the case of the Dirichlet functions of non principal characters, we show that this problem can be solved in terms of a self-averaging procedure based on an ensemble of block variables computed on extended intervals of primes. Those intervals, called inertial intervals, ensure the ergodicity and stationarity of the time series underlying the quantity C N . The infinity of primes also ensures the absence of rare events which would have been responsible for a different scaling behavior than the universal law of the random walks.
Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.
Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J
2016-02-01
It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.
Bat wing biometrics: using collagen-elastin bundles in bat wings as a unique individual identifier.
Amelon, Sybill K; Hooper, Sarah E; Womack, Kathryn M
2017-05-29
The ability to recognize individuals within an animal population is fundamental to conservation and management. Identification of individual bats has relied on artificial marking techniques that may negatively affect the survival and alter the behavior of individuals. Biometric systems use biological characteristics to identify individuals. The field of animal biometrics has expanded to include recognition of individuals based upon various morphologies and phenotypic variations including pelage patterns, tail flukes, and whisker arrangement. Biometric systems use 4 biologic measurement criteria: universality, distinctiveness, permanence, and collectability. Additionally, the system should not violate assumptions of capture-recapture methods that include no increased mortality or alterations of behavior. We evaluated whether individual bats could be uniquely identified based upon the collagen-elastin bundles that are visible with gross examination of their wings. We examined little brown bats ( Myotis lucifugus ), northern long-eared bats ( M. septentrionalis ), big brown bats ( Eptesicus fuscus ), and tricolored bats ( Perimyotis subflavus ) to determine whether the "wing prints" from the bundle network would satisfy the biologic measurement criteria. We evaluated 1,212 photographs from 230 individual bats comparing week 0 photos with those taken at weeks 3 or 6 and were able to confirm identity of individuals over time. Two blinded evaluators were able to successfully match 170 individuals in hand to photographs taken at weeks 0, 3, and 6. This study suggests that bats can be successfully re-identified using photographs taken at previous times. We suggest further evaluation of this methodology for use in a standardized system that can be shared among bat conservationists.
Bat wing biometrics: using collagen–elastin bundles in bat wings as a unique individual identifier
Hooper, Sarah E.; Womack, Kathryn M.
2017-01-01
Abstract The ability to recognize individuals within an animal population is fundamental to conservation and management. Identification of individual bats has relied on artificial marking techniques that may negatively affect the survival and alter the behavior of individuals. Biometric systems use biological characteristics to identify individuals. The field of animal biometrics has expanded to include recognition of individuals based upon various morphologies and phenotypic variations including pelage patterns, tail flukes, and whisker arrangement. Biometric systems use 4 biologic measurement criteria: universality, distinctiveness, permanence, and collectability. Additionally, the system should not violate assumptions of capture–recapture methods that include no increased mortality or alterations of behavior. We evaluated whether individual bats could be uniquely identified based upon the collagen–elastin bundles that are visible with gross examination of their wings. We examined little brown bats (Myotis lucifugus), northern long-eared bats (M. septentrionalis), big brown bats (Eptesicus fuscus), and tricolored bats (Perimyotis subflavus) to determine whether the “wing prints” from the bundle network would satisfy the biologic measurement criteria. We evaluated 1,212 photographs from 230 individual bats comparing week 0 photos with those taken at weeks 3 or 6 and were able to confirm identity of individuals over time. Two blinded evaluators were able to successfully match 170 individuals in hand to photographs taken at weeks 0, 3, and 6. This study suggests that bats can be successfully re-identified using photographs taken at previous times. We suggest further evaluation of this methodology for use in a standardized system that can be shared among bat conservationists. PMID:29674784
A simple and fast representation space for classifying complex time series
NASA Astrophysics Data System (ADS)
Zunino, Luciano; Olivares, Felipe; Bariviera, Aurelio F.; Rosso, Osvaldo A.
2017-03-01
In the context of time series analysis considerable effort has been directed towards the implementation of efficient discriminating statistical quantifiers. Very recently, a simple and fast representation space has been introduced, namely the number of turning points versus the Abbe value. It is able to separate time series from stationary and non-stationary processes with long-range dependences. In this work we show that this bidimensional approach is useful for distinguishing complex time series: different sets of financial and physiological data are efficiently discriminated. Additionally, a multiscale generalization that takes into account the multiple time scales often involved in complex systems has been also proposed. This multiscale analysis is essential to reach a higher discriminative power between physiological time series in health and disease.
Complex dynamic in ecological time series
Peter Turchin; Andrew D. Taylor
1992-01-01
Although the possibility of complex dynamical behaviors-limit cycles, quasiperiodic oscillations, and aperiodic chaos-has been recognized theoretically, most ecologists are skeptical of their importance in nature. In this paper we develop a methodology for reconstructing endogenous (or deterministic) dynamics from ecological time series. Our method consists of fitting...
Biogeochemistry from Gliders at the Hawaii Ocean Times-Series
NASA Astrophysics Data System (ADS)
Nicholson, D. P.; Barone, B.; Karl, D. M.
2016-02-01
At the Hawaii Ocean Time-series (HOT) autonomous, underwater gliders equipped with biogeochemical sensors observe the oceans for months at a time, sampling spatiotemporal scales missed by the ship-based programs. Over the last decade, glider data augmented by a foundation of time-series observations have shed light on biogeochemical dynamics occuring spatially at meso- and submesoscales and temporally on scales from diel to annual. We present insights gained from the synergy between glider observations, time-series measurements and remote sensing in the subtropical North Pacific. We focus on diel variability observed in dissolved oxygen and bio-optics and approaches to autonomously quantify net community production and gross primary production (GPP) as developed during the 2012 Hawaii Ocean Experiment - DYnamics of Light And Nutrients (HOE-DYLAN). Glider-based GPP measurements were extended to explore the relationship between GPP and mesoscale context over multiple years of Seaglider deployments.
Melbourne, Launice; Murnick, Jonathan; Chang, Taeun; Glass, Penny; Massaro, An N
2015-10-01
This study aims to evaluate individual regional brain biometrics and their association with developmental outcome in extremely low-birth-weight (ELBW) infants. This is a retrospective study evaluating term-equivalent magnetic resonance imaging (TE-MRI) from 27 ELBW infants with known developmental outcomes beyond 12 months corrected age. Regional biometric measurements were performed by a pediatric neuroradiologist blinded to outcome data. Measures included biparietal width, transcerebellar diameter (TCD), deep gray matter area (DGMA), ventricular dilatation, corpus callosum, and interhemispheric distance. The relationship between regional biometrics and Bayley-II developmental scores were evaluated with linear regression models. The study cohort had an average±standard deviation birth weight of 684±150 g, gestational age of 24.6±2 weeks and 48% males. DGMA was significantly associated with both cognitive and motor outcomes. Significant associations were also observed between TCD and corpus callosum splenium with cognitive and motor outcomes, respectively. Other biometric measures were not associated with outcome (p>0.05). DGMA<10.26 cm2 was highly specific for poor motor and cognitive outcome. TE-MRI biometrics reflecting impaired deep gray matter, callosal, and cerebellar size is associated with worse early childhood cognitive and motor outcomes. DGMA may be the most robust single biometric measure to predict adverse developmental outcome in preterm survivors. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
2015-05-01
Director, Operational Test and Evaluation Department of Defense (DOD) Automated Biometric Identification System (ABIS) Version 1.2 Initial...Operational Test and Evaluation Report May 2015 This report on the Department of Defense (DOD) Automated Biometric Identification System...COVERED - 4. TITLE AND SUBTITLE Department of Defense (DOD) Automated Biometric Identification System (ABIS) Version 1.2 Initial Operational Test
Improvements to surrogate data methods for nonstationary time series.
Lucio, J H; Valdés, R; Rodríguez, L R
2012-05-01
The method of surrogate data has been extensively applied to hypothesis testing of system linearity, when only one realization of the system, a time series, is known. Normally, surrogate data should preserve the linear stochastic structure and the amplitude distribution of the original series. Classical surrogate data methods (such as random permutation, amplitude adjusted Fourier transform, or iterative amplitude adjusted Fourier transform) are successful at preserving one or both of these features in stationary cases. However, they always produce stationary surrogates, hence existing nonstationarity could be interpreted as dynamic nonlinearity. Certain modifications have been proposed that additionally preserve some nonstationarity, at the expense of reproducing a great deal of nonlinearity. However, even those methods generally fail to preserve the trend (i.e., global nonstationarity in the mean) of the original series. This is the case of time series with unit roots in their autoregressive structure. Additionally, those methods, based on Fourier transform, either need first and last values in the original series to match, or they need to select a piece of the original series with matching ends. These conditions are often inapplicable and the resulting surrogates are adversely affected by the well-known artefact problem. In this study, we propose a simple technique that, applied within existing Fourier-transform-based methods, generates surrogate data that jointly preserve the aforementioned characteristics of the original series, including (even strong) trends. Moreover, our technique avoids the negative effects of end mismatch. Several artificial and real, stationary and nonstationary, linear and nonlinear time series are examined, in order to demonstrate the advantages of the methods. Corresponding surrogate data are produced with the classical and with the proposed methods, and the results are compared.
Use of Time-Series, ARIMA Designs to Assess Program Efficacy.
ERIC Educational Resources Information Center
Braden, Jeffery P.; And Others
1990-01-01
Illustrates use of time-series designs for determining efficacy of interventions with fictitious data describing drug-abuse prevention program. Discusses problems and procedures associated with time-series data analysis using Auto Regressive Integrated Moving Averages (ARIMA) models. Example illustrates application of ARIMA analysis for…