Building Simple Hidden Markov Models. Classroom Notes
ERIC Educational Resources Information Center
Ching, Wai-Ki; Ng, Michael K.
2004-01-01
Hidden Markov models (HMMs) are widely used in bioinformatics, speech recognition and many other areas. This note presents HMMs via the framework of classical Markov chain models. A simple example is given to illustrate the model. An estimation method for the transition probabilities of the hidden states is also discussed.
Phase transitions in Hidden Markov Models
NASA Astrophysics Data System (ADS)
Bechhoefer, John; Lathouwers, Emma
In Hidden Markov Models (HMMs), a Markov process is not directly accessible. In the simplest case, a two-state Markov model ``emits'' one of two ``symbols'' at each time step. We can think of these symbols as noisy measurements of the underlying state. With some probability, the symbol implies that the system is in one state when it is actually in the other. The ability to judge which state the system is in sets the efficiency of a Maxwell demon that observes state fluctuations in order to extract heat from a coupled reservoir. The state-inference problem is to infer the underlying state from such noisy measurements at each time step. We show that there can be a phase transition in such measurements: for measurement error rates below a certain threshold, the inferred state always matches the observation. For higher error rates, there can be continuous or discontinuous transitions to situations where keeping a memory of past observations improves the state estimate. We can partly understand this behavior by mapping the HMM onto a 1d random-field Ising model at zero temperature. We also present more recent work that explores a larger parameter space and more states. Research funded by NSERC, Canada.
Zipf exponent of trajectory distribution in the hidden Markov model
NASA Astrophysics Data System (ADS)
Bochkarev, V. V.; Lerner, E. Yu
2014-03-01
This paper is the first step of generalization of the previously obtained full classification of the asymptotic behavior of the probability for Markov chain trajectories for the case of hidden Markov models. The main goal is to study the power (Zipf) and nonpower asymptotics of the frequency list of trajectories of hidden Markov frequencys and to obtain explicit formulae for the exponent of the power asymptotics. We consider several simple classes of hidden Markov models. We prove that the asymptotics for a hidden Markov model and for the corresponding Markov chain can be essentially different.
Estimating Neuronal Ageing with Hidden Markov Models
NASA Astrophysics Data System (ADS)
Wang, Bing; Pham, Tuan D.
2011-06-01
Neuronal degeneration is widely observed in normal ageing, meanwhile the neurode-generative disease like Alzheimer's disease effects neuronal degeneration in a faster way which is considered as faster ageing. Early intervention of such disease could benefit subjects with potentials of positive clinical outcome, therefore, early detection of disease related brain structural alteration is required. In this paper, we propose a computational approach for modelling the MRI-based structure alteration with ageing using hidden Markov model. The proposed hidden Markov model based brain structural model encodes intracortical tissue/fluid distribution using discrete wavelet transformation and vector quantization. Further, it captures gray matter volume loss, which is capable of reflecting subtle intracortical changes with ageing. Experiments were carried out on healthy subjects to validate its accuracy and robustness. Results have shown its ability of predicting the brain age with prediction error of 1.98 years without training data, which shows better result than other age predition methods.
Hidden Markov Model Analysis of Multichromophore Photobleaching
Messina, Troy C.; Kim, Hiyun; Giurleo, Jason T.; Talaga, David S.
2007-01-01
The interpretation of single-molecule measurements is greatly complicated by the presence of multiple fluorescent labels. However, many molecular systems of interest consist of multiple interacting components. We investigate this issue using multiply labeled dextran polymers that we intentionally photobleach to the background on a single-molecule basis. Hidden Markov models allow for unsupervised analysis of the data to determine the number of fluorescent subunits involved in the fluorescence intermittency of the 6-carboxy-tetramethylrhodamine labels by counting the discrete steps in fluorescence intensity. The Bayes information criterion allows us to distinguish between hidden Markov models that differ by the number of states, that is, the number of fluorescent molecules. We determine information-theoretical limits and show via Monte Carlo simulations that the hidden Markov model analysis approaches these theoretical limits. This technique has resolving power of one fluorescing unit up to as many as 30 fluorescent dyes with the appropriate choice of dye and adequate detection capability. We discuss the general utility of this method for determining aggregation-state distributions as could appear in many biologically important systems and its adaptability to general photometric experiments. PMID:16913765
Active Inference for Binary Symmetric Hidden Markov Models
NASA Astrophysics Data System (ADS)
Allahverdyan, Armen E.; Galstyan, Aram
2015-10-01
We consider active maximum a posteriori (MAP) inference problem for hidden Markov models (HMM), where, given an initial MAP estimate of the hidden sequence, we select to label certain states in the sequence to improve the estimation accuracy of the remaining states. We focus on the binary symmetric HMM, and employ its known mapping to 1d Ising model in random fields. From the statistical physics viewpoint, the active MAP inference problem reduces to analyzing the ground state of the 1d Ising model under modified external fields. We develop an analytical approach and obtain a closed form solution that relates the expected error reduction to model parameters under the specified active inference scheme. We then use this solution to determine most optimal active inference scheme in terms of error reduction, and examine the relation of those schemes to heuristic principles of uncertainty reduction and solution unicity.
Hidden Markov models for stochastic thermodynamics
NASA Astrophysics Data System (ADS)
Bechhoefer, John
2015-07-01
The formalism of state estimation and hidden Markov models can simplify and clarify the discussion of stochastic thermodynamics in the presence of feedback and measurement errors. After reviewing the basic formalism, we use it to shed light on a recent discussion of phase transitions in the optimized response of an information engine, for which measurement noise serves as a control parameter. The HMM formalism also shows that the value of additional information displays a maximum at intermediate signal-to-noise ratios. Finally, we discuss how systems open to information flow can apparently violate causality; the HMM formalism can quantify the performance gains due to such violations.
Learning Heterogeneous Hidden Markov Random Fields
Liu, Jie; Zhang, Chunming; Burnside, Elizabeth; Page, David
2014-01-01
Hidden Markov random fields (HMRFs) are conventionally assumed to be homogeneous in the sense that the potential functions are invariant across different sites. However in some biological applications, it is desirable to make HMRFs heterogeneous, especially when there exists some background knowledge about how the potential functions vary. We formally define heterogeneous HMRFs and propose an EM algorithm whose M-step combines a contrastive divergence learner with a kernel smoothing step to incorporate the background knowledge. Simulations show that our algorithm is effective for learning heterogeneous HMRFs and outperforms alternative binning methods. We learn a heterogeneous HMRF in a real-world study. PMID:25404989
Multiple alignment using hidden Markov models
Eddy, S.R.
1995-12-31
A simulated annealing method is described for training hidden Markov models and producing multiple sequence alignments from initially unaligned protein or DNA sequences. Simulated annealing in turn uses a dynamic programming algorithm for correctly sampling suboptimal multiple alignments according to their probability and a Boltzmann temperature factor. The quality of simulated annealing alignments is evaluated on structural alignments of ten different protein families, and compared to the performance of other HMM training methods and the ClustalW program. Simulated annealing is better able to find near-global optima in the multiple alignment probability landscape than the other tested HMM training methods. Neither ClustalW nor simulated annealing produce consistently better alignments compared to each other. Examination of the specific cases in which ClustalW outperforms simulated annealing, and vice versa, provides insight into the strengths and weaknesses of current hidden Maxkov model approaches.
Monitoring volcano activity through Hidden Markov Model
NASA Astrophysics Data System (ADS)
Cassisi, C.; Montalto, P.; Prestifilippo, M.; Aliotta, M.; Cannata, A.; Patanè, D.
2013-12-01
During 2011-2013, Mt. Etna was mainly characterized by cyclic occurrences of lava fountains, totaling to 38 episodes. During this time interval Etna volcano's states (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN), whose automatic recognition is very useful for monitoring purposes, turned out to be strongly related to the trend of RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area. Since RMS time series behavior is considered to be stochastic, we can try to model the system generating its values, assuming to be a Markov process, by using Hidden Markov models (HMMs). HMMs are a powerful tool in modeling any time-varying series. HMMs analysis seeks to recover the sequence of hidden states from the observed emissions. In our framework, observed emissions are characters generated by the SAX (Symbolic Aggregate approXimation) technique, which maps RMS time series values with discrete literal emissions. The experiments show how it is possible to guess volcano states by means of HMMs and SAX.
The infinite hidden Markov random field model.
Chatzis, Sotirios P; Tsechpenakis, Gabriel
2010-06-01
Hidden Markov random field (HMRF) models are widely used for image segmentation, as they appear naturally in problems where a spatially constrained clustering scheme is asked for. A major limitation of HMRF models concerns the automatic selection of the proper number of their states, i.e., the number of region clusters derived by the image segmentation procedure. Existing methods, including likelihood- or entropy-based criteria, and reversible Markov chain Monte Carlo methods, usually tend to yield noisy model size estimates while imposing heavy computational requirements. Recently, Dirichlet process (DP, infinite) mixture models have emerged in the cornerstone of nonparametric Bayesian statistics as promising candidates for clustering applications where the number of clusters is unknown a priori; infinite mixture models based on the original DP or spatially constrained variants of it have been applied in unsupervised image segmentation applications showing promising results. Under this motivation, to resolve the aforementioned issues of HMRF models, in this paper, we introduce a nonparametric Bayesian formulation for the HMRF model, the infinite HMRF model, formulated on the basis of a joint Dirichlet process mixture (DPM) and Markov random field (MRF) construction. We derive an efficient variational Bayesian inference algorithm for the proposed model, and we experimentally demonstrate its advantages over competing methodologies.
Defect Detection Using Hidden Markov Random Fields
Dogandzic, Aleksandar; Eua-anant, Nawanat; Zhang Benhong
2005-04-09
We derive an approximate maximum a posteriori (MAP) method for detecting NDE defect signals using hidden Markov random fields (HMRFs). In the proposed HMRF framework, a set of spatially distributed NDE measurements is assumed to form a noisy realization of an underlying random field that has a simple structure with Markovian dependence. Here, the random field describes the defect signals to be estimated or detected. The HMRF models incorporate measurement locations into the statistical analysis, which is important in scenarios where the same defect affects measurements at multiple locations. We also discuss initialization of the proposed HMRF detector and apply to simulated eddy-current data and experimental ultrasonic C-scan data from an inspection of a cylindrical Ti 6-4 billet.
Mixture Hidden Markov Models in Finance Research
NASA Astrophysics Data System (ADS)
Dias, José G.; Vermunt, Jeroen K.; Ramos, Sofia
Finite mixture models have proven to be a powerful framework whenever unobserved heterogeneity cannot be ignored. We introduce in finance research the Mixture Hidden Markov Model (MHMM) that takes into account time and space heterogeneity simultaneously. This approach is flexible in the sense that it can deal with the specific features of financial time series data, such as asymmetry, kurtosis, and unobserved heterogeneity. This methodology is applied to model simultaneously 12 time series of Asian stock markets indexes. Because we selected a heterogeneous sample of countries including both developed and emerging countries, we expect that heterogeneity in market returns due to country idiosyncrasies will show up in the results. The best fitting model was the one with two clusters at country level with different dynamics between the two regimes.
Plume mapping via hidden Markov methods.
Farrell, J A; Pang, Shuo; Li, Wei
2003-01-01
This paper addresses the problem of mapping likely locations of a chemical source using an autonomous vehicle operating in a fluid flow. The paper reviews biological plume-tracing concepts, reviews previous strategies for vehicle-based plume tracing, and presents a new plume mapping approach based on hidden Markov methods (HMM). HMM provide efficient algorithms for predicting the likelihood of odor detection versus position, the likelihood of source location versus position, the most likely path taken by the odor to a given location, and the path between two points most likely to result in odor detection. All four are useful for solving the odor source localization problem using an autonomous vehicle. The vehicle is assumed to be capable of detecting above threshold chemical concentration and sensing the fluid flow velocity at the vehicle location. The fluid flow is assumed to vary with space and time, and to have a high Reynolds number (Re>10). PMID:18238238
Facies Reconstruction by hidden Markov models
NASA Astrophysics Data System (ADS)
Panzeri, M.; Della Rossa, E.; Dovera, L.; Riva, M.; Guadagnini, A.
2012-04-01
The inherent heterogeneity of natural aquifer complex systems can be properly described by a doubly stochastic composite medium approach, where distributions of geomaterials (facies) and attributes, e.g., hydraulic conductivity and porosity, can be uncertain. We focus on the reconstruction of the spatial distribution of facies within a porous medium. The key contribution of our work is to provide a methodology for evaluating the unknown facies distribution while maintaining the spatial correlation between the geological bodies. The latter is considered to be known a priori. The geostatistical model for the spatial distribution of facies is defined in the framework of multiple-point geostatistics, relying on transition probabilities (Stien and Kolbjornsen, 2011). Specifically, we model the facies distribution over the domain by employing the notion of Hidden Markov Model. The hidden states of the system are provided by the value of the indicator function at each cell of the grid, while the the petrophysical properties of the soil (e.g., the permeability) are considered as known. In this context, the key issue is the assessment of the spatial architecture of the geological bodies within the domain of interest upon maximizing the probability associated with a given permeability distribution. This objective is achieved through the Viterbi algorithm. This algorithm was initially introduced for signal denoising problems (e.g., Rabiner, 1989) and has been extended here to a two-dimensional system, following the approach proposed by Li et al. (2000) according to the following steps: (1) the parameters of the transitional probabilities of the facies distribution are estimated from a given training image; (2) the facies distribution maximizing the probability of occurrence considering the probability of (i) facies distribution, (ii) conductivity distribution and (iii) their joint conditional probability is then reconstructed. We demonstrate the reliability and advantage of
Hidden Markov models in automatic speech recognition
NASA Astrophysics Data System (ADS)
Wrzoskowicz, Adam
1993-11-01
This article describes a method for constructing an automatic speech recognition system based on hidden Markov models (HMMs). The author discusses the basic concepts of HMM theory and the application of these models to the analysis and recognition of speech signals. The author provides algorithms which make it possible to train the ASR system and recognize signals on the basis of distinct stochastic models of selected speech sound classes. The author describes the specific components of the system and the procedures used to model and recognize speech. The author discusses problems associated with the choice of optimal signal detection and parameterization characteristics and their effect on the performance of the system. The author presents different options for the choice of speech signal segments and their consequences for the ASR process. The author gives special attention to the use of lexical, syntactic, and semantic information for the purpose of improving the quality and efficiency of the system. The author also describes an ASR system developed by the Speech Acoustics Laboratory of the IBPT PAS. The author discusses the results of experiments on the effect of noise on the performance of the ASR system and describes methods of constructing HMM's designed to operate in a noisy environment. The author also describes a language for human-robot communications which was defined as a complex multilevel network from an HMM model of speech sounds geared towards Polish inflections. The author also added mandatory lexical and syntactic rules to the system for its communications vocabulary.
Systolic Architectures For Hidden Markov Models
NASA Astrophysics Data System (ADS)
Hwang, J. N.; Vlontzos, J. A.; Kung, S. Y.
1988-10-01
This paper proposes an unidirectional ring systolic architecture for implementing the hidden Markov models (HMMs). This array architecture maximizes the strength of VLSI in terms of intensive and pipelined computing and yet circumvents the limitation on communication. Both the scoring and learning phases of an HMM are formulated as a consecutive matrix-vector multiplication problem, which can be executed in a fully pipelined fashion (100% utilization effi-ciency) by using an unidirectional ring systolic architecture. By appropriately scheduling the algorithm, which combines both the operations of the backward evaluation procedure and reestimation algorithm at the same time, we can use this systolic HMM in a most efficient manner. The systolic HMM can also be easily adapted to the left-to-right HMM by using bidirectional semi-global links with significant time saving. This architecture can also incorporate the scaling scheme with little extra effort in the computations of forward and backward evaluation variables to prevent the frequently encountered mathematical undertow problems. We also discuss a possible implementation of this proposed architecture using Inmos transputer (T-800) as the building block.
Spatiotemporal pattern recognition using hidden Markov models
NASA Astrophysics Data System (ADS)
Fielding, Kenneth H.; Ruck, Dennis W.; Rogers, Steven K.; Welsh, Byron M.; Oxley, Mark E.
1993-10-01
A spatio-temporal method for identifying objects contained in an image sequence is presented. The Hidden Markov Model (HMM) technique is used as the classification algorithm, making classification decisions based on a spatio-temporal sequence of observed object features. A five class problem is considered. Classification accuracies of 100% and 99.7% are obtained for sequences of images generated over two separate regions of viewing positions. HMMs trained on image sequences of the objects moving in opposite directions showed a 98.1% successful classification rate by class and direction of movement. The HMM technique proved robust to image corruption with additive correlated noise and had a higher accuracy than a single look nearest neighbor method. A real image sequence of one of the objects used was successfully recognized with the HMMs trained on synthetic data. This study shows the temporal changes that observed feature vectors undergo due to object motion hold information that can yield superior classification accuracy when compared to single frame techniques.
Stochastic motif extraction using hidden Markov model
Fujiwara, Yukiko; Asogawa, Minoru; Konagaya, Akihiko
1994-12-31
In this paper, we study the application of an HMM (hidden Markov model) to the problem of representing protein sequences by a stochastic motif. A stochastic protein motif represents the small segments of protein sequences that have a certain function or structure. The stochastic motif, represented by an HMM, has conditional probabilities to deal with the stochastic nature of the motif. This HMM directive reflects the characteristics of the motif, such as a protein periodical structure or grouping. In order to obtain the optimal HMM, we developed the {open_quotes}iterative duplication method{close_quotes} for HMM topology learning. It starts from a small fully-connected network and iterates the network generation and parameter optimization until it achieves sufficient discrimination accuracy. Using this method, we obtained an HMM for a leucine zipper motif. Compared to the accuracy of a symbolic pattern representation with accuracy of 14.8 percent, an HMM achieved 79.3 percent in prediction. Additionally, the method can obtain an HMM for various types of zinc finger motifs, and it might separate the mixed data. We demonstrated that this approach is applicable to the validation of the protein databases; a constructed HMM b as indicated that one protein sequence annotated as {open_quotes}lencine-zipper like sequence{close_quotes} in the database is quite different from other leucine-zipper sequences in terms of likelihood, and we found this discrimination is plausible.
Time series segmentation with shifting means hidden markov models
NASA Astrophysics Data System (ADS)
Kehagias, Ath.; Fortin, V.
2006-08-01
We present a new family of hidden Markov models and apply these to the segmentation of hydrological and environmental time series. The proposed hidden Markov models have a discrete state space and their structure is inspired from the shifting means models introduced by Chernoff and Zacks and by Salas and Boes. An estimation method inspired from the EM algorithm is proposed, and we show that it can accurately identify multiple change-points in a time series. We also show that the solution obtained using this algorithm can serve as a starting point for a Monte-Carlo Markov chain Bayesian estimation method, thus reducing the computing time needed for the Markov chain to converge to a stationary distribution.
MODELING PAVEMENT DETERIORATION PROCESSES BY POISSON HIDDEN MARKOV MODELS
NASA Astrophysics Data System (ADS)
Nam, Le Thanh; Kaito, Kiyoyuki; Kobayashi, Kiyoshi; Okizuka, Ryosuke
In pavement management, it is important to estimate lifecycle cost, which is composed of the expenses for repairing local damages, including potholes, and repairing and rehabilitating the surface and base layers of pavements, including overlays. In this study, a model is produced under the assumption that the deterioration process of pavement is a complex one that includes local damages, which occur frequently, and the deterioration of the surface and base layers of pavement, which progresses slowly. The variation in pavement soundness is expressed by the Markov deterioration model and the Poisson hidden Markov deterioration model, in which the frequency of local damage depends on the distribution of pavement soundness, is formulated. In addition, the authors suggest a model estimation method using the Markov Chain Monte Carlo (MCMC) method, and attempt to demonstrate the applicability of the proposed Poisson hidden Markov deterioration model by studying concrete application cases.
Unsupervised Segmentation of Hidden Semi-Markov Non Stationary Chains
NASA Astrophysics Data System (ADS)
Lapuyade-Lahorgue, Jérôme; Pieczynski, Wojciech
2006-11-01
In the classical hidden Markov chain (HMC) model we have a hidden chain X, which is a Markov one and an observed chain Y. HMC are widely used; however, in some situations they have to be replaced by the more general "hidden semi-Markov chains" (HSMC) which are particular "triplet Markov chains" (TMC) T = (X, U, Y), where the auxiliary chain U models the semi-Markovianity of X. Otherwise, non stationary classical HMC can also be modeled by a triplet Markov stationary chain with, as a consequence, the possibility of parameters' estimation. The aim of this paper is to use simultaneously both properties. We consider a non stationary HSMC and model it as a TMC T = (X, U1, U2, Y), where U1 models the semi-Markovianity and U2 models the non stationarity. The TMC T being itself stationary, all parameters can be estimated by the general "Iterative Conditional Estimation" (ICE) method, which leads to unsupervised segmentation. We present some experiments showing the interest of the new model and related processing in image segmentation area.
Multivariate longitudinal data analysis with mixed effects hidden Markov models.
Raffa, Jesse D; Dubin, Joel A
2015-09-01
Multiple longitudinal responses are often collected as a means to capture relevant features of the true outcome of interest, which is often hidden and not directly measurable. We outline an approach which models these multivariate longitudinal responses as generated from a hidden disease process. We propose a class of models which uses a hidden Markov model with separate but correlated random effects between multiple longitudinal responses. This approach was motivated by a smoking cessation clinical trial, where a bivariate longitudinal response involving both a continuous and a binomial response was collected for each participant to monitor smoking behavior. A Bayesian method using Markov chain Monte Carlo is used. Comparison of separate univariate response models to the bivariate response models was undertaken. Our methods are demonstrated on the smoking cessation clinical trial dataset, and properties of our approach are examined through extensive simulation studies. PMID:25761965
Multiple testing for neuroimaging via hidden Markov random field.
Shu, Hai; Nan, Bin; Koeppe, Robert
2015-09-01
Traditional voxel-level multiple testing procedures in neuroimaging, mostly p-value based, often ignore the spatial correlations among neighboring voxels and thus suffer from substantial loss of power. We extend the local-significance-index based procedure originally developed for the hidden Markov chain models, which aims to minimize the false nondiscovery rate subject to a constraint on the false discovery rate, to three-dimensional neuroimaging data using a hidden Markov random field model. A generalized expectation-maximization algorithm for maximizing the penalized likelihood is proposed for estimating the model parameters. Extensive simulations show that the proposed approach is more powerful than conventional false discovery rate procedures. We apply the method to the comparison between mild cognitive impairment, a disease status with increased risk of developing Alzheimer's or another dementia, and normal controls in the FDG-PET imaging study of the Alzheimer's Disease Neuroimaging Initiative.
Multiple testing for neuroimaging via hidden Markov random field.
Shu, Hai; Nan, Bin; Koeppe, Robert
2015-09-01
Traditional voxel-level multiple testing procedures in neuroimaging, mostly p-value based, often ignore the spatial correlations among neighboring voxels and thus suffer from substantial loss of power. We extend the local-significance-index based procedure originally developed for the hidden Markov chain models, which aims to minimize the false nondiscovery rate subject to a constraint on the false discovery rate, to three-dimensional neuroimaging data using a hidden Markov random field model. A generalized expectation-maximization algorithm for maximizing the penalized likelihood is proposed for estimating the model parameters. Extensive simulations show that the proposed approach is more powerful than conventional false discovery rate procedures. We apply the method to the comparison between mild cognitive impairment, a disease status with increased risk of developing Alzheimer's or another dementia, and normal controls in the FDG-PET imaging study of the Alzheimer's Disease Neuroimaging Initiative. PMID:26012881
A Hidden Markov Approach to Modeling Interevent Earthquake Times
NASA Astrophysics Data System (ADS)
Chambers, D.; Ebel, J. E.; Kafka, A. L.; Baglivo, J.
2003-12-01
A hidden Markov process, in which the interevent time distribution is a mixture of exponential distributions with different rates, is explored as a model for seismicity that does not follow a Poisson process. In a general hidden Markov model, one assumes that a system can be in any of a finite number k of states and there is a random variable of interest whose distribution depends on the state in which the system resides. The system moves probabilistically among the states according to a Markov chain; that is, given the history of visited states up to the present, the conditional probability that the next state is a specified one depends only on the present state. Thus the transition probabilities are specified by a k by k stochastic matrix. Furthermore, it is assumed that the actual states are unobserved (hidden) and that only the values of the random variable are seen. From these values, one wishes to estimate the sequence of states, the transition probability matrix, and any parameters used in the state-specific distributions. The hidden Markov process was applied to a data set of 110 interevent times for earthquakes in New England from 1975 to 2000. Using the Baum-Welch method (Baum et al., Ann. Math. Statist. 41, 164-171), we estimate the transition probabilities, find the most likely sequence of states, and estimate the k means of the exponential distributions. Using k=2 states, we found the data were fit well by a mixture of two exponential distributions, with means of approximately 5 days and 95 days. The steady state model indicates that after approximately one fourth of the earthquakes, the waiting time until the next event had the first exponential distribution and three fourths of the time it had the second. Three and four state models were also fit to the data; the data were inconsistent with a three state model but were well fit by a four state model.
Hidden Markov Models: The Best Models for Forager Movements?
Joo, Rocio; Bertrand, Sophie; Tam, Jorge; Fablet, Ronan
2013-01-01
One major challenge in the emerging field of movement ecology is the inference of behavioural modes from movement patterns. This has been mainly addressed through Hidden Markov models (HMMs). We propose here to evaluate two sets of alternative and state-of-the-art modelling approaches. First, we consider hidden semi-Markov models (HSMMs). They may better represent the behavioural dynamics of foragers since they explicitly model the duration of the behavioural modes. Second, we consider discriminative models which state the inference of behavioural modes as a classification issue, and may take better advantage of multivariate and non linear combinations of movement pattern descriptors. For this work, we use a dataset of >200 trips from human foragers, Peruvian fishermen targeting anchovy. Their movements were recorded through a Vessel Monitoring System (∼1 record per hour), while their behavioural modes (fishing, searching and cruising) were reported by on-board observers. We compare the efficiency of hidden Markov, hidden semi-Markov, and three discriminative models (random forests, artificial neural networks and support vector machines) for inferring the fishermen behavioural modes, using a cross-validation procedure. HSMMs show the highest accuracy (80%), significantly outperforming HMMs and discriminative models. Simulations show that data with higher temporal resolution, HSMMs reach nearly 100% of accuracy. Our results demonstrate to what extent the sequential nature of movement is critical for accurately inferring behavioural modes from a trajectory and we strongly recommend the use of HSMMs for such purpose. In addition, this work opens perspectives on the use of hybrid HSMM-discriminative models, where a discriminative setting for the observation process of HSMMs could greatly improve inference performance. PMID:24058400
Infinite Factorial Unbounded-State Hidden Markov Model.
Valera, Isabel; Ruiz, Francisco J R; Perez-Cruz, Fernando
2016-09-01
There are many scenarios in artificial intelligence, signal processing or medicine, in which a temporal sequence consists of several unknown overlapping independent causes, and we are interested in accurately recovering those canonical causes. Factorial hidden Markov models (FHMMs) present the versatility to provide a good fit to these scenarios. However, in some scenarios, the number of causes or the number of states of the FHMM cannot be known or limited a priori. In this paper, we propose an infinite factorial unbounded-state hidden Markov model (IFUHMM), in which the number of parallel hidden Markovmodels (HMMs) and states in each HMM are potentially unbounded. We rely on a Bayesian nonparametric (BNP) prior over integer-valued matrices, in which the columns represent the Markov chains, the rows the time indexes, and the integers the state for each chain and time instant. First, we extend the existent infinite factorial binary-state HMM to allow for any number of states. Then, we modify this model to allow for an unbounded number of states and derive an MCMC-based inference algorithm that properly deals with the trade-off between the unbounded number of states and chains. We illustrate the performance of our proposed models in the power disaggregation problem. PMID:26571511
A coupled hidden Markov model for disease interactions.
Sherlock, Chris; Xifara, Tatiana; Telfer, Sandra; Begon, Mike
2013-08-01
To investigate interactions between parasite species in a host, a population of field voles was studied longitudinally, with presence or absence of six different parasites measured repeatedly. Although trapping sessions were regular, a different set of voles was caught at each session, leading to incomplete profiles for all subjects. We use a discrete time hidden Markov model for each disease with transition probabilities dependent on covariates via a set of logistic regressions. For each disease the hidden states for each of the other diseases at a given time point form part of the covariate set for the Markov transition probabilities from that time point. This allows us to gauge the influence of each parasite species on the transition probabilities for each of the other parasite species. Inference is performed via a Gibbs sampler, which cycles through each of the diseases, first using an adaptive Metropolis-Hastings step to sample from the conditional posterior of the covariate parameters for that particular disease given the hidden states for all other diseases and then sampling from the hidden states for that disease given the parameters. We find evidence for interactions between several pairs of parasites and of an acquired immune response for two of the parasites. PMID:24223436
A coupled hidden Markov model for disease interactions.
Sherlock, Chris; Xifara, Tatiana; Telfer, Sandra; Begon, Mike
2013-08-01
To investigate interactions between parasite species in a host, a population of field voles was studied longitudinally, with presence or absence of six different parasites measured repeatedly. Although trapping sessions were regular, a different set of voles was caught at each session, leading to incomplete profiles for all subjects. We use a discrete time hidden Markov model for each disease with transition probabilities dependent on covariates via a set of logistic regressions. For each disease the hidden states for each of the other diseases at a given time point form part of the covariate set for the Markov transition probabilities from that time point. This allows us to gauge the influence of each parasite species on the transition probabilities for each of the other parasite species. Inference is performed via a Gibbs sampler, which cycles through each of the diseases, first using an adaptive Metropolis-Hastings step to sample from the conditional posterior of the covariate parameters for that particular disease given the hidden states for all other diseases and then sampling from the hidden states for that disease given the parameters. We find evidence for interactions between several pairs of parasites and of an acquired immune response for two of the parasites.
Efficient Parallel Learning of Hidden Markov Chain Models on SMPs
NASA Astrophysics Data System (ADS)
Li, Lei; Fu, Bin; Faloutsos, Christos
Quad-core cpus have been a common desktop configuration for today's office. The increasing number of processors on a single chip opens new opportunity for parallel computing. Our goal is to make use of the multi-core as well as multi-processor architectures to speed up large-scale data mining algorithms. In this paper, we present a general parallel learning framework, Cut-And-Stitch, for training hidden Markov chain models. Particularly, we propose two model-specific variants, CAS-LDS for learning linear dynamical systems (LDS) and CAS-HMM for learning hidden Markov models (HMM). Our main contribution is a novel method to handle the data dependencies due to the chain structure of hidden variables, so as to parallelize the EM-based parameter learning algorithm. We implement CAS-LDS and CAS-HMM using OpenMP on two supercomputers and a quad-core commercial desktop. The experimental results show that parallel algorithms using Cut-And-Stitch achieve comparable accuracy and almost linear speedups over the traditional serial version.
AIRWAY LABELING USING A HIDDEN MARKOV TREE MODEL
Ross, James C.; Díaz, Alejandro A.; Okajima, Yuka; Wassermann, Demian; Washko, George R.; Dy, Jennifer; San José Estépar, Raúl
2014-01-01
We present a novel airway labeling algorithm based on a Hidden Markov Tree Model (HMTM). We obtain a collection of discrete points along the segmented airway tree using particles sampling [1] and establish topology using Kruskal’s minimum spanning tree algorithm. Following this, our HMTM algorithm probabilistically assigns labels to each point. While alternative methods label airway branches out to the segmental level, we describe a general method and demonstrate its performance out to the subsubsegmental level (two generations further than previously published approaches). We present results on a collection of 25 computed tomography (CT) datasets taken from a Chronic Obstructive Pulmonary Disease (COPD) study. PMID:25436039
Hidden Markov Modeling for Weigh-In-Motion Estimation
Abercrombie, Robert K; Ferragut, Erik M; Boone, Shane
2012-01-01
This paper describes a hidden Markov model to assist in the weight measurement error that arises from complex vehicle oscillations of a system of discrete masses. Present reduction of oscillations is by a smooth, flat, level approach and constant, slow speed in a straight line. The model uses this inherent variability to assist in determining the true total weight and individual axle weights of a vehicle. The weight distribution dynamics of a generic moving vehicle were simulated. The model estimation converged to within 1% of the true mass for simulated data. The computational demands of this method, while much greater than simple averages, took only seconds to run on a desktop computer.
Improved Hidden-Markov-Model Method Of Detecting Faults
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J.
1994-01-01
Method of automated, continuous monitoring to detect faults in complicated dynamic system based on hidden-Markov-model (HMM) approach. Simpler than another, recently proposed HMM method, but retains advantages of that method, including low susceptibility to false alarms, no need for mathematical model of dynamics of system under normal or faulty conditions, and ability to detect subtle changes in characteristics of monitored signals. Examples of systems monitored by use of this method include motors, turbines, and pumps critical in their applications; chemical-processing plants; powerplants; and biomedical systems.
Self-Organizing Hidden Markov Model Map (SOHMMM).
Ferles, Christos; Stafylopatis, Andreas
2013-12-01
A hybrid approach combining the Self-Organizing Map (SOM) and the Hidden Markov Model (HMM) is presented. The Self-Organizing Hidden Markov Model Map (SOHMMM) establishes a cross-section between the theoretic foundations and algorithmic realizations of its constituents. The respective architectures and learning methodologies are fused in an attempt to meet the increasing requirements imposed by the properties of deoxyribonucleic acid (DNA), ribonucleic acid (RNA), and protein chain molecules. The fusion and synergy of the SOM unsupervised training and the HMM dynamic programming algorithms bring forth a novel on-line gradient descent unsupervised learning algorithm, which is fully integrated into the SOHMMM. Since the SOHMMM carries out probabilistic sequence analysis with little or no prior knowledge, it can have a variety of applications in clustering, dimensionality reduction and visualization of large-scale sequence spaces, and also, in sequence discrimination, search and classification. Two series of experiments based on artificial sequence data and splice junction gene sequences demonstrate the SOHMMM's characteristics and capabilities. PMID:24001407
Trajectory classification using switched dynamical hidden Markov models.
Nascimento, Jacinto C; Figueiredo, Mario; Marques, Jorge S
2010-05-01
This paper proposes an approach for recognizing human activities (more specifically, pedestrian trajectories) in video sequences, in a surveillance context. A system for automatic processing of video information for surveillance purposes should be capable of detecting, recognizing, and collecting statistics of human activity, reducing human intervention as much as possible. In the method described in this paper, human trajectories are modeled as a concatenation of segments produced by a set of low level dynamical models. These low level models are estimated in an unsupervised fashion, based on a finite mixture formulation, using the expectation-maximization (EM) algorithm; the number of models is automatically obtained using a minimum message length (MML) criterion. This leads to a parsimonious set of models tuned to the complexity of the scene. We describe the switching among the low-level dynamic models by a hidden Markov chain; thus, the complete model is termed a switched dynamical hidden Markov model (SD-HMM). The performance of the proposed method is illustrated with real data from two different scenarios: a shopping center and a university campus. A set of human activities in both scenarios is successfully recognized by the proposed system. These experiments show the ability of our approach to properly describe trajectories with sudden changes.
Trajectory classification using switched dynamical hidden Markov models.
Nascimento, Jacinto C; Figueiredo, Mario; Marques, Jorge S
2010-05-01
This paper proposes an approach for recognizing human activities (more specifically, pedestrian trajectories) in video sequences, in a surveillance context. A system for automatic processing of video information for surveillance purposes should be capable of detecting, recognizing, and collecting statistics of human activity, reducing human intervention as much as possible. In the method described in this paper, human trajectories are modeled as a concatenation of segments produced by a set of low level dynamical models. These low level models are estimated in an unsupervised fashion, based on a finite mixture formulation, using the expectation-maximization (EM) algorithm; the number of models is automatically obtained using a minimum message length (MML) criterion. This leads to a parsimonious set of models tuned to the complexity of the scene. We describe the switching among the low-level dynamic models by a hidden Markov chain; thus, the complete model is termed a switched dynamical hidden Markov model (SD-HMM). The performance of the proposed method is illustrated with real data from two different scenarios: a shopping center and a university campus. A set of human activities in both scenarios is successfully recognized by the proposed system. These experiments show the ability of our approach to properly describe trajectories with sudden changes. PMID:20051342
Behavior Detection using Confidence Intervals of Hidden Markov Models
Griffin, Christopher H
2009-01-01
Markov models are commonly used to analyze real-world problems. Their combination of discrete states and stochastic transitions is suited to applications with deterministic and stochastic components. Hidden Markov Models (HMMs) are a class of Markov model commonly used in pattern recognition. Currently, HMMs recognize patterns using a maximum likelihood approach. One major drawback with this approach is that data observations are mapped to HMMs without considering the number of data samples available. Another problem is that this approach is only useful for choosing between HMMs. It does not provide a criteria for determining whether or not a given HMM adequately matches the data stream. In this work, we recognize complex behaviors using HMMs and confidence intervals. The certainty of a data match increases with the number of data samples considered. Receiver Operating Characteristic curves are used to find the optimal threshold for either accepting or rejecting a HMM description. We present one example using a family of HMM's to show the utility of the proposed approach. A second example using models extracted from a database of consumer purchases provides additional evidence that this approach can perform better than existing techniques.
ENSO informed Drought Forecasting Using Nonhomogeneous Hidden Markov Chain Model
NASA Astrophysics Data System (ADS)
Kwon, H.; Yoo, J.; Kim, T.
2013-12-01
The study aims at developing a new scheme to investigate the potential use of ENSO (El Niño/Southern Oscillation) for drought forecasting. In this regard, objective of this study is to extend a previously developed nonhomogeneous hidden Markov chain model (NHMM) to identify climate states associated with drought that can be potentially used to forecast drought conditions using climate information. As a target variable for forecasting, SPI(standardized precipitation index) is mainly utilized. This study collected monthly precipitation data over 56 stations that cover more than 30 years and K-means cluster analysis using drought properties was applied to partition regions into mutually exclusive clusters. In this study, six main clusters were distinguished through the regionalization procedure. For each cluster, the NHMM was applied to estimate the transition probability of hidden states as well as drought conditions informed by large scale climate indices (e.g. SOI, Nino1.2, Nino3, Nino3.4, MJO and PDO). The NHMM coupled with large scale climate information shows promise as a technique for forecasting drought scenarios. A more detailed explanation of large scale climate patterns associated with the identified hidden states will be provided with anomaly composites of SSTs and SLPs. Acknowledgement This research was supported by a grant(11CTIPC02) from Construction Technology Innovation Program (CTIP) funded by Ministry of Land, Transport and Maritime Affairs of Korean government.
Decoding coalescent hidden Markov models in linear time
Harris, Kelley; Sheehan, Sara; Kamm, John A.; Song, Yun S.
2014-01-01
In many areas of computational biology, hidden Markov models (HMMs) have been used to model local genomic features. In particular, coalescent HMMs have been used to infer ancient population sizes, migration rates, divergence times, and other parameters such as mutation and recombination rates. As more loci, sequences, and hidden states are added to the model, however, the runtime of coalescent HMMs can quickly become prohibitive. Here we present a new algorithm for reducing the runtime of coalescent HMMs from quadratic in the number of hidden time states to linear, without making any additional approximations. Our algorithm can be incorporated into various coalescent HMMs, including the popular method PSMC for inferring variable effective population sizes. Here we implement this algorithm to speed up our demographic inference method diCal, which is equivalent to PSMC when applied to a sample of two haplotypes. We demonstrate that the linear-time method can reconstruct a population size change history more accurately than the quadratic-time method, given similar computation resources. We also apply the method to data from the 1000 Genomes project, inferring a high-resolution history of size changes in the European population. PMID:25340178
Hidden Markov Models for Detecting Aseismic Events in Southern California
NASA Astrophysics Data System (ADS)
Granat, R.
2004-12-01
We employ a hidden Markov model (HMM) to segment surface displacement time series collection by the Southern California Integrated Geodetic Network (SCIGN). These segmented time series are then used to detect regional events by observing the number of simultaneous mode changes across the network; if a large number of stations change at the same time, that indicates an event. The hidden Markov model (HMM) approach assumes that the observed data has been generated by an unobservable dynamical statistical process. The process is of a particular form such that each observation is coincident with the system being in a particular discrete state, which is interpreted as a behavioral mode. The dynamics are the model are constructed so that the next state is directly dependent only on the current state -- it is a first order Markov process. The model is completely described by a set of parameters: the initial state probabilities, the first order Markov chain state-to-state transition probabilities, and the probability distribution of observable outputs associated with each state. The result of this approach is that our segmentation decisions are based entirely on statistical changes in the behavior of the observed daily displacements. In general, finding the optimal model parameters to fit the data is a difficult problem. We present an innovative model fitting method that is unsupervised (i.e., it requires no labeled training data) and uses a regularized version of the expectation-maximization (EM) algorithm to ensure that model solutions are both robust with respect to initial conditions and of high quality. We demonstrate the reliability of the method as compared to standard model fitting methods and show that it results in lower noise in the mode change correlation signal used to detect regional events. We compare candidate events detected by this method to the seismic record and observe that most are not correlated with a significant seismic event. Our analysis
Hidden Markov models for fault detection in dynamic systems
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J. (Inventor)
1995-01-01
The invention is a system failure monitoring method and apparatus which learns the symptom-fault mapping directly from training data. The invention first estimates the state of the system at discrete intervals in time. A feature vector x of dimension k is estimated from sets of successive windows of sensor data. A pattern recognition component then models the instantaneous estimate of the posterior class probability given the features, p(w(sub i) (vertical bar)/x), 1 less than or equal to i isless than or equal to m. Finally, a hidden Markov model is used to take advantage of temporal context and estimate class probabilities conditioned on recent past history. In this hierarchical pattern of information flow, the time series data is transformed and mapped into a categorical representation (the fault classes) and integrated over time to enable robust decision-making.
Combining Wavelet Transform and Hidden Markov Models for ECG Segmentation
NASA Astrophysics Data System (ADS)
Andreão, Rodrigo Varejão; Boudy, Jérôme
2006-12-01
This work aims at providing new insights on the electrocardiogram (ECG) segmentation problem using wavelets. The wavelet transform has been originally combined with a hidden Markov models (HMMs) framework in order to carry out beat segmentation and classification. A group of five continuous wavelet functions commonly used in ECG analysis has been implemented and compared using the same framework. All experiments were realized on the QT database, which is composed of a representative number of ambulatory recordings of several individuals and is supplied with manual labels made by a physician. Our main contribution relies on the consistent set of experiments performed. Moreover, the results obtained in terms of beat segmentation and premature ventricular beat (PVC) detection are comparable to others works reported in the literature, independently of the type of the wavelet. Finally, through an original concept of combining two wavelet functions in the segmentation stage, we achieve our best performances.
Vision-based road detection by hidden Markov model
NASA Astrophysics Data System (ADS)
Wang, Yanqing; Chen, Deyun; Tao, Liyuan; Shi, Chaoxia
2009-07-01
A novel vision-based road detection method was proposed in this paper to realize visual guiding navigation for ground mobile vehicles (GMV). The original image captured by single camera was first segmented into the road region and nonroad region by using an adaptive threshold segmentation algorithm named OTSU. Subsequently, the Canny edges extracted in grey images would be filtered in the road region so that the road boundary could be recognized accurately among those disturbances caused by other edges existed in the image. In order to improve the performance of road detection, the dynamics of GMV and the Hidden Markov Model (HMM) was taken into account to associate the possible road boundary at different time step. The method proposed in this paper was robust against strong shadows, surface dilapidation and illumination variations. It has been tested on real GMV and performed well in real road environments.
Hidden Markov model using Dirichlet process for de-identification.
Chen, Tao; Cullen, Richard M; Godwin, Marshall
2015-12-01
For the 2014 i2b2/UTHealth de-identification challenge, we introduced a new non-parametric Bayesian hidden Markov model using a Dirichlet process (HMM-DP). The model intends to reduce task-specific feature engineering and to generalize well to new data. In the challenge we developed a variational method to learn the model and an efficient approximation algorithm for prediction. To accommodate out-of-vocabulary words, we designed a number of feature functions to model such words. The results show the model is capable of understanding local context cues to make correct predictions without manual feature engineering and performs as accurately as state-of-the-art conditional random field models in a number of categories. To incorporate long-range and cross-document context cues, we developed a skip-chain conditional random field model to align the results produced by HMM-DP, which further improved the performance. PMID:26407642
Understanding eye movements in face recognition using hidden Markov models.
Chuk, Tim; Chan, Antoni B; Hsiao, Janet H
2014-09-16
We use a hidden Markov model (HMM) based approach to analyze eye movement data in face recognition. HMMs are statistical models that are specialized in handling time-series data. We conducted a face recognition task with Asian participants, and model each participant's eye movement pattern with an HMM, which summarized the participant's scan paths in face recognition with both regions of interest and the transition probabilities among them. By clustering these HMMs, we showed that participants' eye movements could be categorized into holistic or analytic patterns, demonstrating significant individual differences even within the same culture. Participants with the analytic pattern had longer response times, but did not differ significantly in recognition accuracy from those with the holistic pattern. We also found that correct and wrong recognitions were associated with distinctive eye movement patterns; the difference between the two patterns lies in the transitions rather than locations of the fixations alone.
Hidden Markov models for fault detection in dynamic systems
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J. (Inventor)
1993-01-01
The invention is a system failure monitoring method and apparatus which learns the symptom-fault mapping directly from training data. The invention first estimates the state of the system at discrete intervals in time. A feature vector x of dimension k is estimated from sets of successive windows of sensor data. A pattern recognition component then models the instantaneous estimate of the posterior class probability given the features, p(w(sub i) perpendicular to x), 1 less than or equal to i is less than or equal to m. Finally, a hidden Markov model is used to take advantage of temporal context and estimate class probabilities conditioned on recent past history. In this hierarchical pattern of information flow, the time series data is transformed and mapped into a categorical representation (the fault classes) and integrated over time to enable robust decision-making.
Hidden Markov models for evolution and comparative genomics analysis.
Bykova, Nadezda A; Favorov, Alexander V; Mironov, Andrey A
2013-01-01
The problem of reconstruction of ancestral states given a phylogeny and data from extant species arises in a wide range of biological studies. The continuous-time Markov model for the discrete states evolution is generally used for the reconstruction of ancestral states. We modify this model to account for a case when the states of the extant species are uncertain. This situation appears, for example, if the states for extant species are predicted by some program and thus are known only with some level of reliability; it is common for bioinformatics field. The main idea is formulation of the problem as a hidden Markov model on a tree (tree HMM, tHMM), where the basic continuous-time Markov model is expanded with the introduction of emission probabilities of observed data (e.g. prediction scores) for each underlying discrete state. Our tHMM decoding algorithm allows us to predict states at the ancestral nodes as well as to refine states at the leaves on the basis of quantitative comparative genomics. The test on the simulated data shows that the tHMM approach applied to the continuous variable reflecting the probabilities of the states (i.e. prediction score) appears to be more accurate then the reconstruction from the discrete states assignment defined by the best score threshold. We provide examples of applying our model to the evolutionary analysis of N-terminal signal peptides and transcription factor binding sites in bacteria. The program is freely available at http://bioinf.fbb.msu.ru/~nadya/tHMM and via web-service at http://bioinf.fbb.msu.ru/treehmmweb.
Volatility: A hidden Markov process in financial time series
NASA Astrophysics Data System (ADS)
Eisler, Zoltán; Perelló, Josep; Masoliver, Jaume
2007-11-01
Volatility characterizes the amplitude of price return fluctuations. It is a central magnitude in finance closely related to the risk of holding a certain asset. Despite its popularity on trading floors, volatility is unobservable and only the price is known. Diffusion theory has many common points with the research on volatility, the key of the analogy being that volatility is a time-dependent diffusion coefficient of the random walk for the price return. We present a formal procedure to extract volatility from price data by assuming that it is described by a hidden Markov process which together with the price forms a two-dimensional diffusion process. We derive a maximum-likelihood estimate of the volatility path valid for a wide class of two-dimensional diffusion processes. The choice of the exponential Ornstein-Uhlenbeck (expOU) stochastic volatility model performs remarkably well in inferring the hidden state of volatility. The formalism is applied to the Dow Jones index. The main results are that (i) the distribution of estimated volatility is lognormal, which is consistent with the expOU model, (ii) the estimated volatility is related to trading volume by a power law of the form σ∝V0.55 , and (iii) future returns are proportional to the current volatility, which suggests some degree of predictability for the size of future returns.
Identifying Seismicity Levels via Poisson Hidden Markov Models
NASA Astrophysics Data System (ADS)
Orfanogiannaki, K.; Karlis, D.; Papadopoulos, G. A.
2010-08-01
Poisson Hidden Markov models (PHMMs) are introduced to model temporal seismicity changes. In a PHMM the unobserved sequence of states is a finite-state Markov chain and the distribution of the observation at any time is Poisson with rate depending only on the current state of the chain. Thus, PHMMs allow a region to have varying seismicity rate. We applied the PHMM to model earthquake frequencies in the seismogenic area of Killini, Ionian Sea, Greece, between period 1990 and 2006. Simulations of data from the assumed model showed that it describes quite well the true data. The earthquake catalogue is dominated by main shocks occurring in 1993, 1997 and 2002. The time plot of PHMM seismicity states not only reproduces the three seismicity clusters but also quantifies the seismicity level and underlies the degree of strength of the serial dependence of the events at any point of time. Foreshock activity becomes quite evident before the three sequences with the gradual transition to states of cascade seismicity. Traditional analysis, based on the determination of highly significant changes of seismicity rates, failed to recognize foreshocks before the 1997 main shock due to the low number of events preceding that main shock. Then, PHMM has better performance than traditional analysis since the transition from one state to another does not only depend on the total number of events involved but also on the current state of the system. Therefore, PHMM recognizes significant changes of seismicity soon after they start, which is of particular importance for real-time recognition of foreshock activities and other seismicity changes.
Lee, Lee-Min; Jean, Fu-Rong
2016-08-01
The hidden Markov models have been widely applied to systems with sequential data. However, the conditional independence of the state outputs will limit the output of a hidden Markov model to be a piecewise constant random sequence, which is not a good approximation for many real processes. In this paper, a high-order hidden Markov model for piecewise linear processes is proposed to better approximate the behavior of a real process. A parameter estimation method based on the expectation-maximization algorithm was derived for the proposed model. Experiments on speech recognition of noisy Mandarin digits were conducted to examine the effectiveness of the proposed method. Experimental results show that the proposed method can reduce the recognition error rate compared to a baseline hidden Markov model. PMID:27586781
A hidden Markov model for multimodal biometrics score fusion
NASA Astrophysics Data System (ADS)
Zheng, Yufeng
2011-05-01
There are strong evidences of that multimodal biometric score fusion can significantly improve human identification performance. Score level fusion usually involves score normalization, score fusion, and fusion decision. There are several types of score fusion methods, direct combination of fusion scores, classifier-based fusion, and density-based fusion. The real applications require achieving greater reliability in determining or verifying person's identity. The goal of this research is to improve the accuracy and robustness of human identification by using multimodal biometrics score fusion. The accuracy means high verification rate if tested on a closed dataset, or a high genuine accept rate under low false accept rate if tested on an open dataset. While the robustness means the fusion performance is stable with variant biometric scores. We propose a hidden Markov model (HMM) for multiple score fusion, where the biometric scores include multimodal scores and multi-matcher scores. The state probability density functions in a HHM model are estimated by Gaussian mixture model. The proposed HMM model for multiple score fusion is accurate for identification, flexible and reliable with biometrics. The proposed HMM method are tested on three NIST-BSSR1 multimodal databases and on three face-score databases. The results show the HMM method is an excellent and reliable score fusion method.
Hidden Markov chain modeling for epileptic networks identification.
Le Cam, Steven; Louis-Dorr, Valérie; Maillard, Louis
2013-01-01
The partial epileptic seizures are often considered to be caused by a wrong balance between inhibitory and excitatory interneuron connections within a focal brain area. These abnormal balances are likely to result in loss of functional connectivities between remote brain structures, while functional connectivities within the incriminated zone are enhanced. The identification of the epileptic networks underlying these hypersynchronies are expected to contribute to a better understanding of the brain mechanisms responsible for the development of the seizures. In this objective, threshold strategies are commonly applied, based on synchrony measurements computed from recordings of the electrophysiologic brain activity. However, such methods are reported to be prone to errors and false alarms. In this paper, we propose a hidden Markov chain modeling of the synchrony states with the aim to develop a reliable machine learning methods for epileptic network inference. The method is applied on a real Stereo-EEG recording, demonstrating consistent results with the clinical evaluations and with the current knowledge on temporal lobe epilepsy. PMID:24110697
Identification and classification of conopeptides using profile Hidden Markov Models.
Laht, Silja; Koua, Dominique; Kaplinski, Lauris; Lisacek, Frédérique; Stöcklin, Reto; Remm, Maido
2012-03-01
Conopeptides are small toxins produced by predatory marine snails of the genus Conus. They are studied with increasing intensity due to their potential in neurosciences and pharmacology. The number of existing conopeptides is estimated to be 1 million, but only about 1000 have been described to date. Thanks to new high-throughput sequencing technologies the number of known conopeptides is likely to increase exponentially in the near future. There is therefore a need for a fast and accurate computational method for identification and classification of the novel conopeptides in large data sets. 62 profile Hidden Markov Models (pHMMs) were built for prediction and classification of all described conopeptide superfamilies and families, based on the different parts of the corresponding protein sequences. These models showed very high specificity in detection of new peptides. 56 out of 62 models do not give a single false positive in a test with the entire UniProtKB/Swiss-Prot protein sequence database. Our study demonstrates the usefulness of mature peptide models for automatic classification with accuracy of 96% for the mature peptide models and 100% for the pro- and signal peptide models. Our conopeptide profile HMMs can be used for finding and annotation of new conopeptides from large datasets generated by transcriptome or genome sequencing. To our knowledge this is the first time this kind of computational method has been applied to predict all known conopeptide superfamilies and some conopeptide families. PMID:22244925
Optical character recognition of handwritten Arabic using hidden Markov models
Aulama, Mohannad M.; Natsheh, Asem M.; Abandah, Gheith A.; Olama, Mohammed M
2011-01-01
The problem of optical character recognition (OCR) of handwritten Arabic has not received a satisfactory solution yet. In this paper, an Arabic OCR algorithm is developed based on Hidden Markov Models (HMMs) combined with the Viterbi algorithm, which results in an improved and more robust recognition of characters at the sub-word level. Integrating the HMMs represents another step of the overall OCR trends being currently researched in the literature. The proposed approach exploits the structure of characters in the Arabic language in addition to their extracted features to achieve improved recognition rates. Useful statistical information of the Arabic language is initially extracted and then used to estimate the probabilistic parameters of the mathematical HMM. A new custom implementation of the HMM is developed in this study, where the transition matrix is built based on the collected large corpus, and the emission matrix is built based on the results obtained via the extracted character features. The recognition process is triggered using the Viterbi algorithm which employs the most probable sequence of sub-words. The model was implemented to recognize the sub-word unit of Arabic text raising the recognition rate from being linked to the worst recognition rate for any character to the overall structure of the Arabic language. Numerical results show that there is a potentially large recognition improvement by using the proposed algorithms.
A Network of SCOP Hidden Markov Models and Its Analysis
2011-01-01
Background The Structural Classification of Proteins (SCOP) database uses a large number of hidden Markov models (HMMs) to represent families and superfamilies composed of proteins that presumably share the same evolutionary origin. However, how the HMMs are related to one another has not been examined before. Results In this work, taking into account the processes used to build the HMMs, we propose a working hypothesis to examine the relationships between HMMs and the families and superfamilies that they represent. Specifically, we perform an all-against-all HMM comparison using the HHsearch program (similar to BLAST) and construct a network where the nodes are HMMs and the edges connect similar HMMs. We hypothesize that the HMMs in a connected component belong to the same family or superfamily more often than expected under a random network connection model. Results show a pattern consistent with this working hypothesis. Moreover, the HMM network possesses features distinctly different from the previously documented biological networks, exemplified by the exceptionally high clustering coefficient and the large number of connected components. Conclusions The current finding may provide guidance in devising computational methods to reduce the degree of overlaps between the HMMs representing the same superfamilies, which may in turn enable more efficient large-scale sequence searches against the database of HMMs. PMID:21635719
Identification and classification of conopeptides using profile Hidden Markov Models.
Laht, Silja; Koua, Dominique; Kaplinski, Lauris; Lisacek, Frédérique; Stöcklin, Reto; Remm, Maido
2012-03-01
Conopeptides are small toxins produced by predatory marine snails of the genus Conus. They are studied with increasing intensity due to their potential in neurosciences and pharmacology. The number of existing conopeptides is estimated to be 1 million, but only about 1000 have been described to date. Thanks to new high-throughput sequencing technologies the number of known conopeptides is likely to increase exponentially in the near future. There is therefore a need for a fast and accurate computational method for identification and classification of the novel conopeptides in large data sets. 62 profile Hidden Markov Models (pHMMs) were built for prediction and classification of all described conopeptide superfamilies and families, based on the different parts of the corresponding protein sequences. These models showed very high specificity in detection of new peptides. 56 out of 62 models do not give a single false positive in a test with the entire UniProtKB/Swiss-Prot protein sequence database. Our study demonstrates the usefulness of mature peptide models for automatic classification with accuracy of 96% for the mature peptide models and 100% for the pro- and signal peptide models. Our conopeptide profile HMMs can be used for finding and annotation of new conopeptides from large datasets generated by transcriptome or genome sequencing. To our knowledge this is the first time this kind of computational method has been applied to predict all known conopeptide superfamilies and some conopeptide families.
Recognition of surgical skills using hidden Markov models
NASA Astrophysics Data System (ADS)
Speidel, Stefanie; Zentek, Tom; Sudra, Gunther; Gehrig, Tobias; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger
2009-02-01
Minimally invasive surgery is a highly complex medical discipline and can be regarded as a major breakthrough in surgical technique. A minimally invasive intervention requires enhanced motor skills to deal with difficulties like the complex hand-eye coordination and restricted mobility. To alleviate these constraints we propose to enhance the surgeon's capabilities by providing a context-aware assistance using augmented reality techniques. To recognize and analyze the current situation for context-aware assistance, we need intraoperative sensor data and a model of the intervention. Characteristics of a situation are the performed activity, the used instruments, the surgical objects and the anatomical structures. Important information about the surgical activity can be acquired by recognizing the surgical gesture performed. Surgical gestures in minimally invasive surgery like cutting, knot-tying or suturing are here referred to as surgical skills. We use the motion data from the endoscopic instruments to classify and analyze the performed skill and even use it for skill evaluation in a training scenario. The system uses Hidden Markov Models (HMM) to model and recognize a specific surgical skill like knot-tying or suturing with an average recognition rate of 92%.
Efficient inference of hidden Markov models from large observation sequences
NASA Astrophysics Data System (ADS)
Priest, Benjamin W.; Cybenko, George
2016-05-01
The hidden Markov model (HMM) is widely used to model time series data. However, the conventional Baum- Welch algorithm is known to perform poorly when applied to long observation sequences. The literature contains several alternatives that seek to improve the memory or time complexity of the algorithm. However, for an HMM with N states and an observation sequence of length T, these alternatives require at best O(N) space and O(N2T) time. Given the preponderance of applications that increasingly deal with massive amounts of data, an alternative whose time is O(T)+poly(N) is desired. Recent research presents an alternative to the Baum-Welch algorithm that relies on nonnegative matrix factorization. This document examines the space complexity of this alternative approach and proposes further optimizations using approaches adopted from the matrix sketching literature. The result is a streaming algorithm whose space complexity is constant and time complexity is linear with respect to the size of the observation sequence. The paper also presents a batch algorithm that allow for even further improved space complexity at the expense of an additional pass over the observation sequence.
Supervised learning of hidden Markov models for sequence discrimination
Mamitsuka, Hiroshi
1997-12-01
We present two supervised learning algorithms for hidden Markov models (HMMs) for sequence discrimination. When we model a class of sequences with an HMM, conventional learning algorithms for HMMs have trained the HMM with training examples belonging to the class, i.e. positive examples alone, while both of our methods allow us to use negative examples as well as positive examples. One of our algorithms minimizes a kind of distance between a target likelihood of a given training sequence and an actual likelihood of the sequence, which is obtained by a given HMM, using an additive type of parameter updating based on a gradient-descent learning. The other algorithm maximizes a criterion which represents a kind of ratio of the likelihood of a positive example to the likelihood of the total example, using a multiplicative type of parameter updating which is more efficient in actual computation time than the additive type one. We compare our two methods with two conventional methods on a type of cross-validation of actual motif classification experiments. Experimental results show that in terms of the average number of classification errors, our two methods out-perform the two conventional algorithms. 14 refs., 4 figs., 1 tab.
A clustering approach for estimating parameters of a profile hidden Markov model.
Aghdam, Rosa; Pezeshk, Hamid; Malekpour, Seyed Amir; Shemehsavar, Soudabeh; Eslahchi, Changiz
2013-01-01
A Profile Hidden Markov Model (PHMM) is a standard form of a Hidden Markov Models used for modeling protein and DNA sequence families based on multiple alignment. In this paper, we implement Baum-Welch algorithm and the Bayesian Monte Carlo Markov Chain (BMCMC) method for estimating parameters of small artificial PHMM. In order to improve the prediction accuracy of the estimation of the parameters of the PHMM, we classify the training data using the weighted values of sequences in the PHMM then apply an algorithm for estimating parameters of the PHMM. The results show that the BMCMC method performs better than the Maximum Likelihood estimation. PMID:23865165
Nonparametric model validations for hidden Markov models with applications in financial econometrics
Zhao, Zhibiao
2011-01-01
We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601
Group association test using a hidden Markov model.
Cheng, Yichen; Dai, James Y; Kooperberg, Charles
2016-04-01
In the genomic era, group association tests are of great interest. Due to the overwhelming number of individual genomic features, the power of testing for association of a single genomic feature at a time is often very small, as are the effect sizes for most features. Many methods have been proposed to test association of a trait with a group of features within a functional unit as a whole, e.g. all SNPs in a gene, yet few of these methods account for the fact that generally a substantial proportion of the features are not associated with the trait. In this paper, we propose to model the association for each feature in the group as a mixture of features with no association and features with non-zero associations to explicitly account for the possibility that a fraction of features may not be associated with the trait while other features in the group are. The feature-level associations are first estimated by generalized linear models; the sequence of these estimated associations is then modeled by a hidden Markov chain. To test for global association, we develop a modified likelihood ratio test based on a log-likelihood function that ignores higher order dependency plus a penalty term. We derive the asymptotic distribution of the likelihood ratio test under the null hypothesis. Furthermore, we obtain the posterior probability of association for each feature, which provides evidence of feature-level association and is useful for potential follow-up studies. In simulations and data application, we show that our proposed method performs well when compared with existing group association tests especially when there are only few features associated with the outcome. PMID:26420797
Redefining CpG islands using hidden Markov models.
Wu, Hao; Caffo, Brian; Jaffee, Harris A; Irizarry, Rafael A; Feinberg, Andrew P
2010-07-01
The DNA of most vertebrates is depleted in CpG dinucleotide: a C followed by a G in the 5' to 3' direction. CpGs are the target for DNA methylation, a chemical modification of cytosine (C) heritable during cell division and the most well-characterized epigenetic mechanism. The remaining CpGs tend to cluster in regions referred to as CpG islands (CGI). Knowing CGI locations is important because they mark functionally relevant epigenetic loci in development and disease. For various mammals, including human, a readily available and widely used list of CGI is available from the UCSC Genome Browser. This list was derived using algorithms that search for regions satisfying a definition of CGI proposed by Gardiner-Garden and Frommer more than 20 years ago. Recent findings, enabled by advances in technology that permit direct measurement of epigenetic endpoints at a whole-genome scale, motivate the need to adapt the current CGI definition. In this paper, we propose a procedure, guided by hidden Markov models, that permits an extensible approach to detecting CGI. The main advantage of our approach over others is that it summarizes the evidence for CGI status as probability scores. This provides flexibility in the definition of a CGI and facilitates the creation of CGI lists for other species. The utility of this approach is demonstrated by generating the first CGI lists for invertebrates, and the fact that we can create CGI lists that substantially increases overlap with recently discovered epigenetic marks. A CGI list and the probability scores, as a function of genome location, for each species are available at http://www.rafalab.org. PMID:20212320
Ensemble hidden Markov models with application to landmine detection
NASA Astrophysics Data System (ADS)
Hamdi, Anis; Frigui, Hichem
2015-12-01
We introduce an ensemble learning method for temporal data that uses a mixture of hidden Markov models (HMM). We hypothesize that the data are generated by K models, each of which reflects a particular trend in the data. The proposed approach, called ensemble HMM (eHMM), is based on clustering within the log-likelihood space and has two main steps. First, one HMM is fit to each of the N individual training sequences. For each fitted model, we evaluate the log-likelihood of each sequence. This results in an N-by-N log-likelihood distance matrix that will be partitioned into K groups using a relational clustering algorithm. In the second step, we learn the parameters of one HMM per cluster. We propose using and optimizing various training approaches for the different K groups depending on their size and homogeneity. In particular, we investigate the maximum likelihood (ML), the minimum classification error (MCE), and the variational Bayesian (VB) training approaches. Finally, to test a new sequence, its likelihood is computed in all the models and a final confidence value is assigned by combining the models' outputs using an artificial neural network. We propose both discrete and continuous versions of the eHMM. Our approach was evaluated on a real-world application for landmine detection using ground-penetrating radar (GPR). Results show that both the continuous and discrete eHMM can identify meaningful and coherent HMM mixture components that describe different properties of the data. Each HMM mixture component models a group of data that share common attributes. These attributes are reflected in the mixture model's parameters. The results indicate that the proposed method outperforms the baseline HMM that uses one model for each class in the data.
Wang, Hongyan; Zhou, Xiaobo
2013-04-01
By altering the electrostatic charge of histones or providing binding sites to protein recognition molecules, Chromatin marks have been proposed to regulate gene expression, a property that has motivated researchers to link these marks to cis-regulatory elements. With the help of next generation sequencing technologies, we can now correlate one specific chromatin mark with regulatory elements (e.g. enhancers or promoters) and also build tools, such as hidden Markov models, to gain insight into mark combinations. However, hidden Markov models have limitation for their character of generative models and assume that a current observation depends only on a current hidden state in the chain. Here, we employed two graphical probabilistic models, namely the linear conditional random field model and multivariate hidden Markov model, to mark gene regions with different states based on recurrent and spatially coherent character of these eight marks. Both models revealed chromatin states that may correspond to enhancers and promoters, transcribed regions, transcriptional elongation, and low-signal regions. We also found that the linear conditional random field model was more effective than the hidden Markov model in recognizing regulatory elements, such as promoter-, enhancer-, and transcriptional elongation-associated regions, which gives us a better choice.
Wang, Hongyan; Zhou, Xiaobo
2013-04-01
By altering the electrostatic charge of histones or providing binding sites to protein recognition molecules, Chromatin marks have been proposed to regulate gene expression, a property that has motivated researchers to link these marks to cis-regulatory elements. With the help of next generation sequencing technologies, we can now correlate one specific chromatin mark with regulatory elements (e.g. enhancers or promoters) and also build tools, such as hidden Markov models, to gain insight into mark combinations. However, hidden Markov models have limitation for their character of generative models and assume that a current observation depends only on a current hidden state in the chain. Here, we employed two graphical probabilistic models, namely the linear conditional random field model and multivariate hidden Markov model, to mark gene regions with different states based on recurrent and spatially coherent character of these eight marks. Both models revealed chromatin states that may correspond to enhancers and promoters, transcribed regions, transcriptional elongation, and low-signal regions. We also found that the linear conditional random field model was more effective than the hidden Markov model in recognizing regulatory elements, such as promoter-, enhancer-, and transcriptional elongation-associated regions, which gives us a better choice. PMID:23237214
Post processing with first- and second-order hidden Markov models
NASA Astrophysics Data System (ADS)
Taghva, Kazem; Poudel, Srijana; Malreddy, Spandana
2013-01-01
In this paper, we present the implementation and evaluation of first order and second order Hidden Markov Models to identify and correct OCR errors in the post processing of books. Our experiments show that the first order model approximately corrects 10% of the errors with 100% precision, while the second order model corrects a higher percentage of errors with much lower precision.
Tracking Problem Solving by Multivariate Pattern Analysis and Hidden Markov Model Algorithms
ERIC Educational Resources Information Center
Anderson, John R.
2012-01-01
Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application…
Comparison of the Beta and the Hidden Markov Models of Trust in Dynamic Environments
NASA Astrophysics Data System (ADS)
Moe, Marie E. G.; Helvik, Bjarne E.; Knapskog, Svein J.
Computational trust and reputation models are used to aid the decision-making process in complex dynamic environments, where we are unable to obtain perfect information about the interaction partners. In this paper we present a comparison of our proposed hidden Markov trust model to the Beta reputation system. The hidden Markov trust model takes the time between observations into account, it also distinguishes between system states and uses methods previously applied to intrusion detection for the prediction of which state an agent is in. We show that the hidden Markov trust model performs better when it comes to the detection of changes in behavior of agents, due to its larger richness in model features. This means that our trust model may be more realistic in dynamic environments. However, the increased model complexity also leads to bigger challenges in estimating parameter values for the model. We also show that the hidden Markov trust model can be parameterized so that it responds similarly to the Beta reputation system.
Estimation of the occurrence rate of strong earthquakes based on hidden semi-Markov models
NASA Astrophysics Data System (ADS)
Votsi, I.; Limnios, N.; Tsaklidis, G.; Papadimitriou, E.
2012-04-01
The present paper aims at the application of hidden semi-Markov models (HSMMs) in an attempt to reveal key features for the earthquake generation, associated with the actual stress field, which is not accessible to direct observation. The models generalize the hidden Markov models by considering the hidden process to form actually a semi-Markov chain. Considering that the states of the models correspond to levels of actual stress fields, the stress field level at the occurrence time of each strong event is revealed. The dataset concerns a well catalogued seismically active region incorporating a variety of tectonic styles. More specifically, the models are applied in Greece and its surrounding lands, concerning a complete data sample with strong (M≥ 6.5) earthquakes that occurred in the study area since 1845 up to present. The earthquakes that occurred are grouped according to their magnitudes and the cases of two and three magnitude ranges for a corresponding number of states are examined. The parameters of the HSMMs are estimated and their confidence intervals are calculated based on their asymptotic behavior. The rate of the earthquake occurrence is introduced through the proposed HSMMs and its maximum likelihood estimator is calculated. The asymptotic properties of the estimator are studied, including the uniformly strongly consistency and the asymptotical normality. The confidence interval for the proposed estimator is given. We assume the state space of both the observable and the hidden process to be finite, the hidden Markov chain to be homogeneous and stationary and the observations to be conditionally independent. The hidden states at the occurrence time of each strong event are revealed and the rate of occurrence of an anticipated earthquake is estimated on the basis of the proposed HSMMs. Moreover, the mean time for the first occurrence of a strong anticipated earthquake is estimated and its confidence interval is calculated.
Cheng, Jade Yu; Mailund, Thomas
2015-08-01
With full genome data from several closely related species now readily available, we have the ultimate data for demographic inference. Exploiting these full genomes, however, requires models that can explicitly model recombination along alignments of full chromosomal length. Over the last decade a class of models, based on the sequential Markov coalescence model combined with hidden Markov models, has been developed and used to make inference in simple demographic scenarios. To move forward to more complex demographic modelling we need better and more automated ways of specifying these models and efficient optimisation algorithms for inferring the parameters in complex and often high-dimensional models. In this paper we present a framework for building such coalescence hidden Markov models for pairwise alignments and present results for using heuristic optimisation algorithms for parameter estimation. We show that we can build more complex demographic models than our previous frameworks and that we obtain more accurate parameter estimates using heuristic optimisation algorithms than when using our previous gradient based approaches. Our new framework provides a flexible way of constructing coalescence hidden Markov models almost automatically. While estimating parameters in more complex models is still challenging we show that using heuristic optimisation algorithms we still get a fairly good accuracy.
A TWO-STATE MIXED HIDDEN MARKOV MODEL FOR RISKY TEENAGE DRIVING BEHAVIOR
Jackson, John C.; Albert, Paul S.; Zhang, Zhiwei
2016-01-01
This paper proposes a joint model for longitudinal binary and count outcomes. We apply the model to a unique longitudinal study of teen driving where risky driving behavior and the occurrence of crashes or near crashes are measured prospectively over the first 18 months of licensure. Of scientific interest is relating the two processes and predicting crash and near crash outcomes. We propose a two-state mixed hidden Markov model whereby the hidden state characterizes the mean for the joint longitudinal crash/near crash outcomes and elevated g-force events which are a proxy for risky driving. Heterogeneity is introduced in both the conditional model for the count outcomes and the hidden process using a shared random effect. An estimation procedure is presented using the forward–backward algorithm along with adaptive Gaussian quadrature to perform numerical integration. The estimation procedure readily yields hidden state probabilities as well as providing for a broad class of predictors.
NASA Astrophysics Data System (ADS)
Vaglica, Gabriella; Lillo, Fabrizio; Mantegna, Rosario N.
2010-07-01
Large trades in a financial market are usually split into smaller parts and traded incrementally over extended periods of time. We address these large trades as hidden orders. In order to identify and characterize hidden orders, we fit hidden Markov models to the time series of the sign of the tick-by-tick inventory variation of market members of the Spanish Stock Exchange. Our methodology probabilistically detects trading sequences, which are characterized by a significant majority of buy or sell transactions. We interpret these patches of sequential buying or selling transactions as proxies of the traded hidden orders. We find that the time, volume and number of transaction size distributions of these patches are fat tailed. Long patches are characterized by a large fraction of market orders and a low participation rate, while short patches have a large fraction of limit orders and a high participation rate. We observe the existence of a buy-sell asymmetry in the number, average length, average fraction of market orders and average participation rate of the detected patches. The detected asymmetry is clearly dependent on the local market trend. We also compare the hidden Markov model patches with those obtained with the segmentation method used in Vaglica et al (2008 Phys. Rev. E 77 036110), and we conclude that the former ones can be interpreted as a partition of the latter ones.
Markov Chain Monte Carlo Sampling Methods for 1D Seismic and EM Data Inversion
2008-09-22
This software provides several Markov chain Monte Carlo sampling methods for the Bayesian model developed for inverting 1D marine seismic and controlled source electromagnetic (CSEM) data. The current software can be used for individual inversion of seismic AVO and CSEM data and for joint inversion of both seismic and EM data sets. The structure of the software is very general and flexible, and it allows users to incorporate their own forward simulation codes and rockmore » physics model codes easily into this software. Although the softwae was developed using C and C++ computer languages, the user-supplied codes can be written in C, C++, or various versions of Fortran languages. The software provides clear interfaces for users to plug in their own codes. The output of this software is in the format that the R free software CODA can directly read to build MCMC objects.« less
A hidden Markov model for space-time precipitation
Zucchini, W. ); Guttorp, P. )
1991-08-01
Stochastic models for precipitation events in space and time over mesoscale spatial areas have important applications in hydrology, both as input to runoff models and as parts of general circulation models (GCMs) of global climate. A family of multivariate models for the occurrence/nonoccurrence of precipitation at N sites is constructed by assuming a different probability of events at the sites for each of a number of unobservable climate states. The climate process is assumed to follow a Markov chain. Simple formulae for first- and second-order parameter functions are derived, and used to find starting values for a numerical maximization of the likelihood. The method is illustrated by applying it to data for one site in Washington and to data for a network in the Great plains.
A method of hidden Markov model optimization for use with geophysical data sets
NASA Technical Reports Server (NTRS)
Granat, R. A.
2003-01-01
Geophysics research has been faced with a growing need for automated techniques with which to process large quantities of data. A successful tool must meet a number of requirements: it should be consistent, require minimal parameter tuning, and produce scientifically meaningful results in reasonable time. We introduce a hidden Markov model (HMM)-based method for analysis of geophysical data sets that attempts to address these issues.
NASA Astrophysics Data System (ADS)
Choi, Yeontaek; Sim, Seungwoo; Lee, Sang-Hee
2014-06-01
The locomotion behavior of Caenorhabditis elegans has been extensively studied to understand the relationship between the changes in the organism's neural activity and the biomechanics. However, so far, we have not yet achieved the understanding. This is because the worm complicatedly responds to the environmental factors, especially chemical stress. Constructing a mathematical model is helpful for the understanding the locomotion behavior in various surrounding conditions. In the present study, we built three hidden Markov models for the crawling behavior of C. elegans in a controlled environment with no chemical treatment and in a polluted environment by formaldehyde, toluene, and benzene (0.1 ppm and 0.5 ppm for each case). The organism's crawling activity was recorded using a digital camcorder for 20 min at a rate of 24 frames per second. All shape patterns were quantified by branch length similarity entropy and classified into five groups by using the self-organizing map. To evaluate and establish the hidden Markov models, we compared correlation coefficients between the simulated behavior (i.e. temporal pattern sequence) generated by the models and the actual crawling behavior. The comparison showed that the hidden Markov models are successful to characterize the crawling behavior. In addition, we briefly discussed the possibility of using the models together with the entropy to develop bio-monitoring systems for determining water quality.
Hidden Markov models and other machine learning approaches in computational molecular biology
Baldi, P.
1995-12-31
This tutorial was one of eight tutorials selected to be presented at the Third International Conference on Intelligent Systems for Molecular Biology which was held in the United Kingdom from July 16 to 19, 1995. Computational tools are increasingly needed to process the massive amounts of data, to organize and classify sequences, to detect weak similarities, to separate coding from non-coding regions, and reconstruct the underlying evolutionary history. The fundamental problem in machine learning is the same as in scientific reasoning in general, as well as statistical modeling: to come up with a good model for the data. In this tutorial four classes of models are reviewed. They are: Hidden Markov models; artificial Neural Networks; Belief Networks; and Stochastic Grammars. When dealing with DNA and protein primary sequences, Hidden Markov models are one of the most flexible and powerful alignments and data base searches. In this tutorial, attention is focused on the theory of Hidden Markov Models, and how to apply them to problems in molecular biology.
Detecting critical state before phase transition of complex systems by hidden Markov model
NASA Astrophysics Data System (ADS)
Liu, Rui; Chen, Pei; Li, Yongjun; Chen, Luonan
Identifying the critical state or pre-transition state just before the occurrence of a phase transition is a challenging task, because the state of the system may show little apparent change before this critical transition during the gradual parameter variations. Such dynamics of phase transition is generally composed of three stages, i.e., before-transition state, pre-transition state, and after-transition state, which can be considered as three different Markov processes. Thus, based on this dynamical feature, we present a novel computational method, i.e., hidden Markov model (HMM), to detect the switching point of the two Markov processes from the before-transition state (a stationary Markov process) to the pre-transition state (a time-varying Markov process), thereby identifying the pre-transition state or early-warning signals of the phase transition. To validate the effectiveness, we apply this method to detect the signals of the imminent phase transitions of complex systems based on the simulated datasets, and further identify the pre-transition states as well as their critical modules for three real datasets, i.e., the acute lung injury triggered by phosgene inhalation, MCF-7 human breast cancer caused by heregulin, and HCV-induced dysplasia and hepatocellular carcinoma.
Efficient Learning of Continuous-Time Hidden Markov Models for Disease Progression
Liu, Yu-Ying; Li, Shuang; Li, Fuxin; Song, Le; Rehg, James M.
2016-01-01
The Continuous-Time Hidden Markov Model (CT-HMM) is an attractive approach to modeling disease progression due to its ability to describe noisy observations arriving irregularly in time. However, the lack of an efficient parameter learning algorithm for CT-HMM restricts its use to very small models or requires unrealistic constraints on the state transitions. In this paper, we present the first complete characterization of efficient EM-based learning methods for CT-HMM models. We demonstrate that the learning problem consists of two challenges: the estimation of posterior state probabilities and the computation of end-state conditioned statistics. We solve the first challenge by reformulating the estimation problem in terms of an equivalent discrete time-inhomogeneous hidden Markov model. The second challenge is addressed by adapting three approaches from the continuous time Markov chain literature to the CT-HMM domain. We demonstrate the use of CT-HMMs with more than 100 states to visualize and predict disease progression using a glaucoma dataset and an Alzheimer’s disease dataset. PMID:27019571
Complex Sequencing Rules of Birdsong Can be Explained by Simple Hidden Markov Processes
Katahira, Kentaro; Suzuki, Kenta; Okanoya, Kazuo; Okada, Masato
2011-01-01
Complex sequencing rules observed in birdsongs provide an opportunity to investigate the neural mechanism for generating complex sequential behaviors. To relate the findings from studying birdsongs to other sequential behaviors such as human speech and musical performance, it is crucial to characterize the statistical properties of the sequencing rules in birdsongs. However, the properties of the sequencing rules in birdsongs have not yet been fully addressed. In this study, we investigate the statistical properties of the complex birdsong of the Bengalese finch (Lonchura striata var. domestica). Based on manual-annotated syllable labeles, we first show that there are significant higher-order context dependencies in Bengalese finch songs, that is, which syllable appears next depends on more than one previous syllable. We then analyze acoustic features of the song and show that higher-order context dependencies can be explained using first-order hidden state transition dynamics with redundant hidden states. This model corresponds to hidden Markov models (HMMs), well known statistical models with a large range of application for time series modeling. The song annotation with these models with first-order hidden state dynamics agreed well with manual annotation, the score was comparable to that of a second-order HMM, and surpassed the zeroth-order model (the Gaussian mixture model; GMM), which does not use context information. Our results imply that the hierarchical representation with hidden state dynamics may underlie the neural implementation for generating complex behavioral sequences with higher-order dependencies. PMID:21915345
Complex sequencing rules of birdsong can be explained by simple hidden Markov processes.
Katahira, Kentaro; Suzuki, Kenta; Okanoya, Kazuo; Okada, Masato
2011-01-01
Complex sequencing rules observed in birdsongs provide an opportunity to investigate the neural mechanism for generating complex sequential behaviors. To relate the findings from studying birdsongs to other sequential behaviors such as human speech and musical performance, it is crucial to characterize the statistical properties of the sequencing rules in birdsongs. However, the properties of the sequencing rules in birdsongs have not yet been fully addressed. In this study, we investigate the statistical properties of the complex birdsong of the Bengalese finch (Lonchura striata var. domestica). Based on manual-annotated syllable labeles, we first show that there are significant higher-order context dependencies in Bengalese finch songs, that is, which syllable appears next depends on more than one previous syllable. We then analyze acoustic features of the song and show that higher-order context dependencies can be explained using first-order hidden state transition dynamics with redundant hidden states. This model corresponds to hidden Markov models (HMMs), well known statistical models with a large range of application for time series modeling. The song annotation with these models with first-order hidden state dynamics agreed well with manual annotation, the score was comparable to that of a second-order HMM, and surpassed the zeroth-order model (the Gaussian mixture model; GMM), which does not use context information. Our results imply that the hierarchical representation with hidden state dynamics may underlie the neural implementation for generating complex behavioral sequences with higher-order dependencies.
Complex sequencing rules of birdsong can be explained by simple hidden Markov processes.
Katahira, Kentaro; Suzuki, Kenta; Okanoya, Kazuo; Okada, Masato
2011-01-01
Complex sequencing rules observed in birdsongs provide an opportunity to investigate the neural mechanism for generating complex sequential behaviors. To relate the findings from studying birdsongs to other sequential behaviors such as human speech and musical performance, it is crucial to characterize the statistical properties of the sequencing rules in birdsongs. However, the properties of the sequencing rules in birdsongs have not yet been fully addressed. In this study, we investigate the statistical properties of the complex birdsong of the Bengalese finch (Lonchura striata var. domestica). Based on manual-annotated syllable labeles, we first show that there are significant higher-order context dependencies in Bengalese finch songs, that is, which syllable appears next depends on more than one previous syllable. We then analyze acoustic features of the song and show that higher-order context dependencies can be explained using first-order hidden state transition dynamics with redundant hidden states. This model corresponds to hidden Markov models (HMMs), well known statistical models with a large range of application for time series modeling. The song annotation with these models with first-order hidden state dynamics agreed well with manual annotation, the score was comparable to that of a second-order HMM, and surpassed the zeroth-order model (the Gaussian mixture model; GMM), which does not use context information. Our results imply that the hierarchical representation with hidden state dynamics may underlie the neural implementation for generating complex behavioral sequences with higher-order dependencies. PMID:21915345
Noé, Frank; Wu, Hao; Prinz, Jan-Hendrik; Plattner, Nuria
2013-11-14
Markov state models (MSMs) have been successful in computing metastable states, slow relaxation timescales and associated structural changes, and stationary or kinetic experimental observables of complex molecules from large amounts of molecular dynamics simulation data. However, MSMs approximate the true dynamics by assuming a Markov chain on a clusters discretization of the state space. This approximation is difficult to make for high-dimensional biomolecular systems, and the quality and reproducibility of MSMs has, therefore, been limited. Here, we discard the assumption that dynamics are Markovian on the discrete clusters. Instead, we only assume that the full phase-space molecular dynamics is Markovian, and a projection of this full dynamics is observed on the discrete states, leading to the concept of Projected Markov Models (PMMs). Robust estimation methods for PMMs are not yet available, but we derive a practically feasible approximation via Hidden Markov Models (HMMs). It is shown how various molecular observables of interest that are often computed from MSMs can be computed from HMMs/PMMs. The new framework is applicable to both, simulation and single-molecule experimental data. We demonstrate its versatility by applications to educative model systems, a 1 ms Anton MD simulation of the bovine pancreatic trypsin inhibitor protein, and an optical tweezer force probe trajectory of an RNA hairpin. PMID:24320261
NASA Astrophysics Data System (ADS)
Noé, Frank; Wu, Hao; Prinz, Jan-Hendrik; Plattner, Nuria
2013-11-01
Markov state models (MSMs) have been successful in computing metastable states, slow relaxation timescales and associated structural changes, and stationary or kinetic experimental observables of complex molecules from large amounts of molecular dynamics simulation data. However, MSMs approximate the true dynamics by assuming a Markov chain on a clusters discretization of the state space. This approximation is difficult to make for high-dimensional biomolecular systems, and the quality and reproducibility of MSMs has, therefore, been limited. Here, we discard the assumption that dynamics are Markovian on the discrete clusters. Instead, we only assume that the full phase-space molecular dynamics is Markovian, and a projection of this full dynamics is observed on the discrete states, leading to the concept of Projected Markov Models (PMMs). Robust estimation methods for PMMs are not yet available, but we derive a practically feasible approximation via Hidden Markov Models (HMMs). It is shown how various molecular observables of interest that are often computed from MSMs can be computed from HMMs/PMMs. The new framework is applicable to both, simulation and single-molecule experimental data. We demonstrate its versatility by applications to educative model systems, a 1 ms Anton MD simulation of the bovine pancreatic trypsin inhibitor protein, and an optical tweezer force probe trajectory of an RNA hairpin.
Segmentation of brain tumors in 4D MR images using the hidden Markov model.
Solomon, Jeffrey; Butman, John A; Sood, Arun
2006-12-01
Tumor size is an objective measure that is used to evaluate the effectiveness of anticancer agents. Responses to therapy are categorized as complete response, partial response, stable disease and progressive disease. Implicit in this scheme is the change in the tumor over time; however, most tumor segmentation algorithms do not use temporal information. Here we introduce an automated method using probabilistic reasoning over both space and time to segment brain tumors from 4D spatio-temporal MRI data. The 3D expectation-maximization method is extended using the hidden Markov model to infer tumor classification based on previous and subsequent segmentation results. Spatial coherence via a Markov Random Field was included in the 3D spatial model. Simulated images as well as patient images from three independent sources were used to validate this method. The sensitivity and specificity of tumor segmentation using this spatio-temporal model is improved over commonly used spatial or temporal models alone. PMID:17050032
Bayesian Clustering Using Hidden Markov Random Fields in Spatial Population Genetics
François, Olivier; Ancelet, Sophie; Guillot, Gilles
2006-01-01
We introduce a new Bayesian clustering algorithm for studying population structure using individually geo-referenced multilocus data sets. The algorithm is based on the concept of hidden Markov random field, which models the spatial dependencies at the cluster membership level. We argue that (i) a Markov chain Monte Carlo procedure can implement the algorithm efficiently, (ii) it can detect significant geographical discontinuities in allele frequencies and regulate the number of clusters, (iii) it can check whether the clusters obtained without the use of spatial priors are robust to the hypothesis of discontinuous geographical variation in allele frequencies, and (iv) it can reduce the number of loci required to obtain accurate assignments. We illustrate and discuss the implementation issues with the Scandinavian brown bear and the human CEPH diversity panel data set. PMID:16888334
A path-independent method for barrier option pricing in hidden Markov models
NASA Astrophysics Data System (ADS)
Rashidi Ranjbar, Hedieh; Seifi, Abbas
2015-12-01
This paper presents a method for barrier option pricing under a Black-Scholes model with Markov switching. We extend the option pricing method of Buffington and Elliott to price continuously monitored barrier options under a Black-Scholes model with regime switching. We use a regime switching random Esscher transform in order to determine an equivalent martingale pricing measure, and then solve the resulting multidimensional integral for pricing barrier options. We have calculated prices for down-and-out call options under a two-state hidden Markov model using two different Monte-Carlo simulation approaches and the proposed method. A comparison of the results shows that our method is faster than Monte-Carlo simulation methods.
NASA Astrophysics Data System (ADS)
Dong, Ming; He, David
2007-07-01
Diagnostics and prognostics are two important aspects in a condition-based maintenance (CBM) program. However, these two tasks are often separately performed. For example, data might be collected and analysed separately for diagnosis and prognosis. This practice increases the cost and reduces the efficiency of CBM and may affect the accuracy of the diagnostic and prognostic results. In this paper, a statistical modelling methodology for performing both diagnosis and prognosis in a unified framework is presented. The methodology is developed based on segmental hidden semi-Markov models (HSMMs). An HSMM is a hidden Markov model (HMM) with temporal structures. Unlike HMM, an HSMM does not follow the unrealistic Markov chain assumption and therefore provides more powerful modelling and analysis capability for real problems. In addition, an HSMM allows modelling the time duration of the hidden states and therefore is capable of prognosis. To facilitate the computation in the proposed HSMM-based diagnostics and prognostics, new forward-backward variables are defined and a modified forward-backward algorithm is developed. The existing state duration estimation methods are inefficient because they require a huge storage and computational load. Therefore, a new approach is proposed for training HSMMs in which state duration probabilities are estimated on the lattice (or trellis) of observations and states. The model parameters are estimated through the modified forward-backward training algorithm. The estimated state duration probability distributions combined with state-changing point detection can be used to predict the useful remaining life of a system. The evaluation of the proposed methodology was carried out through a real world application: health monitoring of hydraulic pumps. In the tests, the recognition rates for all states are greater than 96%. For each individual pump, the recognition rate is increased by 29.3% in comparison with HMMs. Because of the temporal
Memetic Approaches for Optimizing Hidden Markov Models: A Case Study in Time Series Prediction
NASA Astrophysics Data System (ADS)
Bui, Lam Thu; Barlow, Michael
We propose a methodology for employing memetics (local search) within the framework of evolutionary algorithms to optimize parameters of hidden markov models. With this proposal, the rate and frequency of using local search are automatically changed over time either at a population or individual level. At the population level, we allow the rate of using local search to decay over time to zero (at the final generation). At the individual level, each individual is equipped with information of when it will do local search and for how long. This information evolves over time alongside the main elements of the chromosome representing the individual.
Alignment of multiple proteins with an ensemble of Hidden Markov Models
Song, Yinglei; Qu, Junfeng; Hura, Gurdeep S.
2011-01-01
In this paper, we developed a new method that progressively construct and update a set of alignments by adding sequences in certain order to each of the existing alignments. Each of the existing alignments is modelled with a profile Hidden Markov Model (HMM) and an added sequence is aligned to each of these profile HMMs. We introduced an integer parameter for the number of profile HMMs. The profile HMMs are then updated based on the alignments with leading scores. Our experiments on BaliBASE showed that our approach could efficiently explore the alignment space and significantly improve the alignment accuracy. PMID:20376922
(abstract) Modeling Protein Families and Human Genes: Hidden Markov Models and a Little Beyond
NASA Technical Reports Server (NTRS)
Baldi, Pierre
1994-01-01
We will first give a brief overview of Hidden Markov Models (HMMs) and their use in Computational Molecular Biology. In particular, we will describe a detailed application of HMMs to the G-Protein-Coupled-Receptor Superfamily. We will also describe a number of analytical results on HMMs that can be used in discrimination tests and database mining. We will then discuss the limitations of HMMs and some new directions of research. We will conclude with some recent results on the application of HMMs to human gene modeling and parsing.
NASA Astrophysics Data System (ADS)
Granat, R. A.; Clayton, R.; Kedar, S.; Kaneko, Y.
2003-12-01
We employ a robust hidden Markov model (HMM) based technique to perform statistical pattern analysis of suspected seismic and aseismic events in the poorly explored period band of minutes to hours. The technique allows us to classify known events and provides a statistical basis for finding and cataloging similar events represented elsewhere in the observations. In this work, we focus on data collected by the Southern California TriNet system. The hidden Markov model (HMM) approach assumes that the observed data has been generated by an unobservable dynamical statistical process. The process is of a particular form such that each observation is coincident with the system being in a particular discrete state. The dynamics are the model are constructed so that the next state is directly dependent only on the current state -- it is a first order Markov process. The model is completely described by a set of parameters: the initial state probabilities, the first order Markov chain state-to-state transition probabilities, and the probability distribution of observable outputs associated with each state. Application of the model to data involves optimizing these model parameters with respect to some function of the observations, typically the likelihood of the observations given the model. Our work focused on the fact that this objective function has a number of local maxima that is exponential in the model size (the number of states). This means that not only is it very difficult to discover the global maximum, but also that results can vary widely between applications of the model. For some domains which employ HMMs for such purposes, such as speech processing, sufficient a priori information about the system is available to avoid this problem. However, for seismic data in general such a priori information is not available. Our approach involves analytical location of sub-optimal local maxima; once the locations of these maxima have been found, then we can employ a
Statistical Inference in Hidden Markov Models Using k-Segment Constraints
Titsias, Michalis K.; Holmes, Christopher C.; Yau, Christopher
2016-01-01
Hidden Markov models (HMMs) are one of the most widely used statistical methods for analyzing sequence data. However, the reporting of output from HMMs has largely been restricted to the presentation of the most-probable (MAP) hidden state sequence, found via the Viterbi algorithm, or the sequence of most probable marginals using the forward–backward algorithm. In this article, we expand the amount of information we could obtain from the posterior distribution of an HMM by introducing linear-time dynamic programming recursions that, conditional on a user-specified constraint in the number of segments, allow us to (i) find MAP sequences, (ii) compute posterior probabilities, and (iii) simulate sample paths. We collectively call these recursions k-segment algorithms and illustrate their utility using simulated and real examples. We also highlight the prospective and retrospective use of k-segment constraints for fitting HMMs or exploring existing model fits. Supplementary materials for this article are available online. PMID:27226674
Algorithms for Hidden Markov Models Restricted to Occurrences of Regular Expressions
Tataru, Paula; Sand, Andreas; Hobolth, Asger; Mailund, Thomas; Pedersen, Christian N. S.
2013-01-01
Hidden Markov Models (HMMs) are widely used probabilistic models, particularly for annotating sequential data with an underlying hidden structure. Patterns in the annotation are often more relevant to study than the hidden structure itself. A typical HMM analysis consists of annotating the observed data using a decoding algorithm and analyzing the annotation to study patterns of interest. For example, given an HMM modeling genes in DNA sequences, the focus is on occurrences of genes in the annotation. In this paper, we define a pattern through a regular expression and present a restriction of three classical algorithms to take the number of occurrences of the pattern in the hidden sequence into account. We present a new algorithm to compute the distribution of the number of pattern occurrences, and we extend the two most widely used existing decoding algorithms to employ information from this distribution. We show experimentally that the expectation of the distribution of the number of pattern occurrences gives a highly accurate estimate, while the typical procedure can be biased in the sense that the identified number of pattern occurrences does not correspond to the true number. We furthermore show that using this distribution in the decoding algorithms improves the predictive power of the model. PMID:24833225
Automated brain tumor segmentation using spatial accuracy-weighted hidden Markov Random Field.
Nie, Jingxin; Xue, Zhong; Liu, Tianming; Young, Geoffrey S; Setayesh, Kian; Guo, Lei; Wong, Stephen T C
2009-09-01
A variety of algorithms have been proposed for brain tumor segmentation from multi-channel sequences, however, most of them require isotropic or pseudo-isotropic resolution of the MR images. Although co-registration and interpolation of low-resolution sequences, such as T2-weighted images, onto the space of the high-resolution image, such as T1-weighted image, can be performed prior to the segmentation, the results are usually limited by partial volume effects due to interpolation of low-resolution images. To improve the quality of tumor segmentation in clinical applications where low-resolution sequences are commonly used together with high-resolution images, we propose the algorithm based on Spatial accuracy-weighted Hidden Markov random field and Expectation maximization (SHE) approach for both automated tumor and enhanced-tumor segmentation. SHE incorporates the spatial interpolation accuracy of low-resolution images into the optimization procedure of the Hidden Markov Random Field (HMRF) to segment tumor using multi-channel MR images with different resolutions, e.g., high-resolution T1-weighted and low-resolution T2-weighted images. In experiments, we evaluated this algorithm using a set of simulated multi-channel brain MR images with known ground-truth tissue segmentation and also applied it to a dataset of MR images obtained during clinical trials of brain tumor chemotherapy. The results show that more accurate tumor segmentation results can be obtained by comparing with conventional multi-channel segmentation algorithms.
Zhang, Yu-Chen; Zhang, Shao-Wu; Liu, Lian; Liu, Hui; Zhang, Lin; Cui, Xiaodong; Huang, Yufei; Meng, Jia
2015-01-01
With the development of new sequencing technology, the entire N6-methyl-adenosine (m6A) RNA methylome can now be unbiased profiled with methylated RNA immune-precipitation sequencing technique (MeRIP-Seq), making it possible to detect differential methylation states of RNA between two conditions, for example, between normal and cancerous tissue. However, as an affinity-based method, MeRIP-Seq has yet provided base-pair resolution; that is, a single methylation site determined from MeRIP-Seq data can in practice contain multiple RNA methylation residuals, some of which can be regulated by different enzymes and thus differentially methylated between two conditions. Since existing peak-based methods could not effectively differentiate multiple methylation residuals located within a single methylation site, we propose a hidden Markov model (HMM) based approach to address this issue. Specifically, the detected RNA methylation site is further divided into multiple adjacent small bins and then scanned with higher resolution using a hidden Markov model to model the dependency between spatially adjacent bins for improved accuracy. We tested the proposed algorithm on both simulated data and real data. Result suggests that the proposed algorithm clearly outperforms existing peak-based approach on simulated systems and detects differential methylation regions with higher statistical significance on real dataset. PMID:26301253
2014-01-01
Background Logos are commonly used in molecular biology to provide a compact graphical representation of the conservation pattern of a set of sequences. They render the information contained in sequence alignments or profile hidden Markov models by drawing a stack of letters for each position, where the height of the stack corresponds to the conservation at that position, and the height of each letter within a stack depends on the frequency of that letter at that position. Results We present a new tool and web server, called Skylign, which provides a unified framework for creating logos for both sequence alignments and profile hidden Markov models. In addition to static image files, Skylign creates a novel interactive logo plot for inclusion in web pages. These interactive logos enable scrolling, zooming, and inspection of underlying values. Skylign can avoid sampling bias in sequence alignments by down-weighting redundant sequences and by combining observed counts with informed priors. It also simplifies the representation of gap parameters, and can optionally scale letter heights based on alternate calculations of the conservation of a position. Conclusion Skylign is available as a website, a scriptable web service with a RESTful interface, and as a software package for download. Skylign’s interactive logos are easily incorporated into a web page with just a few lines of HTML markup. Skylign may be found at http://skylign.org. PMID:24410852
Sun, Shuying; Yu, Xiaoqing
2016-03-01
DNA methylation is an epigenetic event that plays an important role in regulating gene expression. It is important to study DNA methylation, especially differential methylation patterns between two groups of samples (e.g. patients vs. normal individuals). With next generation sequencing technologies, it is now possible to identify differential methylation patterns by considering methylation at the single CG site level in an entire genome. However, it is challenging to analyze large and complex NGS data. In order to address this difficult question, we have developed a new statistical method using a hidden Markov model and Fisher's exact test (HMM-Fisher) to identify differentially methylated cytosines and regions. We first use a hidden Markov chain to model the methylation signals to infer the methylation state as Not methylated (N), Partly methylated (P), and Fully methylated (F) for each individual sample. We then use Fisher's exact test to identify differentially methylated CG sites. We show the HMM-Fisher method and compare it with commonly cited methods using both simulated data and real sequencing data. The results show that HMM-Fisher outperforms the current available methods to which we have compared. HMM-Fisher is efficient and robust in identifying heterogeneous DM regions. PMID:26854292
Automated brain tumor segmentation using spatial accuracy-weighted hidden Markov Random Field.
Nie, Jingxin; Xue, Zhong; Liu, Tianming; Young, Geoffrey S; Setayesh, Kian; Guo, Lei; Wong, Stephen T C
2009-09-01
A variety of algorithms have been proposed for brain tumor segmentation from multi-channel sequences, however, most of them require isotropic or pseudo-isotropic resolution of the MR images. Although co-registration and interpolation of low-resolution sequences, such as T2-weighted images, onto the space of the high-resolution image, such as T1-weighted image, can be performed prior to the segmentation, the results are usually limited by partial volume effects due to interpolation of low-resolution images. To improve the quality of tumor segmentation in clinical applications where low-resolution sequences are commonly used together with high-resolution images, we propose the algorithm based on Spatial accuracy-weighted Hidden Markov random field and Expectation maximization (SHE) approach for both automated tumor and enhanced-tumor segmentation. SHE incorporates the spatial interpolation accuracy of low-resolution images into the optimization procedure of the Hidden Markov Random Field (HMRF) to segment tumor using multi-channel MR images with different resolutions, e.g., high-resolution T1-weighted and low-resolution T2-weighted images. In experiments, we evaluated this algorithm using a set of simulated multi-channel brain MR images with known ground-truth tissue segmentation and also applied it to a dataset of MR images obtained during clinical trials of brain tumor chemotherapy. The results show that more accurate tumor segmentation results can be obtained by comparing with conventional multi-channel segmentation algorithms. PMID:19446435
Robertson, Colin; Sawford, Kate; Gunawardana, Walimunige S. N.; Nelson, Trisalyn A.; Nathoo, Farouk; Stephen, Craig
2011-01-01
Surveillance systems tracking health patterns in animals have potential for early warning of infectious disease in humans, yet there are many challenges that remain before this can be realized. Specifically, there remains the challenge of detecting early warning signals for diseases that are not known or are not part of routine surveillance for named diseases. This paper reports on the development of a hidden Markov model for analysis of frontline veterinary sentinel surveillance data from Sri Lanka. Field veterinarians collected data on syndromes and diagnoses using mobile phones. A model for submission patterns accounts for both sentinel-related and disease-related variability. Models for commonly reported cattle diagnoses were estimated separately. Region-specific weekly average prevalence was estimated for each diagnoses and partitioned into normal and abnormal periods. Visualization of state probabilities was used to indicate areas and times of unusual disease prevalence. The analysis suggests that hidden Markov modelling is a useful approach for surveillance datasets from novel populations and/or having little historical baselines. PMID:21949763
NASA Astrophysics Data System (ADS)
Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea; Montalto, Placido; Patanè, Domenico; Privitera, Eugenio
2016-07-01
From January 2011 to December 2015, Mt. Etna was mainly characterized by a cyclic eruptive behavior with more than 40 lava fountains from New South-East Crater. Using the RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area, an automatic recognition of the different states of volcanic activity (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN) has been applied for monitoring purposes. Since values of the RMS time series calculated on the seismic signal are generated from a stochastic process, we can try to model the system generating its sampled values, assumed to be a Markov process, using Hidden Markov Models (HMMs). HMMs analysis seeks to recover the sequence of hidden states from the observations. In our framework, observations are characters generated by the Symbolic Aggregate approXimation (SAX) technique, which maps RMS time series values with symbols of a pre-defined alphabet. The main advantages of the proposed framework, based on HMMs and SAX, with respect to other automatic systems applied on seismic signals at Mt. Etna, are the use of multiple stations and static thresholds to well characterize the volcano states. Its application on a wide seismic dataset of Etna volcano shows the possibility to guess the volcano states. The experimental results show that, in most of the cases, we detected lava fountains in advance.
Hidden Markov Model analysis of force/torque information in telemanipulation
Hannaford, B. ); Lee, P. )
1991-10-01
A new model is developed for prediction and analysis of sensor information recorded during robotic performance of tasks by telemanipulation. The model uses the Hidden Markov Model (stochastic functions of Markov nets; HMM) to describe the task structure, the operator or intelligent controller's goal structure, and the sensor signals such as forces and torques arising from interaction with the environment. The Markov process portion encodes the task sequence/subgoal structure, and the observation densities associated with each subgoal state encode the expected sensor signals associated with carrying out that subgoal. Methodology is described for construction of the model parameters based on engineering knowledge of the task. The Viterbi algorithm is used for model based analysis of force signals measured during experimental teleoperation and achieves excellent segmentation of the data into subgoal phases. The Baum-Welch algorithm is used to identify the most likely HMM from a given experiment. The HMM achieves a structured, knowledge-base model with explicit uncertainties and mature, optimal identification algorithms.
Unsupervised SAR images change detection with hidden Markov chains on a sliding window
NASA Astrophysics Data System (ADS)
Bouyahia, Zied; Benyoussef, Lamia; Derrode, Stéphane
2007-10-01
This work deals with unsupervised change detection in bi-date Synthetic Aperture Radar (SAR) images. Whatever the indicator of change used, e.g. log-ratio or Kullback-Leibler divergence, we have observed poor quality change maps for some events when using the Hidden Markov Chain (HMC) model we focus on in this work. The main reason comes from the stationary assumption involved in this model - and in most Markovian models such as Hidden Markov Random Fields-, which can not be justified in most observed scenes: changed areas are not necessarily stationary in the image. Besides the few non stationary Markov models proposed in the literature, the aim of this paper is to describe a pragmatic solution to tackle stationarity by using a sliding window strategy. In this algorithm, the criterion image is scanned pixel by pixel, and a classical HMC model is applied only on neighboring pixels. By moving the window through the image, the process is able to produce a change map which can better exhibit non stationary changes than the classical HMC applied directly on the whole criterion image. Special care is devoted to the estimation of the number of classes in each window, which can vary from one (no change) to three (positive change, negative change and no change) by using the corrected Akaike Information Criterion (AICc) suited to small samples. The quality assessment of the proposed approach is achieved with speckle-simulated images in which simulated changes is introduced. The windowed strategy is also evaluated with a pair of RADARSAT images bracketing the Nyiragongo volcano eruption event in January 2002. The available ground truth confirms the effectiveness of the proposed approach compared to a classical HMC-based strategy.
Classification method for disease risk mapping based on discrete hidden Markov random fields.
Charras-Garrido, Myriam; Abrial, David; Goër, Jocelyn De; Dachian, Sergueï; Peyrard, Nathalie
2012-04-01
Risk mapping in epidemiology enables areas with a low or high risk of disease contamination to be localized and provides a measure of risk differences between these regions. Risk mapping models for pooled data currently used by epidemiologists focus on the estimated risk for each geographical unit. They are based on a Poisson log-linear mixed model with a latent intrinsic continuous hidden Markov random field (HMRF) generally corresponding to a Gaussian autoregressive spatial smoothing. Risk classification, which is necessary to draw clearly delimited risk zones (in which protection measures may be applied), generally must be performed separately. We propose a method for direct classified risk mapping based on a Poisson log-linear mixed model with a latent discrete HMRF. The discrete hidden field (HF) corresponds to the assignment of each spatial unit to a risk class. The risk values attached to the classes are parameters and are estimated. When mapping risk using HMRFs, the conditional distribution of the observed field is modeled with a Poisson rather than a Gaussian distribution as in image segmentation. Moreover, abrupt changes in risk levels are rare in disease maps. The spatial hidden model should favor smoothed out risks, but conventional discrete Markov random fields (e.g. the Potts model) do not impose this. We therefore propose new potential functions for the HF that take into account class ordering. We use a Monte Carlo version of the expectation-maximization algorithm to estimate parameters and determine risk classes. We illustrate the method's behavior on simulated and real data sets. Our method appears particularly well adapted to localize high-risk regions and estimate the corresponding risk levels.
NASA Astrophysics Data System (ADS)
Turner, Sean; Galelli, Stefano; Wilcox, Karen
2015-04-01
Water reservoir systems are often affected by recurring large-scale ocean-atmospheric anomalies, known as teleconnections, that cause prolonged periods of climatological drought. Accurate forecasts of these events -- at lead times in the order of weeks and months -- may enable reservoir operators to take more effective release decisions to improve the performance of their systems. In practice this might mean a more reliable water supply system, a more profitable hydropower plant or a more sustainable environmental release policy. To this end, climate indices, which represent the oscillation of the ocean-atmospheric system, might be gainfully employed within reservoir operating models that adapt the reservoir operation as a function of the climate condition. This study develops a Stochastic Dynamic Programming (SDP) approach that can incorporate climate indices using a Hidden Markov Model. The model simulates the climatic regime as a hidden state following a Markov chain, with the state transitions driven by variation in climatic indices, such as the Southern Oscillation Index. Time series analysis of recorded streamflow data reveals the parameters of separate autoregressive models that describe the inflow to the reservoir under three representative climate states ("normal", "wet", "dry"). These models then define inflow transition probabilities for use in a classic SDP approach. The key advantage of the Hidden Markov Model is that it allows conditioning the operating policy not only on the reservoir storage and the antecedent inflow, but also on the climate condition, thus potentially allowing adaptability to a broader range of climate conditions. In practice, the reservoir operator would effect a water release tailored to a specific climate state based on available teleconnection data and forecasts. The approach is demonstrated on the operation of a realistic, stylised water reservoir with carry-over capacity in South-East Australia. Here teleconnections relating
ERIC Educational Resources Information Center
Stifter, Cynthia A.; Rovine, Michael
2015-01-01
The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at 2 and 6?months of age, used hidden Markov modelling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a…
Statistical Mechanics of Transcription-Factor Binding Site Discovery Using Hidden Markov Models
Mehta, Pankaj; Schwab, David J.; Sengupta, Anirvan M.
2011-01-01
Hidden Markov Models (HMMs) are a commonly used tool for inference of transcription factor (TF) binding sites from DNA sequence data. We exploit the mathematical equivalence between HMMs for TF binding and the “inverse” statistical mechanics of hard rods in a one-dimensional disordered potential to investigate learning in HMMs. We derive analytic expressions for the Fisher information, a commonly employed measure of confidence in learned parameters, in the biologically relevant limit where the density of binding sites is low. We then use techniques from statistical mechanics to derive a scaling principle relating the specificity (binding energy) of a TF to the minimum amount of training data necessary to learn it. PMID:22851788
Ficz, Gabriella; Wolf, Verena; Walter, Jörn
2016-01-01
DNA methylation and demethylation are opposing processes that when in balance create stable patterns of epigenetic memory. The control of DNA methylation pattern formation by replication dependent and independent demethylation processes has been suggested to be influenced by Tet mediated oxidation of 5mC. Several alternative mechanisms have been proposed suggesting that 5hmC influences either replication dependent maintenance of DNA methylation or replication independent processes of active demethylation. Using high resolution hairpin oxidative bisulfite sequencing data, we precisely determine the amount of 5mC and 5hmC and model the contribution of 5hmC to processes of demethylation in mouse ESCs. We develop an extended hidden Markov model capable of accurately describing the regional contribution of 5hmC to demethylation dynamics. Our analysis shows that 5hmC has a strong impact on replication dependent demethylation, mainly by impairing methylation maintenance. PMID:27224554
Tracking Problem Solving by Multivariate Pattern Analysis and Hidden Markov Model Algorithms
Anderson, John R.
2011-01-01
Multivariate pattern analysis can be combined with hidden Markov model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first “mind reading” application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second “model discovery” application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving. PMID:21820455
Laser Doppler vibrometry measurements of the carotid pulse: biometrics using hidden Markov models
NASA Astrophysics Data System (ADS)
Kaplan, Alan D.; O'Sullivan, Joseph A.; Sirevaag, Erik J.; Rohrbaugh, John W.
2009-05-01
Small movements of the skin overlying the carotid artery, arising from pressure pulse changes in the carotid during the cardiac cycle, can be detected using the method of Laser Doppler Vibrometry (LDV). Based on the premise that there is a high degree of individuality in cardiovascular function, the pulse-related movements were modeled for biometric use. Short time variations in the signal due to physiological factors are described and these variations are shown to be informative for identity verification and recognition. Hidden Markov models (HMMs) are used to exploit the dependence between the pulse signals over successive cardiac cycles. The resulting biometric classification performance confirms that the LDV signal contains information that is unique to the individual.
Prestat, Emmanuel; David, Maude M.; Hultman, Jenni; Ta , Neslihan; Lamendella, Regina; Dvornik, Jill; Mackelprang, Rachel; Myrold, David D.; Jumpponen, Ari; Tringe, Susannah G.; Holman, Elizabeth; Mavromatis, Konstantinos; Jansson, Janet K.
2014-09-26
A new functional gene database, FOAM (Functional Ontology Assignments for Metagenomes), was developed to screen environmental metagenomic sequence datasets. FOAM provides a new functional ontology dedicated to classify gene functions relevant to environmental microorganisms based on Hidden Markov Models (HMMs). Sets of aligned protein sequences (i.e. ‘profiles’) were tailored to a large group of target KEGG Orthologs (KOs) from which HMMs were trained. The alignments were checked and curated to make them specific to the targeted KO. Within this process, sequence profiles were enriched with the most abundant sequences available to maximize the yield of accurate classifier models. An associated functional ontology was built to describe the functional groups and hierarchy. FOAM allows the user to select the target search space before HMM-based comparison steps and to easily organize the results into different functional categories and subcategories. FOAM is publicly available at http://portal.nersc.gov/project/m1317/FOAM/.
3D+t brain MRI segmentation using robust 4D Hidden Markov Chain.
Lavigne, François; Collet, Christophe; Armspach, Jean-Paul
2014-01-01
In recent years many automatic methods have been developed to help physicians diagnose brain disorders, but the problem remains complex. In this paper we propose a method to segment brain structures on two 3D multi-modal MR images taken at different times (longitudinal acquisition). A bias field correction is performed with an adaptation of the Hidden Markov Chain (HMC) allowing us to take into account the temporal correlation in addition to spatial neighbourhood information. To improve the robustness of the segmentation of the principal brain structures and to detect Multiple Sclerosis Lesions as outliers the Trimmed Likelihood Estimator (TLE) is used during the process. The method is validated on 3D+t brain MR images. PMID:25571045
McCallum, Kenneth Jordan; Wang, Ji-Ping
2013-07-01
Copy number variations (CNVs) are a significant source of genetic variation and have been found frequently associated with diseases such as cancers and autism. High-throughput sequencing data are increasingly being used to detect and quantify CNVs; however, the distributional properties of the data are not fully understood. A hidden Markov model (HMM) is proposed using inhomogeneous emission distributions based on negative binomial regression to account for the sequencing biases. The model is tested on the whole genome sequencing data and simulated data sets. An algorithm for CNV detection is implemented in the R package CNVfinder. The model based on negative binomial regression is shown to provide a good fit to the data and provides competitive performance compared with methods based on normalization of read counts. PMID:23428932
Giehr, Pascal; Kyriakopoulos, Charalampos; Ficz, Gabriella; Wolf, Verena; Walter, Jörn
2016-05-01
DNA methylation and demethylation are opposing processes that when in balance create stable patterns of epigenetic memory. The control of DNA methylation pattern formation by replication dependent and independent demethylation processes has been suggested to be influenced by Tet mediated oxidation of 5mC. Several alternative mechanisms have been proposed suggesting that 5hmC influences either replication dependent maintenance of DNA methylation or replication independent processes of active demethylation. Using high resolution hairpin oxidative bisulfite sequencing data, we precisely determine the amount of 5mC and 5hmC and model the contribution of 5hmC to processes of demethylation in mouse ESCs. We develop an extended hidden Markov model capable of accurately describing the regional contribution of 5hmC to demethylation dynamics. Our analysis shows that 5hmC has a strong impact on replication dependent demethylation, mainly by impairing methylation maintenance. PMID:27224554
An adaptive Hidden Markov model for activity recognition based on a wearable multi-sensor device.
Li, Zhen; Wei, Zhiqiang; Yue, Yaofeng; Wang, Hao; Jia, Wenyan; Burke, Lora E; Baranowski, Thomas; Sun, Mingui
2015-05-01
Human activity recognition is important in the study of personal health, wellness and lifestyle. In order to acquire human activity information from the personal space, many wearable multi-sensor devices have been developed. In this paper, a novel technique for automatic activity recognition based on multi-sensor data is presented. In order to utilize these data efficiently and overcome the big data problem, an offline adaptive-Hidden Markov Model (HMM) is proposed. A sensor selection scheme is implemented based on an improved Viterbi algorithm. A new method is proposed that incorporates personal experience into the HMM model as a priori information. Experiments are conducted using a personal wearable computer eButton consisting of multiple sensors. Our comparative study with the standard HMM and other alternative methods in processing the eButton data have shown that our method is more robust and efficient, providing a useful tool to evaluate human activity and lifestyle.
Tracking problem solving by multivariate pattern analysis and Hidden Markov Model algorithms.
Anderson, John R
2012-03-01
Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second "model discovery" application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving.
Integrating hidden Markov model and PRAAT: a toolbox for robust automatic speech transcription
NASA Astrophysics Data System (ADS)
Kabir, A.; Barker, J.; Giurgiu, M.
2010-09-01
An automatic time-aligned phone transcription toolbox of English speech corpora has been developed. Especially the toolbox would be very useful to generate robust automatic transcription and able to produce phone level transcription using speaker independent models as well as speaker dependent models without manual intervention. The system is based on standard Hidden Markov Models (HMM) approach and it was successfully experimented over a large audiovisual speech corpus namely GRID corpus. One of the most powerful features of the toolbox is the increased flexibility in speech processing where the speech community would be able to import the automatic transcription generated by HMM Toolkit (HTK) into a popular transcription software, PRAAT, and vice-versa. The toolbox has been evaluated through statistical analysis on GRID data which shows that automatic transcription deviates by an average of 20 ms with respect to manual transcription.
Prestat, Emmanuel; David, Maude M; Hultman, Jenni; Taş, Neslihan; Lamendella, Regina; Dvornik, Jill; Mackelprang, Rachel; Myrold, David D; Jumpponen, Ari; Tringe, Susannah G; Holman, Elizabeth; Mavromatis, Konstantinos; Jansson, Janet K
2014-10-29
A new functional gene database, FOAM (Functional Ontology Assignments for Metagenomes), was developed to screen environmental metagenomic sequence datasets. FOAM provides a new functional ontology dedicated to classify gene functions relevant to environmental microorganisms based on Hidden Markov Models (HMMs). Sets of aligned protein sequences (i.e. 'profiles') were tailored to a large group of target KEGG Orthologs (KOs) from which HMMs were trained. The alignments were checked and curated to make them specific to the targeted KO. Within this process, sequence profiles were enriched with the most abundant sequences available to maximize the yield of accurate classifier models. An associated functional ontology was built to describe the functional groups and hierarchy. FOAM allows the user to select the target search space before HMM-based comparison steps and to easily organize the results into different functional categories and subcategories. FOAM is publicly available at http://portal.nersc.gov/project/m1317/FOAM/.
NASA Astrophysics Data System (ADS)
Attaluri, Pavan K.; Chen, Zhengxin; Weerakoon, Aruna M.; Lu, Guoqing
Multiple criteria decision making (MCDM) has significant impact in bioinformatics. In the research reported here, we explore the integration of decision tree (DT) and Hidden Markov Model (HMM) for subtype prediction of human influenza A virus. Infection with influenza viruses continues to be an important public health problem. Viral strains of subtype H3N2 and H1N1 circulates in humans at least twice annually. The subtype detection depends mainly on the antigenic assay, which is time-consuming and not fully accurate. We have developed a Web system for accurate subtype detection of human influenza virus sequences. The preliminary experiment showed that this system is easy-to-use and powerful in identifying human influenza subtypes. Our next step is to examine the informative positions at the protein level and extend its current functionality to detect more subtypes. The web functions can be accessed at http://glee.ist.unomaha.edu/.
Autoregressive hidden Markov models for the early detection of neonatal sepsis.
Stanculescu, Ioan; Williams, Christopher K I; Freer, Yvonne
2014-09-01
Late onset neonatal sepsis is one of the major clinical concerns when premature babies receive intensive care. Current practice relies on slow laboratory testing of blood cultures for diagnosis. A valuable research question is whether sepsis can be reliably detected before the blood sample is taken. This paper investigates the extent to which physiological events observed in the patient's monitoring traces could be used for the early detection of neonatal sepsis. We model the distribution of these events with an autoregressive hidden Markov model (AR-HMM). Both learning and inference carefully use domain knowledge to extract the baby's true physiology from the monitoring data. Our model can produce real-time predictions about the onset of the infection and also handles missing data. We evaluate the effectiveness of the AR-HMM for sepsis detection on a dataset collected from the Neonatal Intensive Care Unit at the Royal Infirmary of Edinburgh.
Hand Gesture Spotting Based on 3D Dynamic Features Using Hidden Markov Models
NASA Astrophysics Data System (ADS)
Elmezain, Mahmoud; Al-Hamadi, Ayoub; Michaelis, Bernd
In this paper, we propose an automatic system that handles hand gesture spotting and recognition simultaneously in stereo color image sequences without any time delay based on Hidden Markov Models (HMMs). Color and 3D depth map are used to segment hand regions. The hand trajectory will determine in further step using Mean-shift algorithm and Kalman filter to generate 3D dynamic features. Furthermore, k-means clustering algorithm is employed for the HMMs codewords. To spot meaningful gestures accurately, a non-gesture model is proposed, which provides confidence limit for the calculated likelihood by other gesture models. The confidence measures are used as an adaptive threshold for spotting meaningful gestures. Experimental results show that the proposed system can successfully recognize isolated gestures with 98.33% and meaningful gestures with 94.35% reliability for numbers (0-9).
Fast Bayesian Inference of Copy Number Variants using Hidden Markov Models with Wavelet Compression.
Wiedenhoeft, John; Brugel, Eric; Schliep, Alexander
2016-05-01
By integrating Haar wavelets with Hidden Markov Models, we achieve drastically reduced running times for Bayesian inference using Forward-Backward Gibbs sampling. We show that this improves detection of genomic copy number variants (CNV) in array CGH experiments compared to the state-of-the-art, including standard Gibbs sampling. The method concentrates computational effort on chromosomal segments which are difficult to call, by dynamically and adaptively recomputing consecutive blocks of observations likely to share a copy number. This makes routine diagnostic use and re-analysis of legacy data collections feasible; to this end, we also propose an effective automatic prior. An open source software implementation of our method is available at http://schlieplab.org/Software/HaMMLET/ (DOI: 10.5281/zenodo.46262). This paper was selected for oral presentation at RECOMB 2016, and an abstract is published in the conference proceedings. PMID:27177143
Medical Inpatient Journey Modeling and Clustering: A Bayesian Hidden Markov Model Based Approach
Huang, Zhengxing; Dong, Wei; Wang, Fei; Duan, Huilong
2015-01-01
Modeling and clustering medical inpatient journeys is useful to healthcare organizations for a number of reasons including inpatient journey reorganization in a more convenient way for understanding and browsing, etc. In this study, we present a probabilistic model-based approach to model and cluster medical inpatient journeys. Specifically, we exploit a Bayesian Hidden Markov Model based approach to transform medical inpatient journeys into a probabilistic space, which can be seen as a richer representation of inpatient journeys to be clustered. Then, using hierarchical clustering on the matrix of similarities, inpatient journeys can be clustered into different categories w.r.t their clinical and temporal characteristics. We evaluated the proposed approach on a real clinical data set pertaining to the unstable angina treatment process. The experimental results reveal that our method can identify and model latent treatment topics underlying in personalized inpatient journeys, and yield impressive clustering quality. PMID:26958200
Multi-aspect target discrimination using hidden Markov models and neural networks.
Robinson, Marc; Azimi-Sadjadi, Mahmood R; Salazar, Jaime
2005-03-01
This paper presents a new multi-aspect pattern classification method using hidden Markov models (HMMs). Models are defined for each class, with the probability found by each model determining class membership. Each HMM model is enhanced by the use of a multilayer perception (MLP) network to generate emission probabilities. This hybrid system uses the MLP to find the probability of a state for an unknown pattern and the HMM to model the process underlying the state transitions. A new batch gradient descent-based method is introduced for optimal estimation of the transition and emission probabilities. A prediction method in conjunction with HMM model is also presented that attempts to improve the computation of transition probabilities by using the previous states to predict the next state. This method exploits the correlation information between consecutive aspects. These algorithms are then implemented and benchmarked on a multi-aspect underwater target classification problem using a realistic sonar data set collected in different bottom conditions.
Non-intrusive gesture recognition system combining with face detection based on Hidden Markov Model
NASA Astrophysics Data System (ADS)
Jin, Jing; Wang, Yuanqing; Xu, Liujing; Cao, Liqun; Han, Lei; Zhou, Biye; Li, Minggao
2014-11-01
A non-intrusive gesture recognition human-machine interaction system is proposed in this paper. In order to solve the hand positioning problem which is a difficulty in current algorithms, face detection is used for the pre-processing to narrow the search area and find user's hand quickly and accurately. Hidden Markov Model (HMM) is used for gesture recognition. A certain number of basic gesture units are trained as HMM models. At the same time, an improved 8-direction feature vector is proposed and used to quantify characteristics in order to improve the detection accuracy. The proposed system can be applied in interaction equipments without special training for users, such as household interactive television
A computationally efficient approach for hidden-Markov model-augmented fingerprint-based positioning
NASA Astrophysics Data System (ADS)
Roth, John; Tummala, Murali; McEachen, John
2016-09-01
This paper presents a computationally efficient approach for mobile subscriber position estimation in wireless networks. A method of data scaling assisted by timing adjust is introduced in fingerprint-based location estimation under a framework which allows for minimising computational cost. The proposed method maintains a comparable level of accuracy to the traditional case where no data scaling is used and is evaluated in a simulated environment under varying channel conditions. The proposed scheme is studied when it is augmented by a hidden-Markov model to match the internal parameters to the channel conditions that present, thus minimising computational cost while maximising accuracy. Furthermore, the timing adjust quantity, available in modern wireless signalling messages, is shown to be able to further reduce computational cost and increase accuracy when available. The results may be seen as a significant step towards integrating advanced position-based modelling with power-sensitive mobile devices.
HMM-DM: identifying differentially methylated regions using a hidden Markov model.
Yu, Xiaoqing; Sun, Shuying
2016-03-01
DNA methylation is an epigenetic modification involved in organism development and cellular differentiation. Identifying differential methylations can help to study genomic regions associated with diseases. Differential methylation studies on single-CG resolution have become possible with the bisulfite sequencing (BS) technology. However, there is still a lack of efficient statistical methods for identifying differentially methylated (DM) regions in BS data. We have developed a new approach named HMM-DM to detect DM regions between two biological conditions using BS data. This new approach first uses a hidden Markov model (HMM) to identify DM CG sites accounting for spatial correlation across CG sites and variation across samples, and then summarizes identified sites into regions. We demonstrate through a simulation study that our approach has a superior performance compared to BSmooth. We also illustrate the application of HMM-DM using a real breast cancer dataset. PMID:26887041
NASA Astrophysics Data System (ADS)
Jiang, Huiming; Chen, Jin; Dong, Guangming
2016-05-01
Hidden Markov model (HMM) has been widely applied in bearing performance degradation assessment. As a machine learning-based model, its accuracy, subsequently, is dependent on the sensitivity of the features used to estimate the degradation performance of bearings. It's a big challenge to extract effective features which are not influenced by other qualities or attributes uncorrelated with the bearing degradation condition. In this paper, a bearing performance degradation assessment method based on HMM and nuisance attribute projection (NAP) is proposed. NAP can filter out the effect of nuisance attributes in feature space through projection. The new feature space projected by NAP is more sensitive to bearing health changes and barely influenced by other interferences occurring in operation condition. To verify the effectiveness of the proposed method, two different experimental databases are utilized. The results show that the combination of HMM and NAP can effectively improve the accuracy and robustness of the bearing performance degradation assessment system.
Damage evaluation by a guided wave-hidden Markov model based method
NASA Astrophysics Data System (ADS)
Mei, Hanfei; Yuan, Shenfang; Qiu, Lei; Zhang, Jinjin
2016-02-01
Guided wave based structural health monitoring has shown great potential in aerospace applications. However, one of the key challenges of practical engineering applications is the accurate interpretation of the guided wave signals under time-varying environmental and operational conditions. This paper presents a guided wave-hidden Markov model based method to improve the damage evaluation reliability of real aircraft structures under time-varying conditions. In the proposed approach, an HMM based unweighted moving average trend estimation method, which can capture the trend of damage propagation from the posterior probability obtained by HMM modeling is used to achieve a probabilistic evaluation of the structural damage. To validate the developed method, experiments are performed on a hole-edge crack specimen under fatigue loading condition and a real aircraft wing spar under changing structural boundary conditions. Experimental results show the advantage of the proposed method.
Hidden Markov models and neural networks for fault detection in dynamic systems
NASA Technical Reports Server (NTRS)
Smyth, Padhraic
1994-01-01
Neural networks plus hidden Markov models (HMM) can provide excellent detection and false alarm rate performance in fault detection applications, as shown in this viewgraph presentation. Modified models allow for novelty detection. Key contributions of neural network models are: (1) excellent nonparametric discrimination capability; (2) a good estimator of posterior state probabilities, even in high dimensions, and thus can be embedded within overall probabilistic model (HMM); and (3) simple to implement compared to other nonparametric models. Neural network/HMM monitoring model is currently being integrated with the new Deep Space Network (DSN) antenna controller software and will be on-line monitoring a new DSN 34-m antenna (DSS-24) by July, 1994.
Switched Fault Diagnosis Approach for Industrial Processes based on Hidden Markov Model
NASA Astrophysics Data System (ADS)
Wang, Lin; Yang, Chunjie; Sun, Youxian; Pan, Yijun; An, Ruqiao
2015-11-01
Traditional fault diagnosis methods based on hidden Markov model (HMM) use a unified method for feature extraction, such as principal component analysis (PCA), kernel principal component analysis (KPCA) and independent component analysis (ICA). However, every method has its own limitations. For example, PCA cannot extract nonlinear relationships among process variables. So it is inappropriate to extract all features of variables by only one method, especially when data characteristics are very complex. This article proposes a switched feature extraction procedure using PCA and KPCA based on nonlinearity measure. By the proposed method, we are able to choose the most suitable feature extraction method, which could improve the accuracy of fault diagnosis. A simulation from the Tennessee Eastman (TE) process demonstrates that the proposed approach is superior to the traditional one based on HMM and could achieve more accurate classification of various process faults.
Grinding Wheel Condition Monitoring with Hidden Markov Model-Based Clustering Methods
Liao, T. W.; Hua, G; Qu, Jun; Blau, Peter Julian
2006-01-01
Hidden Markov model (HMM) is well known for sequence modeling and has been used for condition monitoring. However, HMM-based clustering methods are developed only recently. This article proposes a HMM-based clustering method for monitoring the condition of grinding wheel used in grinding operations. The proposed method first extract features from signals based on discrete wavelet decomposition using a moving window approach. It then generates a distance (dissimilarity) matrix using HMM. Based on this distance matrix several hierarchical and partitioning-based clustering algorithms are applied to obtain clustering results. The proposed methodology was tested with feature sequences extracted from acoustic emission signals. The results show that clustering accuracy is dependent upon cutting condition. Higher material removal rate seems to produce more discriminatory signals/features than lower material removal rate. The effect of window size, wavelet decomposition level, wavelet basis, clustering algorithm, and data normalization were also studied.
Hidden Markov model approach to skill learning and its application to telerobotics
Yang, J. . Robotics Inst. Univ. of Akron, OH . Dept. of Electrical Engineering); Xu, Y. . Robotics Inst.); Chen, C.S. . Dept. of Electrical Engineering)
1994-10-01
In this paper, the authors discuss the problem of how human skill can be represented as a parametric model using a hidden Markov model (HMM), and how an HMM-based skill model can be used to learn human skill. HMM is feasible to characterize a doubly stochastic process--measurable action and immeasurable mental states--that is involved in the skill learning. The authors formulated the learning problem as a multidimensional HMM and developed a testbed for a variety of skill learning applications. Based on ''the most likely performance'' criterion, the best action sequence can be selected from all previously measured action data by modeling the skill as an HMM. The proposed method has been implemented in the teleoperation control of a space station robot system, and some important implementation issues have been discussed. The method allows a robot to learn human skill certain tasks and to improve motion performance.
Global-constrained hidden Markov model applied on wireless capsule endoscopy video segmentation
NASA Astrophysics Data System (ADS)
Wan, Yiwen; Duraisamy, Prakash; Alam, Mohammad S.; Buckles, Bill
2012-06-01
Accurate analysis of wireless capsule endoscopy (WCE) videos is vital but tedious. Automatic image analysis can expedite this task. Video segmentation of WCE into the four parts of the gastrointestinal tract is one way to assist a physician. The segmentation approach described in this paper integrates pattern recognition with statiscal analysis. Iniatially, a support vector machine is applied to classify video frames into four classes using a combination of multiple color and texture features as the feature vector. A Poisson cumulative distribution, for which the parameter depends on the length of segments, models a prior knowledge. A priori knowledge together with inter-frame difference serves as the global constraints driven by the underlying observation of each WCE video, which is fitted by Gaussian distribution to constrain the transition probability of hidden Markov model.Experimental results demonstrated effectiveness of the approach.
Fast Bayesian Inference of Copy Number Variants using Hidden Markov Models with Wavelet Compression
Wiedenhoeft, John; Brugel, Eric; Schliep, Alexander
2016-01-01
By integrating Haar wavelets with Hidden Markov Models, we achieve drastically reduced running times for Bayesian inference using Forward-Backward Gibbs sampling. We show that this improves detection of genomic copy number variants (CNV) in array CGH experiments compared to the state-of-the-art, including standard Gibbs sampling. The method concentrates computational effort on chromosomal segments which are difficult to call, by dynamically and adaptively recomputing consecutive blocks of observations likely to share a copy number. This makes routine diagnostic use and re-analysis of legacy data collections feasible; to this end, we also propose an effective automatic prior. An open source software implementation of our method is available at http://schlieplab.org/Software/HaMMLET/ (DOI: 10.5281/zenodo.46262). This paper was selected for oral presentation at RECOMB 2016, and an abstract is published in the conference proceedings. PMID:27177143
Identifying bubble collapse in a hydrothermal system using hidden Markov models
NASA Astrophysics Data System (ADS)
Dawson, Phillip B.; Benítez, M. C.; Lowenstern, Jacob B.; Chouet, Bernard A.
2012-01-01
Beginning in July 2003 and lasting through September 2003, the Norris Geyser Basin in Yellowstone National Park exhibited an unusual increase in ground temperature and hydrothermal activity. Using hidden Markov model theory, we identify over five million high-frequency (>15 Hz) seismic events observed at a temporary seismic station deployed in the basin in response to the increase in hydrothermal activity. The source of these seismic events is constrained to within ˜100 m of the station, and produced ˜3500-5500 events per hour with mean durations of ˜0.35-0.45 s. The seismic event rate, air temperature, hydrologic temperatures, and surficial water flow of the geyser basin exhibited a marked diurnal pattern that was closely associated with solar thermal radiance. We interpret the source of the seismicity to be due to the collapse of small steam bubbles in the hydrothermal system, with the rate of collapse being controlled by surficial temperatures and daytime evaporation rates.
Implementing EM and Viterbi algorithms for Hidden Markov Model in linear memory
Churbanov, Alexander; Winters-Hilt, Stephen
2008-01-01
Background The Baum-Welch learning procedure for Hidden Markov Models (HMMs) provides a powerful tool for tailoring HMM topologies to data for use in knowledge discovery and clustering. A linear memory procedure recently proposed by Miklós, I. and Meyer, I.M. describes a memory sparse version of the Baum-Welch algorithm with modifications to the original probabilistic table topologies to make memory use independent of sequence length (and linearly dependent on state number). The original description of the technique has some errors that we amend. We then compare the corrected implementation on a variety of data sets with conventional and checkpointing implementations. Results We provide a correct recurrence relation for the emission parameter estimate and extend it to parameter estimates of the Normal distribution. To accelerate estimation of the prior state probabilities, and decrease memory use, we reverse the originally proposed forward sweep. We describe different scaling strategies necessary in all real implementations of the algorithm to prevent underflow. In this paper we also describe our approach to a linear memory implementation of the Viterbi decoding algorithm (with linearity in the sequence length, while memory use is approximately independent of state number). We demonstrate the use of the linear memory implementation on an extended Duration Hidden Markov Model (DHMM) and on an HMM with a spike detection topology. Comparing the various implementations of the Baum-Welch procedure we find that the checkpointing algorithm produces the best overall tradeoff between memory use and speed. In cases where sequence length is very large (for Baum-Welch), or state number is very large (for Viterbi), the linear memory methods outlined may offer some utility. Conclusion Our performance-optimized Java implementations of Baum-Welch algorithm are available at . The described method and implementations will aid sequence alignment, gene structure prediction, HMM
Modeling carbachol-induced hippocampal network synchronization using hidden Markov models
NASA Astrophysics Data System (ADS)
Dragomir, Andrei; Akay, Yasemin M.; Akay, Metin
2010-10-01
In this work we studied the neural state transitions undergone by the hippocampal neural network using a hidden Markov model (HMM) framework. We first employed a measure based on the Lempel-Ziv (LZ) estimator to characterize the changes in the hippocampal oscillation patterns in terms of their complexity. These oscillations correspond to different modes of hippocampal network synchronization induced by the cholinergic agonist carbachol in the CA1 region of mice hippocampus. HMMs are then used to model the dynamics of the LZ-derived complexity signals as first-order Markov chains. Consequently, the signals corresponding to our oscillation recordings can be segmented into a sequence of statistically discriminated hidden states. The segmentation is used for detecting transitions in neural synchronization modes in data recorded from wild-type and triple transgenic mice models (3xTG) of Alzheimer's disease (AD). Our data suggest that transition from low-frequency (delta range) continuous oscillation mode into high-frequency (theta range) oscillation, exhibiting repeated burst-type patterns, occurs always through a mode resembling a mixture of the two patterns, continuous with burst. The relatively random patterns of oscillation during this mode may reflect the fact that the neuronal network undergoes re-organization. Further insight into the time durations of these modes (retrieved via the HMM segmentation of the LZ-derived signals) reveals that the mixed mode lasts significantly longer (p < 10-4) in 3xTG AD mice. These findings, coupled with the documented cholinergic neurotransmission deficits in the 3xTG mice model, may be highly relevant for the case of AD.
Doan, Tan N; Kong, David C M; Marshall, Caroline; Kirkpatrick, Carl M J; McBryde, Emma S
2015-01-01
Little is known about the transmission dynamics of Acinetobacter baumannii in hospitals, despite such information being critical for designing effective infection control measures. In the absence of comprehensive epidemiological data, mathematical modelling is an attractive approach to understanding transmission process. The statistical challenge in estimating transmission parameters from infection data arises from the fact that most patients are colonised asymptomatically and therefore the transmission process is not fully observed. Hidden Markov models (HMMs) can overcome this problem. We developed a continuous-time structured HMM to characterise the transmission dynamics, and to quantify the relative importance of different acquisition sources of A. baumannii in intensive care units (ICUs) in three hospitals in Melbourne, Australia. The hidden states were the total number of patients colonised with A. baumannii (both detected and undetected). The model input was monthly incidence data of the number of detected colonised patients (observations). A Bayesian framework with Markov chain Monte Carlo algorithm was used for parameter estimations. We estimated that 96-98% of acquisition in Hospital 1 and 3 was due to cross-transmission between patients; whereas most colonisation in Hospital 2 was due to other sources (sporadic acquisition). On average, it takes 20 and 31 days for each susceptible individual in Hospital 1 and Hospital 3 to become colonised as a result of cross-transmission, respectively; whereas it takes 17 days to observe one new colonisation from sporadic acquisition in Hospital 2. The basic reproduction ratio (R0) for Hospital 1, 2 and 3 was 1.5, 0.02 and 1.6, respectively. Our study is the first to characterise the transmission dynamics of A. baumannii using mathematical modelling. We showed that HMMs can be applied to sparse hospital infection data to estimate transmission parameters despite unobserved events and imperfect detection of the organism
Fuzzy hidden Markov chains segmentation for volume determination and quantitation in PET
NASA Astrophysics Data System (ADS)
Hatt, M.; Lamare, F.; Boussion, N.; Turzo, A.; Collet, C.; Salzenstein, F.; Roux, C.; Jarritt, P.; Carson, K.; Cheze-LeRest, C.; Visvikis, D.
2007-07-01
Accurate volume of interest (VOI) estimation in PET is crucial in different oncology applications such as response to therapy evaluation and radiotherapy treatment planning. The objective of our study was to evaluate the performance of the proposed algorithm for automatic lesion volume delineation; namely the fuzzy hidden Markov chains (FHMC), with that of current state of the art in clinical practice threshold based techniques. As the classical hidden Markov chain (HMC) algorithm, FHMC takes into account noise, voxel intensity and spatial correlation, in order to classify a voxel as background or functional VOI. However the novelty of the fuzzy model consists of the inclusion of an estimation of imprecision, which should subsequently lead to a better modelling of the 'fuzzy' nature of the object of interest boundaries in emission tomography data. The performance of the algorithms has been assessed on both simulated and acquired datasets of the IEC phantom, covering a large range of spherical lesion sizes (from 10 to 37 mm), contrast ratios (4:1 and 8:1) and image noise levels. Both lesion activity recovery and VOI determination tasks were assessed in reconstructed images using two different voxel sizes (8 mm3 and 64 mm3). In order to account for both the functional volume location and its size, the concept of % classification errors was introduced in the evaluation of volume segmentation using the simulated datasets. Results reveal that FHMC performs substantially better than the threshold based methodology for functional volume determination or activity concentration recovery considering a contrast ratio of 4:1 and lesion sizes of <28 mm. Furthermore differences between classification and volume estimation errors evaluated were smaller for the segmented volumes provided by the FHMC algorithm. Finally, the performance of the automatic algorithms was less susceptible to image noise levels in comparison to the threshold based techniques. The analysis of both
NASA Astrophysics Data System (ADS)
Liu, Qinming; Dong, Ming; Lv, Wenyuan; Geng, Xiuli; Li, Yupeng
2015-12-01
Health prognosis for equipment is considered as a key process of the condition-based maintenance strategy. This paper presents an integrated framework for multi-sensor equipment diagnosis and prognosis based on adaptive hidden semi-Markov model (AHSMM). Unlike hidden semi-Markov model (HSMM), the basic algorithms in an AHSMM are first modified in order for decreasing computation and space complexity. Then, the maximum likelihood linear regression transformations method is used to train the output and duration distributions to re-estimate all unknown parameters. The AHSMM is used to identify the hidden degradation state and obtain the transition probabilities among health states and durations. Finally, through the proposed hazard rate equations, one can predict the useful remaining life of equipment with multi-sensor information. Our main results are verified in real world applications: monitoring hydraulic pumps from Caterpillar Inc. The results show that the proposed methods are more effective for multi-sensor monitoring equipment health prognosis.
Local Autoencoding for Parameter Estimation in a Hidden Potts-Markov Random Field.
Song, Sanming; Si, Bailu; Herrmann, J Michael; Feng, Xisheng
2016-05-01
A local-autoencoding (LAE) method is proposed for the parameter estimation in a Hidden Potts-Markov random field model. Due to sampling cost, Markov chain Monte Carlo methods are rarely used in real-time applications. Like other heuristic methods, LAE is based on a conditional independence assumption. It adapts, however, the parameters in a block-by-block style with a simple Hebbian learning rule. Experiments with given label fields show that the LAE is able to converge in far less time than required for a scan. It is also possible to derive an estimate for LAE based on a Cramer–Rao bound that is similar to the classical maximum pseudolikelihood method. As a general algorithm, LAE can be used to estimate the parameters in anisotropic label fields. Furthermore, LAE is not limited to the classical Potts model and can be applied to other types of Potts models by simple label field transformations and straightforward learning rule extensions. Experimental results on image segmentations demonstrate the efficiency and generality of the LAE algorithm.
Local Autoencoding for Parameter Estimation in a Hidden Potts-Markov Random Field.
Song, Sanming; Si, Bailu; Herrmann, J Michael; Feng, Xisheng
2016-05-01
A local-autoencoding (LAE) method is proposed for the parameter estimation in a Hidden Potts-Markov random field model. Due to sampling cost, Markov chain Monte Carlo methods are rarely used in real-time applications. Like other heuristic methods, LAE is based on a conditional independence assumption. It adapts, however, the parameters in a block-by-block style with a simple Hebbian learning rule. Experiments with given label fields show that the LAE is able to converge in far less time than required for a scan. It is also possible to derive an estimate for LAE based on a Cramer–Rao bound that is similar to the classical maximum pseudolikelihood method. As a general algorithm, LAE can be used to estimate the parameters in anisotropic label fields. Furthermore, LAE is not limited to the classical Potts model and can be applied to other types of Potts models by simple label field transformations and straightforward learning rule extensions. Experimental results on image segmentations demonstrate the efficiency and generality of the LAE algorithm. PMID:27019491
Estimating parameters of hidden Markov models based on marked individuals: use of robust design data
Kendall, William L.; White, Gary C.; Hines, James E.; Langtimm, Catherine A.; Yoshizaki, Jun
2012-01-01
Development and use of multistate mark-recapture models, which provide estimates of parameters of Markov processes in the face of imperfect detection, have become common over the last twenty years. Recently, estimating parameters of hidden Markov models, where the state of an individual can be uncertain even when it is detected, has received attention. Previous work has shown that ignoring state uncertainty biases estimates of survival and state transition probabilities, thereby reducing the power to detect effects. Efforts to adjust for state uncertainty have included special cases and a general framework for a single sample per period of interest. We provide a flexible framework for adjusting for state uncertainty in multistate models, while utilizing multiple sampling occasions per period of interest to increase precision and remove parameter redundancy. These models also produce direct estimates of state structure for each primary period, even for the case where there is just one sampling occasion. We apply our model to expected value data, and to data from a study of Florida manatees, to provide examples of the improvement in precision due to secondary capture occasions. We also provide user-friendly software to implement these models. This general framework could also be used by practitioners to consider constrained models of particular interest, or model the relationship between within-primary period parameters (e.g., state structure) and between-primary period parameters (e.g., state transition probabilities).
Syed, Sheyum; Müllner, Fiona E.; Selvin, Paul R.; Sigworth, Fred J.
2010-01-01
Unbiased interpretation of noisy single molecular motor recordings remains a challenging task. To address this issue, we have developed robust algorithms based on hidden Markov models (HMMs) of motor proteins. The basic algorithm, called variable-stepsize HMM (VS-HMM), was introduced in the previous article. It improves on currently available Markov-model based techniques by allowing for arbitrary distributions of step sizes, and shows excellent convergence properties for the characterization of staircase motor timecourses in the presence of large measurement noise. In this article, we extend the VS-HMM framework for better performance with experimental data. The extended algorithm, variable-stepsize integrating-detector HMM (VSI-HMM) better models the data-acquisition process, and accounts for random baseline drifts. Further, as an extension, maximum a posteriori estimation is provided. When used as a blind step detector, the VSI-HMM outperforms conventional step detectors. The fidelity of the VSI-HMM is tested with simulations and is applied to in vitro myosin V data where a small 10 nm population of steps is identified. It is also applied to an in vivo recording of melanosome motion, where strong evidence is found for repeated, bidirectional steps smaller than 8 nm in size, implying that multiple motors simultaneously carry the cargo. PMID:21112294
Kendall, William L; White, Gary C; Hines, James E; Langtimm, Catherine A; Yoshizaki, Jun
2012-04-01
Development and use of multistate mark-recapture models, which provide estimates of parameters of Markov processes in the face of imperfect detection, have become common over the last 20 years. Recently, estimating parameters of hidden Markov models, where the state of an individual can be uncertain even when it is detected, has received attention. Previous work has shown that ignoring state uncertainty biases estimates of survival and state transition probabilities, thereby reducing the power to detect effects. Efforts to adjust for state uncertainty have included special cases and a general framework for a single sample per period of interest. We provide a flexible framework for adjusting for state uncertainty in multistate models, while utilizing multiple sampling occasions per period of interest to increase precision and remove parameter redundancy. These models also produce direct estimates of state structure for each primary period, even for the case where there is just one sampling occasion. We apply our model to expected-value data, and to data from a study of Florida manatees, to provide examples of the improvement in precision due to secondary capture occasions. We have also implemented these models in program MARK. This general framework could also be used by practitioners to consider constrained models of particular interest, or to model the relationship between within-primary-period parameters (e.g., state structure) and between-primary-period parameters (e.g., state transition probabilities). PMID:22690641
Hidden Markov induced Dynamic Bayesian Network for recovering time evolving gene regulatory networks
NASA Astrophysics Data System (ADS)
Zhu, Shijia; Wang, Yadong
2015-12-01
Dynamic Bayesian Networks (DBN) have been widely used to recover gene regulatory relationships from time-series data in computational systems biology. Its standard assumption is ‘stationarity’, and therefore, several research efforts have been recently proposed to relax this restriction. However, those methods suffer from three challenges: long running time, low accuracy and reliance on parameter settings. To address these problems, we propose a novel non-stationary DBN model by extending each hidden node of Hidden Markov Model into a DBN (called HMDBN), which properly handles the underlying time-evolving networks. Correspondingly, an improved structural EM algorithm is proposed to learn the HMDBN. It dramatically reduces searching space, thereby substantially improving computational efficiency. Additionally, we derived a novel generalized Bayesian Information Criterion under the non-stationary assumption (called BWBIC), which can help significantly improve the reconstruction accuracy and largely reduce over-fitting. Moreover, the re-estimation formulas for all parameters of our model are derived, enabling us to avoid reliance on parameter settings. Compared to the state-of-the-art methods, the experimental evaluation of our proposed method on both synthetic and real biological data demonstrates more stably high prediction accuracy and significantly improved computation efficiency, even with no prior knowledge and parameter settings.
Detecting Gait Phases from RGB-D Images Based on Hidden Markov Model
Heravi, Hamed; Ebrahimi, Afshin; Olyaee, Ehsan
2016-01-01
Gait contains important information about the status of the human body and physiological signs. In many medical applications, it is important to monitor and accurately analyze the gait of the patient. Since walking shows the reproducibility signs in several phases, separating these phases can be used for the gait analysis. In this study, a method based on image processing for extracting phases of human gait from RGB-Depth images is presented. The sequence of depth images from the front view has been processed to extract the lower body depth profile and distance features. Feature vector extracted from image is the same as observation vector of hidden Markov model, and the phases of gait are considered as hidden states of the model. After training the model using the images which are randomly selected as training samples, the phase estimation of gait becomes possible using the model. The results confirm the rate of 60–40% of two major phases of the gait and also the mid-stance phase is recognized with 85% precision. PMID:27563572
Detecting Gait Phases from RGB-D Images Based on Hidden Markov Model.
Heravi, Hamed; Ebrahimi, Afshin; Olyaee, Ehsan
2016-01-01
Gait contains important information about the status of the human body and physiological signs. In many medical applications, it is important to monitor and accurately analyze the gait of the patient. Since walking shows the reproducibility signs in several phases, separating these phases can be used for the gait analysis. In this study, a method based on image processing for extracting phases of human gait from RGB-Depth images is presented. The sequence of depth images from the front view has been processed to extract the lower body depth profile and distance features. Feature vector extracted from image is the same as observation vector of hidden Markov model, and the phases of gait are considered as hidden states of the model. After training the model using the images which are randomly selected as training samples, the phase estimation of gait becomes possible using the model. The results confirm the rate of 60-40% of two major phases of the gait and also the mid-stance phase is recognized with 85% precision.
A Hidden Markov Model for Urban-Scale Traffic Estimation Using Floating Car Data.
Wang, Xiaomeng; Peng, Ling; Chi, Tianhe; Li, Mengzhu; Yao, Xiaojing; Shao, Jing
2015-01-01
Urban-scale traffic monitoring plays a vital role in reducing traffic congestion. Owing to its low cost and wide coverage, floating car data (FCD) serves as a novel approach to collecting traffic data. However, sparse probe data represents the vast majority of the data available on arterial roads in most urban environments. In order to overcome the problem of data sparseness, this paper proposes a hidden Markov model (HMM)-based traffic estimation model, in which the traffic condition on a road segment is considered as a hidden state that can be estimated according to the conditions of road segments having similar traffic characteristics. An algorithm based on clustering and pattern mining rather than on adjacency relationships is proposed to find clusters with road segments having similar traffic characteristics. A multi-clustering strategy is adopted to achieve a trade-off between clustering accuracy and coverage. Finally, the proposed model is designed and implemented on the basis of a real-time algorithm. Results of experiments based on real FCD confirm the applicability, accuracy, and efficiency of the model. In addition, the results indicate that the model is practicable for traffic estimation on urban arterials and works well even when more than 70% of the probe data are missing.
A Hidden Markov Model for Urban-Scale Traffic Estimation Using Floating Car Data
Wang, Xiaomeng; Peng, Ling; Chi, Tianhe; Li, Mengzhu; Yao, Xiaojing; Shao, Jing
2015-01-01
Urban-scale traffic monitoring plays a vital role in reducing traffic congestion. Owing to its low cost and wide coverage, floating car data (FCD) serves as a novel approach to collecting traffic data. However, sparse probe data represents the vast majority of the data available on arterial roads in most urban environments. In order to overcome the problem of data sparseness, this paper proposes a hidden Markov model (HMM)-based traffic estimation model, in which the traffic condition on a road segment is considered as a hidden state that can be estimated according to the conditions of road segments having similar traffic characteristics. An algorithm based on clustering and pattern mining rather than on adjacency relationships is proposed to find clusters with road segments having similar traffic characteristics. A multi-clustering strategy is adopted to achieve a trade-off between clustering accuracy and coverage. Finally, the proposed model is designed and implemented on the basis of a real-time algorithm. Results of experiments based on real FCD confirm the applicability, accuracy, and efficiency of the model. In addition, the results indicate that the model is practicable for traffic estimation on urban arterials and works well even when more than 70% of the probe data are missing. PMID:26710073
A Hidden Markov Model for Urban-Scale Traffic Estimation Using Floating Car Data.
Wang, Xiaomeng; Peng, Ling; Chi, Tianhe; Li, Mengzhu; Yao, Xiaojing; Shao, Jing
2015-01-01
Urban-scale traffic monitoring plays a vital role in reducing traffic congestion. Owing to its low cost and wide coverage, floating car data (FCD) serves as a novel approach to collecting traffic data. However, sparse probe data represents the vast majority of the data available on arterial roads in most urban environments. In order to overcome the problem of data sparseness, this paper proposes a hidden Markov model (HMM)-based traffic estimation model, in which the traffic condition on a road segment is considered as a hidden state that can be estimated according to the conditions of road segments having similar traffic characteristics. An algorithm based on clustering and pattern mining rather than on adjacency relationships is proposed to find clusters with road segments having similar traffic characteristics. A multi-clustering strategy is adopted to achieve a trade-off between clustering accuracy and coverage. Finally, the proposed model is designed and implemented on the basis of a real-time algorithm. Results of experiments based on real FCD confirm the applicability, accuracy, and efficiency of the model. In addition, the results indicate that the model is practicable for traffic estimation on urban arterials and works well even when more than 70% of the probe data are missing. PMID:26710073
Michalopoulos, Kostas; Zervakis, Michalis; Deiber, Marie-Pierre; Bourbakis, Nikolaos
2016-09-01
We present a novel synergistic methodology for the spatio-temporal analysis of single Electroencephalogram (EEG) trials. This new methodology is based on the novel synergy of Local Global Graph (LG graph) to characterize define the structural features of the EEG topography as a global descriptor for robust comparison of dominant topographies (microstates) and Hidden Markov Models (HMM) to model the topographic sequence in a unique way. In particular, the LG graph descriptor defines similarity and distance measures that can be successfully used for the difficult comparison of the extracted LG graphs in the presence of noise. In addition, hidden states represent periods of stationary distribution of topographies that constitute the equivalent of the microstates in the model. The transitions between the different microstates and the formed syntactic patterns can reveal differences in the processing of the input stimulus between different pathologies. We train the HMM model to learn the transitions between the different microstates and express the syntactic patterns that appear in the single trials in a compact and efficient way. We applied this methodology in single trials consisting of normal subjects and patients with Progressive Mild Cognitive Impairment (PMCI) to discriminate these two groups. The classification results show that this approach is capable to efficiently discriminate between control and Progressive MCI single trials. Results indicate that HMMs provide physiologically meaningful results that can be used in the syntactic analysis of Event Related Potentials. PMID:27255799
Detecting Gait Phases from RGB-D Images Based on Hidden Markov Model.
Heravi, Hamed; Ebrahimi, Afshin; Olyaee, Ehsan
2016-01-01
Gait contains important information about the status of the human body and physiological signs. In many medical applications, it is important to monitor and accurately analyze the gait of the patient. Since walking shows the reproducibility signs in several phases, separating these phases can be used for the gait analysis. In this study, a method based on image processing for extracting phases of human gait from RGB-Depth images is presented. The sequence of depth images from the front view has been processed to extract the lower body depth profile and distance features. Feature vector extracted from image is the same as observation vector of hidden Markov model, and the phases of gait are considered as hidden states of the model. After training the model using the images which are randomly selected as training samples, the phase estimation of gait becomes possible using the model. The results confirm the rate of 60-40% of two major phases of the gait and also the mid-stance phase is recognized with 85% precision. PMID:27563572
NASA Astrophysics Data System (ADS)
Andriyas, S.; McKee, M.
2014-12-01
Anticipating farmers' irrigation decisions can provide the possibility of improving the efficiency of canal operations in on-demand irrigation systems. Although multiple factors are considered during irrigation decision making, for any given farmer there might be one factor playing a major role. Identification of that biophysical factor which led to a farmer deciding to irrigate is difficult because of high variability of those factors during the growing season. Analysis of the irrigation decisions of a group of farmers for a single crop can help to simplify the problem. We developed a hidden Markov model (HMM) to analyze irrigation decisions and explore the factor and level at which the majority of farmers decide to irrigate. The model requires observed variables as inputs and the hidden states. The chosen model inputs were relatively easily measured, or estimated, biophysical data, including such factors (i.e., those variables which are believed to affect irrigation decision-making) as cumulative evapotranspiration, soil moisture depletion, soil stress coefficient, and canal flows. Irrigation decision series were the hidden states for the model. The data for the work comes from the Canal B region of the Lower Sevier River Basin, near Delta, Utah. The main crops of the region are alfalfa, barley, and corn. A portion of the data was used to build and test the model capability to explore that factor and the level at which the farmer takes the decision to irrigate for future irrigation events. Both group and individual level behavior can be studied using HMMs. The study showed that the farmers cannot be classified into certain classes based on their irrigation decisions, but vary in their behavior from irrigation-to-irrigation across all years and crops. HMMs can be used to analyze what factor and, subsequently, what level of that factor on which the farmer most likely based the irrigation decision. The study shows that the HMM is a capable tool to study a process
Neuwald, Andrew F; Liu, Jun S
2004-01-01
Background Certain protein families are highly conserved across distantly related organisms and belong to large and functionally diverse superfamilies. The patterns of conservation present in these protein sequences presumably are due to selective constraints maintaining important but unknown structural mechanisms with some constraints specific to each family and others shared by a larger subset or by the entire superfamily. To exploit these patterns as a source of functional information, we recently devised a statistically based approach called contrast hierarchical alignment and interaction network (CHAIN) analysis, which infers the strengths of various categories of selective constraints from co-conserved patterns in a multiple alignment. The power of this approach strongly depends on the quality of the multiple alignments, which thus motivated development of theoretical concepts and strategies to improve alignment of conserved motifs within large sets of distantly related sequences. Results Here we describe a hidden Markov model (HMM), an algebraic system, and Markov chain Monte Carlo (MCMC) sampling strategies for alignment of multiple sequence motifs. The MCMC sampling strategies are useful both for alignment optimization and for adjusting position specific background amino acid frequencies for alignment uncertainties. Associated statistical formulations provide an objective measure of alignment quality as well as automatic gap penalty optimization. Improved alignments obtained in this way are compared with PSI-BLAST based alignments within the context of CHAIN analysis of three protein families: Giα subunits, prolyl oligopeptidases, and transitional endoplasmic reticulum (p97) AAA+ ATPases. Conclusion While not entirely replacing PSI-BLAST based alignments, which likewise may be optimized for CHAIN analysis using this approach, these motif-based methods often more accurately align very distantly related sequences and thus can provide a better measure of
Fuzzy hidden Markov chains segmentation for volume determination and quantitation in PET
Hatt, Mathieu; Lamare, Frédéric; Boussion, Nicolas; Roux, Christian; Turzo, Alexandre; Cheze-Lerest, Catherine; Jarritt, Peter; Carson, Kathryn; Salzenstein, Fabien; Collet, Christophe; Visvikis, Dimitris
2007-01-01
Accurate volume of interest (VOI) estimation in PET is crucial in different oncology applications such as response to therapy evaluation and radiotherapy treatment planning. The objective of our study was to evaluate the performance of the proposed algorithm for automatic lesion volume delineation; namely the Fuzzy Hidden Markov Chains (FHMC), with that of current state of the art in clinical practice threshold based techniques. As the classical Hidden Markov Chain (HMC) algorithm, FHMC takes into account noise, voxel’s intensity and spatial correlation, in order to classify a voxel as background or functional VOI. However the novelty of the fuzzy model consists of the inclusion of an estimation of imprecision, which should subsequently lead to a better modelling of the “fuzzy” nature of the object on interest boundaries in emission tomography data. The performance of the algorithms has been assessed on both simulated and acquired datasets of the IEC phantom, covering a large range of spherical lesion sizes (from 10 to 37mm), contrast ratios (4:1 and 8:1) and image noise levels. Both lesion activity recovery and VOI determination tasks were assessed in reconstructed images using two different voxel sizes (8mm3 and 64mm3). In order to account for both the functional volume location and its size, the concept of % classification errors was introduced in the evaluation of volume segmentation using the simulated datasets. Results reveal that FHMC performs substantially better than the threshold based methodology for functional volume determination or activity concentration recovery considering a contrast ratio of 4:1 and lesion sizes of <28mm. Furthermore differences between classification and volume estimation errors evaluated were smaller for the segmented volumes provided by the FHMC algorithm. Finally, the performance of the automatic algorithms was less susceptible to image noise levels in comparison to the threshold based techniques. The analysis of both
On the use of hidden Markov models for gaze pattern modeling
NASA Astrophysics Data System (ADS)
Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph
2016-05-01
Some of the conventional metrics derived from gaze patterns (on computer screens) to study visual attention, engagement and fatigue are saccade counts, nearest neighbor index (NNI) and duration of dwells/fixations. Each of these metrics has drawbacks in modeling the behavior of gaze patterns; one such drawback comes from the fact that some portions on the screen are not as important as some other portions on the screen. This is addressed by computing the eye gaze metrics corresponding to important areas of interest (AOI) on the screen. There are some challenges in developing accurate AOI based metrics: firstly, the definition of AOI is always fuzzy; secondly, it is possible that the AOI may change adaptively over time. Hence, there is a need to introduce eye-gaze metrics that are aware of the AOI in the field of view; at the same time, the new metrics should be able to automatically select the AOI based on the nature of the gazes. In this paper, we propose a novel way of computing NNI based on continuous hidden Markov models (HMM) that model the gazes as 2D Gaussian observations (x-y coordinates of the gaze) with the mean at the center of the AOI and covariance that is related to the concentration of gazes. The proposed modeling allows us to accurately compute the NNI metric in the presence of multiple, undefined AOI on the screen in the presence of intermittent casual gazing that is modeled as random gazes on the screen.
Sun, Jun; Palade, Vasile; Wu, Xiaojun; Fang, Wei
2014-01-01
Hidden Markov Models (HMMs) are powerful tools for multiple sequence alignment (MSA), which is known to be an NP-complete and important problem in bioinformatics. Learning HMMs is a difficult task, and many meta-heuristic methods, including particle swarm optimization (PSO), have been used for that. In this paper, a new variant of PSO, called the random drift particle swarm optimization (RDPSO) algorithm, is proposed to be used for HMM learning tasks in MSA problems. The proposed RDPSO algorithm, inspired by the free electron model in metal conductors in an external electric field, employs a novel set of evolution equations that can enhance the global search ability of the algorithm. Moreover, in order to further enhance the algorithmic performance of the RDPSO, we incorporate a diversity control method into the algorithm and, thus, propose an RDPSO with diversity-guided search (RDPSO-DGS). The performances of the RDPSO, RDPSO-DGS and other algorithms are tested and compared by learning HMMs for MSA on two well-known benchmark data sets. The experimental results show that the HMMs learned by the RDPSO and RDPSO-DGS are able to generate better alignments for the benchmark data sets than other most commonly used HMM learning methods, such as the Baum-Welch and other PSO algorithms. The performance comparison with well-known MSA programs, such as ClustalW and MAFFT, also shows that the proposed methods have advantages in multiple sequence alignment.
A jumping profile Hidden Markov Model and applications to recombination sites in HIV and HCV genomes
Schultz, Anne-Kathrin; Zhang, Ming; Leitner, Thomas; Kuiken, Carla; Korber, Bette; Morgenstern, Burkhard; Stanke, Mario
2006-01-01
Background Jumping alignments have recently been proposed as a strategy to search a given multiple sequence alignment A against a database. Instead of comparing a database sequence S to the multiple alignment or profile as a whole, S is compared and aligned to individual sequences from A. Within this alignment, S can jump between different sequences from A, so different parts of S can be aligned to different sequences from the input multiple alignment. This approach is particularly useful for dealing with recombination events. Results We developed a jumping profile Hidden Markov Model (jpHMM), a probabilistic generalization of the jumping-alignment approach. Given a partition of the aligned input sequence family into known sequence subtypes, our model can jump between states corresponding to these different subtypes, depending on which subtype is locally most similar to a database sequence. Jumps between different subtypes are indicative of intersubtype recombinations. We applied our method to a large set of genome sequences from human immunodeficiency virus (HIV) and hepatitis C virus (HCV) as well as to simulated recombined genome sequences. Conclusion Our results demonstrate that jumps in our jumping profile HMM often correspond to recombination breakpoints; our approach can therefore be used to detect recombinations in genomic sequences. The recombination breakpoints identified by jpHMM were found to be significantly more accurate than breakpoints defined by traditional methods based on comparing single representative sequences. PMID:16716226
Hame, Yrjo; Angelini, Elsa D; Hoffman, Eric A; Barr, R Graham; Laine, Andrew F
2014-07-01
The extent of pulmonary emphysema is commonly estimated from CT scans by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols, and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the presented model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was applied on a longitudinal data set with 87 subjects and a total of 365 scans acquired with varying imaging protocols. The resulting emphysema estimates had very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. The generated emphysema delineations promise advantages for regional analysis of emphysema extent and progression.
Martinez-Murcia, Francisco J; Górriz, Juan M; Ramírez, Javier; Ortiz, Andres
2016-11-01
The usage of biomedical imaging in the diagnosis of dementia is increasingly widespread. A number of works explore the possibilities of computational techniques and algorithms in what is called computed aided diagnosis. Our work presents an automatic parametrization of the brain structure by means of a path generation algorithm based on hidden Markov models (HMMs). The path is traced using information of intensity and spatial orientation in each node, adapting to the structure of the brain. Each path is itself a useful way to characterize the distribution of the tissue inside the magnetic resonance imaging (MRI) image by, for example, extracting the intensity levels at each node or generating statistical information of the tissue distribution. Additionally, a further processing consisting of a modification of the grey level co-occurrence matrix (GLCM) can be used to characterize the textural changes that occur throughout the path, yielding more meaningful values that could be associated to Alzheimer's disease (AD), as well as providing a significant feature reduction. This methodology achieves moderate performance, up to 80.3% of accuracy using a single path in differential diagnosis involving Alzheimer-affected subjects versus controls belonging to the Alzheimer's disease neuroimaging initiative (ADNI).
Hypovigilance Detection for UCAV Operators Based on a Hidden Markov Model
Kwon, Namyeon; Shin, Yongwook; Ryo, Chuh Yeop; Park, Jonghun
2014-01-01
With the advance of military technology, the number of unmanned combat aerial vehicles (UCAVs) has rapidly increased. However, it has been reported that the accident rate of UCAVs is much higher than that of manned combat aerial vehicles. One of the main reasons for the high accident rate of UCAVs is the hypovigilance problem which refers to the decrease in vigilance levels of UCAV operators while maneuvering. In this paper, we propose hypovigilance detection models for UCAV operators based on EEG signal to minimize the number of occurrences of hypovigilance. To enable detection, we have applied hidden Markov models (HMMs), two of which are used to indicate the operators' dual states, normal vigilance and hypovigilance, and, for each operator, the HMMs are trained as a detection model. To evaluate the efficacy and effectiveness of the proposed models, we conducted two experiments on the real-world data obtained by using EEG-signal acquisition devices, and they yielded satisfactory results. By utilizing the proposed detection models, the problem of hypovigilance of UCAV operators and the problem of high accident rate of UCAVs can be addressed. PMID:24963338
Application of hidden Markov models to biological data mining: a case study
NASA Astrophysics Data System (ADS)
Yin, Michael M.; Wang, Jason T.
2000-04-01
In this paper we present an example of biological data mining: the detection of splicing junction acceptors in eukaryotic genes. Identification or prediction of transcribed sequences from within genomic DNA has been a major rate-limiting step in the pursuit of genes. Programs currently available are far from being powerful enough to elucidate the gene structure completely. Here we develop a hidden Markov model (HMM) to represent the degeneracy features of splicing junction acceptor sites in eukaryotic genes. The HMM system is fully trained using an expectation maximization (EM) algorithm and the system performance is evaluated using the 10-way cross- validation method. Experimental results show that our HMM system can correctly classify more than 94% of the candidate sequences (including true and false acceptor sites) into right categories. About 90% of the true acceptor sites and 96% of the false acceptor sites in the test data are classified correctly. These results are very promising considering that only the local information in DNA is used. The proposed model will be a very important component of an effective and accurate gene structure detection system currently being developed in our lab.
Bayesian hidden Markov models to identify RNA-protein interaction sites in PAR-CLIP.
Yun, Jonghyun; Wang, Tao; Xiao, Guanghua
2014-06-01
The photoactivatable ribonucleoside enhanced cross-linking immunoprecipitation (PAR-CLIP) has been increasingly used for the global mapping of RNA-protein interaction sites. There are two key features of the PAR-CLIP experiments: The sequence read tags are likely to form an enriched peak around each RNA-protein interaction site; and the cross-linking procedure is likely to introduce a specific mutation in each sequence read tag at the interaction site. Several ad hoc methods have been developed to identify the RNA-protein interaction sites using either sequence read counts or mutation counts alone; however, rigorous statistical methods for analyzing PAR-CLIP are still lacking. In this article, we propose an integrative model to establish a joint distribution of observed read and mutation counts. To pinpoint the interaction sites at single base-pair resolution, we developed a novel modeling approach that adopts non-homogeneous hidden Markov models to incorporate the nucleotide sequence at each genomic location. Both simulation studies and data application showed that our method outperforms the ad hoc methods, and provides reliable inferences for the RNA-protein binding sites from PAR-CLIP data. PMID:24571656
Combining hidden Markov models for comparing the dynamics of multiple sleep electroencephalograms.
Langrock, Roland; Swihart, Bruce J; Caffo, Brian S; Punjabi, Naresh M; Crainiceanu, Ciprian M
2013-08-30
In this manuscript, we consider methods for the analysis of populations of electroencephalogram signals during sleep for the study of sleep disorders using hidden Markov models (HMMs). Notably, we propose an easily implemented method for simultaneously modeling multiple time series that involve large amounts of data. We apply these methods to study sleep-disordered breathing (SDB) in the Sleep Heart Health Study (SHHS), a landmark study of SDB and cardiovascular consequences. We use the entire, longitudinally collected, SHHS cohort to develop HMM population parameters, which we then apply to obtain subject-specific Markovian predictions. From these predictions, we create several indices of interest, such as transition frequencies between latent states. Our HMM analysis of electroencephalogram signals uncovers interesting findings regarding differences in brain activity during sleep between those with and without SDB. These findings include stability of the percent time spent in HMM latent states across matched diseased and non-diseased groups and differences in the rate of transitioning. PMID:23348835
NASA Astrophysics Data System (ADS)
Nishiura, Takanobu; Nakamura, Satoshi
2003-10-01
Humans communicate with each other through speech by focusing on the target speech among environmental sounds in real acoustic environments. We can easily identify the target sound from other environmental sounds. For hands-free speech recognition, the identification of the target speech from environmental sounds is imperative. This mechanism may also be important for a self-moving robot to sense the acoustic environments and communicate with humans. Therefore, this paper first proposes hidden Markov model (HMM)-based environmental sound source identification. Environmental sounds are modeled by three states of HMMs and evaluated using 92 kinds of environmental sounds. The identification accuracy was 95.4%. This paper also proposes a new HMM composition method that composes speech HMMs and an HMM of categorized environmental sounds for robust environmental sound-added speech recognition. As a result of the evaluation experiments, we confirmed that the proposed HMM composition outperforms the conventional HMM composition with speech HMMs and a noise (environmental sound) HMM trained using noise periods prior to the target speech in a captured signal. [Work supported by Ministry of Public Management, Home Affairs, Posts and Telecommunications of Japan.
Characterization of the crawling activity of Caenorhabditis elegans using a Hidden Markov model.
Lee, Sang-Hee; Kang, Seung-Ho
2015-12-01
The locomotion behavior of Caenorhabditis elegans has been studied extensively to understand the respective roles of neural control and biomechanics as well as the interaction between them. Constructing a mathematical model is helpful to understand the locomotion behavior in various surrounding conditions that are difficult to realize in experiments. In this study, we built three hidden Markov models (HMMs) for the crawling behavior of C. elegans in a controlled environment with no chemical treatment and in a formaldehyde-treated environment (0.1 and 0.5 ppm). The organism's crawling activity was recorded using a digital camcorder for 20 min at a rate of 24 frames per second. All shape patterns were quantified by branch length similarity (BLS) entropy and classified into four groups using the self-organizing map (SOM). Comparison of the simulated behavior generated by HMMs and the actual crawling behavior demonstrated that the HMM coupled with the SOM was successful in characterizing the crawling behavior. In addition, we briefly discussed the possibility of using the HMM together with BLS entropy to develop bio-monitoring systems to determine water quality. PMID:26319806
High range resolution radar target identification using the Prony model and hidden Markov models
NASA Astrophysics Data System (ADS)
Dewitt, Mark R.
1992-12-01
Fully polarized Xpatch signatures are transformed to two left circularly polarized signals. These two signals are then filtered by a linear FM pulse compression ('chirp') transfer function, corrupted by AWGN, and filtered by a filter matched to the 'chirp' transfer function. The bandwidth of the 'chirp' radar is about 750 MHz. Range profile feature extraction is performed using the TLS Prony Model parameter estimation technique developed at Ohio State University. Using the Prony Model, each scattering center is described by a polarization ellipse, relative energy, frequency response, and range. This representation of the target is vector quantized using a K-means clustering algorithm. Sequences of vector quantized scattering centers as well as sequences of vector quantized range profiles are used to synthesize target specific Hidden Markov Models (HMM's). The identification decision is made by determining which HMM has the highest probability of generating the unknown sequence. The data consist of synthesized Xpatch signatures of two targets which have been difficult to separate with other RTI algorithms. The RTI algorithm developed is clearly able to separate these two targets over a 10 by 10 degree (1 degree granularity) aspect angle window off the nose for SNR's as low as 0 dB. The classification rate is 100 percent for SNR's of 5 - 20 dB, 95 percent for a SNR of 0 dB and it drops rapidly for SNR's lower than 0 dB.
Combining Monte Carlo and mean-field-like methods for inference in hidden Markov random fields.
Forbes, Florence; Fort, Gersende
2007-03-01
Issues involving missing data are typical settings where exact inference is not tractable as soon as nontrivial interactions occur between the missing variables. Approximations are required, and most of them are based either on simulation methods or on deterministic variational methods. While variational methods provide fast and reasonable approximate estimates in many scenarios, simulation methods offer more consideration of important theoretical issues such as accuracy of the approximation and convergence of the algorithms but at a much higher computational cost. In this work, we propose a new class of algorithms that combine the main features and advantages of both simulation and deterministic methods and consider applications to inference in hidden Markov random fields (HMRFs). These algorithms can be viewed as stochastic perturbations of variational expectation maximization (VEM) algorithms, which are not tractable for HMRF. We focus more specifically on one of these perturbations and we prove their (almost sure) convergence to the same limit set as the limit set of VEM. In addition, experiments on synthetic and real-world images show that the algorithm performance is very close and sometimes better than that of other existing simulation-based and variational EM-like algorithms.
Enhancing speech recognition using improved particle swarm optimization based hidden Markov model.
Selvaraj, Lokesh; Ganesan, Balakrishnan
2014-01-01
Enhancing speech recognition is the primary intention of this work. In this paper a novel speech recognition method based on vector quantization and improved particle swarm optimization (IPSO) is suggested. The suggested methodology contains four stages, namely, (i) denoising, (ii) feature mining (iii), vector quantization, and (iv) IPSO based hidden Markov model (HMM) technique (IP-HMM). At first, the speech signals are denoised using median filter. Next, characteristics such as peak, pitch spectrum, Mel frequency Cepstral coefficients (MFCC), mean, standard deviation, and minimum and maximum of the signal are extorted from the denoised signal. Following that, to accomplish the training process, the extracted characteristics are given to genetic algorithm based codebook generation in vector quantization. The initial populations are created by selecting random code vectors from the training set for the codebooks for the genetic algorithm process and IP-HMM helps in doing the recognition. At this point the creativeness will be done in terms of one of the genetic operation crossovers. The proposed speech recognition technique offers 97.14% accuracy. PMID:25478588
An enhanced informed watermarking scheme using the posterior hidden Markov model.
Wang, Chuntao
2014-01-01
Designing a practical watermarking scheme with high robustness, feasible imperceptibility, and large capacity remains one of the most important research topics in robust watermarking. This paper presents a posterior hidden Markov model (HMM-) based informed image watermarking scheme, which well enhances the practicability of the prior-HMM-based informed watermarking with favorable robustness, imperceptibility, and capacity. To make the encoder and decoder use the (nearly) identical posterior HMM, each cover image at the encoder and each received image at the decoder are attacked with JPEG compression at an equivalently small quality factor (QF). The attacked images are then employed to estimate HMM parameter sets for both the encoder and decoder, respectively. Numerical simulations show that a small QF of 5 is an optimum setting for practical use. Based on this posterior HMM, we develop an enhanced posterior-HMM-based informed watermarking scheme. Extensive experimental simulations show that the proposed scheme is comparable to its prior counterpart in which the HMM is estimated with the original image, but it avoids the transmission of the prior HMM from the encoder to the decoder. This thus well enhances the practical application of HMM-based informed watermarking systems. Also, it is demonstrated that the proposed scheme has the robustness comparable to the state-of-the-art with significantly reduced computation time.
An Enhanced Informed Watermarking Scheme Using the Posterior Hidden Markov Model
2014-01-01
Designing a practical watermarking scheme with high robustness, feasible imperceptibility, and large capacity remains one of the most important research topics in robust watermarking. This paper presents a posterior hidden Markov model (HMM-) based informed image watermarking scheme, which well enhances the practicability of the prior-HMM-based informed watermarking with favorable robustness, imperceptibility, and capacity. To make the encoder and decoder use the (nearly) identical posterior HMM, each cover image at the encoder and each received image at the decoder are attacked with JPEG compression at an equivalently small quality factor (QF). The attacked images are then employed to estimate HMM parameter sets for both the encoder and decoder, respectively. Numerical simulations show that a small QF of 5 is an optimum setting for practical use. Based on this posterior HMM, we develop an enhanced posterior-HMM-based informed watermarking scheme. Extensive experimental simulations show that the proposed scheme is comparable to its prior counterpart in which the HMM is estimated with the original image, but it avoids the transmission of the prior HMM from the encoder to the decoder. This thus well enhances the practical application of HMM-based informed watermarking systems. Also, it is demonstrated that the proposed scheme has the robustness comparable to the state-of-the-art with significantly reduced computation time. PMID:24574883
Rotation-invariant image retrieval using hidden Markov tree for remote sensing data
NASA Astrophysics Data System (ADS)
Miao, Congcong; Zhao, Yindi
2014-11-01
The rapid increase in quantity of available remote sensing data brought an urgent need for intelligent retrieval techniques for remote sensing images. As one of the basic visual characteristics and important information sources of remote sensing images, texture is widely used in the scheme of remote sensing image retrieval. Since many images or regions with identical texture features usually show the diversity of direction, the consideration of rotation-invariance in the description of texture features is of significance both theoretically and practically. To address these issues, we develop a rotation-invariant image retrieval method based on the texture features of remote sensing images. We use the steerable pyramid transform to get the multi-scale and multi-orientation representation of texture images. Then we employ the hidden Markov tree (HMT) model, which provides a good tool to describe texture feature, to capture the dependencies across scales and orientations, by which the statistical properties of the transform domain coefficients can be obtained. Utilizing the inherent tree structure of the HMT and its fast training and likelihood computation algorithms, we can extract the rotation-invariant features of texture images. Similarity between the query image and each candidate image in the database can be measured by computing the Kullback-Leibler distance between the corresponding models. We evaluate the retrieval effectiveness of the algorithm with Brodatz texture database and remote sensing images. The experimental results show that this method has satisfactory performance in image retrieval and less sensitivity to texture rotation.
Identifying bubble collapse in a hydrothermal system using hidden Markov models
Dawson, P.B.; Benitez, M.C.; Lowenstern, J. B.; Chouet, B.A.
2012-01-01
Beginning in July 2003 and lasting through September 2003, the Norris Geyser Basin in Yellowstone National Park exhibited an unusual increase in ground temperature and hydrothermal activity. Using hidden Markov model theory, we identify over five million high-frequency (>15Hz) seismic events observed at a temporary seismic station deployed in the basin in response to the increase in hydrothermal activity. The source of these seismic events is constrained to within ???100 m of the station, and produced ???3500-5500 events per hour with mean durations of ???0.35-0.45s. The seismic event rate, air temperature, hydrologic temperatures, and surficial water flow of the geyser basin exhibited a marked diurnal pattern that was closely associated with solar thermal radiance. We interpret the source of the seismicity to be due to the collapse of small steam bubbles in the hydrothermal system, with the rate of collapse being controlled by surficial temperatures and daytime evaporation rates. copyright 2012 by the American Geophysical Union.
Segmentation of cone-beam CT using a hidden Markov random field with informative priors
NASA Astrophysics Data System (ADS)
Moores, M.; Hargrave, C.; Harden, F.; Mengersen, K.
2014-03-01
Cone-beam computed tomography (CBCT) has enormous potential to improve the accuracy of treatment delivery in image-guided radiotherapy (IGRT). To assist radiotherapists in interpreting these images, we use a Bayesian statistical model to label each voxel according to its tissue type. The rich sources of prior information in IGRT are incorporated into a hidden Markov random field model of the 3D image lattice. Tissue densities in the reference CT scan are estimated using inverse regression and then rescaled to approximate the corresponding CBCT intensity values. The treatment planning contours are combined with published studies of physiological variability to produce a spatial prior distribution for changes in the size, shape and position of the tumour volume and organs at risk. The voxel labels are estimated using iterated conditional modes. The accuracy of the method has been evaluated using 27 CBCT scans of an electron density phantom. The mean voxel-wise misclassification rate was 6.2%, with Dice similarity coefficient of 0.73 for liver, muscle, breast and adipose tissue. By incorporating prior information, we are able to successfully segment CBCT images. This could be a viable approach for automated, online image analysis in radiotherapy.
Extracting duration information in a picture category decoding task using hidden Markov Models
NASA Astrophysics Data System (ADS)
Pfeiffer, Tim; Heinze, Nicolai; Frysch, Robert; Deouell, Leon Y.; Schoenfeld, Mircea A.; Knight, Robert T.; Rose, Georg
2016-04-01
Objective. Adapting classifiers for the purpose of brain signal decoding is a major challenge in brain-computer-interface (BCI) research. In a previous study we showed in principle that hidden Markov models (HMM) are a suitable alternative to the well-studied static classifiers. However, since we investigated a rather straightforward task, advantages from modeling of the signal could not be assessed. Approach. Here, we investigate a more complex data set in order to find out to what extent HMMs, as a dynamic classifier, can provide useful additional information. We show for a visual decoding problem that besides category information, HMMs can simultaneously decode picture duration without an additional training required. This decoding is based on a strong correlation that we found between picture duration and the behavior of the Viterbi paths. Main results. Decoding accuracies of up to 80% could be obtained for category and duration decoding with a single classifier trained on category information only. Significance. The extraction of multiple types of information using a single classifier enables the processing of more complex problems, while preserving good training results even on small databases. Therefore, it provides a convenient framework for online real-life BCI utilizations.
Zacher, Benedikt; Lidschreiber, Michael; Cramer, Patrick; Gagneur, Julien; Tresch, Achim
2014-01-01
DNA replication, transcription and repair involve the recruitment of protein complexes that change their composition as they progress along the genome in a directed or strand-specific manner. Chromatin immunoprecipitation in conjunction with hidden Markov models (HMMs) has been instrumental in understanding these processes, as they segment the genome into discrete states that can be related to DNA-associated protein complexes. However, current HMM-based approaches are not able to assign forward or reverse direction to states or properly integrate strand-specific (e.g., RNA expression) with non-strand-specific (e.g., ChIP) data, which is indispensable to accurately characterize directed processes. To overcome these limitations, we introduce bidirectional HMMs which infer directed genomic states from occupancy profiles de novo. Application to RNA polymerase II-associated factors in yeast and chromatin modifications in human T cells recovers the majority of transcribed loci, reveals gene-specific variations in the yeast transcription cycle and indicates the existence of directed chromatin state patterns at transcribed, but not at repressed, regions in the human genome. In yeast, we identify 32 new transcribed loci, a regulated initiation–elongation transition, the absence of elongation factors Ctk1 and Paf1 from a class of genes, a distinct transcription mechanism for highly expressed genes and novel DNA sequence motifs associated with transcription termination. We anticipate bidirectional HMMs to significantly improve the analyses of genome-associated directed processes. PMID:25527639
Hidden semi-Markov models reveal multiphasic movement of the endangered Florida panther.
van de Kerk, Madelon; Onorato, David P; Criffield, Marc A; Bolker, Benjamin M; Augustine, Ben C; McKinley, Scott A; Oli, Madan K
2015-03-01
Animals must move to find food and mates, and to avoid predators; movement thus influences survival and reproduction, and ultimately determines fitness. Precise description of movement and understanding of spatial and temporal patterns as well as relationships with intrinsic and extrinsic factors is important both for theoretical and applied reasons. We applied hidden semi-Markov models (HSMM) to hourly geographic positioning system (GPS) location data to understand movement patterns of the endangered Florida panther (Puma concolor coryi) and to discern factors influencing these patterns. Three distinct movement modes were identified: (1) Resting mode, characterized by short step lengths and turning angles around 180(o); (2) Moderately active (or intermediate) mode characterized by intermediate step lengths and variable turning angles, and (3) Traveling mode, characterized by long step lengths and turning angles around 0(o). Males and females, and females with and without kittens, exhibited distinctly different movement patterns. Using the Viterbi algorithm, we show that differences in movement patterns of male and female Florida panthers were a consequence of sex-specific differences in diurnal patterns of state occupancy and sex-specific differences in state-specific movement parameters, whereas the differences between females with and without dependent kittens were caused solely by variation in state occupancy. Our study demonstrates the use of HSMM methodology to precisely describe movement and to dissect differences in movement patterns according to sex, and reproductive status. PMID:25251870
Wissel, Tobias; Pfeiffer, Tim; Frysch, Robert; Knight, Robert T.; Chang, Edward F.; Hinrichs, Hermann; Rieger, Jochem W.; Rose, Georg
2013-01-01
Objective Support Vector Machines (SVM) have developed into a gold standard for accurate classification in Brain-Computer-Interfaces (BCI). The choice of the most appropriate classifier for a particular application depends on several characteristics in addition to decoding accuracy. Here we investigate the implementation of Hidden Markov Models (HMM)for online BCIs and discuss strategies to improve their performance. Approach We compare the SVM, serving as a reference, and HMMs for classifying discrete finger movements obtained from the Electrocorticograms of four subjects doing a finger tapping experiment. The classifier decisions are based on a subset of low-frequency time domain and high gamma oscillation features. Main results We show that decoding optimization between the two approaches is due to the way features are extracted and selected and less dependent on the classifier. An additional gain in HMM performance of up to 6% was obtained by introducing model constraints. Comparable accuracies of up to 90% were achieved with both SVM and HMM with the high gamma cortical response providing the most important decoding information for both techniques. Significance We discuss technical HMM characteristics and adaptations in the context of the presented data as well as for general BCI applications. Our findings suggest that HMMs and their characteristics are promising for efficient online brain-computer interfaces. PMID:24045504
Griffin, William A.; Li, Xun
2016-01-01
Sequential affect dynamics generated during the interaction of intimate dyads, such as married couples, are associated with a cascade of effects—some good and some bad—on each partner, close family members, and other social contacts. Although the effects are well documented, the probabilistic structures associated with micro-social processes connected to the varied outcomes remain enigmatic. Using extant data we developed a method of classifying and subsequently generating couple dynamics using a Hierarchical Dirichlet Process Hidden semi-Markov Model (HDP-HSMM). Our findings indicate that several key aspects of existing models of marital interaction are inadequate: affect state emissions and their durations, along with the expected variability differences between distressed and nondistressed couples are present but highly nuanced; and most surprisingly, heterogeneity among highly satisfied couples necessitate that they be divided into subgroups. We review how this unsupervised learning technique generates plausible dyadic sequences that are sensitive to relationship quality and provide a natural mechanism for computational models of behavioral and affective micro-social processes. PMID:27187319
Hidden semi-Markov models reveal multiphasic movement of the endangered Florida panther.
van de Kerk, Madelon; Onorato, David P; Criffield, Marc A; Bolker, Benjamin M; Augustine, Ben C; McKinley, Scott A; Oli, Madan K
2015-03-01
Animals must move to find food and mates, and to avoid predators; movement thus influences survival and reproduction, and ultimately determines fitness. Precise description of movement and understanding of spatial and temporal patterns as well as relationships with intrinsic and extrinsic factors is important both for theoretical and applied reasons. We applied hidden semi-Markov models (HSMM) to hourly geographic positioning system (GPS) location data to understand movement patterns of the endangered Florida panther (Puma concolor coryi) and to discern factors influencing these patterns. Three distinct movement modes were identified: (1) Resting mode, characterized by short step lengths and turning angles around 180(o); (2) Moderately active (or intermediate) mode characterized by intermediate step lengths and variable turning angles, and (3) Traveling mode, characterized by long step lengths and turning angles around 0(o). Males and females, and females with and without kittens, exhibited distinctly different movement patterns. Using the Viterbi algorithm, we show that differences in movement patterns of male and female Florida panthers were a consequence of sex-specific differences in diurnal patterns of state occupancy and sex-specific differences in state-specific movement parameters, whereas the differences between females with and without dependent kittens were caused solely by variation in state occupancy. Our study demonstrates the use of HSMM methodology to precisely describe movement and to dissect differences in movement patterns according to sex, and reproductive status.
Enhancing Speech Recognition Using Improved Particle Swarm Optimization Based Hidden Markov Model
Selvaraj, Lokesh; Ganesan, Balakrishnan
2014-01-01
Enhancing speech recognition is the primary intention of this work. In this paper a novel speech recognition method based on vector quantization and improved particle swarm optimization (IPSO) is suggested. The suggested methodology contains four stages, namely, (i) denoising, (ii) feature mining (iii), vector quantization, and (iv) IPSO based hidden Markov model (HMM) technique (IP-HMM). At first, the speech signals are denoised using median filter. Next, characteristics such as peak, pitch spectrum, Mel frequency Cepstral coefficients (MFCC), mean, standard deviation, and minimum and maximum of the signal are extorted from the denoised signal. Following that, to accomplish the training process, the extracted characteristics are given to genetic algorithm based codebook generation in vector quantization. The initial populations are created by selecting random code vectors from the training set for the codebooks for the genetic algorithm process and IP-HMM helps in doing the recognition. At this point the creativeness will be done in terms of one of the genetic operation crossovers. The proposed speech recognition technique offers 97.14% accuracy. PMID:25478588
Prestat, Emmanuel; David, Maude M.; Hultman, Jenni; Ta , Neslihan; Lamendella, Regina; Dvornik, Jill; Mackelprang, Rachel; Myrold, David D.; Jumpponen, Ari; Tringe, Susannah G.; et al
2014-09-26
A new functional gene database, FOAM (Functional Ontology Assignments for Metagenomes), was developed to screen environmental metagenomic sequence datasets. FOAM provides a new functional ontology dedicated to classify gene functions relevant to environmental microorganisms based on Hidden Markov Models (HMMs). Sets of aligned protein sequences (i.e. ‘profiles’) were tailored to a large group of target KEGG Orthologs (KOs) from which HMMs were trained. The alignments were checked and curated to make them specific to the targeted KO. Within this process, sequence profiles were enriched with the most abundant sequences available to maximize the yield of accurate classifier models. An associatedmore » functional ontology was built to describe the functional groups and hierarchy. FOAM allows the user to select the target search space before HMM-based comparison steps and to easily organize the results into different functional categories and subcategories. FOAM is publicly available at http://portal.nersc.gov/project/m1317/FOAM/.« less
Griffin, William A; Li, Xun
2016-01-01
Sequential affect dynamics generated during the interaction of intimate dyads, such as married couples, are associated with a cascade of effects-some good and some bad-on each partner, close family members, and other social contacts. Although the effects are well documented, the probabilistic structures associated with micro-social processes connected to the varied outcomes remain enigmatic. Using extant data we developed a method of classifying and subsequently generating couple dynamics using a Hierarchical Dirichlet Process Hidden semi-Markov Model (HDP-HSMM). Our findings indicate that several key aspects of existing models of marital interaction are inadequate: affect state emissions and their durations, along with the expected variability differences between distressed and nondistressed couples are present but highly nuanced; and most surprisingly, heterogeneity among highly satisfied couples necessitate that they be divided into subgroups. We review how this unsupervised learning technique generates plausible dyadic sequences that are sensitive to relationship quality and provide a natural mechanism for computational models of behavioral and affective micro-social processes. PMID:27187319
Protein modeling with hybrid Hidden Markov Model/Neurel network architectures
Baldi, P.; Chauvin, Y.
1995-12-31
Hidden Markov Models (HMMs) are useful in a number of tasks in computational molecular biology, and in particular to model and align protein families. We argue that HMMs are somewhat optimal within a certain modeling hierarchy. Single first order HMMs, however, have two potential limitations: a large number of unstructured parameters, and a built-in inability to deal with long-range dependencies. Hybrid HMM/Neural Network (NN) architectures attempt to overcome these limitations. In hybrid HMM/NN, the HMM parameters are computed by a NN. This provides a reparametrization that allows for flexible control of model complexity, and incorporation of constraints. The approach is tested on the immunoglobulin family. A hybrid model is trained, and a multiple alignment derived, with less than a fourth of the number of parameters used with previous single HMMs. To capture dependencies, however, one must resort to a larger hybrid model class, where the data is modeled by multiple HMMs. The parameters of the HMMs, and their modulation as a function of input or context, is again calculated by a NN.
A Hidden Markov Model for avalanche forecasting on Chowkibal-Tangdhar road axis in Indian Himalayas
NASA Astrophysics Data System (ADS)
Joshi, Jagdish Chandra; Srivastava, Sunita
2014-12-01
A numerical avalanche prediction scheme using Hidden Markov Model (HMM) has been developed for Chowkibal-Tangdhar road axis in J&K, India. The model forecast is in the form of different levels of avalanche danger (no, low, medium, and high) with a lead time of two days. Snow and meteorological data (maximum temperature, minimum temperature, fresh snow, fresh snow duration, standing snow) of past 12 winters (1992-2008) have been used to derive the model input variables (average temperature, fresh snow in 24 hrs, snow fall intensity, standing snow, Snow Temperature Index (STI) of the top layer, and STI of buried layer). As in HMMs, there are two sequences: a state sequence and a state dependent observation sequence; in the present model, different levels of avalanche danger are considered as different states of the model and Avalanche Activity Index (AAI) of a day, derived from the model input variables, as an observation. Validation of the model with independent data of two winters (2008-2009, 2009-2010) gives 80% accuracy for both day-1 and day-2. Comparison of various forecasting quality measures and Heidke Skill Score of the HMM and the NN model indicate better forecasting skill of the HMM.
Song, Changyue; Liu, Kaibo; Zhang, Xi; Chen, Lili; Xian, Xiaochen
2016-07-01
Obstructive sleep apnea (OSA) syndrome is a common sleep disorder suffered by an increasing number of people worldwide. As an alternative to polysomnography (PSG) for OSA diagnosis, the automatic OSA detection methods used in the current practice mainly concentrate on feature extraction and classifier selection based on collected physiological signals. However, one common limitation in these methods is that the temporal dependence of signals are usually ignored, which may result in critical information loss for OSA diagnosis. In this study, we propose a novel OSA detection approach based on ECG signals by considering temporal dependence within segmented signals. A discriminative hidden Markov model (HMM) and corresponding parameter estimation algorithms are provided. In addition, subject-specific transition probabilities within the model are employed to characterize the subject-to-subject differences of potential OSA patients. To validate our approach, 70 recordings obtained from the Physionet Apnea-ECG database were used. Accuracies of 97.1% for per-recording classification and 86.2% for per-segment OSA detection with satisfactory sensitivity and specificity were achieved. Compared with other existing methods that simply ignore the temporal dependence of signals, the proposed HMM-based detection approach delivers more satisfactory detection performance and could be extended to other disease diagnosis applications. PMID:26560867
Jones, Jonathan-Lee; Essa, Ehab; Xie, Xianghua
2015-08-01
We present a novel method to segment the lymph vessel wall in confocal microscopy images using Optimal Surface Segmentation (OSS) and hidden Markov Models (HMM). OSS is used to preform a pre-segmentation on the images, to act as the initial state for the HMM. We utilize a steerable filter to determine edge based filters for both of these segmentations, and use these features to build Gaussian probability distributions for both the vessel walls and the background. From this we infer the emission probability for the HMM, and the transmission probability is learned using a Baum-Welch algorithm. We transform the segmentation problem into one of cost minimization, with each node in the graph corresponding to one state, and the weight for each node being defined using its emission probability. We define the inter-relations between neighboring nodes using the transmission probability. Having constructed the problem, it is solved using the Viterbi algorithm, allowing the vessel to be reconstructed. The optimal solution can be found in polynomial time. We present qualitative and quantitative analysis to show the performance of the proposed method. PMID:26736778
Luo, Yuxuan; Feng, Jianjiang; Xu, Miao; Zhou, Jie; Min, James K; Xiong, Guanglei
2015-08-01
Computed tomography angiography (CTA) allows for not only diagnosis of coronary artery disease (CAD) with high spatial resolution but also monitoring the remodeling of vessel walls in the progression of CAD. Alignment of coronary arteries in CTA images acquired at different times (with a 3-7 years interval) is required to visualize and analyze the geometric and structural changes quantitatively. Previous work in image registration primarily focused on large anatomical structures and leads to suboptimal results when applying to registration of coronary arteries. In this paper, we develop a novel method to directly align the straightened coronary arteries in the cylindrical coordinate system guided by the extracted centerlines. By using a Hidden Markov Model (HMM), image intensity information from CTA and geometric information of extracted coronary arteries are combined to align coronary arteries. After registration, the pathological features in two straightened coronary arteries can be directly visualized side by side by synchronizing the corresponding cross-sectional slices and circumferential rotation angles. By evaluating with manually labeled landmarks, the average distance error is 1.6 mm. PMID:26736676
Identifying spatiotemporal migration patterns of non-volcanic tremors using hidden Markov models
NASA Astrophysics Data System (ADS)
Zhuang, J.; Wang, T.; Obara, K.; Tsuruoka, H.
2015-12-01
Tremor activity has been recently detected in various tectonic areas worldwide, and is spatially segmented and temporally recurrent. We design a type of hidden Markov models (HMMs) to investigate this phenomenon, where each state represents a distinct segment of tremor sources. We systematically analyze the tremor data from the Tokai region in southwest Japan using this model and find that tremors in this region concentrate around several distinct centers. We find: (1) The system is classified into three classes, background (quiescent), quasi-quiescent, and active states; (2) The region can be separated into two subsystems, the southwest and northeast parts, with most of the active transitions being among the states in each subsystem and the other transitions mainly to the quiescent/quasi-quiescent states; and (3) Tremor activity lasts longer in the northeastern part than in the southwest part. The success of this analysis indicates the power of HMMs in revealing the underlying physical process that drives non-volcanic tremors. Figure： The migration pattern for the HMM with 8 states. Top panel: Observed distances with the center μi of each state overlayed as the red line and ±σi on the left-hand side of the panel in green lines; Middle panel: the tracked most likely state sequence of the 8-state HMM; Bottom panel: the estimated probability of the data being in each state, with blank representing the probability of being in State 1 (the null state).
Glas, Julia; Dümcke, Sebastian; Zacher, Benedikt; Poron, Don; Gagneur, Julien; Tresch, Achim
2016-03-18
Hidden Markov models (HMMs) have been extensively used to dissect the genome into functionally distinct regions using data such as RNA expression or DNA binding measurements. It is a challenge to disentangle processes occurring on complementary strands of the same genomic region. We present the double-stranded HMM (dsHMM), a model for the strand-specific analysis of genomic processes. We applied dsHMM to yeast using strand specific transcription data, nucleosome data, and protein binding data for a set of 11 factors associated with the regulation of transcription.The resulting annotation recovers the mRNA transcription cycle (initiation, elongation, termination) while correctly predicting strand-specificity and directionality of the transcription process. We find that pre-initiation complex formation is an essentially undirected process, giving rise to a large number of bidirectional promoters and to pervasive antisense transcription. Notably, 12% of all transcriptionally active positions showed simultaneous activity on both strands. Furthermore, dsHMM reveals that antisense transcription is specifically suppressed by Nrd1, a yeast termination factor. PMID:26578558
Automatic sleep staging based on ECG signals using hidden Markov models.
Ying Chen; Xin Zhu; Wenxi Chen
2015-08-01
This study is designed to investigate the feasibility of automatic sleep staging using features only derived from electrocardiography (ECG) signal. The study was carried out using the framework of hidden Markov models (HMMs). The mean, and SD values of heart rates (HRs) computed from each 30-second epoch served as the features. The two feature sequences were first detrended by ensemble empirical mode decomposition (EEMD), formed as a two-dimensional feature vector, and then converted into code vectors by vector quantization (VQ) method. The output VQ indexes were utilized to estimate parameters for HMMs. The proposed model was tested and evaluated on a group of healthy individuals using leave-one-out cross-validation. The automatic sleep staging results were compared with PSG estimated ones. Results showed accuracies of 82.2%, 76.0%, 76.1% and 85.5% for deep, light, REM and wake sleep, respectively. The findings proved that HRs-based HMM approach is feasible for automatic sleep staging and can pave a way for developing more efficient, robust, and simple sleep staging system suitable for home application. PMID:26736316
Snoring detection using a piezo snoring sensor based on hidden Markov models.
Lee, Hyo-Ki; Lee, Jeon; Kim, Hojoong; Ha, Jin-Young; Lee, Kyoung-Joung
2013-05-01
This study presents a snoring detection method based on hidden Markov models (HMMs) using a piezo snoring sensor. Snoring is a major symptom of obstructive sleep apnea (OSA). In most sleep studies, snoring is detected with a microphone. Since these studies analyze the acoustic properties of snoring, they need to acquire data at high sampling rates, so a large amount of data should be processed. Recently, several sleep studies have monitored snoring using a piezo snoring sensor. However, an automatic method for snoring detection using a piezo snoring sensor has not been reported in the literature. This study proposed the HMM-based method to detect snoring using this sensor, which is attached to the neck. The data from 21 patients with OSA were gathered for training and test sets. The short-time Fourier transform and short-time energy were computed so they could be applied to HMMs. The data were classified as snoring, noise and silence according to their HMMs. As a result, the sensitivity and the positive predictivity values were 93.3% and 99.1% for snoring detection, respectively. The results demonstrated that the method produced simple, portable and user-friendly detection tools that provide an alternative to the microphone-based method.
Extracting duration information in a picture category decoding task using hidden Markov Models
Pfeiffer, Tim; Heinze, Nicolai; Frysch, Robert; Deouell, Leon Y; Schoenfeld, Mircea A; Knight, Robert T; Rose, Georg
2016-01-01
Objective Adapting classifiers for the purpose of brain signal decoding is a major challenge in brain–computer-interface (BCI) research. In a previous study we showed in principle that hidden Markov models (HMM) are a suitable alternative to the well-studied static classifiers. However, since we investigated a rather straightforward task, advantages from modeling of the signal could not be assessed. Approach Here, we investigate a more complex data set in order to find out to what extent HMMs, as a dynamic classifier, can provide useful additional information. We show for a visual decoding problem that besides category information, HMMs can simultaneously decode picture duration without an additional training required. This decoding is based on a strong correlation that we found between picture duration and the behavior of the Viterbi paths. Main results Decoding accuracies of up to 80% could be obtained for category and duration decoding with a single classifier trained on category information only. Significance The extraction of multiple types of information using a single classifier enables the processing of more complex problems, while preserving good training results even on small databases. Therefore, it provides a convenient framework for online real-life BCI utilizations. PMID:26859831
Temporal structure analysis of broadcast tennis video using hidden Markov models
NASA Astrophysics Data System (ADS)
Kijak, Ewa; Oisel, Lionel; Gros, Patrick
2003-01-01
This work aims at recovering the temporal structure of a broadcast tennis video from an analysis of the raw footage. Our method relies on a statistical model of the interleaving of shots, in order to group shots into predefined classes representing structural elements of a tennis video. This stochastic modeling is performed in the global framework of Hidden Markov Models (HMMs). The fundamental units are shots and transitions. In a first step, colors and motion attributes of segmented shots are used to map shots into 2 classes: game (view of the full tennis court) and not game (medium, close up views, and commercials). In a second step, a trained HMM is used to analyze the temporal interleaving of shots. This analysis results in the identification of more complex structures, such as first missed services, short rallies that could be aces or services, long rallies, breaks that are significant of the end of a game and replays that highlight interesting points. These higher-level unit structures can be used either to create summaries, or to allow non-linear browsing of the video.
Accelerated 3D MERGE Carotid Imaging using Compressed Sensing with a Hidden Markov Tree Model
Makhijani, Mahender K.; Balu, Niranjan; Yamada, Kiyofumi; Yuan, Chun; Nayak, Krishna S.
2012-01-01
Purpose To determine the potential for accelerated 3D carotid magnetic resonance imaging (MRI) using wavelet based compressed sensing (CS) with a hidden Markov tree (HMT) model. Materials and Methods We retrospectively applied HMT model-based CS and conventional CS to 3D carotid MRI data with 0.7 mm isotropic resolution, from six subjects with known carotid stenosis (12 carotids). We applied a wavelet-tree model learnt from a training database of carotid images to improve CS reconstruction. Quantitative endpoints such as lumen area, wall area, mean and maximum wall thickness, plaque calicification, and necrotic core area, were measured and compared using Bland-Altman analysis along with image quality. Results Rate-4.5 acceleration with HMT model-based CS provided image quality comparable to that of rate-3 acceleration with conventional CS and fully sampled reference reconstructions. Morphological measurements made on rate-4.5 HMT model-based CS reconstructions were in good agreement with measurements made on fully sampled reference images. There was no significant bias or correlation between mean and difference of measurements when comparing rate 4.5 HMT model-based CS with fully sampled reference images. Conclusion HMT model-based CS can potentially be used to accelerate clinical carotid MRI by a factor of 4.5 without impacting diagnostic quality or quantitative endpoints. PMID:22826159
Conesa, D; Martínez-Beneito, M A; Amorós, R; López-Quílez, A
2015-04-01
Considerable effort has been devoted to the development of statistical algorithms for the automated monitoring of influenza surveillance data. In this article, we introduce a framework of models for the early detection of the onset of an influenza epidemic which is applicable to different kinds of surveillance data. In particular, the process of the observed cases is modelled via a Bayesian Hierarchical Poisson model in which the intensity parameter is a function of the incidence rate. The key point is to consider this incidence rate as a normal distribution in which both parameters (mean and variance) are modelled differently, depending on whether the system is in an epidemic or non-epidemic phase. To do so, we propose a hidden Markov model in which the transition between both phases is modelled as a function of the epidemic state of the previous week. Different options for modelling the rates are described, including the option of modelling the mean at each phase as autoregressive processes of order 0, 1 or 2. Bayesian inference is carried out to provide the probability of being in an epidemic state at any given moment. The methodology is applied to various influenza data sets. The results indicate that our methods outperform previous approaches in terms of sensitivity, specificity and timeliness.
NASA Astrophysics Data System (ADS)
Hossen, Jakir; Jacobs, Eddie L.; Chari, Srikant
2015-07-01
Linear pyroelectric array sensors have enabled useful classifications of objects such as humans and animals to be performed with relatively low-cost hardware in border and perimeter security applications. Ongoing research has sought to improve the performance of these sensors through signal processing algorithms. In the research presented here, we introduce the use of hidden Markov tree (HMT) models for object recognition in images generated by linear pyroelectric sensors. HMTs are trained to statistically model the wavelet features of individual objects through an expectation-maximization learning process. Human versus animal classification for a test object is made by evaluating its wavelet features against the trained HMTs using the maximum-likelihood criterion. The classification performance of this approach is compared to two other techniques; a texture, shape, and spectral component features (TSSF) based classifier and a speeded-up robust feature (SURF) classifier. The evaluation indicates that among the three techniques, the wavelet-based HMT model works well, is robust, and has improved classification performance compared to a SURF-based algorithm in equivalent computation time. When compared to the TSSF-based classifier, the HMT model has a slightly degraded performance but almost an order of magnitude improvement in computation time enabling real-time implementation.
NASA Astrophysics Data System (ADS)
Zhou, Haitao; Chen, Jin; Dong, Guangming; Wang, Ran
2016-05-01
Many existing signal processing methods usually select a predefined basis function in advance. This basis functions selection relies on a priori knowledge about the target signal, which is always infeasible in engineering applications. Dictionary learning method provides an ambitious direction to learn basis atoms from data itself with the objective of finding the underlying structure embedded in signal. As a special case of dictionary learning methods, shift-invariant dictionary learning (SIDL) reconstructs an input signal using basis atoms in all possible time shifts. The property of shift-invariance is very suitable to extract periodic impulses, which are typical symptom of mechanical fault signal. After learning basis atoms, a signal can be decomposed into a collection of latent components, each is reconstructed by one basis atom and its corresponding time-shifts. In this paper, SIDL method is introduced as an adaptive feature extraction technique. Then an effective approach based on SIDL and hidden Markov model (HMM) is addressed for machinery fault diagnosis. The SIDL-based feature extraction is applied to analyze both simulated and experiment signal with specific notch size. This experiment shows that SIDL can successfully extract double impulses in bearing signal. The second experiment presents an artificial fault experiment with different bearing fault type. Feature extraction based on SIDL method is performed on each signal, and then HMM is used to identify its fault type. This experiment results show that the proposed SIDL-HMM has a good performance in bearing fault diagnosis.
A hidden Markov random field model for genome-wide association studies.
Li, Hongzhe; Wei, Zhi; Maris, John
2010-01-01
Genome-wide association studies (GWAS) are increasingly utilized for identifying novel susceptible genetic variants for complex traits, but there is little consensus on analysis methods for such data. Most commonly used methods include single single nucleotide polymorphism (SNP) analysis or haplotype analysis with Bonferroni correction for multiple comparisons. Since the SNPs in typical GWAS are often in linkage disequilibrium (LD), at least locally, Bonferroni correction of multiple comparisons often leads to conservative error control and therefore lower statistical power. In this paper, we propose a hidden Markov random field model (HMRF) for GWAS analysis based on a weighted LD graph built from the prior LD information among the SNPs and an efficient iterative conditional mode algorithm for estimating the model parameters. This model effectively utilizes the LD information in calculating the posterior probability that an SNP is associated with the disease. These posterior probabilities can then be used to define a false discovery controlling procedure in order to select the disease-associated SNPs. Simulation studies demonstrated the potential gain in power over single SNP analysis. The proposed method is especially effective in identifying SNPs with borderline significance at the single-marker level that nonetheless are in high LD with significant SNPs. In addition, by simultaneously considering the SNPs in LD, the proposed method can also help to reduce the number of false identifications of disease-associated SNPs. We demonstrate the application of the proposed HMRF model using data from a case-control GWAS of neuroblastoma and identify 1 new SNP that is potentially associated with neuroblastoma.
A transition-constrained discrete hidden Markov model for automatic sleep staging
2012-01-01
Background Approximately one-third of the human lifespan is spent sleeping. To diagnose sleep problems, all-night polysomnographic (PSG) recordings including electroencephalograms (EEGs), electrooculograms (EOGs) and electromyograms (EMGs), are usually acquired from the patient and scored by a well-trained expert according to Rechtschaffen & Kales (R&K) rules. Visual sleep scoring is a time-consuming and subjective process. Therefore, the development of an automatic sleep scoring method is desirable. Method The EEG, EOG and EMG signals from twenty subjects were measured. In addition to selecting sleep characteristics based on the 1968 R&K rules, features utilized in other research were collected. Thirteen features were utilized including temporal and spectrum analyses of the EEG, EOG and EMG signals, and a total of 158 hours of sleep data were recorded. Ten subjects were used to train the Discrete Hidden Markov Model (DHMM), and the remaining ten were tested by the trained DHMM for recognition. Furthermore, the 2-fold cross validation was performed during this experiment. Results Overall agreement between the expert and the results presented is 85.29%. With the exception of S1, the sensitivities of each stage were more than 81%. The most accurate stage was SWS (94.9%), and the least-accurately classified stage was S1 (<34%). In the majority of cases, S1 was classified as Wake (21%), S2 (33%) or REM sleep (12%), consistent with previous studies. However, the total time of S1 in the 20 all-night sleep recordings was less than 4%. Conclusion The results of the experiments demonstrate that the proposed method significantly enhances the recognition rate when compared with prior studies. PMID:22908930
NASA Astrophysics Data System (ADS)
Suvorova, S.; Sun, L.; Melatos, A.; Moran, W.; Evans, R. J.
2016-06-01
Gravitational wave searches for continuous-wave signals from neutron stars are especially challenging when the star's spin frequency is unknown a priori from electromagnetic observations and wanders stochastically under the action of internal (e.g., superfluid or magnetospheric) or external (e.g., accretion) torques. It is shown that frequency tracking by hidden Markov model (HMM) methods can be combined with existing maximum likelihood coherent matched filters like the F -statistic to surmount some of the challenges raised by spin wandering. Specifically, it is found that, for an isolated, biaxial rotor whose spin frequency walks randomly, HMM tracking of the F -statistic output from coherent segments with duration Tdrift=10 d over a total observation time of Tobs=1 yr can detect signals with wave strains h0>2 ×10-26 at a noise level characteristic of the Advanced Laser Interferometer Gravitational Wave Observatory (Advanced LIGO). For a biaxial rotor with randomly walking spin in a binary orbit, whose orbital period and semimajor axis are known approximately from electromagnetic observations, HMM tracking of the Bessel-weighted F -statistic output can detect signals with h0>8 ×10-26. An efficient, recursive, HMM solver based on the Viterbi algorithm is demonstrated, which requires ˜103 CPU hours for a typical, broadband (0.5-kHz) search for the low-mass x-ray binary Scorpius X-1, including generation of the relevant F -statistic input. In a "realistic" observational scenario, Viterbi tracking successfully detects 41 out of 50 synthetic signals without spin wandering in stage I of the Scorpius X-1 Mock Data Challenge convened by the LIGO Scientific Collaboration down to a wave strain of h0=1.1 ×10-25, recovering the frequency with a root-mean-square accuracy of ≤4.3 ×10-3 Hz .
Stanke, Mario; Schöffmann, Oliver; Morgenstern, Burkhard; Waack, Stephan
2006-01-01
Background In order to improve gene prediction, extrinsic evidence on the gene structure can be collected from various sources of information such as genome-genome comparisons and EST and protein alignments. However, such evidence is often incomplete and usually uncertain. The extrinsic evidence is usually not sufficient to recover the complete gene structure of all genes completely and the available evidence is often unreliable. Therefore extrinsic evidence is most valuable when it is balanced with sequence-intrinsic evidence. Results We present a fairly general method for integration of external information. Our method is based on the evaluation of hints to potentially protein-coding regions by means of a Generalized Hidden Markov Model (GHMM) that takes both intrinsic and extrinsic information into account. We used this method to extend the ab initio gene prediction program AUGUSTUS to a versatile tool that we call AUGUSTUS+. In this study, we focus on hints derived from matches to an EST or protein database, but our approach can be used to include arbitrary user-defined hints. Our method is only moderately effected by the length of a database match. Further, it exploits the information that can be derived from the absence of such matches. As a special case, AUGUSTUS+ can predict genes under user-defined constraints, e.g. if the positions of certain exons are known. With hints from EST and protein databases, our new approach was able to predict 89% of the exons in human chromosome 22 correctly. Conclusion Sensitive probabilistic modeling of extrinsic evidence such as sequence database matches can increase gene prediction accuracy. When a match of a sequence interval to an EST or protein sequence is used it should be treated as compound information rather than as information about individual positions. PMID:16469098
NASA Astrophysics Data System (ADS)
Kodera, Y.; Sakai, S.
2012-12-01
Development of a method of automatic processing of seismic waves is needed since there are limitations to manually picking out earthquake events from seismograms. However, there is no practical method to automatically detect arrival times of P and S waves in seismograms. One typical example of previously proposed methods is automatic detection by using AR model (e.g. Kitagawa et al., 2004). This method seems not to be effective for seismograms contaminated with spike noise, because it cannot distinguish non-stationary signals generated by earthquakes from those generated by noise. The difficulty of distinguishing the signals is caused by the fact that the automatic detection system has a lack of information on time series variation of seismic waves. We expect that an automatic detection system that includes the information on seismic waves is more effective for seismograms contaminated with noise. So we try to adapt Hidden Markov Model (HMM) to construct seismic wave models and establish a new automatic detection method. HMM has been widely used in many fields such as voice recognition (e.g. Bishop, 2006). With the use of HMM, P- or S-waveform models that include envelops can be constructed directly and semi-automatically from lots of observed waveform data of P or S waves. These waveform models are expected to become more robust if the quantity of observation data increases. We have constructed seismic wave models based on HMM from seismograms observed in Ashio, Japan. By using these models, we have tried automatic detection of arrival times of earthquake events in Ashio. Results show that automatic detection based on HMM is more effective for seismograms contaminated with noise than that based on AR model.
Modeling strategic use of human computer interfaces with novel hidden Markov models.
Mariano, Laura J; Poore, Joshua C; Krum, David M; Schwartz, Jana L; Coskren, William D; Jones, Eric M
2015-01-01
Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM) for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game's functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic task pursuit.
Automated Detection and Classification of Rockfall Induced Seismic Signals with Hidden-Markov-Models
NASA Astrophysics Data System (ADS)
Zeckra, M.; Hovius, N.; Burtin, A.; Hammer, C.
2015-12-01
Originally introduced in speech recognition, Hidden Markov Models are applied in different research fields of pattern recognition. In seismology, this technique has recently been introduced to improve common detection algorithms, like STA/LTA ratio or cross-correlation methods. Mainly used for the monitoring of volcanic activity, this study is one of the first applications to seismic signals induced by geomorphologic processes. With an array of eight broadband seismometers deployed around the steep Illgraben catchment (Switzerland) with high-level erosion, we studied a sequence of landslides triggered over a period of several days in winter. A preliminary manual classification led us to identify three main seismic signal classes that were used as a start for the HMM automated detection and classification: (1) rockslide signal, including a failure source and the debris mobilization along the slope, (2) rockfall signal from the remobilization of debris along the unstable slope, and (3) single cracking signal from the affected cliff observed before the rockslide events. Besides the ability to classify the whole dataset automatically, the HMM approach reflects the origin and the interactions of the three signal classes, which helps us to understand this geomorphic crisis and the possible triggering mechanisms for slope processes. The temporal distribution of crack events (duration > 5s, frequency band [2-8] Hz) follows an inverse Omori law, leading to the catastrophic behaviour of the failure mechanisms and the interest for warning purposes in rockslide risk assessment. Thanks to a dense seismic array and independent weather observations in the landslide area, this dataset also provides information about the triggering mechanisms, which exhibit a tight link between rainfall and freezing level fluctuations.
Automatic detection of alpine rockslides in continuous seismic data using hidden Markov models
NASA Astrophysics Data System (ADS)
Dammeier, Franziska; Moore, Jeffrey R.; Hammer, Conny; Haslinger, Florian; Loew, Simon
2016-02-01
Data from continuously recording permanent seismic networks can contain information about rockslide occurrence and timing complementary to eyewitness observations and thus aid in construction of robust event catalogs. However, detecting infrequent rockslide signals within large volumes of continuous seismic waveform data remains challenging and often requires demanding manual intervention. We adapted an automatic classification method using hidden Markov models to detect rockslide signals in seismic data from two stations in central Switzerland. We first processed 21 known rockslides, with event volumes spanning 3 orders of magnitude and station event distances varying by 1 order of magnitude, which resulted in 13 and 19 successfully classified events at the two stations. Retraining the models to incorporate seismic noise from the day of the event improved the respective results to 16 and 19 successful classifications. The missed events generally had low signal-to-noise ratio and small to medium volumes. We then processed nearly 14 years of continuous seismic data from the same two stations to detect previously unknown events. After postprocessing, we classified 30 new events as rockslides, of which we could verify three through independent observation. In particular, the largest new event, with estimated volume of 500,000 m3, was not generally known within the Swiss landslide community, highlighting the importance of regional seismic data analysis even in densely populated mountainous regions. Our method can be easily implemented as part of existing earthquake monitoring systems, and with an average event detection rate of about two per month, manual verification would not significantly increase operational workload.
Sourty, Marion; Thoraval, Laurent; Roquet, Daniel; Armspach, Jean-Paul; Foucher, Jack; Blanc, Frédéric
2016-01-01
Exploring time-varying connectivity networks in neurodegenerative disorders is a recent field of research in functional MRI. Dementia with Lewy bodies (DLB) represents 20% of the neurodegenerative forms of dementia. Fluctuations of cognition and vigilance are the key symptoms of DLB. To date, no dynamic functional connectivity (DFC) investigations of this disorder have been performed. In this paper, we refer to the concept of connectivity state as a piecewise stationary configuration of functional connectivity between brain networks. From this concept, we propose a new method for group-level as well as for subject-level studies to compare and characterize connectivity state changes between a set of resting-state networks (RSNs). Dynamic Bayesian networks, statistical and graph theory-based models, enable one to learn dependencies between interacting state-based processes. Product hidden Markov models (PHMM), an instance of dynamic Bayesian networks, are introduced here to capture both statistical and temporal aspects of DFC of a set of RSNs. This analysis was based on sliding-window cross-correlations between seven RSNs extracted from a group independent component analysis performed on 20 healthy elderly subjects and 16 patients with DLB. Statistical models of DFC differed in patients compared to healthy subjects for the occipito-parieto-frontal network, the medial occipital network and the right fronto-parietal network. In addition, pairwise comparisons of DFC of RSNs revealed a decrease of dependency between these two visual networks (occipito-parieto-frontal and medial occipital networks) and the right fronto-parietal control network. The analysis of DFC state changes thus pointed out networks related to the cognitive functions that are known to be impaired in DLB: visual processing as well as attentional and executive functions. Besides this context, product HMM applied to RSNs cross-correlations offers a promising new approach to investigate structural and
Sourty, Marion; Thoraval, Laurent; Roquet, Daniel; Armspach, Jean-Paul; Foucher, Jack; Blanc, Frédéric
2016-01-01
Exploring time-varying connectivity networks in neurodegenerative disorders is a recent field of research in functional MRI. Dementia with Lewy bodies (DLB) represents 20% of the neurodegenerative forms of dementia. Fluctuations of cognition and vigilance are the key symptoms of DLB. To date, no dynamic functional connectivity (DFC) investigations of this disorder have been performed. In this paper, we refer to the concept of connectivity state as a piecewise stationary configuration of functional connectivity between brain networks. From this concept, we propose a new method for group-level as well as for subject-level studies to compare and characterize connectivity state changes between a set of resting-state networks (RSNs). Dynamic Bayesian networks, statistical and graph theory-based models, enable one to learn dependencies between interacting state-based processes. Product hidden Markov models (PHMM), an instance of dynamic Bayesian networks, are introduced here to capture both statistical and temporal aspects of DFC of a set of RSNs. This analysis was based on sliding-window cross-correlations between seven RSNs extracted from a group independent component analysis performed on 20 healthy elderly subjects and 16 patients with DLB. Statistical models of DFC differed in patients compared to healthy subjects for the occipito-parieto-frontal network, the medial occipital network and the right fronto-parietal network. In addition, pairwise comparisons of DFC of RSNs revealed a decrease of dependency between these two visual networks (occipito-parieto-frontal and medial occipital networks) and the right fronto-parietal control network. The analysis of DFC state changes thus pointed out networks related to the cognitive functions that are known to be impaired in DLB: visual processing as well as attentional and executive functions. Besides this context, product HMM applied to RSNs cross-correlations offers a promising new approach to investigate structural and
Modeling strategic use of human computer interfaces with novel hidden Markov models
Mariano, Laura J.; Poore, Joshua C.; Krum, David M.; Schwartz, Jana L.; Coskren, William D.; Jones, Eric M.
2015-01-01
Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM) for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game's functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic task pursuit. PMID
NASA Astrophysics Data System (ADS)
Yoo, Jiyoung; Kwon, Hyun-Han; So, Byung-Jin; Rajagopalan, Balaji; Kim, Tae-Woong
2015-04-01
This study proposed a hidden Markov chain model-based drought analysis (HMM-DA) tool to understand the beginning and ending of meteorological drought and to further characterize typhoon-induced drought busters (TDB) by exploring spatiotemporal drought patterns in South Korea. It was found that typhoons have played a dominant role in ending drought events (EDE) during the typhoon season (July-September) over the last four decades (1974-2013). The percentage of EDEs terminated by TDBs was about 43-90% mainly along coastal regions in South Korea. Furthermore, the TDBs, mainly during summer, have a positive role in managing extreme droughts during the subsequent autumn and spring seasons. The HMM-DA models the temporal dependencies between drought states using Markov chain, consequently capturing the dependencies between droughts and typhoons well, thus, enabling a better performance in modeling spatiotemporal drought attributes compared to traditional methods.
Kaya, Yılmaz
2015-09-01
This paper proposes a novel approach to detect epilepsy seizures by using Electroencephalography (EEG), which is one of the most common methods for the diagnosis of epilepsy, based on 1-Dimension Local Binary Pattern (1D-LBP) and grey relational analysis (GRA) methods. The main aim of this paper is to evaluate and validate a novel approach, which is a computer-based quantitative EEG analyzing method and based on grey systems, aimed to help decision-maker. In this study, 1D-LBP, which utilizes all data points, was employed for extracting features in raw EEG signals, Fisher score (FS) was employed to select the representative features, which can also be determined as hidden patterns. Additionally, GRA is performed to classify EEG signals through these Fisher scored features. The experimental results of the proposed approach, which was employed in a public dataset for validation, showed that it has a high accuracy in identifying epileptic EEG signals. For various combinations of epileptic EEG, such as A-E, B-E, C-E, D-E, and A-D clusters, 100, 96, 100, 99.00 and 100% were achieved, respectively. Also, this work presents an attempt to develop a new general-purpose hidden pattern determination scheme, which can be utilized for different categories of time-varying signals.
Profile Hidden Markov Models for the Detection of Viruses within Metagenomic Sequence Data
Skewes-Cox, Peter; Sharpton, Thomas J.; Pollard, Katherine S.; DeRisi, Joseph L.
2014-01-01
Rapid, sensitive, and specific virus detection is an important component of clinical diagnostics. Massively parallel sequencing enables new diagnostic opportunities that complement traditional serological and PCR based techniques. While massively parallel sequencing promises the benefits of being more comprehensive and less biased than traditional approaches, it presents new analytical challenges, especially with respect to detection of pathogen sequences in metagenomic contexts. To a first approximation, the initial detection of viruses can be achieved simply through alignment of sequence reads or assembled contigs to a reference database of pathogen genomes with tools such as BLAST. However, recognition of highly divergent viral sequences is problematic, and may be further complicated by the inherently high mutation rates of some viral types, especially RNA viruses. In these cases, increased sensitivity may be achieved by leveraging position-specific information during the alignment process. Here, we constructed HMMER3-compatible profile hidden Markov models (profile HMMs) from all the virally annotated proteins in RefSeq in an automated fashion using a custom-built bioinformatic pipeline. We then tested the ability of these viral profile HMMs (“vFams”) to accurately classify sequences as viral or non-viral. Cross-validation experiments with full-length gene sequences showed that the vFams were able to recall 91% of left-out viral test sequences without erroneously classifying any non-viral sequences into viral protein clusters. Thorough reanalysis of previously published metagenomic datasets with a set of the best-performing vFams showed that they were more sensitive than BLAST for detecting sequences originating from more distant relatives of known viruses. To facilitate the use of the vFams for rapid detection of remote viral homologs in metagenomic data, we provide two sets of vFams, comprising more than 4,000 vFams each, in the HMMER3 format. We also
2012-01-01
Background Hidden Markov Models (HMMs) are a powerful tool for protein domain identification. The Pfam database notably provides a large collection of HMMs which are widely used for the annotation of proteins in new sequenced organisms. In Pfam, each domain family is represented by a curated multiple sequence alignment from which a profile HMM is built. In spite of their high specificity, HMMs may lack sensitivity when searching for domains in divergent organisms. This is particularly the case for species with a biased amino-acid composition, such as P. falciparum, the main causal agent of human malaria. In this context, fitting HMMs to the specificities of the target proteome can help identify additional domains. Results Using P. falciparum as an example, we compare approaches that have been proposed for this problem, and present two alternative methods. Because previous attempts strongly rely on known domain occurrences in the target species or its close relatives, they mainly improve the detection of domains which belong to already identified families. Our methods learn global correction rules that adjust amino-acid distributions associated with the match states of HMMs. These rules are applied to all match states of the whole HMM library, thus enabling the detection of domains from previously absent families. Additionally, we propose a procedure to estimate the proportion of false positives among the newly discovered domains. Starting with the Pfam standard library, we build several new libraries with the different HMM-fitting approaches. These libraries are first used to detect new domain occurrences with low E-values. Second, by applying the Co-Occurrence Domain Discovery (CODD) procedure we have recently proposed, the libraries are further used to identify likely occurrences among potential domains with higher E-values. Conclusion We show that the new approaches allow identification of several domain families previously absent in the P. falciparum proteome
NASA Astrophysics Data System (ADS)
Tan, Wei Lun; Yusof, Fadhilah; Yusop, Zulkifli
2016-04-01
This study involves the modelling of a homogeneous hidden Markov model (HMM) on the northeast rainfall monsoon using 40 rainfall stations in Peninsular Malaysia for the period of 1975 to 2008. A six hidden states HMM was selected based on Bayesian information criterion (BIC), and every hidden state has distinct rainfall characteristics. Three of the states were found to correspond by wet conditions; while the remaining three states were found to correspond to dry conditions. The six hidden states were found to correspond with the associated atmospheric composites. The relationships between El Niño-Southern Oscillation (ENSO) and the sea surface temperatures (SST) in the Pacific Ocean are found regarding interannual variability. The wet (dry) states were found to be well correlated with a Niño 3.4 index which was used to characterize the intensity of an ENSO event. This model is able to assess the behaviour of the rainfall characteristics with the large scale atmospheric circulation; the monsoon rainfall is well correlated with the El Niño-Southern Oscillation in Peninsular Malaysia.
Ghil, M.; Kravtsov, S.; Robertson, A. W.; Smyth, P.
2008-10-14
This project was a continuation of previous work under DOE CCPP funding, in which we had developed a twin approach of probabilistic network (PN) models (sometimes called dynamic Bayesian networks) and intermediate-complexity coupled ocean-atmosphere models (ICMs) to identify the predictable modes of climate variability and to investigate their impacts on the regional scale. We had developed a family of PNs (similar to Hidden Markov Models) to simulate historical records of daily rainfall, and used them to downscale GCM seasonal predictions. Using an idealized atmospheric model, we had established a novel mechanism through which ocean-induced sea-surface temperature (SST) anomalies might influence large-scale atmospheric circulation patterns on interannual and longer time scales; we had found similar patterns in a hybrid coupled ocean-atmosphere-sea-ice model. The goal of the this continuation project was to build on these ICM results and PN model development to address prediction of rainfall and temperature statistics at the local scale, associated with global climate variability and change, and to investigate the impact of the latter on coupled ocean-atmosphere modes. Our main results from the grant consist of extensive further development of the hidden Markov models for rainfall simulation and downscaling together with the development of associated software; new intermediate coupled models; a new methodology of inverse modeling for linking ICMs with observations and GCM results; and, observational studies of decadal and multi-decadal natural climate results, informed by ICM results.
Yang, Sejung; Lee, Byung-Uk
2015-01-01
In certain image acquisitions processes, like in fluorescence microscopy or astronomy, only a limited number of photons can be collected due to various physical constraints. The resulting images suffer from signal dependent noise, which can be modeled as a Poisson distribution, and a low signal-to-noise ratio. However, the majority of research on noise reduction algorithms focuses on signal independent Gaussian noise. In this paper, we model noise as a combination of Poisson and Gaussian probability distributions to construct a more accurate model and adopt the contourlet transform which provides a sparse representation of the directional components in images. We also apply hidden Markov models with a framework that neatly describes the spatial and interscale dependencies which are the properties of transformation coefficients of natural images. In this paper, an effective denoising algorithm for Poisson-Gaussian noise is proposed using the contourlet transform, hidden Markov models and noise estimation in the transform domain. We supplement the algorithm by cycle spinning and Wiener filtering for further improvements. We finally show experimental results with simulations and fluorescence microscopy images which demonstrate the improved performance of the proposed approach. PMID:26352138
Yang, Sejung; Lee, Byung-Uk
2015-01-01
In certain image acquisitions processes, like in fluorescence microscopy or astronomy, only a limited number of photons can be collected due to various physical constraints. The resulting images suffer from signal dependent noise, which can be modeled as a Poisson distribution, and a low signal-to-noise ratio. However, the majority of research on noise reduction algorithms focuses on signal independent Gaussian noise. In this paper, we model noise as a combination of Poisson and Gaussian probability distributions to construct a more accurate model and adopt the contourlet transform which provides a sparse representation of the directional components in images. We also apply hidden Markov models with a framework that neatly describes the spatial and interscale dependencies which are the properties of transformation coefficients of natural images. In this paper, an effective denoising algorithm for Poisson-Gaussian noise is proposed using the contourlet transform, hidden Markov models and noise estimation in the transform domain. We supplement the algorithm by cycle spinning and Wiener filtering for further improvements. We finally show experimental results with simulations and fluorescence microscopy images which demonstrate the improved performance of the proposed approach. PMID:26352138
NASA Astrophysics Data System (ADS)
Faisan, Sylvain; Thoraval, Laurent; Armspach, Jean-Paul; Heitz, Fabrice; Foucher, Jack
2005-04-01
This paper presents a novel, completely unsupervised fMRI brain mapping approach that addresses the three problems of hemodynamic response function (HRF) shape variability, neural event timing, and fMRI response linearity. To make it robust, the method takes into account spatial and temporal information directly into the core of the activation detection process. In practice, activation detection is formulated in terms of temporal alignment between the sequence of hemodynamic response onsets (HROs) detected in the fMRI signal at υ and in the spatial neighbourhood of υ, and the sequence of "off-on" transitions observed in the input blocked stimulation paradigm (when considering epoch-related fMRI data), or the sequence of stimuli of the event-based paradigm (when considering event-related fMRI data). This multiple event sequence alignment problem, which comes under multisensor data fusion, is solved within the probabilistic framework of hidden Markov multiple event sequence models (HMMESMs), a special class of hidden Markov models. Results obtained on real and synthetic data compete with those obtained with the popular statistical parametric mapping (SPM) approach, but without necessitating any prior definition of the expected activation patterns, the HMMESM mapping approach being completely unsupervised.
NASA Astrophysics Data System (ADS)
Young, Dylan
Particle tracking offers significant insight into the molecular mechanics that govern the behavior of living cells. The analysis of molecular trajectories that transition between different motive states, such as diffusive, driven and tethered modes, is of considerable importance, with even single trajectories containing significant amounts of information about a molecule's environment and its interactions with cellular structures such as the cell cytoskeleton, membrane or extracellular matrix. Hidden Markov models (HMM) have been widely adopted to perform the segmentation of such complex tracks, however robust methods for failure detection are required when HMMs are applied to individual particle tracks and limited data sets. Here, we show that extensive analysis of hidden Markov model outputs using data derived from multi-state Brownian dynamics simulations can be used for both the optimization of likelihood models, and also to generate custom failure tests based on a modified Bayesian Information Criterion. In the first instance, these failure tests can be applied to assess the quality of the HMM results. In addition, they provide critical information for the successful design of particle tracking experiments where trajectories containing multiple mobile states are expected.
NASA Astrophysics Data System (ADS)
Wang, Hui; Wellmann, Florian
2016-04-01
It is generally accepted that 3D geological models inferred from observed data will contain a certain amount of uncertainties. The uncertainty quantification and stochastic sampling methods are essential for gaining the insight into the geological variability of subsurface structures. In the community of deterministic or traditional modelling techniques, classical geo-statistical methods using boreholes (hard data sets) are still most widely accepted although suffering certain drawbacks. Modern geophysical measurements provide us regional data sets in 2D or 3D spaces either directly from sensors or indirectly from inverse problem solving using observed signal (soft data sets). We propose a stochastic modelling framework to extract subsurface heterogeneity from multiple and complementary types of data. In the presented work, subsurface heterogeneity is considered as the "hidden link" among multiple spatial data sets as well as inversion results. Hidden Markov random field models are employed to perform 3D segmentation which is the representation of the "hidden link". Finite Gaussian mixture models are adopted to characterize the statistical parameters of the multiple data sets. The uncertainties are quantified via a Gibbs sampling process under the Bayesian inferential framework. The proposed modelling framework is validated using two numerical examples. The model behavior and convergence are also well examined. It is shown that the presented stochastic modelling framework is a promising tool for the 3D data fusion in the communities of geological modelling and geophysics.
Benoit, Julia S; Chan, Wenyaw; Luo, Sheng; Yeh, Hung-Wen; Doody, Rachelle
2016-04-30
Understanding the dynamic disease process is vital in early detection, diagnosis, and measuring progression. Continuous-time Markov chain (CTMC) methods have been used to estimate state-change intensities but challenges arise when stages are potentially misclassified. We present an analytical likelihood approach where the hidden state is modeled as a three-state CTMC model allowing for some observed states to be possibly misclassified. Covariate effects of the hidden process and misclassification probabilities of the hidden state are estimated without information from a 'gold standard' as comparison. Parameter estimates are obtained using a modified expectation-maximization (EM) algorithm, and identifiability of CTMC estimation is addressed. Simulation studies and an application studying Alzheimer's disease caregiver stress-levels are presented. The method was highly sensitive to detecting true misclassification and did not falsely identify error in the absence of misclassification. In conclusion, we have developed a robust longitudinal method for analyzing categorical outcome data when classification of disease severity stage is uncertain and the purpose is to study the process' transition behavior without a gold standard. PMID:26782946
Super-Resolution Using Hidden Markov Model and Bayesian Detection Estimation Framework
NASA Astrophysics Data System (ADS)
Humblot, Fabrice; Mohammad-Djafari, Ali
2006-12-01
This paper presents a new method for super-resolution (SR) reconstruction of a high-resolution (HR) image from several low-resolution (LR) images. The HR image is assumed to be composed of homogeneous regions. Thus, the a priori distribution of the pixels is modeled by a finite mixture model (FMM) and a Potts Markov model (PMM) for the labels. The whole a priori model is then a hierarchical Markov model. The LR images are assumed to be obtained from the HR image by lowpass filtering, arbitrarily translation, decimation, and finally corruption by a random noise. The problem is then put in a Bayesian detection and estimation framework, and appropriate algorithms are developed based on Markov chain Monte Carlo (MCMC) Gibbs sampling. At the end, we have not only an estimate of the HR image but also an estimate of the classification labels which leads to a segmentation result.
Wolf, Christian
2010-03-01
We present a new method for blind document bleed-through removal based on separate Markov Random Field (MRF) regularization for the recto and for the verso side, where separate priors are derived from the full graph. The segmentation algorithm is based on Bayesian Maximum a Posteriori (MAP) estimation. The advantages of this separate approach are the adaptation of the prior to the contents creation process (e.g., superimposing two handwritten pages), and the improvement of the estimation of the recto pixels through an estimation of the verso pixels covered by recto pixels; moreover, the formulation as a binary labeling problem with two hidden labels per pixels naturally leads to an efficient optimization method based on the minimum cut/maximum flow in a graph. The proposed method is evaluated on scanned document images from the 18th century, showing an improvement of character recognition results compared to other restoration methods.
Lagona, Francesco; Jdanov, Dmitri; Shkolnikova, Maria
2014-01-01
Longitudinal data are often segmented by unobserved time-varying factors, which introduce latent heterogeneity at the observation level, in addition to heterogeneity across subjects. We account for this latent structure by a linear mixed hidden Markov model. It integrates subject-specific random effects and Markovian sequences of time-varying effects in the linear predictor. We propose an expectation—maximization algorithm for maximum likelihood estimation, based on data augmentation. It reduces to the iterative maximization of the expected value of a complete likelihood function, derived from an augmented dataset with case weights, alternated with weights updating. In a case study of the Survey on Stress Aging and Health in Russia, the model is exploited to estimate the influence of the observed covariates under unobserved time-varying factors, which affect the cardiovascular activity of each subject during the observation period. PMID:24889355
Fiske, Ian J.; Royle, J. Andrew; Gross, Kevin
2014-01-01
Ecologists and wildlife biologists increasingly use latent variable models to study patterns of species occurrence when detection is imperfect. These models have recently been generalized to accommodate both a more expansive description of state than simple presence or absence, and Markovian dynamics in the latent state over successive sampling seasons. In this paper, we write these multi-season, multi-state models as hidden Markov models to find both maximum likelihood estimates of model parameters and finite-sample estimators of the trajectory of the latent state over time. These estimators are especially useful for characterizing population trends in species of conservation concern. We also develop parametric bootstrap procedures that allow formal inference about latent trend. We examine model behavior through simulation, and we apply the model to data from the North American Amphibian Monitoring Program.
NASA Astrophysics Data System (ADS)
Yusof, Fadhilah; Kane, Ibrahim Lawal; Yusop, Zulkifli
2015-02-01
Precarious circumstances related to rainfall events can be due to very intense or persistence of rainfall over a long period of time. Such events may give rise to an exceedence of the capacity of sewer systems resulting to landslides or flooding. One of the conventional ways of measuring such risk associated with persistence in rain is done through studies of long term persistence and volatility persistence. This work investigates the persistence level of Kuantan daily rainfall using the hybrid of autoregressive fractional integrated moving average (ARFIMA) and hidden Markov model (HMM). The result shows that the rainfall variability period returns quickly to its usual variability level which may not have a lasting period of extreme wet, hence relatively stable rainfall behavior is observed in Kuantan rainfall. This will enhance the understanding of the process for the successful development and implementation of water resource tools to assess engineering and environmental problems such as flood control.
NASA Astrophysics Data System (ADS)
Lisiecki, L. E.; Ahn, S.; Khider, D.; Lawrence, C.
2015-12-01
Stratigraphic alignment is the primary way in which long marine climate records are placed on a common age model. We previously presented a probabilistic pairwise alignment algorithm, HMM-Match, which uses hidden Markov models to estimate alignment uncertainty and apply it to the alignment of benthic δ18O records to the "LR04" global benthic stack of Lisiecki and Raymo (2005) (Lin et al., 2014). However, since the LR04 stack is deterministic, the algorithm does not account for uncertainty in the stack. Here we address this limitation by developing a probabilistic stack, HMM-Stack. In this model the stack is a probabilistic inhomogeneous hidden Markov model, a.k.a. profile HMM. The HMM-stack is represented by a probabilistic model that "emits" each of the input records (Durbin et al., 1998). The unknown parameters of this model are learned from a set of input records using the expectation maximization (EM) algorithm. Because the multiple alignment of these records is unknown and uncertain, the expected contribution of each input point to each point in the stack is determined probabilistically. For each time step in the HMM-stack, δ18O values are described by a Gaussian probability distribution. Available δ18O records (N=180) are employed to estimate the mean and variance of δ18O at each time point. The mean of HMM-Stack follows the predicted pattern of glacial cycles with increased amplitude after the Pliocene-Pleistocene boundary and also larger and longer cycles after the mid-Pleistocene transition. Furthermore, the δ18O variance increases with age, producing a substantial loss in the signal-to-noise ratio. Not surprisingly, uncertainty in alignment and thus estimated age also increase substantially in the older portion of the stack.
A F Pimentel, Marco; Santos, Mauro D; Springer, David B; Clifford, Gari D
2015-08-01
Accurate heart beat detection in signals acquired from intensive care unit (ICU) patients is necessary for establishing both normality and detecting abnormal events. Detection is normally performed by analysing the electrocardiogram (ECG) signal, and alarms are triggered when parameters derived from this signal exceed preset or variable thresholds. However, due to noisy and missing data, these alarms are frequently deemed to be false positives, and therefore ignored by clinical staff. The fusion of features derived from other signals, such as the arterial blood pressure (ABP) or the photoplethysmogram (PPG), has the potential to reduce such false alarms. In order to leverage the highly correlated temporal nature of the physiological signals, a hidden semi-Markov model (HSMM) approach, which uses the intra- and inter-beat depolarization interval, was designed to detect heart beats in such data. Features based on the wavelet transform, signal gradient and signal quality indices were extracted from the ECG and ABP waveforms for use in the HSMM framework. The presented method achieved an overall score of 89.13% on the hidden/test data set provided by the Physionet/Computing in Cardiology Challenge 2014: Robust Detection of Heart Beats in Multimodal Data. PMID:26218536
A F Pimentel, Marco; Santos, Mauro D; Springer, David B; Clifford, Gari D
2015-08-01
Accurate heart beat detection in signals acquired from intensive care unit (ICU) patients is necessary for establishing both normality and detecting abnormal events. Detection is normally performed by analysing the electrocardiogram (ECG) signal, and alarms are triggered when parameters derived from this signal exceed preset or variable thresholds. However, due to noisy and missing data, these alarms are frequently deemed to be false positives, and therefore ignored by clinical staff. The fusion of features derived from other signals, such as the arterial blood pressure (ABP) or the photoplethysmogram (PPG), has the potential to reduce such false alarms. In order to leverage the highly correlated temporal nature of the physiological signals, a hidden semi-Markov model (HSMM) approach, which uses the intra- and inter-beat depolarization interval, was designed to detect heart beats in such data. Features based on the wavelet transform, signal gradient and signal quality indices were extracted from the ECG and ABP waveforms for use in the HSMM framework. The presented method achieved an overall score of 89.13% on the hidden/test data set provided by the Physionet/Computing in Cardiology Challenge 2014: Robust Detection of Heart Beats in Multimodal Data.
NASA Astrophysics Data System (ADS)
Pal, Indrani; Robertson, Andrew W.; Lall, Upmanu; Cane, Mark A.
2015-02-01
A multiscale-modeling framework for daily rainfall is considered and diagnostic results are presented for an application to the winter season in Northwest India. The daily rainfall process is considered to follow a hidden Markov model (HMM), with the hidden states assumed to be an unknown random function of slowly varying climatic modulation of the winter jet stream and moisture transport dynamics. The data used are from 14 stations over Satluj River basin in winter (December-January-February-March). The period considered is 1977/78-2005/06. The HMM identifies four discrete weather states, which are used to describe daily rainfall variability over study region. Each state was found to be associated with a distinct atmospheric circulation pattern, with the driest and drier states, State 1 and 2 respectively, characterized by a lack of synoptic wave activity. In contrast, the wetter and wettest states, States 3 and 4 respectively, are characterized by a zonally oriented wave train extending across Eurasia between 20N and 40N, identified with `western disturbances' (WD). The occurrence of State 4 is strongly conditioned by the El Nino and Indian Ocean Dipole (IOD) phenomena in winter, which is demonstrated using large-scale correlation maps based on mean sea level pressure and sea surface temperature. This suggests that there is a tendency of higher frequency of the wet days and intense WD activities in winter during El Nino and positive IOD years. These findings, derived from daily rainfall station records, help clarify the sequence of Northern Hemisphere mid-latitude storms bringing winter rainfall over Northwest India, and their association with potentially predictable low frequency modes on seasonal time scales and longer.
NASA Astrophysics Data System (ADS)
Kang, Seung-Ho; Lee, Sang-Hee; Chon, Tae-Soo
2012-02-01
In recent decades, the behavior of Caenorhabditis elegans ( C. elegans) has been extensively studied to understand the respective roles of neural control and biomechanics. Thus far, however, only a few studies on the simulation modeling of C. elegans swimming behavior have been conducted because it is mathematically difficult to describe its complicated behavior. In this study, we built two hidden Markov models (HMMs), corresponding to the movements of C. elegans in a controlled environment with no chemical treatment and in a formaldehyde-treated environment (0.1 ppm), respectively. The movement was characterized by a series of shape patterns of the organism, taken every 0.25 s for 40 min. All shape patterns were quantified by branch length similarity (BLS) entropy and classified into seven patterns by using the self-organizing map (SOM) and the k-means clustering algorithm. The HMM coupled with the SOM was successful in accurately explaining the organism's behavior. In addition, we briefly discussed the possibility of using the HMM together with BLS entropy to develop bio-monitoring systems for real-time applications to determine water quality.
NASA Astrophysics Data System (ADS)
Gómez-Losada, Álvaro; Pires, José Carlos M.; Pino-Mejías, Rafael
2016-02-01
Urban area air pollution results from local air pollutants (from different sources) and horizontal transport (background pollution). Understanding urban air pollution background (lowest) concentration profiles is key in population exposure assessment and epidemiological studies. To this end, air pollution registered at background monitoring sites is studied, but background pollution levels are given as the average of the air pollutant concentrations measured at these sites over long periods of time. This short communication shows how a metric based on Hidden Markov Models (HMMs) can characterise the air pollutant background concentration profiles. HMMs were applied to daily average concentrations of CO, NO2, PM10 and SO2 at thirteen urban monitoring sites from three cities from 2010 to 2013. Using the proposed metric, the mean values of background and ambient air pollution registered at these sites for these primary pollutants were estimated and the ratio of ambient to background air pollution and the difference between them were studied. The ratio indicator for the studied air pollutants during the four-year study sets the background air pollution at 48%-69% of the ambient air pollution, while the difference between these values ranges from 101 to 193 μg/m3, 7-12 μg/m3, 11-13 μg/m3 and 2-3 μg/m3 for CO, NO2, PM10 and SO2, respectively.
Zarrabi, Nawid; Ernst, Stefan; Verhalen, Brandy; Wilkens, Stephan; Börsch, Michael
2013-01-01
Single-molecule Förster resonance energy (smFRET) transfer has become a powerful tool for observing conformational dynamics of biological macromolecules. Analyzing smFRET time trajectories allows to identify the state transitions occuring on reaction pathways of molecular machines. Previously, we have developed a smFRET approach to monitor movements of the two nucleotide binding domains (NBDs) of P-glycoprotein (Pgp) during ATP hydrolysis driven drug transport in solution. One limitation of this initial work was that single-molecule photon bursts were analyzed by visual inspection with manual assignment of individual FRET levels. Here a fully automated analysis of Pgp smFRET data using hidden Markov models (HMM) for transitions up to 9 conformational states is applied. We propose new estimators for HMMs to integrate the information of fluctuating intensities in confocal smFRET measurements of freely diffusing lipid bilayer bound membrane proteins in solution. HMM analysis strongly supports that under conditions of steady state turnover, conformational states with short NBD distances and short dwell times are more populated compared to conditions without nucleotide or transport substrate present. PMID:23891547
NASA Astrophysics Data System (ADS)
Luk, B. L.; Liu, K. P.; Tong, F.; Man, K. F.
2010-05-01
The impact-acoustics method utilizes different information contained in the acoustic signals generated by tapping a structure with a small metal object. It offers a convenient and cost-efficient way to inspect the tile-wall bonding integrity. However, the existence of the surface irregularities will cause abnormal multiple bounces in the practical inspection implementations. The spectral characteristics from those bounces can easily be confused with the signals obtained from different bonding qualities. As a result, it will deteriorate the classic feature-based classification methods based on frequency domain. Another crucial difficulty posed by the implementation is the additive noise existing in the practical environments that may also cause feature mismatch and false judgment. In order to solve this problem, the work described in this paper aims to develop a robust inspection method that applies model-based strategy, and utilizes the wavelet domain features with hidden Markov modeling. It derives a bonding integrity recognition approach with enhanced immunity to surface roughness as well as the environmental noise. With the help of the specially designed artificial sample slabs, experiments have been carried out with impact acoustic signals contaminated by real environmental noises acquired under practical inspection background. The results are compared with those using classic method to demonstrate the effectiveness of the proposed method.
Parida, Bikram K; Panda, Prasanna K; Misra, Namrata; Mishra, Barada K
2015-02-01
Modeling the three-dimensional (3D) structures of proteins assumes great significance because of its manifold applications in biomolecular research. Toward this goal, we present MaxMod, a graphical user interface (GUI) of the MODELLER program that combines profile hidden Markov model (profile HMM) method with Clustal Omega program to significantly improve the selection of homologous templates and target-template alignment for construction of accurate 3D protein models. MaxMod distinguishes itself from other existing GUIs of MODELLER software by implementing effortless modeling of proteins using templates that bear modified residues. Additionally, it provides various features such as loop optimization, express modeling (a feature where protein model can be generated directly from its sequence, without any further user intervention) and automatic update of PDB database, thus enhancing the user-friendly control of computational tasks. We find that HMM-based MaxMod performs better than other modeling packages in terms of execution time and model quality. MaxMod is freely available as a downloadable standalone tool for academic and non-commercial purpose at http://www.immt.res.in/maxmod/. PMID:25636267
Taborri, Juri; Scalona, Emilia; Palermo, Eduardo; Rossi, Stefano; Cappa, Paolo
2015-01-01
Gait-phase recognition is a necessary functionality to drive robotic rehabilitation devices for lower limbs. Hidden Markov Models (HMMs) represent a viable solution, but they need subject-specific training, making data processing very time-consuming. Here, we validated an inter-subject procedure to avoid the intra-subject one in two, four and six gait-phase models in pediatric subjects. The inter-subject procedure consists in the identification of a standardized parameter set to adapt the model to measurements. We tested the inter-subject procedure both on scalar and distributed classifiers. Ten healthy children and ten hemiplegic children, each equipped with two Inertial Measurement Units placed on shank and foot, were recruited. The sagittal component of angular velocity was recorded by gyroscopes while subjects performed four walking trials on a treadmill. The goodness of classifiers was evaluated with the Receiver Operating Characteristic. The results provided a goodness from good to optimum for all examined classifiers (0 < G < 0.6), with the best performance for the distributed classifier in two-phase recognition (G = 0.02). Differences were found among gait partitioning models, while no differences were found between training procedures with the exception of the shank classifier. Our results raise the possibility of avoiding subject-specific training in HMM for gait-phase recognition and its implementation to control exoskeletons for the pediatric population. PMID:26404309
NASA Astrophysics Data System (ADS)
Beyreuther, Moritz; Carniel, Roberto; Wassermann, Joachim
2008-10-01
A possible interaction of (volcano-) tectonic earthquakes with the continuous seismic noise recorded in the volcanic island of Tenerife was recently suggested. Also recently the zone close to Las Canadas caldera shows unusual high number of near (< 25 km), possibly volcano-tectonic, earthquakes indicating signs of reawakening of the volcano putting high pressure on the risk analyst. Certainly for both tasks consistent earthquake catalogues provide valuable information and thus there is a strong demand for automatic detection and classification methodologies generating such catalogues. Therefore we adopt methodologies of speech recognition where statistical models, called Hidden Markov Models (HMMs), are widely used for spotting words in continuous audio data. In this study HMMs are used to detect and classify volcano-tectonic and/or tectonic earthquakes in continuous seismic data. Further the HMM detection and classification is evaluated and discussed for a one month period of continuous seismic data at a single seismic station. Being a stochastic process, HMMs provide the possibility to add a confidence measure to each classification made, basically evaluating how "sure" the algorithm is when classifying a certain earthquake. Moreover, this provides helpful information for the seismological analyst when cataloguing earthquakes. Combined with the confidence measure the HMM detection and classification can provide precise enough earthquake statistics, both for further evidence on the interaction between seismic noise and (volcano-) tectonic earthquakes as well as for incorporation in an automatic early warning system.
Parida, Bikram K; Panda, Prasanna K; Misra, Namrata; Mishra, Barada K
2015-02-01
Modeling the three-dimensional (3D) structures of proteins assumes great significance because of its manifold applications in biomolecular research. Toward this goal, we present MaxMod, a graphical user interface (GUI) of the MODELLER program that combines profile hidden Markov model (profile HMM) method with Clustal Omega program to significantly improve the selection of homologous templates and target-template alignment for construction of accurate 3D protein models. MaxMod distinguishes itself from other existing GUIs of MODELLER software by implementing effortless modeling of proteins using templates that bear modified residues. Additionally, it provides various features such as loop optimization, express modeling (a feature where protein model can be generated directly from its sequence, without any further user intervention) and automatic update of PDB database, thus enhancing the user-friendly control of computational tasks. We find that HMM-based MaxMod performs better than other modeling packages in terms of execution time and model quality. MaxMod is freely available as a downloadable standalone tool for academic and non-commercial purpose at http://www.immt.res.in/maxmod/.
NASA Astrophysics Data System (ADS)
Bhatti, Sohail Masood; Khan, Muhammad Salman; Wuth, Jorge; Huenupan, Fernando; Curilem, Millaray; Franco, Luis; Yoma, Nestor Becerra
2016-09-01
In this paper we propose an automatic volcano event detection system based on Hidden Markov Model (HMM) with state and event duration models. Since different volcanic events have different durations, therefore the state and whole event durations learnt from the training data are enforced on the corresponding state and event duration models within the HMM. Seismic signals from the Llaima volcano are used to train the system. Two types of events are employed in this study, Long Period (LP) and Volcano-Tectonic (VT). Experiments show that the standard HMMs can detect the volcano events with high accuracy but generates false positives. The results presented in this paper show that the incorporation of duration modeling can lead to reductions in false positive rate in event detection as high as 31% with a true positive accuracy equal to 94%. Further evaluation of the false positives indicate that the false alarms generated by the system were mostly potential events based on the signal-to-noise ratio criteria recommended by a volcano expert.
Taborri, Juri; Scalona, Emilia; Palermo, Eduardo; Rossi, Stefano; Cappa, Paolo
2015-09-23
Gait-phase recognition is a necessary functionality to drive robotic rehabilitation devices for lower limbs. Hidden Markov Models (HMMs) represent a viable solution, but they need subject-specific training, making data processing very time-consuming. Here, we validated an inter-subject procedure to avoid the intra-subject one in two, four and six gait-phase models in pediatric subjects. The inter-subject procedure consists in the identification of a standardized parameter set to adapt the model to measurements. We tested the inter-subject procedure both on scalar and distributed classifiers. Ten healthy children and ten hemiplegic children, each equipped with two Inertial Measurement Units placed on shank and foot, were recruited. The sagittal component of angular velocity was recorded by gyroscopes while subjects performed four walking trials on a treadmill. The goodness of classifiers was evaluated with the Receiver Operating Characteristic. The results provided a goodness from good to optimum for all examined classifiers (0 < G < 0.6), with the best performance for the distributed classifier in two-phase recognition (G = 0.02). Differences were found among gait partitioning models, while no differences were found between training procedures with the exception of the shank classifier. Our results raise the possibility of avoiding subject-specific training in HMM for gait-phase recognition and its implementation to control exoskeletons for the pediatric population.
Taborri, Juri; Scalona, Emilia; Palermo, Eduardo; Rossi, Stefano; Cappa, Paolo
2015-01-01
Gait-phase recognition is a necessary functionality to drive robotic rehabilitation devices for lower limbs. Hidden Markov Models (HMMs) represent a viable solution, but they need subject-specific training, making data processing very time-consuming. Here, we validated an inter-subject procedure to avoid the intra-subject one in two, four and six gait-phase models in pediatric subjects. The inter-subject procedure consists in the identification of a standardized parameter set to adapt the model to measurements. We tested the inter-subject procedure both on scalar and distributed classifiers. Ten healthy children and ten hemiplegic children, each equipped with two Inertial Measurement Units placed on shank and foot, were recruited. The sagittal component of angular velocity was recorded by gyroscopes while subjects performed four walking trials on a treadmill. The goodness of classifiers was evaluated with the Receiver Operating Characteristic. The results provided a goodness from good to optimum for all examined classifiers (0 < G < 0.6), with the best performance for the distributed classifier in two-phase recognition (G = 0.02). Differences were found among gait partitioning models, while no differences were found between training procedures with the exception of the shank classifier. Our results raise the possibility of avoiding subject-specific training in HMM for gait-phase recognition and its implementation to control exoskeletons for the pediatric population. PMID:26404309
NASA Astrophysics Data System (ADS)
Hossen, Jakir; Jacobs, Eddie L.; Chari, Srikant
2014-03-01
In this paper, we propose a real-time human versus animal classification technique using a pyro-electric sensor array and Hidden Markov Model. The technique starts with the variational energy functional level set segmentation technique to separate the object from background. After segmentation, we convert the segmented object to a signal by considering column-wise pixel values and then finding the wavelet coefficients of the signal. HMMs are trained to statistically model the wavelet features of individuals through an expectation-maximization learning process. Human versus animal classifications are made by evaluating a set of new wavelet feature data against the trained HMMs using the maximum-likelihood criterion. Human and animal data acquired-using a pyro-electric sensor in different terrains are used for performance evaluation of the algorithms. Failures of the computationally effective SURF feature based approach that we develop in our previous research are because of distorted images produced when the object runs very fast or if the temperature difference between target and background is not sufficient to accurately profile the object. We show that wavelet based HMMs work well for handling some of the distorted profiles in the data set. Further, HMM achieves improved classification rate over the SURF algorithm with almost the same computational time.
A novel seizure detection algorithm informed by hidden Markov model event states
NASA Astrophysics Data System (ADS)
Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian
2016-06-01
Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h‑1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.
A novel seizure detection algorithm informed by hidden Markov model event states
NASA Astrophysics Data System (ADS)
Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian
2016-06-01
Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.
Phillips, Joe Scutt; Patterson, Toby A; Leroy, Bruno; Pilling, Graham M; Nicol, Simon J
2015-07-01
Analysis of complex time-series data from ecological system study requires quantitative tools for objective description and classification. These tools must take into account largely ignored problems of bias in manual classification, autocorrelation, and noise. Here we describe a method using existing estimation techniques for multivariate-normal hidden Markov models (HMMs) to develop such a classification. We use high-resolution behavioral data from bio-loggers attached to free-roaming pelagic tuna as an example. Observed patterns are assumed to be generated by an unseen Markov process that switches between several multivariate-normal distributions. Our approach is assessed in two parts. The first uses simulation experiments, from which the ability of the HMM to estimate known parameter values is examined using artificial time series of data consistent with hypotheses about pelagic predator foraging ecology. The second is the application to time series of continuous vertical movement data from yellowfin and bigeye tuna taken from tuna tagging experiments. These data were compressed into summary metrics capturing the variation of patterns in diving behavior and formed into a multivariate time series used to estimate a HMM. Each observation was associated with covariate information incorporating the effect of day and night on behavioral switching. Known parameter values were well recovered by the HMMs in our simulation experiments, resulting in mean correct classification rates of 90-97%, although some variance-covariance parameters were estimated less accurately. HMMs with two distinct behavioral states were selected for every time series of real tuna data, predicting a shallow warm state, which was similar across all individuals, and a deep colder state, which was more variable. Marked diurnal behavioral switching was predicted, consistent with many previous empirical studies on tuna. HMMs provide easily interpretable models for the objective classification of
Dwyer, Michael G; Bergsland, Niels; Zivadinov, Robert
2014-04-15
SIENA and similar techniques have demonstrated the utility of performing "direct" measurements as opposed to post-hoc comparison of cross-sectional data for the measurement of whole brain (WB) atrophy over time. However, gray matter (GM) and white matter (WM) atrophy are now widely recognized as important components of neurological disease progression, and are being actively evaluated as secondary endpoints in clinical trials. Direct measures of GM/WM change with advantages similar to SIENA have been lacking. We created a robust and easily-implemented method for direct longitudinal analysis of GM/WM atrophy, SIENAX multi-time-point (SIENAX-MTP). We built on the basic halfway-registration and mask composition components of SIENA to improve the raw output of FMRIB's FAST tissue segmentation tool. In addition, we created LFAST, a modified version of FAST incorporating a 4th dimension in its hidden Markov random field model in order to directly represent time. The method was validated by scan-rescan, simulation, comparison with SIENA, and two clinical effect size comparisons. All validation approaches demonstrated improved longitudinal precision with the proposed SIENAX-MTP method compared to SIENAX. For GM, simulation showed better correlation with experimental volume changes (r=0.992 vs. 0.941), scan-rescan showed lower standard deviations (3.8% vs. 8.4%), correlation with SIENA was more robust (r=0.70 vs. 0.53), and effect sizes were improved by up to 68%. Statistical power estimates indicated a potential drop of 55% in the number of subjects required to detect the same treatment effect with SIENAX-MTP vs. SIENAX. The proposed direct GM/WM method significantly improves on the standard SIENAX technique by trading a small amount of bias for a large reduction in variance, and may provide more precise data and additional statistical power in longitudinal studies.
Dwyer, Michael G; Bergsland, Niels; Saluste, Erik; Sharma, Jitendra; Jaisani, Zeenat; Durfee, Jacqueline; Abdelrahman, Nadir; Minagar, Alireza; Hoque, Romy; Munschauer, Frederick E; Zivadinov, Robert
2008-10-01
The perfusion/diffusion 'mismatch model' in acute ischemic stroke provides the potential to more accurately understand the consequences of thrombolytic therapy on an individual patient basis. Few methods exist to quantify mismatch extent (ischemic penumbra) and none have shown a robust ability to predict infarcted tissue outcome. Hidden Markov random field (HMRF) approaches have been used successfully in many other applications. The aim of the study was to develop a method for rapid and reliable identification and quantification of perfusion/diffusion mismatch using an HMRF approach. An HMRF model was used in combination with automated contralateral identification to segment normal tissue from non-infarcted tissue with perfusion abnormality. The infarct was used as a seed point to initialize segmentation, along with the contralateral mirror tissue. The two seeds were then allowed to compete for ownership of all unclassified tissue. In addition, a novel method was presented for quantifying tissue salvageability by weighting the volume with the degree of hypoperfusion, allowing the penumbra voxels to contribute unequal potential damage estimates. Simulated and in vivo datasets were processed and compared with results from a conventional thresholding approach. Both simulated and in vivo experiments demonstrated a dramatic improvement in accuracy with the proposed technique. For the simulated dataset, the mean absolute error decreased from 171.9% with conventional thresholding to 2.9% for the delay-weighted HMRF approach. For the in vivo dataset, the mean absolute error decreased from 564.6% for thresholding to 34.2% for the delay-weighted HMRF approach. The described method represents a significant improvement over thresholding techniques.
Sgourakis, Nikolaos G; Bagos, Pantelis G; Papasaikas, Panagiotis K; Hamodrakas, Stavros J
2005-01-01
Background G- Protein coupled receptors (GPCRs) comprise the largest group of eukaryotic cell surface receptors with great pharmacological interest. A broad range of native ligands interact and activate GPCRs, leading to signal transduction within cells. Most of these responses are mediated through the interaction of GPCRs with heterotrimeric GTP-binding proteins (G-proteins). Due to the information explosion in biological sequence databases, the development of software algorithms that could predict properties of GPCRs is important. Experimental data reported in the literature suggest that heterotrimeric G-proteins interact with parts of the activated receptor at the transmembrane helix-intracellular loop interface. Utilizing this information and membrane topology information, we have developed an intensive exploratory approach to generate a refined library of statistical models (Hidden Markov Models) that predict the coupling preference of GPCRs to heterotrimeric G-proteins. The method predicts the coupling preferences of GPCRs to Gs, Gi/o and Gq/11, but not G12/13 subfamilies. Results Using a dataset of 282 GPCR sequences of known coupling preference to G-proteins and adopting a five-fold cross-validation procedure, the method yielded an 89.7% correct classification rate. In a validation set comprised of all receptor sequences that are species homologues to GPCRs with known coupling preferences, excluding the sequences used to train the models, our method yields a correct classification rate of 91.0%. Furthermore, promiscuous coupling properties were correctly predicted for 6 of the 24 GPCRs that are known to interact with more than one subfamily of G-proteins. Conclusion Our method demonstrates high correct classification rate. Unlike previously published methods performing the same task, it does not require any transmembrane topology prediction in a preceding step. A web-server for the prediction of GPCRs coupling specificity to G-proteins available for non
Hogden, J.
1996-11-05
The goal of the proposed research is to test a statistical model of speech recognition that incorporates the knowledge that speech is produced by relatively slow motions of the tongue, lips, and other speech articulators. This model is called Maximum Likelihood Continuity Mapping (Malcom). Many speech researchers believe that by using constraints imposed by articulator motions, we can improve or replace the current hidden Markov model based speech recognition algorithms. Unfortunately, previous efforts to incorporate information about articulation into speech recognition algorithms have suffered because (1) slight inaccuracies in our knowledge or the formulation of our knowledge about articulation may decrease recognition performance, (2) small changes in the assumptions underlying models of speech production can lead to large changes in the speech derived from the models, and (3) collecting measurements of human articulator positions in sufficient quantity for training a speech recognition algorithm is still impractical. The most interesting (and in fact, unique) quality of Malcom is that, even though Malcom makes use of a mapping between acoustics and articulation, Malcom can be trained to recognize speech using only acoustic data. By learning the mapping between acoustics and articulation using only acoustic data, Malcom avoids the difficulties involved in collecting articulator position measurements and does not require an articulatory synthesizer model to estimate the mapping between vocal tract shapes and speech acoustics. Preliminary experiments that demonstrate that Malcom can learn the mapping between acoustics and articulation are discussed. Potential applications of Malcom aside from speech recognition are also discussed. Finally, specific deliverables resulting from the proposed research are described.
NASA Astrophysics Data System (ADS)
Knapmeyer-Endrun, B.; Hammer, C.
2014-12-01
The seismometers that the Apollo astronauts deployed on the Moon provide the only recordings of seismic events from any extra-terrestrial body so far. These lunar events are significantly different from ones recorded on Earth, in terms of both signal shape and source processes. Thus they are a valuable test case for any experiment in planetary seismology. In this study, we analyze Apollo 16 data with a single-station event detection and classification algorithm in view of NASA's upcoming InSight mission to Mars. InSight, scheduled for launch in early 2016, has the goal to investigate Mars' internal structure by deploying a seismometer on its surface. As the mission does not feature any orbiter, continuous data will be relayed to Earth at a reduced rate. Full range data will only be available by requesting specific time-windows within a few days after the receipt of the original transmission. We apply a recently introduced algorithm based on hidden Markov models that requires only a single example waveform of each event class for training appropriate models. After constructing the prototypes we detect and classify impacts and deep and shallow moonquakes. Initial results for 1972 (year of station installation with 8 months of data) indicate a high detection rate of over 95% for impacts, of which more than 80% are classified correctly. Deep moonquakes, which occur in large amounts, but often show only very weak signals, are detected with less certainty (~70%). As there is only one weak shallow moonquake covered, results for this event class are not statistically significant. Daily adjustments of the background noise model help to reduce false alarms, which are mainly erroneous deep moonquake detections, by about 25%. The algorithm enables us to classify events that were previously listed in the catalog without classification, and, through the combined use of long period and short period data, identify some unlisted local impacts as well as at least two yet unreported
Otterpohl, J R; Haynes, J D; Emmert-Streib, F; Vetter, G; Pawelzik, K
2000-01-01
When studying animal perception, one normally has the chance of localizing perceptual events in time, that is via behavioural responses time-locked to the stimuli. With multistable stimuli, however, perceptual changes occur despite stationary stimulation. Here, the challenge is to infer these not directly observable perceptual states indirectly from the behavioural data. This estimation is complicated by the fact that an animal's performance is contaminated by errors. We propose a two-step approach to overcome this difficulty: First, one sets up a generative, stochastic model of the behavioural time series based on the relevant parameters, including the probability of errors. Second, one performs a model-based maximum-likelihood estimation on the data in order to extract the non-observable perceptual state transitions. We illustrate this methodology for data from experiments on perception of bistable apparent motion in pigeons. The observed behavioural time series is analysed and explained by a combination of a Markovian perceptual dynamics with a renewal process that governs the motor response. We propose a hidden Markov model in which non-observable states represent both the perceptual states and the states of the renewal process of the motor dynamics, while the observable states account for overt pecking performance. Showing that this constitutes an appropriate phenomenological model of the time series of observable pecking events, we use it subsequently to obtain an estimate of the internal (and thus covert) perceptual reversals. These may directly correspond to changes in the activity of mutually inhibitory populations of motion selective neurones tuned to orthogonal directions.
2014-01-01
Background In many applications, a family of nucleotide or protein sequences classified into several subfamilies has to be modeled. Profile Hidden Markov Models (pHMMs) are widely used for this task, modeling each subfamily separately by one pHMM. However, a major drawback of this approach is the difficulty of dealing with subfamilies composed of very few sequences. One of the most crucial bioinformatical tasks affected by the problem of small-size subfamilies is the subtyping of human immunodeficiency virus type 1 (HIV-1) sequences, i.e., HIV-1 subtypes for which only a small number of sequences is known. Results To deal with small samples for particular subfamilies of HIV-1, we introduce a novel model-based information sharing protocol. It estimates the emission probabilities of the pHMM modeling a particular subfamily not only based on the nucleotide frequencies of the respective subfamily but also incorporating the nucleotide frequencies of all available subfamilies. To this end, the underlying probabilistic model mimics the pattern of commonality and variation between the subtypes with regards to the biological characteristics of HI viruses. In order to implement the proposed protocol, we make use of an existing HMM architecture and its associated inference engine. Conclusions We apply the modified algorithm to classify HIV-1 sequence data in the form of partial HIV-1 sequences and semi-artificial recombinants. Thereby, we demonstrate that the performance of pHMMs can be significantly improved by the proposed technique. Moreover, we show that our algorithm performs significantly better than Simplot and Bootscanning. PMID:24946781
NASA Astrophysics Data System (ADS)
Williams, R. M.; Ray, L. E.
2012-12-01
This paper presents methods to automatically classify ground penetrating radar (GPR) images of crevasses on ice sheets for use with a completely autonomous robotic system. We use a combination of support vector machines (SVM) and hidden Markov models (HMM) with appropriate un-biased processing that is suitable for real-time analysis and detection. We tested and evaluated three processing schemes on 96 examples of Antarctic GPR imagery from 2010 and 104 examples of Greenland imagery from 2011, collected by our robot and a Pisten Bully tractor. The Antarctic and Greenland data were collected in the shear zone near McMurdo Station and between Thule Air Base and Summit Station, respectively. Using a modified cross validation technique, we correctly classified 86 of the Antarctic examples and 90 of the Greenland examples with a radial basis kernel SVM trained and evaluated on down-sampled and texture-mapped GPR images of crevasses, compared to 60% classification rate using raw data. In order to reduce false positives, we use the SVM classification results as pre-screener flags that mark locations in the GPR files to evaluate with two gaussian HMMs, and evaluate our results with a similar modified cross validation technique. The combined SVM pre-screen-HMM confirm method retains all the correct classifications by the SVM, and reduces the false positive rate to 4%. This method also reduces the computational burden in classifying GPR traces because the HMM is only being evaluated on select pre-screened traces. Our experiments demonstrate the promise, robustness and reliability of real-time crevasse detection and classification with robotic GPR surveys.
Häme, Yrjö; Pollari, Mika
2012-01-01
A novel liver tumor segmentation method for CT images is presented. The aim of this work was to reduce the manual labor and time required in the treatment planning of radiofrequency ablation (RFA), by providing accurate and automated tumor segmentations reliably. The developed method is semi-automatic, requiring only minimal user interaction. The segmentation is based on non-parametric intensity distribution estimation and a hidden Markov measure field model, with application of a spherical shape prior. A post-processing operation is also presented to remove the overflow to adjacent tissue. In addition to the conventional approach of using a single image as input data, an approach using images from multiple contrast phases was developed. The accuracy of the method was validated with two sets of patient data, and artificially generated samples. The patient data included preoperative RFA images and a public data set from "3D Liver Tumor Segmentation Challenge 2008". The method achieved very high accuracy with the RFA data, and outperformed other methods evaluated with the public data set, receiving an average overlap error of 30.3% which represents an improvement of 2.3% points to the previously best performing semi-automatic method. The average volume difference was 23.5%, and the average, the RMS, and the maximum surface distance errors were 1.87, 2.43, and 8.09 mm, respectively. The method produced good results even for tumors with very low contrast and ambiguous borders, and the performance remained high with noisy image data.
Restrepo-Montoya, Daniel; Becerra, David; Carvajal-Patiño, Juan G.; Mongui, Alvaro; Niño, Luis F.; Patarroyo, Manuel E.; Patarroyo, Manuel A.
2011-01-01
Background This study describes a bioinformatics approach designed to identify Plasmodium vivax proteins potentially involved in reticulocyte invasion. Specifically, different protein training sets were built and tuned based on different biological parameters, such as experimental evidence of secretion and/or involvement in invasion-related processes. A profile-based sequence method supported by hidden Markov models (HMMs) was then used to build classifiers to search for biologically-related proteins. The transcriptional profile of the P. vivax intra-erythrocyte developmental cycle was then screened using these classifiers. Results A bioinformatics methodology for identifying potentially secreted P. vivax proteins was designed using sequence redundancy reduction and probabilistic profiles. This methodology led to identifying a set of 45 proteins that are potentially secreted during the P. vivax intra-erythrocyte development cycle and could be involved in cell invasion. Thirteen of the 45 proteins have already been described as vaccine candidates; there is experimental evidence of protein expression for 7 of the 32 remaining ones, while no previous studies of expression, function or immunology have been carried out for the additional 25. Conclusions The results support the idea that probabilistic techniques like profile HMMs improve similarity searches. Also, different adjustments such as sequence redundancy reduction using Pisces or Cd-Hit allowed data clustering based on rational reproducible measurements. This kind of approach for selecting proteins with specific functions is highly important for supporting large-scale analyses that could aid in the identification of genes encoding potential new target antigens for vaccine development and drug design. The present study has led to targeting 32 proteins for further testing regarding their ability to induce protective immune responses against P. vivax malaria. PMID:21984903
NASA Astrophysics Data System (ADS)
Reves-Sohn, R.; Humphris, S.; Canales, J.
2005-12-01
The TAG hydrothermal mound is a dynamic structure that is continuously growing via mineral deposition, collapsing from gravitational instabilities and anhydrite dissolution, and shaking from frequent seismic activity on the adjacent normal faults. As a result, the sub-surface fluid circulation patterns beneath the mound are continually re-organizing in response to events that close and open flow paths. These characteristics are clearly evident in time series exit-fluid temperature data acquired from June 2003 through July 2004 as part of the Seismicity and Fluid Flow of TAG (STAG) experiment. Twenty one temperature probes were deployed in actively venting cracks across the TAG mound, and temperature measurements were acquired at each site every ~10 minutes. A key insight for understanding the exit-fluid temperature data is that the measurements can be modeled as Markov chains, where each measurement is a random variable drawn from a finite set of probability distributions associated with the hidden states of the system (i.e., Hidden Markov Models). The Markov chain changes states in response to events that can affect multiple probes, but not necessarily in the same way. For example, an event may cause temperatures at one probe to rapidly increase while temperatures at another probe rapidly decrease. The data from many probes can be explained with a two-state Markov chain, with one state corresponding to "crack open" and the second state corresponding to "crack closed", but still other probes require three or more states, possibly in a nested structure. These stochastic models are deepening our understanding of shallow circulation patterns beneath the TAG mound, and we hope to use them to condition subsurface flow models incorporating the relevant physics of permeable flow in fractures and heat flow.
2010-01-01
Background The Medium-chain Dehydrogenases/Reductases (MDR) form a protein superfamily whose size and complexity defeats traditional means of subclassification; it currently has over 15000 members in the databases, the pairwise sequence identity is typically around 25%, there are members from all kingdoms of life, the chain-lengths vary as does the oligomericity, and the members are partaking in a multitude of biological processes. There are profile hidden Markov models (HMMs) available for detecting MDR superfamily members, but none for determining which MDR family each protein belongs to. The current torrential influx of new sequence data enables elucidation of more and more protein families, and at an increasingly fine granularity. However, gathering good quality training data usually requires manual attention by experts and has therefore been the rate limiting step for expanding the number of available models. Results We have developed an automated algorithm for HMM refinement that produces stable and reliable models for protein families. This algorithm uses relationships found in data to generate confident seed sets. Using this algorithm we have produced HMMs for 86 distinct MDR families and 34 of their subfamilies which can be used in automated annotation of new sequences. We find that MDR forms with 2 Zn2+ ions in general are dehydrogenases, while MDR forms with no Zn2+ in general are reductases. Furthermore, in Bacteria MDRs without Zn2+ are more frequent than those with Zn2+, while the opposite is true for eukaryotic MDRs, indicating that Zn2+ has been recruited into the MDR superfamily after the initial life kingdom separations. We have also developed a web site http://mdr-enzymes.org that provides textual and numeric search against various characterised MDR family properties, as well as sequence scan functions for reliable classification of novel MDR sequences. Conclusions Our method of refinement can be readily applied to create stable and reliable HMMs
Herath, Dulip L; Abeyratne, Udantha R; Hukins, Craig
2015-12-01
Obstructive sleep apnea (OSA) is a breathing disorder that can cause serious medical consequences. It is caused by full (apnea) or partial (hypopnea) obstructions of the upper airway during sleep. The gold standard for diagnosis of OSA is the polysomnography (PSG). The main measure for OSA diagnosis is the apnea-hypopnea index (AHI). However, the AHI is a time averaged summary measure of vast amounts of information gathered in an overnight PSG study. It cannot capture the dynamic characteristics associated with apnea/hypopnea events and their overnight distribution. The dynamic characteristics of apnea/hypopnea events are affected by the structural and functional characteristics of the upper airway. The upper airway characteristics also affect the upper airway collapsibility. These effects are manifested in snoring sounds generated from the vibrations of upper airway structures which are then modified by the upper airway geometric and physical characteristics. Hence, it is highly likely that the acoustical behavior of snoring is affected by the upper airway structural and functional characteristics. In the current work, we propose a novel method to model the intra-snore episode behavior of the acoustic characteristics of snoring sounds which can indirectly describe the instantaneous and temporal dynamics of the upper airway. We model the intra-snore episode acoustical behavior by using hidden Markov models (HMMs) with Mel frequency cepstral coefficients. Assuming significant differences in the anatomical and physiological upper airway configurations between low-AHI and high-AHI subjects, we defined different snorer groups with respect to AHI thresholds 15 and 30 and also developed HMM-based classifiers to classify snore episodes into those groups. We also define a measure called instantaneous apneaness score (IAS) in terms of the log-likelihoods produced by respective HMMs. IAS indicates the degree of class membership of each episode to one of the predefined groups
Herath, Dulip L; Abeyratne, Udantha R; Hukins, Craig
2015-12-01
Obstructive sleep apnea (OSA) is a breathing disorder that can cause serious medical consequences. It is caused by full (apnea) or partial (hypopnea) obstructions of the upper airway during sleep. The gold standard for diagnosis of OSA is the polysomnography (PSG). The main measure for OSA diagnosis is the apnea-hypopnea index (AHI). However, the AHI is a time averaged summary measure of vast amounts of information gathered in an overnight PSG study. It cannot capture the dynamic characteristics associated with apnea/hypopnea events and their overnight distribution. The dynamic characteristics of apnea/hypopnea events are affected by the structural and functional characteristics of the upper airway. The upper airway characteristics also affect the upper airway collapsibility. These effects are manifested in snoring sounds generated from the vibrations of upper airway structures which are then modified by the upper airway geometric and physical characteristics. Hence, it is highly likely that the acoustical behavior of snoring is affected by the upper airway structural and functional characteristics. In the current work, we propose a novel method to model the intra-snore episode behavior of the acoustic characteristics of snoring sounds which can indirectly describe the instantaneous and temporal dynamics of the upper airway. We model the intra-snore episode acoustical behavior by using hidden Markov models (HMMs) with Mel frequency cepstral coefficients. Assuming significant differences in the anatomical and physiological upper airway configurations between low-AHI and high-AHI subjects, we defined different snorer groups with respect to AHI thresholds 15 and 30 and also developed HMM-based classifiers to classify snore episodes into those groups. We also define a measure called instantaneous apneaness score (IAS) in terms of the log-likelihoods produced by respective HMMs. IAS indicates the degree of class membership of each episode to one of the predefined groups
Mannini, Andrea; Sabatini, Angelo Maria
2012-09-01
In this paper we present a classifier based on a hidden Markov model (HMM) that was applied to a gait treadmill dataset for gait phase detection and walking/jogging discrimination. The gait events foot strike, foot flat, heel off, toe off were detected using a uni-axial gyroscope that measured the foot instep angular velocity in the sagittal plane. Walking/jogging activities were discriminated by processing gyroscope data from each detected stride. Supervised learning of the classifier was undertaken using reference data from an optical motion analysis system. Remarkably good generalization properties were achieved across tested subjects and gait speeds. Sensitivity and specificity of gait phase detection exceeded 94% and 98%, respectively, with timing errors that were less than 20 ms, on average; the accuracy of walking/jogging discrimination was approximately 99%.
Lamiable, A; Thevenet, P; Tufféry, P
2016-08-01
Hidden Markov Model derived structural alphabets are a probabilistic framework in which the complete conformational space of a peptidic chain is described in terms of probability distributions that can be sampled to identify conformations of largest probabilities. Here, we assess how three strategies to sample sub-optimal conformations-Viterbi k-best, forward backtrack and a taboo sampling approach-can lead to the efficient generation of peptide conformations. We show that the diversity of sampling is essential to compensate biases introduced in the estimates of the probabilities, and we find that only the forward backtrack and a taboo sampling strategies can efficiently generate native or near-native models. Finally, we also find such approaches are as efficient as former protocols, while being one order of magnitude faster, opening the door to the large scale de novo modeling of peptides and mini-proteins. © 2016 Wiley Periodicals, Inc. PMID:27317417
Lipinski, Kamil A; Puchta, Olga; Surendranath, Vineeth; Kudla, Marek; Golik, Pawel
2011-10-01
Pentatricopeptide repeat (PPR) proteins are the largest known RNA-binding protein family, and are found in all eukaryotes, being particularly abundant in higher plants. PPR proteins localize mostly to mitochondria and chloroplasts, and many were shown to modulate organellar genome expression on the posttranscriptional level. Although the genomes of land plants encode hundreds of PPR proteins, only a few have been identified in Fungi and Metazoa. As the current PPR motif profiles are built mainly on the basis of the predominant plant sequences, they are unlikely to be optimal for detecting fungal and animal members of the family, and many putative PPR proteins in these genomes may remain undetected. In order to verify this hypothesis, we designed a hidden Markov model-based bioinformatic tool called Supervised Clustering-based Iterative Phylogenetic Hidden Markov Model algorithm for the Evaluation of tandem Repeat motif families (SCIPHER) using sequence data from orthologous clusters from available yeast genomes. This approach allowed us to assign 12 new proteins in Saccharomyces cerevisiae to the PPR family. Similarly, in other yeast species, we obtained a 5-fold increase in the detection of PPR motifs, compared with the previous tools. All the newly identified S. cerevisiae PPR proteins localize in the mitochondrion and are a part of the RNA processing interaction network. Furthermore, the yeast PPR proteins seem to undergo an accelerated divergent evolution. Analysis of single and double amino acid substitutions in the Dmr1 protein of S. cerevisiae suggests that cooperative interactions between motifs and pseudoreversion could be the force driving this rapid evolution.
Guihenneuc-Jouyaux, C; Richardson, S; Longini, I M
2000-09-01
Multistate models have been increasingly used to model natural history of many diseases as well as to characterize the follow-up of patients under varied clinical protocols. This modeling allows describing disease evolution, estimating the transition rates, and evaluating the therapy effects on progression. In many cases, the staging is defined on the basis of a discretization of the values of continuous markers (CD4 cell count for HIV application) that are subject to great variability due mainly to short time-scale noise (intraindividual variability) and measurement errors. This led us to formulate a Bayesian hierarchical model where, at a first level, a disease process (Markov model on the true states, which are unobserved) is introduced and, at a second level, the measurement process making the link between the true states and the observed marker values is modeled. This hierarchical formulation allows joint estimation of the parameters of both processes. Estimation of the quantities of interest is performed via stochastic algorithms of the family of Markov chain Monte Carlo methods. The flexibility of this approach is illustrated by analyzing the CD4 data on HIV patients of the Concorde clinical trial. PMID:10985209
Borgy, Benjamin; Reboud, Xavier; Peyrard, Nathalie; Sabbadin, Régis; Gaba, Sabrina
2015-01-01
Predicting the population dynamics of annual plants is a challenge due to their hidden seed banks in the field. However, such predictions are highly valuable for determining management strategies, specifically in agricultural landscapes. In agroecosystems, most weed seeds survive during unfavourable seasons and persist for several years in the seed bank. This causes difficulties in making accurate predictions of weed population dynamics and life history traits (LHT). Consequently, it is very difficult to identify management strategies that limit both weed populations and species diversity. In this article, we present a method of assessing weed population dynamics from both standing plant time series data and an unknown seed bank. We use a Hidden Markov Model (HMM) to obtain estimates of over 3,080 botanical records for three major LHT: seed survival in the soil, plant establishment (including post-emergence mortality), and seed production of 18 common weed species. Maximum likelihood and Bayesian approaches were complementarily used to estimate LHT values. The results showed that the LHT provided by the HMM enabled fairly accurate estimates of weed populations in different crops. There was a positive correlation between estimated germination rates and an index of the specialisation to the crop type (IndVal). The relationships between estimated LHTs and that between the estimated LHTs and the ecological characteristics of weeds provided insights into weed strategies. For example, a common strategy to cope with agricultural practices in several weeds was to produce less seeds and increase germination rates. This knowledge, especially of LHT for each type of crop, should provide valuable information for developing sustainable weed management strategies.
Borgy, Benjamin; Reboud, Xavier; Peyrard, Nathalie; Sabbadin, Régis; Gaba, Sabrina
2015-01-01
Predicting the population dynamics of annual plants is a challenge due to their hidden seed banks in the field. However, such predictions are highly valuable for determining management strategies, specifically in agricultural landscapes. In agroecosystems, most weed seeds survive during unfavourable seasons and persist for several years in the seed bank. This causes difficulties in making accurate predictions of weed population dynamics and life history traits (LHT). Consequently, it is very difficult to identify management strategies that limit both weed populations and species diversity. In this article, we present a method of assessing weed population dynamics from both standing plant time series data and an unknown seed bank. We use a Hidden Markov Model (HMM) to obtain estimates of over 3,080 botanical records for three major LHT: seed survival in the soil, plant establishment (including post-emergence mortality), and seed production of 18 common weed species. Maximum likelihood and Bayesian approaches were complementarily used to estimate LHT values. The results showed that the LHT provided by the HMM enabled fairly accurate estimates of weed populations in different crops. There was a positive correlation between estimated germination rates and an index of the specialisation to the crop type (IndVal). The relationships between estimated LHTs and that between the estimated LHTs and the ecological characteristics of weeds provided insights into weed strategies. For example, a common strategy to cope with agricultural practices in several weeds was to produce less seeds and increase germination rates. This knowledge, especially of LHT for each type of crop, should provide valuable information for developing sustainable weed management strategies. PMID:26427023
Dean, Ben
2013-01-01
The use of miniature data loggers is rapidly increasing our understanding of the movements and habitat preferences of pelagic seabirds. However, objectively interpreting behavioural information from the large volumes of highly detailed data collected by such devices can be challenging. We combined three biologging technologies—global positioning system (GPS), saltwater immersion and time–depth recorders—to build a detailed picture of the at-sea behaviour of the Manx shearwater (Puffinus puffinus) during the breeding season. We used a hidden Markov model to explore discrete states within the combined GPS and immersion data, and found that behaviour could be organized into three principal activities representing (i) sustained direct flight, (ii) sitting on the sea surface, and (iii) foraging, comprising tortuous flight interspersed with periods of immersion. The additional logger data verified that the foraging activity corresponded well to the occurrence of diving. Applying this approach to a large tracking dataset revealed that birds from two different colonies foraged in local waters that were exclusive, but overlapped in one key area: the Irish Sea Front (ISF). We show that the allocation of time to each activity differed between colonies, with birds breeding furthest from the ISF spending the greatest proportion of time engaged in direct flight and the smallest proportion of time engaged in foraging activity. This type of analysis has considerable potential for application in future biologging studies and in other taxa.
Dean, Ben; Freeman, Robin; Kirk, Holly; Leonard, Kerry; Phillips, Richard A.; Perrins, Chris M.; Guilford, Tim
2013-01-01
The use of miniature data loggers is rapidly increasing our understanding of the movements and habitat preferences of pelagic seabirds. However, objectively interpreting behavioural information from the large volumes of highly detailed data collected by such devices can be challenging. We combined three biologging technologies—global positioning system (GPS), saltwater immersion and time–depth recorders—to build a detailed picture of the at-sea behaviour of the Manx shearwater (Puffinus puffinus) during the breeding season. We used a hidden Markov model to explore discrete states within the combined GPS and immersion data, and found that behaviour could be organized into three principal activities representing (i) sustained direct flight, (ii) sitting on the sea surface, and (iii) foraging, comprising tortuous flight interspersed with periods of immersion. The additional logger data verified that the foraging activity corresponded well to the occurrence of diving. Applying this approach to a large tracking dataset revealed that birds from two different colonies foraged in local waters that were exclusive, but overlapped in one key area: the Irish Sea Front (ISF). We show that the allocation of time to each activity differed between colonies, with birds breeding furthest from the ISF spending the greatest proportion of time engaged in direct flight and the smallest proportion of time engaged in foraging activity. This type of analysis has considerable potential for application in future biologging studies and in other taxa. PMID:23034356
Miyatake, Satoko; Koshimizu, Eriko; Fujita, Atsushi; Fukai, Ryoko; Imagawa, Eri; Ohba, Chihiro; Kuki, Ichiro; Nukui, Megumi; Araki, Atsushi; Makita, Yoshio; Ogata, Tsutomu; Nakashima, Mitsuko; Tsurusaki, Yoshinori; Miyake, Noriko; Saitsu, Hirotomo; Matsumoto, Naomichi
2015-04-01
Whole-exome sequencing (WES) is becoming a standard tool for detecting nucleotide changes, and determining whether WES data can be used for the detection of copy-number variations (CNVs) is of interest. To date, several algorithms have been developed for such analyses, although verification is needed to establish if they fit well for the appropriate purpose, depending on the characteristics of each algorithm. Here, we performed WES CNV analysis using the eXome Hidden Markov Model (XHMM). We validated its performance using 27 rare CNVs previously identified by microarray as positive controls, finding that the detection rate was 59%, or higher (89%) with three or more targets. XHMM can be effectively used, especially for the detection of >200 kb CNVs. XHMM may be useful for deletion breakpoint detection. Next, we applied XHMM to genetically unsolved patients, demonstrating successful identification of pathogenic CNVs: 1.5-1.9-Mb deletions involving NSD1 in patients with unknown overgrowth syndrome leading to the diagnosis of Sotos syndrome, and 6.4-Mb duplication involving MECP2 in affected brothers with late-onset spasm and progressive cerebral/cerebellar atrophy confirming the clinical suspect of MECP2 duplication syndrome. The possibility of an 'exome-first' approach for clinical genetic investigation may be considered to save the cost of multiple investigations. PMID:25608832
Ru, Jingyu; Wu, Chengdong; Jia, Zixi; Yang, Yufang; Zhang, Yunzhou; Hu, Nan
2015-01-01
Localization as a technique to solve the complex and challenging problems besetting line-of-sight (LOS) and non-line-of-sight (NLOS) transmissions has recently attracted considerable attention in the wireless sensor network field. This paper proposes a strategy for eliminating NLOS localization errors during calculation of the location of mobile terminals (MTs) in unfamiliar indoor environments. In order to improve the hidden Markov model (HMM), we propose two modified algorithms, namely, modified HMM (M-HMM) and replacement modified HMM (RM-HMM). Further, a hybrid localization algorithm that combines HMM with an interacting multiple model (IMM) is proposed to represent the velocity of mobile nodes. This velocity model is divided into a high-speed and a low-speed model, which means the nodes move at different speeds following the same mobility pattern. Each moving node continually switches its state based on its probability. Consequently, to improve precision, each moving node uses the IMM model to integrate the results from the HMM and its modified forms. Simulation experiments conducted show that our proposed algorithms perform well in both distance estimation and coordinate calculation, with increasing accuracy of localization of the proposed algorithms in the order M-HMM, RM-HMM, and HMM + IMM. The simulations also show that the three algorithms are accurate, stable, and robust. PMID:26091395
NASA Astrophysics Data System (ADS)
Jeon, Wonju; Kang, Seung-Ho; Leem, Joo-Baek; Lee, Sang-Hee
2013-05-01
Fish that swim in schools benefit from increased vigilance, and improved predator recognition and assessment. Fish school size varies according to species and environmental conditions. In this study, we present a Hidden Markov Model (HMM) that we use to characterize fish schooling behavior in different sized schools, and explore how school size affects schooling behavior. We recorded the schooling behavior of Medaka (Oryzias latipes) and goldfish (Carassius auratus) using different numbers of individual fish (10-40), in a circular aquarium. Eight to ten 3 s video clips were extracted from the recordings for each group size. Schooling behavior was characterized by three variables: linear speed, angular speed, and Pearson coefficient. The values of the variables were categorized into two events each for linear and angular speed (high and low), and three events for the Pearson coefficient (high, medium, and low). Schooling behavior was then described as a sequence of 12 events (2×2×3), which was input to an HMM as data for training the model. Comparisons of model output with observations of actual schooling behavior demonstrated that the HMM was successful in characterizing fish schooling behavior. We briefly discuss possible applications of the HMM for recognition of fish species in a school, and for developing bio-monitoring systems to determine water quality.
NASA Astrophysics Data System (ADS)
Power, Sarah D.; Falk, Tiago H.; Chau, Tom
2010-04-01
Near-infrared spectroscopy (NIRS) has recently been investigated as a non-invasive brain-computer interface (BCI). In particular, previous research has shown that NIRS signals recorded from the motor cortex during left- and right-hand imagery can be distinguished, providing a basis for a two-choice NIRS-BCI. In this study, we investigated the feasibility of an alternative two-choice NIRS-BCI paradigm based on the classification of prefrontal activity due to two cognitive tasks, specifically mental arithmetic and music imagery. Deploying a dual-wavelength frequency domain near-infrared spectrometer, we interrogated nine sites around the frontopolar locations (International 10-20 System) while ten able-bodied adults performed mental arithmetic and music imagery within a synchronous shape-matching paradigm. With the 18 filtered AC signals, we created task- and subject-specific maximum likelihood classifiers using hidden Markov models. Mental arithmetic and music imagery were classified with an average accuracy of 77.2% ± 7.0 across participants, with all participants significantly exceeding chance accuracies. The results suggest the potential of a two-choice NIRS-BCI based on cognitive rather than motor tasks.
Ioannidou, Zoi S.; Theodoropoulou, Margarita C.; Papandreou, Nikos C.; Willis, Judith H.; Hamodrakas, Stavros J.
2014-01-01
The arthropod cuticle is a composite, bipartite system, made of chitin filaments embedded in a proteinaceous matrix. The physical properties of cuticle are determined by the structure and the interactions of its two major components, cuticular proteins (CPs) and chitin. The proteinaceous matrix consists mainly of structural cuticular proteins. The majority of the structural proteins that have been described to date belong to the CPR family, and they are identified by the conserved R&R region (Rebers and Riddiford Consensus). Two major subfamilies of the CPR family RR-1 and RR-2, have also been identified from conservation at sequence level and some correlation with the cuticle type. Recently, several novel families, also containing characteristic conserved regions, have been described. The package HMMER v3.0 [http://hmmer.janelia.org/] was used to build characteristic profile Hidden Markov Models based on the characteristic regions for 8 of these families, (CPF, CPAP3, CPAP1, CPCFC, CPLCA, CPLCG, CPLCW, Tweedle). In brief, these families can be described as having: CPF (a conserved region with 44 amino acids); CPAP1 and CPAP-3 (analogous to peritrophins, with 1 and 3 chitin-binding domains, respectively); CPCFC (2 or 3 C-x(5)-C repeats); and four of five low complexity (LC) families, each with characteristic domains. Using these models, as well as the models previously created for the two major subfamilies of the CPR family, RR-1 and RR-2 (Karouzou et al., 2007), we developed CutProtFam-Pred, an on-line tool (http://bioinformatics.biol.uoa.gr/CutProtFam-Pred) that allows one to query sequences from proteomes or translated transcriptomes, for the accurate detection and classification of putative structural cuticular proteins. The tool has been applied successfully to diverse arthropod proteomes including a crustacean (Daphnia pulex) and a chelicerate (Tetranychus urticae), but at this taxonomic distance only CPRs and CPAPs were recovered. PMID:24978609
Yamamoto, Toshiyuki; Shimojima, Keiko; Ondo, Yumiko; Imai, Katsumi; Chong, Pin Fee; Kira, Ryutaro; Amemiya, Mitsuhiro; Saito, Akira; Okamoto, Nobuhiko
2016-01-01
Next-generation sequencing (NGS) is widely used for the detection of disease-causing nucleotide variants. The challenges associated with detecting copy number variants (CNVs) using NGS analysis have been reported previously. Disease-related exome panels such as Illumina TruSight One are more cost-effective than whole-exome sequencing (WES) because of their selective target regions (~21% of the WES). In this study, CNVs were analyzed using data extracted through a disease-related exome panel analysis and the eXome Hidden Markov Model (XHMM). Samples from 61 patients with undiagnosed developmental delays and 52 healthy parents were included in this study. In the preliminary study to validate the constructed XHMM system (microarray-first approach), 34 patients who had previously been analyzed by chromosomal microarray testing were used. Among the five CNVs larger than 200 kb that were considered as non-pathogenic CNVs and were used as positive controls, four CNVs was successfully detected. The system was subsequently used to analyze different samples from 27 patients (NGS-first approach); 2 of these patients were successfully diagnosed as having pathogenic CNVs (an unbalanced translocation der(5)t(5;14) and a 16p11.2 duplication). These diagnoses were re-confirmed by chromosomal microarray testing and/or fluorescence in situ hybridization. The NGS-first approach generated no false-negative or false-positive results for pathogenic CNVs, indicating its high sensitivity and specificity in detecting pathogenic CNVs. The results of this study show the possible clinical utility of pathogenic CNV screening using disease-related exome panel analysis and XHMM. PMID:27579173
Yamamoto, Toshiyuki; Shimojima, Keiko; Ondo, Yumiko; Imai, Katsumi; Chong, Pin Fee; Kira, Ryutaro; Amemiya, Mitsuhiro; Saito, Akira; Okamoto, Nobuhiko
2016-01-01
Next-generation sequencing (NGS) is widely used for the detection of disease-causing nucleotide variants. The challenges associated with detecting copy number variants (CNVs) using NGS analysis have been reported previously. Disease-related exome panels such as Illumina TruSight One are more cost-effective than whole-exome sequencing (WES) because of their selective target regions (~21% of the WES). In this study, CNVs were analyzed using data extracted through a disease-related exome panel analysis and the eXome Hidden Markov Model (XHMM). Samples from 61 patients with undiagnosed developmental delays and 52 healthy parents were included in this study. In the preliminary study to validate the constructed XHMM system (microarray-first approach), 34 patients who had previously been analyzed by chromosomal microarray testing were used. Among the five CNVs larger than 200 kb that were considered as non-pathogenic CNVs and were used as positive controls, four CNVs was successfully detected. The system was subsequently used to analyze different samples from 27 patients (NGS-first approach); 2 of these patients were successfully diagnosed as having pathogenic CNVs (an unbalanced translocation der(5)t(5;14) and a 16p11.2 duplication). These diagnoses were re-confirmed by chromosomal microarray testing and/or fluorescence in situ hybridization. The NGS-first approach generated no false-negative or false-positive results for pathogenic CNVs, indicating its high sensitivity and specificity in detecting pathogenic CNVs. The results of this study show the possible clinical utility of pathogenic CNV screening using disease-related exome panel analysis and XHMM. PMID:27579173
El Yazid Boudaren, Mohamed; Monfrini, Emmanuel; Pieczynski, Wojciech; Aïssani, Amar
2014-11-01
Hidden Markov chains have been shown to be inadequate for data modeling under some complex conditions. In this work, we address the problem of statistical modeling of phenomena involving two heterogeneous system states. Such phenomena may arise in biology or communications, among other fields. Namely, we consider that a sequence of meaningful words is to be searched within a whole observation that also contains arbitrary one-by-one symbols. Moreover, a word may be interrupted at some site to be carried on later. Applying plain hidden Markov chains to such data, while ignoring their specificity, yields unsatisfactory results. The Phasic triplet Markov chain, proposed in this paper, overcomes this difficulty by means of an auxiliary underlying process in accordance with the triplet Markov chains theory. Related Bayesian restoration techniques and parameters estimation procedures according to the new model are then described. Finally, to assess the performance of the proposed model against the conventional hidden Markov chain model, experiments are conducted on synthetic and real data. PMID:26353069
NASA Astrophysics Data System (ADS)
Chen, Jinsong; Hubbard, Susan S.; Williams, Kenneth H.
2013-10-01
Although mechanistic reaction networks have been developed to quantify the biogeochemical evolution of subsurface systems associated with bioremediation, it is difficult in practice to quantify the onset and distribution of these transitions at the field scale using commonly collected wellbore datasets. As an alternative approach to the mechanistic methods, we develop a data-driven, statistical model to identify biogeochemical transitions using various time-lapse aqueous geochemical data (e.g., Fe(II), sulfate, sulfide, acetate, and uranium concentrations) and induced polarization (IP) data. We assume that the biogeochemical transitions can be classified as several dominant states that correspond to redox transitions and test the method at a uranium-contaminated site. The relationships between the geophysical observations and geochemical time series vary depending upon the unknown underlying redox status, which is modeled as a hidden Markov random field. We estimate unknown parameters by maximizing the joint likelihood function using the maximization-expectation algorithm. The case study results show that when considered together aqueous geochemical data and IP imaginary conductivity provide a key diagnostic signature of biogeochemical stages. The developed method provides useful information for evaluating the effectiveness of bioremediation, such as the probability of being in specific redox stages following biostimulation where desirable pathways (e.g., uranium removal) are more highly favored. The use of geophysical data in the approach advances the possibility of using noninvasive methods to monitor critical biogeochemical system stages and transitions remotely and over field relevant scales (e.g., from square meters to several hectares).
Entropy Computation in Partially Observed Markov Chains
NASA Astrophysics Data System (ADS)
Desbouvries, François
2006-11-01
Let X = {Xn}n∈N be a hidden process and Y = {Yn}n∈N be an observed process. We assume that (X,Y) is a (pairwise) Markov Chain (PMC). PMC are more general than Hidden Markov Chains (HMC) and yet enable the development of efficient parameter estimation and Bayesian restoration algorithms. In this paper we propose a fast (i.e., O(N)) algorithm for computing the entropy of {Xn}n=0N given an observation sequence {yn}n=0N.
NASA Astrophysics Data System (ADS)
Al-Ghraibah, Amani
error of approximately 3/4 a GOES class. We also consider thresholding the regressed flare size for the experiment containing both flaring and non-flaring regions and find a TPR. of 0.69 and a TNR of 0.86 for flare prediction, consistent with our previous studies of flare prediction using the same magnetic complexity features. The results for both of these size regression experiments are consistent across a wide range of predictive time windows, indicating that the magnetic complexity features may be persistent in appearance long before flare activity. This conjecture is supported by our larger error rates of some 40 hours in the time-to-flare regression problem. The magnetic complexity features considered here appear to have discriminative potential for flare size, but their persistence in time makes them less discriminative for the time-to-flare problem. We also study the prediction of solar flare size and time-to-flare using two temporal features, namely the ▵- and ▵-▵-features, the same average size and time-to-flare regression error are found when these temporal features are used in size and time-to-flare prediction. In the third topic, we study the temporal evolution of active region magnetic fields using Hidden Markov Models (HMMs) which is one of the efficient temporal analyses found in literature. We extracted 38 features which describing the complexity of the photospheric magnetic field. These features are converted into a sequence of symbols using k-nearest neighbor search method. We study many parameters before prediction; like the length of the training window Wtrain which denotes to the number of history images use to train the flare and non-flare HMMs, and number of hidden states Q. In training phase, the model parameters of the HMM of each category are optimized so as to best describe the training symbol sequences. In testing phase, we use the best flare and non-flare models to predict/classify active regions as a flaring or non-flaring region
Kyzer, S; Gordon, P H
1993-08-01
For the patient with an unresectable carcinoma of the rectum, establishment of a "hidden" colostomy rather than formal colostomy, provides a better interim quality of life. When necessary, the "hidden" colostomy can readily be converted to a formal colostomy without the need for a laparotomy or general anesthetic. We conclude that surgeons should remember this technique when the appropriate situation occurs. PMID:7688147
On multitarget pairwise-Markov models
NASA Astrophysics Data System (ADS)
Mahler, Ronald
2015-05-01
Single- and multi-target tracking are both typically based on strong independence assumptions regarding both the target states and sensor measurements. In particular, both are theoretically based on the hidden Markov chain (HMC) model. That is, the target process is a Markov chain that is observed by an independent observation process. Since HMC assumptions are invalid in many practical applications, the pairwise Markov chain (PMC) model has been proposed as a way to weaken those assumptions. In this paper it is shown that the PMC model can be directly generalized to multitarget problems. Since the resulting tracking filters are computationally intractable, the paper investigates generalizations of the cardinalized probability hypothesis density (CPHD) filter to applications with PMC models.
Semi-Markov Arnason-Schwarz models.
King, Ruth; Langrock, Roland
2016-06-01
We consider multi-state capture-recapture-recovery data where observed individuals are recorded in a set of possible discrete states. Traditionally, the Arnason-Schwarz model has been fitted to such data where the state process is modeled as a first-order Markov chain, though second-order models have also been proposed and fitted to data. However, low-order Markov models may not accurately represent the underlying biology. For example, specifying a (time-independent) first-order Markov process involves the assumption that the dwell time in each state (i.e., the duration of a stay in a given state) has a geometric distribution, and hence that the modal dwell time is one. Specifying time-dependent or higher-order processes provides additional flexibility, but at the expense of a potentially significant number of additional model parameters. We extend the Arnason-Schwarz model by specifying a semi-Markov model for the state process, where the dwell-time distribution is specified more generally, using, for example, a shifted Poisson or negative binomial distribution. A state expansion technique is applied in order to represent the resulting semi-Markov Arnason-Schwarz model in terms of a simpler and computationally tractable hidden Markov model. Semi-Markov Arnason-Schwarz models come with only a very modest increase in the number of parameters, yet permit a significantly more flexible state process. Model selection can be performed using standard procedures, and in particular via the use of information criteria. The semi-Markov approach allows for important biological inference to be drawn on the underlying state process, for example, on the times spent in the different states. The feasibility of the approach is demonstrated in a simulation study, before being applied to real data corresponding to house finches where the states correspond to the presence or absence of conjunctivitis. PMID:26584064
Semi-Markov Arnason-Schwarz models.
King, Ruth; Langrock, Roland
2016-06-01
We consider multi-state capture-recapture-recovery data where observed individuals are recorded in a set of possible discrete states. Traditionally, the Arnason-Schwarz model has been fitted to such data where the state process is modeled as a first-order Markov chain, though second-order models have also been proposed and fitted to data. However, low-order Markov models may not accurately represent the underlying biology. For example, specifying a (time-independent) first-order Markov process involves the assumption that the dwell time in each state (i.e., the duration of a stay in a given state) has a geometric distribution, and hence that the modal dwell time is one. Specifying time-dependent or higher-order processes provides additional flexibility, but at the expense of a potentially significant number of additional model parameters. We extend the Arnason-Schwarz model by specifying a semi-Markov model for the state process, where the dwell-time distribution is specified more generally, using, for example, a shifted Poisson or negative binomial distribution. A state expansion technique is applied in order to represent the resulting semi-Markov Arnason-Schwarz model in terms of a simpler and computationally tractable hidden Markov model. Semi-Markov Arnason-Schwarz models come with only a very modest increase in the number of parameters, yet permit a significantly more flexible state process. Model selection can be performed using standard procedures, and in particular via the use of information criteria. The semi-Markov approach allows for important biological inference to be drawn on the underlying state process, for example, on the times spent in the different states. The feasibility of the approach is demonstrated in a simulation study, before being applied to real data corresponding to house finches where the states correspond to the presence or absence of conjunctivitis.
Stochastic thermodynamics of hidden pumps
NASA Astrophysics Data System (ADS)
Esposito, Massimiliano; Parrondo, Juan M. R.
2015-05-01
We show that a reversible pumping mechanism operating between two states of a kinetic network can give rise to Poisson transitions between these two states. An external observer, for whom the pumping mechanism is not accessible, will observe a Markov chain satisfying local detailed balance with an emerging effective force induced by the hidden pump. Due to the reversibility of the pump, the actual entropy production turns out to be lower than the coarse-grained entropy production estimated from the flows and affinities of the resulting Markov chain. Moreover, in presence of a large time scale separation between the fast-pumping dynamics and the slow-network dynamics, a finite current with zero dissipation may be produced. We make use of these general results to build a synthetase-like kinetic scheme able to reversibly produce high free-energy molecules at a finite rate and a rotatory motor achieving 100% efficiency at finite speed.
Stochastic thermodynamics of hidden pumps.
Esposito, Massimiliano; Parrondo, Juan M R
2015-05-01
We show that a reversible pumping mechanism operating between two states of a kinetic network can give rise to Poisson transitions between these two states. An external observer, for whom the pumping mechanism is not accessible, will observe a Markov chain satisfying local detailed balance with an emerging effective force induced by the hidden pump. Due to the reversibility of the pump, the actual entropy production turns out to be lower than the coarse-grained entropy production estimated from the flows and affinities of the resulting Markov chain. Moreover, in presence of a large time scale separation between the fast-pumping dynamics and the slow-network dynamics, a finite current with zero dissipation may be produced. We make use of these general results to build a synthetase-like kinetic scheme able to reversibly produce high free-energy molecules at a finite rate and a rotatory motor achieving 100% efficiency at finite speed. PMID:26066126
Detecting targets hidden in random forests
NASA Astrophysics Data System (ADS)
Kouritzin, Michael A.; Luo, Dandan; Newton, Fraser; Wu, Biao
2009-05-01
Military tanks, cargo or troop carriers, missile carriers or rocket launchers often hide themselves from detection in the forests. This plagues the detection problem of locating these hidden targets. An electro-optic camera mounted on a surveillance aircraft or unmanned aerial vehicle is used to capture the images of the forests with possible hidden targets, e.g., rocket launchers. We consider random forests of longitudinal and latitudinal correlations. Specifically, foliage coverage is encoded with a binary representation (i.e., foliage or no foliage), and is correlated in adjacent regions. We address the detection problem of camouflaged targets hidden in random forests by building memory into the observations. In particular, we propose an efficient algorithm to generate random forests, ground, and camouflage of hidden targets with two dimensional correlations. The observations are a sequence of snapshots consisting of foliage-obscured ground or target. Theoretically, detection is possible because there are subtle differences in the correlations of the ground and camouflage of the rocket launcher. However, these differences are well beyond human perception. To detect the presence of hidden targets automatically, we develop a Markov representation for these sequences and modify the classical filtering equations to allow the Markov chain observation. Particle filters are used to estimate the position of the targets in combination with a novel random weighting technique. Furthermore, we give positive proof-of-concept simulations.
Variational Infinite Hidden Conditional Random Fields.
Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja; Ghahramani, Zoubin
2015-09-01
Hidden conditional random fields (HCRFs) are discriminative latent variable models which have been shown to successfully learn the hidden structure of a given classification problem. An Infinite hidden conditional random field is a hidden conditional random field with a countably infinite number of hidden states, which rids us not only of the necessity to specify a priori a fixed number of hidden states available but also of the problem of overfitting. Markov chain Monte Carlo (MCMC) sampling algorithms are often employed for inference in such models. However, convergence of such algorithms is rather difficult to verify, and as the complexity of the task at hand increases the computational cost of such algorithms often becomes prohibitive. These limitations can be overcome by variational techniques. In this paper, we present a generalized framework for infinite HCRF models, and a novel variational inference approach on a model based on coupled Dirichlet Process Mixtures, the HCRF-DPM. We show that the variational HCRF-DPM is able to converge to a correct number of represented hidden states, and performs as well as the best parametric HCRFs-chosen via cross-validation-for the difficult tasks of recognizing instances of agreement, disagreement, and pain in audiovisual sequences. PMID:26353136
Stein, R.S.; Yeats, R.S.
1989-06-01
Seismologists generally look for earthquakes to happen along visible fault lines, e.g., the San Andreas fault. The authors maintain that another source of dangerous quakes has been overlooked: the release of stress along a fault that is hidden under a fold in the earth's crust. The paper describes the differences between an earthquake which occurs on a visible fault and one which occurs under an anticline and warns that Los Angeles greatest earthquake threat may come from a small quake originating under downtown Los Angeles, rather than a larger earthquake which occurs 50 miles away at the San Andreas fault.
NASA Astrophysics Data System (ADS)
Ji, C.-R.
2014-10-01
With the acceptance of QCD as the fundamental theory of strong interactions, one of the basic problems in the analysis of nuclear phenomena became how to consistently account for the effects of the underlying quark/gluon structure of nucleons and nuclei. Besides providing more detailed understanding of conventional nuclear physics, QCD may also point to novel phenomena accessible by new or upgraded nuclear experimental facilities. We discuss a few interesting applications of QCD to nuclear physics with an emphasis on the hidden color degrees of freedom.
Bayesian Smoothing Algorithms in Partially Observed Markov Chains
NASA Astrophysics Data System (ADS)
Ait-el-Fquih, Boujemaa; Desbouvries, François
2006-11-01
Let x = {xn}n∈N be a hidden process, y = {yn}n∈N an observed process and r = {rn}n∈N some auxiliary process. We assume that t = {tn}n∈N with tn = (xn, rn, yn-1) is a (Triplet) Markov Chain (TMC). TMC are more general than Hidden Markov Chains (HMC) and yet enable the development of efficient restoration and parameter estimation algorithms. This paper is devoted to Bayesian smoothing algorithms for TMC. We first propose twelve algorithms for general TMC. In the Gaussian case, these smoothers reduce to a set of algorithms which include, among other solutions, extensions to TMC of classical Kalman-like smoothing algorithms (originally designed for HMC) such as the RTS algorithms, the Two-Filter algorithms or the Bryson and Frazier algorithm.
Assessment of optimized Markov models in protein fold classification.
Lampros, Christos; Simos, Thomas; Exarchos, Themis P; Exarchos, Konstantinos P; Papaloukas, Costas; Fotiadis, Dimitrios I
2014-08-01
Protein fold classification is a challenging task strongly associated with the determination of proteins' structure. In this work, we tested an optimization strategy on a Markov chain and a recently introduced Hidden Markov Model (HMM) with reduced state-space topology. The proteins with unknown structure were scored against both these models. Then the derived scores were optimized following a local optimization method. The Protein Data Bank (PDB) and the annotation of the Structural Classification of Proteins (SCOP) database were used for the evaluation of the proposed methodology. The results demonstrated that the fold classification accuracy of the optimized HMM was substantially higher compared to that of the Markov chain or the reduced state-space HMM approaches. The proposed methodology achieved an accuracy of 41.4% on fold classification, while Sequence Alignment and Modeling (SAM), which was used for comparison, reached an accuracy of 38%. PMID:25152041
Hidden Markov models for estimating animal mortality from anthropogenic hazards
Carcasses searches are a common method for studying the risk of anthropogenic hazards to wildlife, including non-target poisoning and collisions with anthropogenic structures. Typically, numbers of carcasses found must be corrected for scavenging rates and imperfect detection. ...
Modelling proteins' hidden conformations to predict antibiotic resistance
NASA Astrophysics Data System (ADS)
Hart, Kathryn M.; Ho, Chris M. W.; Dutta, Supratik; Gross, Michael L.; Bowman, Gregory R.
2016-10-01
TEM β-lactamase confers bacteria with resistance to many antibiotics and rapidly evolves activity against new drugs. However, functional changes are not easily explained by differences in crystal structures. We employ Markov state models to identify hidden conformations and explore their role in determining TEM's specificity. We integrate these models with existing drug-design tools to create a new technique, called Boltzmann docking, which better predicts TEM specificity by accounting for conformational heterogeneity. Using our MSMs, we identify hidden states whose populations correlate with activity against cefotaxime. To experimentally detect our predicted hidden states, we use rapid mass spectrometric footprinting and confirm our models' prediction that increased cefotaxime activity correlates with reduced Ω-loop flexibility. Finally, we design novel variants to stabilize the hidden cefotaximase states, and find their populations predict activity against cefotaxime in vitro and in vivo. Therefore, we expect this framework to have numerous applications in drug and protein design.
NASA Technical Reports Server (NTRS)
Smith, R. M.
1991-01-01
Numerous applications in the area of computer system analysis can be effectively studied with Markov reward models. These models describe the behavior of the system with a continuous-time Markov chain, where a reward rate is associated with each state. In a reliability/availability model, upstates may have reward rate 1 and down states may have reward rate zero associated with them. In a queueing model, the number of jobs of certain type in a given state may be the reward rate attached to that state. In a combined model of performance and reliability, the reward rate of a state may be the computational capacity, or a related performance measure. Expected steady-state reward rate and expected instantaneous reward rate are clearly useful measures of the Markov reward model. More generally, the distribution of accumulated reward or time-averaged reward over a finite time interval may be determined from the solution of the Markov reward model. This information is of great practical significance in situations where the workload can be well characterized (deterministically, or by continuous functions e.g., distributions). The design process in the development of a computer system is an expensive and long term endeavor. For aerospace applications the reliability of the computer system is essential, as is the ability to complete critical workloads in a well defined real time interval. Consequently, effective modeling of such systems must take into account both performance and reliability. This fact motivates our use of Markov reward models to aid in the development and evaluation of fault tolerant computer systems.
Hideen Markov Models and Neural Networks for Fault Detection in Dynamic Systems
NASA Technical Reports Server (NTRS)
Smyth, Padhraic
1994-01-01
None given. (From conclusion): Neural networks plus Hidden Markov Models(HMM)can provide excellene detection and false alarm rate performance in fault detection applications. Modified models allow for novelty detection. Also covers some key contributions of neural network model, and application status.
Generator estimation of Markov jump processes
NASA Astrophysics Data System (ADS)
Metzner, P.; Dittmer, E.; Jahnke, T.; Schütte, Ch.
2007-11-01
Estimating the generator of a continuous-time Markov jump process based on incomplete data is a problem which arises in various applications ranging from machine learning to molecular dynamics. Several methods have been devised for this purpose: a quadratic programming approach (cf. [D.T. Crommelin, E. Vanden-Eijnden, Fitting timeseries by continuous-time Markov chains: a quadratic programming approach, J. Comp. Phys. 217 (2006) 782-805]), a resolvent method (cf. [T. Müller, Modellierung von Proteinevolution, PhD thesis, Heidelberg, 2001]), and various implementations of an expectation-maximization algorithm ([S. Asmussen, O. Nerman, M. Olsson, Fitting phase-type distributions via the EM algorithm, Scand. J. Stat. 23 (1996) 419-441; I. Holmes, G.M. Rubin, An expectation maximization algorithm for training hidden substitution models, J. Mol. Biol. 317 (2002) 753-764; U. Nodelman, C.R. Shelton, D. Koller, Expectation maximization and complex duration distributions for continuous time Bayesian networks, in: Proceedings of the twenty-first conference on uncertainty in AI (UAI), 2005, pp. 421-430; M. Bladt, M. Sørensen, Statistical inference for discretely observed Markov jump processes, J.R. Statist. Soc. B 67 (2005) 395-410]). Some of these methods, however, seem to be known only in a particular research community, and have later been reinvented in a different context. The purpose of this paper is to compile a catalogue of existing approaches, to compare the strengths and weaknesses, and to test their performance in a series of numerical examples. These examples include carefully chosen model problems and an application to a time series from molecular dynamics.
NASA Astrophysics Data System (ADS)
Volchenkov, Dima; Dawin, Jean René
A system for using dice to compose music randomly is known as the musical dice game. The discrete time MIDI models of 804 pieces of classical music written by 29 composers have been encoded into the transition matrices and studied by Markov chains. Contrary to human languages, entropy dominates over redundancy, in the musical dice games based on the compositions of classical music. The maximum complexity is achieved on the blocks consisting of just a few notes (8 notes, for the musical dice games generated over Bach's compositions). First passage times to notes can be used to resolve tonality and feature a composer.
Fuzzy Markov random fields versus chains for multispectral image segmentation.
Salzenstein, Fabien; Collet, Christophe
2006-11-01
This paper deals with a comparison of recent statistical models based on fuzzy Markov random fields and chains for multispectral image segmentation. The fuzzy scheme takes into account discrete and continuous classes which model the imprecision of the hidden data. In this framework, we assume the dependence between bands and we express the general model for the covariance matrix. A fuzzy Markov chain model is developed in an unsupervised way. This method is compared with the fuzzy Markovian field model previously proposed by one of the authors. The segmentation task is processed with Bayesian tools, such as the well-known MPM (Mode of Posterior Marginals) criterion. Our goal is to compare the robustness and rapidity for both methods (fuzzy Markov fields versus fuzzy Markov chains). Indeed, such fuzzy-based procedures seem to be a good answer, e.g., for astronomical observations when the patterns present diffuse structures. Moreover, these approaches allow us to process missing data in one or several spectral bands which correspond to specific situations in astronomy. To validate both models, we perform and compare the segmentation on synthetic images and raw multispectral astronomical data.
Fuzzy Markov random fields versus chains for multispectral image segmentation.
Salzenstein, Fabien; Collet, Christophe
2006-11-01
This paper deals with a comparison of recent statistical models based on fuzzy Markov random fields and chains for multispectral image segmentation. The fuzzy scheme takes into account discrete and continuous classes which model the imprecision of the hidden data. In this framework, we assume the dependence between bands and we express the general model for the covariance matrix. A fuzzy Markov chain model is developed in an unsupervised way. This method is compared with the fuzzy Markovian field model previously proposed by one of the authors. The segmentation task is processed with Bayesian tools, such as the well-known MPM (Mode of Posterior Marginals) criterion. Our goal is to compare the robustness and rapidity for both methods (fuzzy Markov fields versus fuzzy Markov chains). Indeed, such fuzzy-based procedures seem to be a good answer, e.g., for astronomical observations when the patterns present diffuse structures. Moreover, these approaches allow us to process missing data in one or several spectral bands which correspond to specific situations in astronomy. To validate both models, we perform and compare the segmentation on synthetic images and raw multispectral astronomical data. PMID:17063681
Constructing 1/ωα noise from reversible Markov chains
NASA Astrophysics Data System (ADS)
Erland, Sveinung; Greenwood, Priscilla E.
2007-09-01
This paper gives sufficient conditions for the output of 1/ωα noise from reversible Markov chains on finite state spaces. We construct several examples exhibiting this behavior in a specified range of frequencies. We apply simple representations of the covariance function and the spectral density in terms of the eigendecomposition of the probability transition matrix. The results extend to hidden Markov chains. We generalize the results for aggregations of AR1-processes of C. W. J. Granger [J. Econometrics 14, 227 (1980)]. Given the eigenvalue function, there is a variety of ways to assign values to the states such that the 1/ωα condition is satisfied. We show that a random walk on a certain state space is complementary to the point process model of 1/ω noise of B. Kaulakys and T. Meskauskas [Phys. Rev. E 58, 7013 (1998)]. Passing to a continuous state space, we construct 1/ωα noise which also has a long memory.
On Factor Maps that Send Markov Measures to Gibbs Measures
NASA Astrophysics Data System (ADS)
Yoo, Jisang
2010-12-01
Let X and Y be mixing shifts of finite type. Let π be a factor map from X to Y that is fiber-mixing, i.e., given x,bar{x}in X with π(x)=π(bar{x})=yin Y, there is z∈ π -1( y) that is left asymptotic to x and right asymptotic to bar{x}. We show that any Markov measure on X projects to a Gibbs measure on Y under π (for a Hölder continuous potential). In other words, all hidden Markov chains (i.e. sofic measures) realized by π are Gibbs measures. In 2003, Chazottes and Ugalde gave a sufficient condition for a sofic measure to be a Gibbs measure. Our sufficient condition generalizes their condition and is invariant under conjugacy and time reversal. We provide examples demonstrating our result.
Predictive Rate-Distortion for Infinite-Order Markov Processes
NASA Astrophysics Data System (ADS)
Marzen, Sarah E.; Crutchfield, James P.
2016-06-01
Predictive rate-distortion analysis suffers from the curse of dimensionality: clustering arbitrarily long pasts to retain information about arbitrarily long futures requires resources that typically grow exponentially with length. The challenge is compounded for infinite-order Markov processes, since conditioning on finite sequences cannot capture all of their past dependencies. Spectral arguments confirm a popular intuition: algorithms that cluster finite-length sequences fail dramatically when the underlying process has long-range temporal correlations and can fail even for processes generated by finite-memory hidden Markov models. We circumvent the curse of dimensionality in rate-distortion analysis of finite- and infinite-order processes by casting predictive rate-distortion objective functions in terms of the forward- and reverse-time causal states of computational mechanics. Examples demonstrate that the resulting algorithms yield substantial improvements.
Nonlocal Order Parameters for the 1D Hubbard Model
NASA Astrophysics Data System (ADS)
Montorsi, Arianna; Roncaglia, Marco
2012-12-01
We characterize the Mott-insulator and Luther-Emery phases of the 1D Hubbard model through correlators that measure the parity of spin and charge strings along the chain. These nonlocal quantities order in the corresponding gapped phases and vanish at the critical point Uc=0, thus configuring as hidden order parameters. The Mott insulator consists of bound doublon-holon pairs, which in the Luther-Emery phase turn into electron pairs with opposite spins, both unbinding at Uc. The behavior of the parity correlators is captured by an effective free spinless fermion model.
Decentralized learning in Markov games.
Vrancx, Peter; Verbeeck, Katja; Nowé, Ann
2008-08-01
Learning automata (LA) were recently shown to be valuable tools for designing multiagent reinforcement learning algorithms. One of the principal contributions of the LA theory is that a set of decentralized independent LA is able to control a finite Markov chain with unknown transition probabilities and rewards. In this paper, we propose to extend this algorithm to Markov games--a straightforward extension of single-agent Markov decision problems to distributed multiagent decision problems. We show that under the same ergodic assumptions of the original theorem, the extended algorithm will converge to a pure equilibrium point between agent policies.
Metrics for Labeled Markov Systems
NASA Technical Reports Server (NTRS)
Desharnais, Josee; Jagadeesan, Radha; Gupta, Vineet; Panangaden, Prakash
1999-01-01
Partial Labeled Markov Chains are simultaneously generalizations of process algebra and of traditional Markov chains. They provide a foundation for interacting discrete probabilistic systems, the interaction being synchronization on labels as in process algebra. Existing notions of process equivalence are too sensitive to the exact probabilities of various transitions. This paper addresses contextual reasoning principles for reasoning about more robust notions of "approximate" equivalence between concurrent interacting probabilistic systems. The present results indicate that:We develop a family of metrics between partial labeled Markov chains to formalize the notion of distance between processes. We show that processes at distance zero are bisimilar. We describe a decision procedure to compute the distance between two processes. We show that reasoning about approximate equivalence can be done compositionally by showing that process combinators do not increase distance. We introduce an asymptotic metric to capture asymptotic properties of Markov chains; and show that parallel composition does not increase asymptotic distance.
Raberto, Marco; Rapallo, Fabio; Scalas, Enrico
2011-01-01
In this paper, we outline a model of graph (or network) dynamics based on two ingredients. The first ingredient is a Markov chain on the space of possible graphs. The second ingredient is a semi-Markov counting process of renewal type. The model consists in subordinating the Markov chain to the semi-Markov counting process. In simple words, this means that the chain transitions occur at random time instants called epochs. The model is quite rich and its possible connections with algebraic geometry are briefly discussed. Moreover, for the sake of simplicity, we focus on the space of undirected graphs with a fixed number of nodes. However, in an example, we present an interbank market model where it is meaningful to use directed graphs or even weighted graphs. PMID:21887245
Infinite hidden conditional random fields for human behavior analysis.
Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja
2013-01-01
Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF (iHCRF), which is a nonparametric model based on hierarchical Dirichlet processes and is capable of automatically learning the optimal number of hidden states for a classification task. We show how we learn the model hyperparameters with an effective Markov-chain Monte Carlo sampling technique, and we explain the process that underlines our iHCRF model with the Restaurant Franchise Rating Agencies analogy. We show that the iHCRF is able to converge to a correct number of represented hidden states, and outperforms the best finite HCRFs--chosen via cross-validation--for the difficult tasks of recognizing instances of agreement, disagreement, and pain. Moreover, the iHCRF manages to achieve this performance in significantly less total training, validation, and testing time. PMID:24808217
How hidden are hidden processes? A primer on crypticity and entropy convergence
NASA Astrophysics Data System (ADS)
Mahoney, John R.; Ellison, Christopher J.; James, Ryan G.; Crutchfield, James P.
2011-09-01
We investigate a stationary process's crypticity—a measure of the difference between its hidden state information and its observed information—using the causal states of computational mechanics. Here, we motivate crypticity and cryptic order as physically meaningful quantities that monitor how hidden a hidden process is. This is done by recasting previous results on the convergence of block entropy and block-state entropy in a geometric setting, one that is more intuitive and that leads to a number of new results. For example, we connect crypticity to how an observer synchronizes to a process. We show that the block-causal-state entropy is a convex function of block length. We give a complete analysis of spin chains. We present a classification scheme that surveys stationary processes in terms of their possible cryptic and Markov orders. We illustrate related entropy convergence behaviors using a new form of foliated information diagram. Finally, along the way, we provide a variety of interpretations of crypticity and cryptic order to establish their naturalness and pervasiveness. This is also a first step in developing applications in spatially extended and network dynamical systems.
How hidden are hidden processes? A primer on crypticity and entropy convergence.
Mahoney, John R; Ellison, Christopher J; James, Ryan G; Crutchfield, James P
2011-09-01
We investigate a stationary process's crypticity--a measure of the difference between its hidden state information and its observed information--using the causal states of computational mechanics. Here, we motivate crypticity and cryptic order as physically meaningful quantities that monitor how hidden a hidden process is. This is done by recasting previous results on the convergence of block entropy and block-state entropy in a geometric setting, one that is more intuitive and that leads to a number of new results. For example, we connect crypticity to how an observer synchronizes to a process. We show that the block-causal-state entropy is a convex function of block length. We give a complete analysis of spin chains. We present a classification scheme that surveys stationary processes in terms of their possible cryptic and Markov orders. We illustrate related entropy convergence behaviors using a new form of foliated information diagram. Finally, along the way, we provide a variety of interpretations of crypticity and cryptic order to establish their naturalness and pervasiveness. This is also a first step in developing applications in spatially extended and network dynamical systems. PMID:21974675
Estimating demographic parameters using hidden process dynamic models.
Gimenez, Olivier; Lebreton, Jean-Dominique; Gaillard, Jean-Michel; Choquet, Rémi; Pradel, Roger
2012-12-01
Structured population models are widely used in plant and animal demographic studies to assess population dynamics. In matrix population models, populations are described with discrete classes of individuals (age, life history stage or size). To calibrate these models, longitudinal data are collected at the individual level to estimate demographic parameters. However, several sources of uncertainty can complicate parameter estimation, such as imperfect detection of individuals inherent to monitoring in the wild and uncertainty in assigning a state to an individual. Here, we show how recent statistical models can help overcome these issues. We focus on hidden process models that run two time series in parallel, one capturing the dynamics of the true states and the other consisting of observations arising from these underlying possibly unknown states. In a first case study, we illustrate hidden Markov models with an example of how to accommodate state uncertainty using Frequentist theory and maximum likelihood estimation. In a second case study, we illustrate state-space models with an example of how to estimate lifetime reproductive success despite imperfect detection, using a Bayesian framework and Markov Chain Monte Carlo simulation. Hidden process models are a promising tool as they allow population biologists to cope with process variation while simultaneously accounting for observation error. PMID:22373775
Hidden circuits and argumentation
NASA Astrophysics Data System (ADS)
Leinonen, Risto; Kesonen, Mikko H. P.; Hirvonen, Pekka E.
2016-11-01
Despite the relevance of DC circuits in everyday life and schools, they have been shown to cause numerous learning difficulties at various school levels. In the course of this article, we present a flexible method for teaching DC circuits at lower secondary level. The method is labelled as hidden circuits, and the essential idea underlying hidden circuits is in hiding the actual wiring of DC circuits, but to make their behaviour evident for pupils. Pupils are expected to find out the wiring of the circuit which should enhance their learning of DC circuits. We present two possible ways to utilise hidden circuits in a classroom. First, they can be used to test and enhance pupils’ conceptual understanding when pupils are expected to find out which one of the offered circuit diagram options corresponds to the actual circuit shown. This method aims to get pupils to evaluate the circuits holistically rather than locally, and as a part of that aim this method highlights any learning difficulties of pupils. Second, hidden circuits can be used to enhance pupils’ argumentation skills with the aid of argumentation sheet that illustrates the main elements of an argument. Based on the findings from our co-operating teachers and our own experiences, hidden circuits offer a flexible and motivating way to supplement teaching of DC circuits.
A Markov switching model for annual hydrologic time series
NASA Astrophysics Data System (ADS)
Akıntuǧ, B.; Rasmussen, P. F.
2005-09-01
This paper investigates the properties of Markov switching (MS) models (also known as hidden Markov models) for generating annual time series. This type of model has been used in a number of recent studies in the water resources literature. The model considered here assumes that climate is switching between M states and that the state sequence can be described by a Markov chain. Observations are assumed to be drawn from a normal distribution whose parameters depend on the state variable. We present the stochastic properties of this class of models along with procedures for model identification and parameter estimation. Although, at a first glance, MS models appear to be quite different from ARMA models, we show that it is possible to find an ARMA model that has the same autocorrelation function and the same marginal distribution as any given MS model. Hence, despite the difference in model structure, there are strong similarities between MS and ARMA models. MS and ARMA models are applied to the time series of mean annual discharge of the Niagara River. Although it is difficult to draw any general conclusion from a single case study, it appears that MS models (and ARMA models derived from MS models) generally have stronger autocorrelation at higher lags than ARMA models estimated by conventional maximum likelihood. This may be an important property if the purpose of the study is the analysis of multiyear droughts.
On Markov parameters in system identification
NASA Technical Reports Server (NTRS)
Phan, Minh; Juang, Jer-Nan; Longman, Richard W.
1991-01-01
A detailed discussion of Markov parameters in system identification is given. Different forms of input-output representation of linear discrete-time systems are reviewed and discussed. Interpretation of sampled response data as Markov parameters is presented. Relations between the state-space model and particular linear difference models via the Markov parameters are formulated. A generalization of Markov parameters to observer and Kalman filter Markov parameters for system identification is explained. These extended Markov parameters play an important role in providing not only a state-space realization, but also an observer/Kalman filter for the system of interest.
A new approach to simulating stream isotope dynamics using Markov switching autoregressive models
NASA Astrophysics Data System (ADS)
Birkel, Christian; Paroli, Roberta; Spezia, Luigi; Dunn, Sarah M.; Tetzlaff, Doerthe; Soulsby, Chris
2012-09-01
In this study we applied Markov switching autoregressive models (MSARMs) as a proof-of-concept to analyze the temporal dynamics and statistical characteristics of the time series of two conservative water isotopes, deuterium (δ2H) and oxygen-18 (δ18O), in daily stream water samples over two years in a small catchment in eastern Scotland. MSARMs enabled us to explicitly account for the identified non-linear, non-Normal and non-stationary isotope dynamics of both time series. The hidden states of the Markov chain could also be associated with meteorological and hydrological drivers identifying the short (event) and longer-term (inter-event) transport mechanisms for both isotopes. Inference was based on the Bayesian approach performed through Markov Chain Monte Carlo algorithms, which also allowed us to deal with a high rate of missing values (17%). Although it is usually assumed that both isotopes are conservative and exhibit similar dynamics, δ18O showed somewhat different time series characteristics. Both isotopes were best modelled with two hidden states, but δ18O demanded autoregressions of the first order, whereas δ2H of the second. Moreover, both the dynamics of observations and the hidden states of the two isotopes were explained by two different sets of covariates. Consequently use of the two tracers for transit time modelling and hydrograph separation may result in different interpretations on the functioning of a catchment system.
Markov Analysis of Sleep Dynamics
NASA Astrophysics Data System (ADS)
Kim, J. W.; Lee, J.-S.; Robinson, P. A.; Jeong, D.-U.
2009-05-01
A new approach, based on a Markov transition matrix, is proposed to explain frequent sleep and wake transitions during sleep. The matrix is determined by analyzing hypnograms of 113 obstructive sleep apnea patients. Our approach shows that the statistics of sleep can be constructed via a single Markov process and that durations of all states have modified exponential distributions, in contrast to recent reports of a scale-free form for the wake stage and an exponential form for the sleep stage. Hypnograms of the same subjects, but treated with Continuous Positive Airway Pressure, are analyzed and compared quantitatively with the pretreatment ones, suggesting potential clinical applications.
On a Result for Finite Markov Chains
ERIC Educational Resources Information Center
Kulathinal, Sangita; Ghosh, Lagnojita
2006-01-01
In an undergraduate course on stochastic processes, Markov chains are discussed in great detail. Textbooks on stochastic processes provide interesting properties of finite Markov chains. This note discusses one such property regarding the number of steps in which a state is reachable or accessible from another state in a finite Markov chain with M…
Relative survival multistate Markov model.
Huszti, Ella; Abrahamowicz, Michal; Alioum, Ahmadou; Binquet, Christine; Quantin, Catherine
2012-02-10
Prognostic studies often have to deal with two important challenges: (i) separating effects of predictions on different 'competing' events and (ii) uncertainty about cause of death. Multistate Markov models permit multivariable analyses of competing risks of, for example, mortality versus disease recurrence. On the other hand, relative survival methods help estimate disease-specific mortality risks even in the absence of data on causes of death. In this paper, we propose a new Markov relative survival (MRS) model that attempts to combine these two methodologies. Our MRS model extends the existing multistate Markov piecewise constant intensities model to relative survival modeling. The intensity of transitions leading to death in the MRS model is modeled as the sum of an estimable excess hazard of mortality from the disease of interest and an 'offset' defined as the expected hazard of all-cause 'natural' mortality obtained from relevant life-tables. We evaluate the new MRS model through simulations, with a design based on registry-based prognostic studies of colon cancer. Simulation results show almost unbiased estimates of prognostic factor effects for the MRS model. We also applied the new MRS model to reassess the role of prognostic factors for mortality in a study of colorectal cancer. The MRS model considerably reduces the bias observed with the conventional Markov model that does not permit accounting for unknown causes of death, especially if the 'true' effects of a prognostic factor on the two types of mortality differ substantially.
Benchmarks and models for 1-D radiation transport in stochastic participating media
Miller, D S
2000-08-21
Benchmark calculations for radiation transport coupled to a material temperature equation in a 1-D slab and 1-D spherical geometry binary random media are presented. The mixing statistics are taken to be homogeneous Markov statistics in the 1-D slab but only approximately Markov statistics in the 1-D sphere. The material chunk sizes are described by Poisson distribution functions. The material opacities are first taken to be constant and then allowed to vary as a strong function of material temperature. Benchmark values and variances for time evolution of the ensemble average of material temperature energy density and radiation transmission are computed via a Monte Carlo type method. These benchmarks are used as a basis for comparison with three other approximate methods of solution. One of these approximate methods is simple atomic mix. The second approximate model is an adaptation of what is commonly called the Levermore-Pomraning model and which is referred to here as the standard model. It is shown that recasting the temperature coupling as a type of effective scattering can be useful in formulating the third approximate model, an adaptation of a model due to Su and Pomraning which attempts to account for the effects of scattering in a stochastic context. This last adaptation shows consistent improvement over both the atomic mix and standard models when used in the 1-D slab geometry but shows limited improvement in the 1-D spherical geometry. Benchmark values are also computed for radiation transmission from the 1-D sphere without material heating present. This is to evaluate the performance of the standard model on this geometry--something which has never been done before. All of the various tests demonstrate the importance of stochastic structure on the solution. Also demonstrated are the range of usefulness and limitations of a simple atomic mix formulation.
Markov random field surface reconstruction.
Paulsen, Rasmus R; Baerentzen, Jakob Andreas; Larsen, Rasmus
2010-01-01
A method for implicit surface reconstruction is proposed. The novelty in this paper is the adaptation of Markov Random Field regularization of a distance field. The Markov Random Field formulation allows us to integrate both knowledge about the type of surface we wish to reconstruct (the prior) and knowledge about data (the observation model) in an orthogonal fashion. Local models that account for both scene-specific knowledge and physical properties of the scanning device are described. Furthermore, how the optimal distance field can be computed is demonstrated using conjugate gradients, sparse Cholesky factorization, and a multiscale iterative optimization scheme. The method is demonstrated on a set of scanned human heads and, both in terms of accuracy and the ability to close holes, the proposed method is shown to have similar or superior performance when compared to current state-of-the-art algorithms.
Markov Tracking for Agent Coordination
NASA Technical Reports Server (NTRS)
Washington, Richard; Lau, Sonie (Technical Monitor)
1998-01-01
Partially observable Markov decision processes (POMDPs) axe an attractive representation for representing agent behavior, since they capture uncertainty in both the agent's state and its actions. However, finding an optimal policy for POMDPs in general is computationally difficult. In this paper we present Markov Tracking, a restricted problem of coordinating actions with an agent or process represented as a POMDP Because the actions coordinate with the agent rather than influence its behavior, the optimal solution to this problem can be computed locally and quickly. We also demonstrate the use of the technique on sequential POMDPs, which can be used to model a behavior that follows a linear, acyclic trajectory through a series of states. By imposing a "windowing" restriction that restricts the number of possible alternatives considered at any moment to a fixed size, a coordinating action can be calculated in constant time, making this amenable to coordination with complex agents.
Learning atomic human actions using variable-length Markov models.
Liang, Yu-Ming; Shih, Sheng-Wen; Shih, Arthur Chun-Chieh; Liao, Hong-Yuan Mark; Lin, Cheng-Chung
2009-02-01
Visual analysis of human behavior has generated considerable interest in the field of computer vision because of its wide spectrum of potential applications. Human behavior can be segmented into atomic actions, each of which indicates a basic and complete movement. Learning and recognizing atomic human actions are essential to human behavior analysis. In this paper, we propose a framework for handling this task using variable-length Markov models (VLMMs). The framework is comprised of the following two modules: a posture labeling module and a VLMM atomic action learning and recognition module. First, a posture template selection algorithm, based on a modified shape context matching technique, is developed. The selected posture templates form a codebook that is used to convert input posture sequences into discrete symbol sequences for subsequent processing. Then, the VLMM technique is applied to learn the training symbol sequences of atomic actions. Finally, the constructed VLMMs are transformed into hidden Markov models (HMMs) for recognizing input atomic actions. This approach combines the advantages of the excellent learning function of a VLMM and the fault-tolerant recognition ability of an HMM. Experiments on realistic data demonstrate the efficacy of the proposed system.
NASA Astrophysics Data System (ADS)
Hassan, Kazi; Allen, Deonie; Haynes, Heather
2016-04-01
This paper considers 1D hydraulic model data on the effect of high flow clusters and sequencing on sediment transport. Using observed flow gauge data from the River Caldew, England, a novel stochastic modelling approach was developed in order to create alternative 50 year flow sequences. Whilst the observed probability density of gauge data was preserved in all sequences, the order in which those flows occurred was varied using the output from a Hidden Markov Model (HMM) with generalised Pareto distribution (GP). In total, one hundred 50 year synthetic flow series were generated and used as the inflow boundary conditions for individual flow series model runs using the 1D sediment transport model HEC-RAS. The model routed graded sediment through the case study river reach to define the long-term morphological changes. Comparison of individual simulations provided a detailed understanding of the sensitivity of channel capacity to flow sequence. Specifically, each 50 year synthetic flow sequence was analysed using a 3-month, 6-month or 12-month rolling window approach and classified for clusters in peak discharge. As a cluster is described as a temporal grouping of flow events above a specified threshold, the threshold condition used herein is considered as a morphologically active channel forming discharge event. Thus, clusters were identified for peak discharges in excess of 10%, 20%, 50%, 100% and 150% of the 1 year Return Period (RP) event. The window of above-peak flows also required cluster definition and was tested for timeframes 1, 2, 10 and 30 days. Subsequently, clusters could be described in terms of the number of events, maximum peak flow discharge, cumulative flow discharge and skewness (i.e. a description of the flow sequence). The model output for each cluster was analysed for the cumulative flow volume and cumulative sediment transport (mass). This was then compared to the total sediment transport of a single flow event of equivalent flow volume
Markov and semi-Markov processes as a failure rate
NASA Astrophysics Data System (ADS)
Grabski, Franciszek
2016-06-01
In this paper the reliability function is defined by the stochastic failure rate process with a non negative and right continuous trajectories. Equations for the conditional reliability functions of an object, under assumption that the failure rate is a semi-Markov process with an at most countable state space are derived. A proper theorem is presented. The linear systems of equations for the appropriate Laplace transforms allow to find the reliability functions for the alternating, the Poisson and the Furry-Yule failure rate processes.
Ancestry inference in complex admixtures via variable-length Markov chain linkage models.
Rodriguez, Jesse M; Bercovici, Sivan; Elmore, Megan; Batzoglou, Serafim
2013-03-01
Inferring the ancestral origin of chromosomal segments in admixed individuals is key for genetic applications, ranging from analyzing population demographics and history, to mapping disease genes. Previous methods addressed ancestry inference by using either weak models of linkage disequilibrium, or large models that make explicit use of ancestral haplotypes. In this paper we introduce ALLOY, an efficient method that incorporates generalized, but highly expressive, linkage disequilibrium models. ALLOY applies a factorial hidden Markov model to capture the parallel process producing the maternal and paternal admixed haplotypes, and models the background linkage disequilibrium in the ancestral populations via an inhomogeneous variable-length Markov chain. We test ALLOY in a broad range of scenarios ranging from recent to ancient admixtures with up to four ancestral populations. We show that ALLOY outperforms the previous state of the art, and is robust to uncertainties in model parameters. PMID:23421795
NASA Astrophysics Data System (ADS)
Nguyen, Tuyen Van; Liu, Yuedan; Jung, Il-Hyo; Chon, Tae-Soo; Lee, Sang-Hee
Revealing biological responses of organisms in responding to environmental stressors is the critical issue in contemporary ecological sciences. Markov processes in behavioral data were unraveled by utilizing the hidden Markov model (HMM). Individual organisms of daphnia (Daphnia magna) and zebrafish (Danio rerio) were exposed to diazinon at low concentrations. The transition probability matrix (TPM) and the emission probability matrix (EPM) were accordingly estimated by training with the HMM and were verified before and after the treatments with 10-6 tolerance in 103 iterations. Structured property in behavioral changes was accordingly revealed to characterize dynamic processes in movement patterns. Parameters and sequences produced through the HMM training could be a suitable means of monitoring toxic chemicals in environment.
Hidden attractors in dynamical systems
NASA Astrophysics Data System (ADS)
Dudkowski, Dawid; Jafari, Sajad; Kapitaniak, Tomasz; Kuznetsov, Nikolay V.; Leonov, Gennady A.; Prasad, Awadhesh
2016-06-01
Complex dynamical systems, ranging from the climate, ecosystems to financial markets and engineering applications typically have many coexisting attractors. This property of the system is called multistability. The final state, i.e., the attractor on which the multistable system evolves strongly depends on the initial conditions. Additionally, such systems are very sensitive towards noise and system parameters so a sudden shift to a contrasting regime may occur. To understand the dynamics of these systems one has to identify all possible attractors and their basins of attraction. Recently, it has been shown that multistability is connected with the occurrence of unpredictable attractors which have been called hidden attractors. The basins of attraction of the hidden attractors do not touch unstable fixed points (if exists) and are located far away from such points. Numerical localization of the hidden attractors is not straightforward since there are no transient processes leading to them from the neighborhoods of unstable fixed points and one has to use the special analytical-numerical procedures. From the viewpoint of applications, the identification of hidden attractors is the major issue. The knowledge about the emergence and properties of hidden attractors can increase the likelihood that the system will remain on the most desirable attractor and reduce the risk of the sudden jump to undesired behavior. We review the most representative examples of hidden attractors, discuss their theoretical properties and experimental observations. We also describe numerical methods which allow identification of the hidden attractors.
Computationally tractable stochastic image modeling based on symmetric Markov mesh random fields.
Yousefi, Siamak; Kehtarnavaz, Nasser; Cao, Yan
2013-06-01
In this paper, the properties of a new class of causal Markov random fields, named symmetric Markov mesh random field, are initially discussed. It is shown that the symmetric Markov mesh random fields from the upper corners are equivalent to the symmetric Markov mesh random fields from the lower corners. Based on this new random field, a symmetric, corner-independent, and isotropic image model is then derived which incorporates the dependency of a pixel on all its neighbors. The introduced image model comprises the product of several local 1D density and 2D joint density functions of pixels in an image thus making it computationally tractable and practically feasible by allowing the use of histogram and joint histogram approximations to estimate the model parameters. An image restoration application is also presented to confirm the effectiveness of the model developed. The experimental results demonstrate that this new model provides an improved tool for image modeling purposes compared to the conventional Markov random field models.
Using Games to Teach Markov Chains
ERIC Educational Resources Information Center
Johnson, Roger W.
2003-01-01
Games are promoted as examples for classroom discussion of stationary Markov chains. In a game context Markov chain terminology and results are made concrete, interesting, and entertaining. Game length for several-player games such as "Hi Ho! Cherry-O" and "Chutes and Ladders" is investigated and new, simple formulas are given. Slight…
Child Abuse: The Hidden Bruises
... AACAP Facts for Families Guide Skip breadcrumb navigation Child Abuse - The Hidden Bruises Quick Links Facts For Families ... 5; Updated November 2014 The statistics on physical child abuse are alarming. It is estimated hundreds of thousands ...
Hidden Magnetic Portals Around Earth
A NASA-sponsored researcher at the University of Iowa has developed a way for spacecraft to hunt down hidden magnetic portals in the vicinity of Earth. These gateways link the magnetic field of our...
Hidden Statistics of Schroedinger Equation
NASA Technical Reports Server (NTRS)
Zak, Michail
2011-01-01
Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.
Hidden modes in open disordered media: analytical, numerical, and experimental results
NASA Astrophysics Data System (ADS)
Bliokh, Yury P.; Freilikher, Valentin; Shi, Z.; Genack, A. Z.; Nori, Franco
2015-11-01
We explore numerically, analytically, and experimentally the relationship between quasi-normal modes (QNMs) and transmission resonance (TR) peaks in the transmission spectrum of one-dimensional (1D) and quasi-1D open disordered systems. It is shown that for weak disorder there exist two types of the eigenstates: ordinary QNMs which are associated with a TR, and hidden QNMs which do not exhibit peaks in transmission or within the sample. The distinctive feature of the hidden modes is that unlike ordinary ones, their lifetimes remain constant in a wide range of the strength of disorder. In this range, the averaged ratio of the number of transmission peaks {N}{{res}} to the number of QNMs {N}{{mod}}, {N}{{res}}/{N}{{mod}}, is insensitive to the type and degree of disorder and is close to the value \\sqrt{2/5}, which we derive analytically in the weak-scattering approximation. The physical nature of the hidden modes is illustrated in simple examples with a few scatterers. The analogy between ordinary and hidden QNMs and the segregation of superradiant states and trapped modes is discussed. When the coupling to the environment is tuned by an external edge reflectors, the superradiance transition is reproduced. Hidden modes have been also found in microwave measurements in quasi-1D open disordered samples. The microwave measurements and modal analysis of transmission in the crossover to localization in quasi-1D systems give a ratio of {N}{{res}}/{N}{{mod}} close to \\sqrt{2/5}. In diffusive quasi-1D samples, however, {N}{{res}}/{N}{{mod}} falls as the effective number of transmission eigenchannels M increases. Once {N}{{mod}} is divided by M, however, the ratio {N}{{res}}/{N}{{mod}} is close to the ratio found in 1D.
Hidden Stages of Cognition Revealed in Patterns of Brain Activation.
Anderson, John R; Pyke, Aryn A; Fincham, Jon M
2016-09-01
To advance cognitive theory, researchers must be able to parse the performance of a task into its significant mental stages. In this article, we describe a new method that uses functional MRI brain activation to identify when participants are engaged in different cognitive stages on individual trials. The method combines multivoxel pattern analysis to identify cognitive stages and hidden semi-Markov models to identify their durations. This method, applied to a problem-solving task, identified four distinct stages: encoding, planning, solving, and responding. We examined whether these stages corresponded to their ascribed functions by testing whether they are affected by appropriate factors. Planning-stage duration increased as the method for solving the problem became less obvious, whereas solving-stage duration increased as the number of calculations to produce the answer increased. Responding-stage duration increased with the difficulty of the motor actions required to produce the answer. PMID:27440808
NASA Astrophysics Data System (ADS)
Sawada, Hiroyuki
Recently, engineering design environment of Japan is changing variously. Manufacturing companies are being challenged to design and bring out products that meet the diverse demands of customers and are competitive against those produced by rising countries(1). In order to keep and strengthen the competitiveness of Japanese companies, it is necessary to create new added values as well as conventional ones. It is well known that design at the early stages has a great influence on the final design solution. Therefore, design support tools for the upstream design is necessary for creating new added values. We have established a research society for 1D-CAE (1 Dimensional Computer Aided Engineering)(2), which is a general term for idea, methodology and tools applicable for the upstream design support, and discuss the concept and definition of 1D-CAE. This paper reports our discussion about 1D-CAE.
MRFalign: protein homology detection through alignment of Markov random fields.
Ma, Jianzhu; Wang, Sheng; Wang, Zhiyong; Xu, Jinbo
2014-03-01
Sequence-based protein homology detection has been extensively studied and so far the most sensitive method is based upon comparison of protein sequence profiles, which are derived from multiple sequence alignment (MSA) of sequence homologs in a protein family. A sequence profile is usually represented as a position-specific scoring matrix (PSSM) or an HMM (Hidden Markov Model) and accordingly PSSM-PSSM or HMM-HMM comparison is used for homolog detection. This paper presents a new homology detection method MRFalign, consisting of three key components: 1) a Markov Random Fields (MRF) representation of a protein family; 2) a scoring function measuring similarity of two MRFs; and 3) an efficient ADMM (Alternating Direction Method of Multipliers) algorithm aligning two MRFs. Compared to HMM that can only model very short-range residue correlation, MRFs can model long-range residue interaction pattern and thus, encode information for the global 3D structure of a protein family. Consequently, MRF-MRF comparison for remote homology detection shall be much more sensitive than HMM-HMM or PSSM-PSSM comparison. Experiments confirm that MRFalign outperforms several popular HMM or PSSM-based methods in terms of both alignment accuracy and remote homology detection and that MRFalign works particularly well for mainly beta proteins. For example, tested on the benchmark SCOP40 (8353 proteins) for homology detection, PSSM-PSSM and HMM-HMM succeed on 48% and 52% of proteins, respectively, at superfamily level, and on 15% and 27% of proteins, respectively, at fold level. In contrast, MRFalign succeeds on 57.3% and 42.5% of proteins at superfamily and fold level, respectively. This study implies that long-range residue interaction patterns are very helpful for sequence-based homology detection. The software is available for download at http://raptorx.uchicago.edu/download/. A summary of this paper appears in the proceedings of the RECOMB 2014 conference, April 2-5. PMID:24675572
DESIGN PACKAGE 1D SYSTEM SAFETY ANALYSIS
L.R. Eisler
1995-02-02
The purpose of this analysis is to systematically identify and evaluate hazards related to the Yucca Mountain Project Exploratory Studies Facility (ESF) Design Package 1D, Surface Facilities, (for a list of design items included in the package 1D system safety analysis see section 3). This process is an integral part of the systems engineering process; whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach was used since a radiological System Safety analysis is not required. The risk assessment in this analysis characterizes the accident scenarios associated with the Design Package 1D structures/systems/components in terms of relative risk and includes recommendations for mitigating all identified risks. The priority for recommending and implementing mitigation control features is: (1) Incorporate measures to reduce risks and hazards into the structure/system/component (S/S/C) design, (2) add safety devices and capabilities to the designs that reduce risk, (3) provide devices that detect and warn personnel of hazardous conditions, and (4) develop procedures and conduct training to increase worker awareness of potential hazards, on methods to reduce exposure to hazards, and on the actions required to avoid accidents or correct hazardous conditions. The scope of this analysis is limited to the Design Package 1D structures/systems/components (S/S/Cs) during normal operations excluding hazards occurring during maintenance and ''off normal'' operations.
Hidden photons in connection to dark matter
Andreas, Sarah; Ringwald, Andreas; Goodsell, Mark D.
2013-11-07
Light extra U(1) gauge bosons, so called hidden photons, which reside in a hidden sector have attracted much attention since they are a well motivated feature of many scenarios beyond the Standard Model and furthermore could mediate the interaction with hidden sector dark matter. We review limits on hidden photons from past electron beam dump experiments including two new limits from such experiments at KEK and Orsay. In addition, we study the possibility of having dark matter in the hidden sector. A simple toy model and different supersymmetric realisations are shown to provide viable dark matter candidates in the hidden sector that are in agreement with recent direct detection limits.
Exact significance test for Markov order
NASA Astrophysics Data System (ADS)
Pethel, S. D.; Hahs, D. W.
2014-02-01
We describe an exact significance test of the null hypothesis that a Markov chain is nth order. The procedure utilizes surrogate data to yield an exact test statistic distribution valid for any sample size. Surrogate data are generated using a novel algorithm that guarantees, per shot, a uniform sampling from the set of sequences that exactly match the nth order properties of the observed data. Using the test, the Markov order of Tel Aviv rainfall data is examined.
Lu, Ji; Pan, Junhao; Zhang, Qiang; Dubé, Laurette; Ip, Edward H.
2015-01-01
With intensively collected longitudinal data, recent advances in Experience Sampling Method (ESM) benefit social science empirical research, but also pose important methodological challenges. As traditional statistical models are not generally well-equipped to analyze a system of variables that contain feedback loops, this paper proposes the utility of an extended hidden Markov model to model reciprocal relationship between momentary emotion and eating behavior. This paper revisited an ESM data set (Lu, Huet & Dube, 2011) that observed 160 participants’ food consumption and momentary emotions six times per day in 10 days. Focusing on the analyses on feedback loop between mood and meal healthiness decision, the proposed Reciprocal Markov Model (RMM) can accommodate both hidden (“general” emotional states: positive vs. negative state) and observed states (meal: healthier, same or less healthy than usual) without presuming independence between observations and smooth trajectories of mood or behavior changes. The results of RMM analyses illustrated the reciprocal chains of meal consumption and mood as well as the effect of contextual factors that moderate the interrelationship between eating and emotion. A simulation experiment that generated data consistent to the empirical study further demonstrated that the procedure is promising in terms of recovering the parameters. PMID:26717120
Algorithms for Discovery of Multiple Markov Boundaries
Statnikov, Alexander; Lytkin, Nikita I.; Lemeire, Jan; Aliferis, Constantin F.
2013-01-01
Algorithms for Markov boundary discovery from data constitute an important recent development in machine learning, primarily because they offer a principled solution to the variable/feature selection problem and give insight on local causal structure. Over the last decade many sound algorithms have been proposed to identify a single Markov boundary of the response variable. Even though faithful distributions and, more broadly, distributions that satisfy the intersection property always have a single Markov boundary, other distributions/data sets may have multiple Markov boundaries of the response variable. The latter distributions/data sets are common in practical data-analytic applications, and there are several reasons why it is important to induce multiple Markov boundaries from such data. However, there are currently no sound and efficient algorithms that can accomplish this task. This paper describes a family of algorithms TIE* that can discover all Markov boundaries in a distribution. The broad applicability as well as efficiency of the new algorithmic family is demonstrated in an extensive benchmarking study that involved comparison with 26 state-of-the-art algorithms/variants in 15 data sets from a diversity of application domains. PMID:25285052
Hidden symmetries in jammed systems
NASA Astrophysics Data System (ADS)
Morse, Peter K.; Corwin, Eric I.
2016-07-01
There are deep, but hidden, geometric structures within jammed systems, associated with hidden symmetries. These can be revealed by repeated transformations under which these structures lead to fixed points. These geometric structures can be found in the Voronoi tesselation of space defined by the packing. In this paper we examine two iterative processes: maximum inscribed sphere (MIS) inversion and a real-space coarsening scheme. Under repeated iterations of the MIS inversion process we find invariant systems in which every particle is equal to the maximum inscribed sphere within its Voronoi cell. Using a real-space coarsening scheme we reveal behavior in geometric order parameters which is length-scale invariant.
MORPH: probabilistic alignment combined with hidden Markov models of cis-regulatory modules.
Sinha, Saurabh; He, Xin
2007-11-01
The discovery and analysis of cis-regulatory modules (CRMs) in metazoan genomes is crucial for understanding the transcriptional control of development and many other biological processes. Cross-species sequence comparison holds much promise for improving computational prediction of CRMs, for elucidating their binding site composition, and for understanding how they evolve. Current methods for analyzing orthologous CRMs from multiple species rely upon sequence alignments produced by off-the-shelf alignment algorithms, which do not exploit the presence of binding sites in the sequences. We present here a unified probabilistic framework, called MORPH, that integrates the alignment task with binding site predictions, allowing more robust CRM analysis in two species. The framework sums over all possible alignments of two sequences, thus accounting for alignment ambiguities in a natural way. We perform extensive tests on orthologous CRMs from two moderately diverged species Drosophila melanogaster and D. mojavensis, to demonstrate the advantages of the new approach. We show that it can overcome certain computational artifacts of traditional alignment tools and provide a different, likely more accurate, picture of cis-regulatory evolution than that obtained from existing methods. The burgeoning field of cis-regulatory evolution, which is amply supported by the availability of many related genomes, is currently thwarted by the lack of accurate alignments of regulatory regions. Our work will fill in this void and enable more reliable analysis of CRM evolution.
MORPH: probabilistic alignment combined with hidden Markov models of cis-regulatory modules.
Sinha, Saurabh; He, Xin
2007-11-01
The discovery and analysis of cis-regulatory modules (CRMs) in metazoan genomes is crucial for understanding the transcriptional control of development and many other biological processes. Cross-species sequence comparison holds much promise for improving computational prediction of CRMs, for elucidating their binding site composition, and for understanding how they evolve. Current methods for analyzing orthologous CRMs from multiple species rely upon sequence alignments produced by off-the-shelf alignment algorithms, which do not exploit the presence of binding sites in the sequences. We present here a unified probabilistic framework, called MORPH, that integrates the alignment task with binding site predictions, allowing more robust CRM analysis in two species. The framework sums over all possible alignments of two sequences, thus accounting for alignment ambiguities in a natural way. We perform extensive tests on orthologous CRMs from two moderately diverged species Drosophila melanogaster and D. mojavensis, to demonstrate the advantages of the new approach. We show that it can overcome certain computational artifacts of traditional alignment tools and provide a different, likely more accurate, picture of cis-regulatory evolution than that obtained from existing methods. The burgeoning field of cis-regulatory evolution, which is amply supported by the availability of many related genomes, is currently thwarted by the lack of accurate alignments of regulatory regions. Our work will fill in this void and enable more reliable analysis of CRM evolution. PMID:17997594
A DIRICHLET PROCESS MIXTURE OF HIDDEN MARKOV MODELS FOR PROTEIN STRUCTURE PREDICTION1
Lennox, Kristin P.; Dahl, David B.; Vannucci, Marina; Day, Ryan; Tsai, Jerry W.
2010-01-01
By providing new insights into the distribution of a protein’s torsion angles, recent statistical models for this data have pointed the way to more efficient methods for protein structure prediction. Most current approaches have concentrated on bivariate models at a single sequence position. There is, however, considerable value in simultaneously modeling angle pairs at multiple sequence positions in a protein. One area of application for such models is in structure prediction for the highly variable loop and turn regions. Such modeling is difficult due to the fact that the number of known protein structures available to estimate these torsion angle distributions is typically small. Furthermore, the data is “sparse” in that not all proteins have angle pairs at each sequence position. We propose a new semiparametric model for the joint distributions of angle pairs at multiple sequence positions. Our model accommodates sparse data by leveraging known information about the behavior of protein secondary structure. We demonstrate our technique by predicting the torsion angles in a loop from the globin fold family. Our results show that a template-based approach can now be successfully extended to modeling the notoriously difficult loop and turn regions. PMID:21031154
ERIC Educational Resources Information Center
Li, Dingcheng
2011-01-01
Coreference resolution (CR) and entity relation detection (ERD) aim at finding predefined relations between pairs of entities in text. CR focuses on resolving identity relations while ERD focuses on detecting non-identity relations. Both CR and ERD are important as they can potentially improve other natural language processing (NLP) related tasks…
The Characterization of Phonetic Variation in American English Schwa Using Hidden Markov Models
ERIC Educational Resources Information Center
Lilley, Jason
2012-01-01
The discovery and characterization of a phonetic segment's variants and the prediction of their distribution are two of the chief goals of phonology. In this dissertation, I develop a new, mostly automatic technique for discovering and classifying contextual variation. The focus is on a set of sounds in English that undergoes considerable…
Signal Processing Based on Hidden Markov Models for Extracting Small Channel Currents
NASA Astrophysics Data System (ADS)
Krishnamurthy, Vikram; Chung, Shin-Ho
The measurement of ionic currents flowing through single channels in cell membranes has been made possible by the giga-seal patch-clamp technique (Neher and Sakmann, 1976; Hamill et al., 1981). A tight seal between the rim of the electrode tip and the cell membrane drastically reduces the leakage current and extraneous background noise, enabling the resolution of the discrete changes in conductance that occur when single channels open or close. Although the noise from a small patch is much less than that from a whole-cell membrane, signals of interest are often obscured by the noise. Even if the signal frequently emerges from the noise, low-amplitude events such as small subconductance states can remain below the noise level and there may be little evidence of their presence. It is desirable, therefore, to have a method to measure and characterize not only relatively large ionic currents but also much smaller current fluctuations that are obscured by noise.
ERIC Educational Resources Information Center
Boyer, Kristy Elizabeth; Phillips, Robert; Ingram, Amy; Ha, Eun Young; Wallis, Michael; Vouk, Mladen; Lester, James
2011-01-01
Identifying effective tutorial dialogue strategies is a key issue for intelligent tutoring systems research. Human-human tutoring offers a valuable model for identifying effective tutorial strategies, but extracting them is a challenge because of the richness of human dialogue. This article addresses that challenge through a machine learning…
New seismic events identified in the Apollo lunar data by application of a Hidden Markov Model
NASA Astrophysics Data System (ADS)
Knapmeyer-Endrun, B.; Hammer, C.
2015-10-01
The Apollo astronauts installed seismic stations on the Moon during Apollo missions 11, 12, 14, 15 and 16. The stations consisted of a three-component long- period seismometer (eigenperiod 15 s) and a vertical short-period sensor (eigenperiod 1 s). Until today, the Apollo seismic network provides the only confirmed recordings of seismic events from any extrater-restrial. The recorded event waveforms differ significantly from what had been expected based on Earth data, mainly by their long duration body wave codas caused by strong near-surface scattering and weak attenuation due to lack of fluids. The main lunar event types are deep moonquakes, impacts, and the rare shallow moonquakes.
An adaptive Hidden Markov Model for activity recognition based on a wearable multi-sensor device
Technology Transfer Automated Retrieval System (TEKTRAN)
Human activity recognition is important in the study of personal health, wellness and lifestyle. In order to acquire human activity information from the personal space, many wearable multi-sensor devices have been developed. In this paper, a novel technique for automatic activity recognition based o...
Protein Kinase Classification with 2866 Hidden Markov Models and One Support Vector Machine
NASA Technical Reports Server (NTRS)
Weber, Ryan; New, Michael H.; Fonda, Mark (Technical Monitor)
2002-01-01
The main application considered in this paper is predicting true kinases from randomly permuted kinases that share the same length and amino acid distributions as the true kinases. Numerous methods already exist for this classification task, such as HMMs, motif-matchers, and sequence comparison algorithms. We build on some of these efforts by creating a vector from the output of thousands of structurally based HMMs, created offline with Pfam-A seed alignments using SAM-T99, which then must be combined into an overall classification for the protein. Then we use a Support Vector Machine for classifying this large ensemble Pfam-Vector, with a polynomial and chisquared kernel. In particular, the chi-squared kernel SVM performs better than the HMMs and better than the BLAST pairwise comparisons, when predicting true from false kinases in some respects, but no one algorithm is best for all purposes or in all instances so we consider the particular strengths and weaknesses of each.
Dynamical symmetries of Markov processes with multiplicative white noise
NASA Astrophysics Data System (ADS)
Aron, Camille; Barci, Daniel G.; Cugliandolo, Leticia F.; González Arenas, Zochil; Lozano, Gustavo S.
2016-05-01
We analyse various properties of stochastic Markov processes with multiplicative white noise. We take a single-variable problem as a simple example, and we later extend the analysis to the Landau-Lifshitz-Gilbert equation for the stochastic dynamics of a magnetic moment. In particular, we focus on the non-equilibrium transfer of angular momentum to the magnetization from a spin-polarised current of electrons, a technique which is widely used in the context of spintronics to manipulate magnetic moments. We unveil two hidden dynamical symmetries of the generating functionals of these Markovian multiplicative white-noise processes. One symmetry only holds in equilibrium and we use it to prove generic relations such as the fluctuation-dissipation theorems. Out of equilibrium, we take profit of the symmetry-breaking terms to prove fluctuation theorems. The other symmetry yields strong dynamical relations between correlation and response functions which can notably simplify the numerical analysis of these problems. Our construction allows us to clarify some misconceptions on multiplicative white-noise stochastic processes that can be found in the literature. In particular, we show that a first-order differential equation with multiplicative white noise can be transformed into an additive-noise equation, but that the latter keeps a non-trivial memory of the discretisation prescription used to define the former.
Dynamical symmetries of Markov processes with multiplicative white noise
NASA Astrophysics Data System (ADS)
Aron, Camille; Barci, Daniel G.; Cugliandolo, Leticia F.; González Arenas, Zochil; Lozano, Gustavo S.
2016-05-01
We analyse various properties of stochastic Markov processes with multiplicative white noise. We take a single-variable problem as a simple example, and we later extend the analysis to the Landau–Lifshitz–Gilbert equation for the stochastic dynamics of a magnetic moment. In particular, we focus on the non-equilibrium transfer of angular momentum to the magnetization from a spin-polarised current of electrons, a technique which is widely used in the context of spintronics to manipulate magnetic moments. We unveil two hidden dynamical symmetries of the generating functionals of these Markovian multiplicative white-noise processes. One symmetry only holds in equilibrium and we use it to prove generic relations such as the fluctuation-dissipation theorems. Out of equilibrium, we take profit of the symmetry-breaking terms to prove fluctuation theorems. The other symmetry yields strong dynamical relations between correlation and response functions which can notably simplify the numerical analysis of these problems. Our construction allows us to clarify some misconceptions on multiplicative white-noise stochastic processes that can be found in the literature. In particular, we show that a first-order differential equation with multiplicative white noise can be transformed into an additive-noise equation, but that the latter keeps a non-trivial memory of the discretisation prescription used to define the former.
Centrosome Positioning in 1D Cell Migration
NASA Astrophysics Data System (ADS)
Adlerz, Katrina; Aranda-Espinoza, Helim
During cell migration, the positioning of the centrosome and nucleus define a cell's polarity. For a cell migrating on a two-dimensional substrate the centrosome is positioned in front of the nucleus. Under one-dimensional confinement, however, the centrosome is positioned behind the nucleus in 60% of cells. It is known that the centrosome is positioned by CDC42 and dynein for cells moving on a 2D substrate in a wound-healing assay. It is currently unknown, however, if this is also true for cells moving under 1D confinement, where the centrosome position is often reversed. Therefore, centrosome positioning was studied in cells migrating under 1D confinement, which mimics cells migrating through 3D matrices. 3 to 5 μm fibronectin lines were stamped onto a glass substrate and cells with fluorescently labeled nuclei and centrosomes migrated on the lines. Our results show that when a cell changes directions the centrosome position is maintained. That is, when the centrosome is between the nucleus and the cell's trailing edge and the cell changes direction, the centrosome will be translocated across the nucleus to the back of the cell again. A dynein inhibitor did have an influence on centrosome positioning in 1D migration and change of directions.
Zero finite-temperature charge stiffness within the half-filled 1D Hubbard model
Carmelo, J.M.P.; Gu, Shi-Jian; Sacramento, P.D.
2013-12-15
Even though the one-dimensional (1D) Hubbard model is solvable by the Bethe ansatz, at half-filling its finite-temperature T>0 transport properties remain poorly understood. In this paper we combine that solution with symmetry to show that within that prominent T=0 1D insulator the charge stiffness D(T) vanishes for T>0 and finite values of the on-site repulsion U in the thermodynamic limit. This result is exact and clarifies a long-standing open problem. It rules out that at half-filling the model is an ideal conductor in the thermodynamic limit. Whether at finite T and U>0 it is an ideal insulator or a normal resistor remains an open question. That at half-filling the charge stiffness is finite at U=0 and vanishes for U>0 is found to result from a general transition from a conductor to an insulator or resistor occurring at U=U{sub c}=0 for all finite temperatures T>0. (At T=0 such a transition is the quantum metal to Mott–Hubbard-insulator transition.) The interplay of the η-spin SU(2) symmetry with the hidden U(1) symmetry beyond SO(4) is found to play a central role in the unusual finite-temperature charge transport properties of the 1D half-filled Hubbard model. -- Highlights: •The charge stiffness of the half-filled 1D Hubbard model is evaluated. •Its value is controlled by the model symmetry operator algebras. •We find that there is no charge ballistic transport at finite temperatures T>0. •The hidden U(1) symmetry controls the U=0 phase transition for T>0.
Semi-Markov adjunction to the Computer-Aided Markov Evaluator (CAME)
NASA Technical Reports Server (NTRS)
Rosch, Gene; Hutchins, Monica A.; Leong, Frank J.; Babcock, Philip S., IV
1988-01-01
The rule-based Computer-Aided Markov Evaluator (CAME) program was expanded in its ability to incorporate the effect of fault-handling processes into the construction of a reliability model. The fault-handling processes are modeled as semi-Markov events and CAME constructs and appropriate semi-Markov model. To solve the model, the program outputs it in a form which can be directly solved with the Semi-Markov Unreliability Range Evaluator (SURE) program. As a means of evaluating the alterations made to the CAME program, the program is used to model the reliability of portions of the Integrated Airframe/Propulsion Control System Architecture (IAPSA 2) reference configuration. The reliability predictions are compared with a previous analysis. The results bear out the feasibility of utilizing CAME to generate appropriate semi-Markov models to model fault-handling processes.
Preschoolers Search for Hidden Objects
ERIC Educational Resources Information Center
Haddad, Jeffrey M.; Chen, Yuping; Keen, Rachel
2011-01-01
The issue of whether young children use spatio-temporal information (e.g., movement of objects through time and space) and/or contact-mechanical information (e.g., interaction between objects) to search for a hidden object was investigated. To determine whether one cue can have priority over the other, a dynamic event that put these cues into…
Handling target obscuration through Markov chain observations
NASA Astrophysics Data System (ADS)
Kouritzin, Michael A.; Wu, Biao
2008-04-01
Target Obscuration, including foliage or building obscuration of ground targets and landscape or horizon obscuration of airborne targets, plagues many real world filtering problems. In particular, ground moving target identification Doppler radar, mounted on a surveillance aircraft or unattended airborne vehicle, is used to detect motion consistent with targets of interest. However, these targets try to obscure themselves (at least partially) by, for example, traveling along the edge of a forest or around buildings. This has the effect of creating random blockages in the Doppler radar image that move dynamically and somewhat randomly through this image. Herein, we address tracking problems with target obscuration by building memory into the observations, eschewing the usual corrupted, distorted partial measurement assumptions of filtering in favor of dynamic Markov chain assumptions. In particular, we assume the observations are a Markov chain whose transition probabilities depend upon the signal. The state of the observation Markov chain attempts to depict the current obscuration and the Markov chain dynamics are used to handle the evolution of the partially obscured radar image. Modifications of the classical filtering equations that allow observation memory (in the form of a Markov chain) are given. We use particle filters to estimate the position of the moving targets. Moreover, positive proof-of-concept simulations are included.
Parsing Social Network Survey Data from Hidden Populations Using Stochastic Context-Free Grammars
Poon, Art F. Y.; Brouwer, Kimberly C.; Strathdee, Steffanie A.; Firestone-Cruz, Michelle; Lozada, Remedios M.; Kosakovsky Pond, Sergei L.; Heckathorn, Douglas D.; Frost, Simon D. W.
2009-01-01
Background Human populations are structured by social networks, in which individuals tend to form relationships based on shared attributes. Certain attributes that are ambiguous, stigmatized or illegal can create a ÔhiddenÕ population, so-called because its members are difficult to identify. Many hidden populations are also at an elevated risk of exposure to infectious diseases. Consequently, public health agencies are presently adopting modern survey techniques that traverse social networks in hidden populations by soliciting individuals to recruit their peers, e.g., respondent-driven sampling (RDS). The concomitant accumulation of network-based epidemiological data, however, is rapidly outpacing the development of computational methods for analysis. Moreover, current analytical models rely on unrealistic assumptions, e.g., that the traversal of social networks can be modeled by a Markov chain rather than a branching process. Methodology/Principal Findings Here, we develop a new methodology based on stochastic context-free grammars (SCFGs), which are well-suited to modeling tree-like structure of the RDS recruitment process. We apply this methodology to an RDS case study of injection drug users (IDUs) in Tijuana, México, a hidden population at high risk of blood-borne and sexually-transmitted infections (i.e., HIV, hepatitis C virus, syphilis). Survey data were encoded as text strings that were parsed using our custom implementation of the inside-outside algorithm in a publicly-available software package (HyPhy), which uses either expectation maximization or direct optimization methods and permits constraints on model parameters for hypothesis testing. We identified significant latent variability in the recruitment process that violates assumptions of Markov chain-based methods for RDS analysis: firstly, IDUs tended to emulate the recruitment behavior of their own recruiter; and secondly, the recruitment of like peers (homophily) was dependent on the number of
Markov speckle for efficient random bit generation.
Horstmeyer, Roarke; Chen, Richard Y; Judkewitz, Benjamin; Yang, Changhuei
2012-11-19
Optical speckle is commonly observed in measurements using coherent radiation. While lacking experimental validation, previous work has often assumed that speckle's random spatial pattern follows a Markov process. Here, we present a derivation and experimental confirmation of conditions under which this assumption holds true. We demonstrate that a detected speckle field can be designed to obey the first-order Markov property by using a Cauchy attenuation mask to modulate scattered light. Creating Markov speckle enables the development of more accurate and efficient image post-processing algorithms, with applications including improved de-noising, segmentation and super-resolution. To show its versatility, we use the Cauchy mask to maximize the entropy of a detected speckle field with fixed average speckle size, allowing cryptographic applications to extract a maximum number of useful random bits from speckle images.
Policy Transfer via Markov Logic Networks
NASA Astrophysics Data System (ADS)
Torrey, Lisa; Shavlik, Jude
We propose using a statistical-relational model, the Markov Logic Network, for knowledge transfer in reinforcement learning. Our goal is to extract relational knowledge from a source task and use it to speed up learning in a related target task. We show that Markov Logic Networks are effective models for capturing both source-task Q-functions and source-task policies. We apply them via demonstration, which involves using them for decision making in an initial stage of the target task before continuing to learn. Through experiments in the RoboCup simulated-soccer domain, we show that transfer via Markov Logic Networks can significantly improve early performance in complex tasks, and that transferring policies is more effective than transferring Q-functions.
Markov chains for testing redundant software
NASA Technical Reports Server (NTRS)
White, Allan L.; Sjogren, Jon A.
1988-01-01
A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.
How to estimate the heat production of a 'hidden' reservoir in Earth's mantle
NASA Astrophysics Data System (ADS)
Korenaga, J.
2008-12-01
The possibility of a hidden geochemical reservoir in the deep mantle has long been debated in geophysics and geochemistry, because of its bearings on the structure of the core-mantle boundary region, the origin of hotspots, the style of mantle convection, the history of the geomagnetic field, and the thermal evolution of Earth. The geochemical nature of a hidden reservoir, however, has been estimated based on composition models for the bulk silicate Earth, although these models preclude, in principle, the presence of such reservoir. Here we present a new self-consistent framework to estimate the neodymium and samarium concentration of a hidden reservoir and also constrain the heat production of the bulk silicate Earth, based on the notion of early global differentiation. Our geochemical inference is formulated as a nonlinear inverse problem, and the permissible solution space, delineated by Markov chain Monte Carlo simulations, indicates that an early enriched reservoir may occupy ~13% of the mantle with internal heat production of ~6~TW. If a hidden reservoir corresponds to the D" layer instead, its heat production would be only ~4~TW. The heat production of the bulk silicate Earth is estimated to be 18.9±3.8~TW, which is virtually independent of the likely reservoir size.
A 1-D dusty plasma photonic crystal
Mitu, M. L.; Ticoş, C. M.; Toader, D.; Banu, N.; Scurtu, A.
2013-09-21
It is demonstrated numerically that a 1-D plasma crystal made of micron size cylindrical dust particles can, in principle, work as a photonic crystal for terahertz waves. The dust rods are parallel to each other and arranged in a linear string forming a periodic structure of dielectric-plasma regions. The dispersion equation is found by solving the waves equation with the boundary conditions at the dust-plasma interface and taking into account the dielectric permittivity of the dust material and plasma. The wavelength of the electromagnetic waves is in the range of a few hundred microns, close to the interparticle separation distance. The band gaps of the 1-D plasma crystal are numerically found for different types of dust materials, separation distances between the dust rods and rod diameters. The distance between levitated dust rods forming a string in rf plasma is shown experimentally to vary over a relatively wide range, from 650 μm to about 1350 μm, depending on the rf power fed into the discharge.
1D fast coded aperture camera.
Haw, Magnus; Bellan, Paul
2015-04-01
A fast (100 MHz) 1D coded aperture visible light camera has been developed as a prototype for imaging plasma experiments in the EUV/X-ray bands. The system uses printed patterns on transparency sheets as the masked aperture and an 80 channel photodiode array (9 V reverse bias) as the detector. In the low signal limit, the system has demonstrated 40-fold increase in throughput and a signal-to-noise gain of ≈7 over that of a pinhole camera of equivalent parameters. In its present iteration, the camera can only image visible light; however, the only modifications needed to make the system EUV/X-ray sensitive are to acquire appropriate EUV/X-ray photodiodes and to machine a metal masked aperture. PMID:25933861
1D fast coded aperture camera.
Haw, Magnus; Bellan, Paul
2015-04-01
A fast (100 MHz) 1D coded aperture visible light camera has been developed as a prototype for imaging plasma experiments in the EUV/X-ray bands. The system uses printed patterns on transparency sheets as the masked aperture and an 80 channel photodiode array (9 V reverse bias) as the detector. In the low signal limit, the system has demonstrated 40-fold increase in throughput and a signal-to-noise gain of ≈7 over that of a pinhole camera of equivalent parameters. In its present iteration, the camera can only image visible light; however, the only modifications needed to make the system EUV/X-ray sensitive are to acquire appropriate EUV/X-ray photodiodes and to machine a metal masked aperture.
1D-VAR Retrieval Using Superchannels
NASA Technical Reports Server (NTRS)
Liu, Xu; Zhou, Daniel; Larar, Allen; Smith, William L.; Schluessel, Peter; Mango, Stephen; SaintGermain, Karen
2008-01-01
Since modern ultra-spectral remote sensors have thousands of channels, it is difficult to include all of them in a 1D-var retrieval system. We will describe a physical inversion algorithm, which includes all available channels for the atmospheric temperature, moisture, cloud, and surface parameter retrievals. Both the forward model and the inversion algorithm compress the channel radiances into super channels. These super channels are obtained by projecting the radiance spectra onto a set of pre-calculated eigenvectors. The forward model provides both super channel properties and jacobian in EOF space directly. For ultra-spectral sensors such as Infrared Atmospheric Sounding Interferometer (IASI) and the NPOESS Airborne Sounder Testbed Interferometer (NAST), a compression ratio of more than 80 can be achieved, leading to a significant reduction in computations involved in an inversion process. Results will be shown applying the algorithm to real IASI and NAST data.
NASA Astrophysics Data System (ADS)
Birkel, C.; Paroli, R.; Spezia, L.; Tetzlaff, D.; Soulsby, C.
2012-12-01
In this paper we present a novel model framework using the class of Markov Switching Autoregressive Models (MSARMs) to examine catchments as complex stochastic systems that exhibit non-stationary, non-linear and non-Normal rainfall-runoff and solute dynamics. Hereby, MSARMs are pairs of stochastic processes, one observed and one unobserved, or hidden. We model the unobserved process as a finite state Markov chain and assume that the observed process, given the hidden Markov chain, is conditionally autoregressive, which means that the current observation depends on its recent past (system memory). The model is fully embedded in a Bayesian analysis based on Markov Chain Monte Carlo (MCMC) algorithms for model selection and uncertainty assessment. Hereby, the autoregressive order and the dimension of the hidden Markov chain state-space are essentially self-selected. The hidden states of the Markov chain represent unobserved levels of variability in the observed process that may result from complex interactions of hydroclimatic variability on the one hand and catchment characteristics affecting water and solute storage on the other. To deal with non-stationarity, additional meteorological and hydrological time series along with a periodic component can be included in the MSARMs as covariates. This extension allows identification of potential underlying drivers of temporal rainfall-runoff and solute dynamics. We applied the MSAR model framework to streamflow and conservative tracer (deuterium and oxygen-18) time series from an intensively monitored 2.3 km2 experimental catchment in eastern Scotland. Statistical time series analysis, in the form of MSARMs, suggested that the streamflow and isotope tracer time series are not controlled by simple linear rules. MSARMs showed that the dependence of current observations on past inputs observed by transport models often in form of the long-tailing of travel time and residence time distributions can be efficiently explained by
On quantum algorithms for noncommutative hidden subgroups
Ettinger, M.; Hoeyer, P.
1998-12-01
Quantum algorithms for factoring and discrete logarithm have previously been generalized to finding hidden subgroups of finite Abelian groups. This paper explores the possibility of extending this general viewpoint to finding hidden subgroups of noncommutative groups. The authors present a quantum algorithm for the special case of dihedral groups which determines the hidden subgroup in a linear number of calls to the input function. They also explore the difficulties of developing an algorithm to process the data to explicitly calculate a generating set for the subgroup. A general framework for the noncommutative hidden subgroup problem is discussed and they indicate future research directions.
Quantum computation and hidden variables
NASA Astrophysics Data System (ADS)
Aristov, V. V.; Nikulov, A. V.
2008-03-01
Many physicists limit oneself to an instrumentalist description of quantum phenomena and ignore the problems of foundation and interpretation of quantum mechanics. This instrumentalist approach results to "specialization barbarism" and mass delusion concerning the problem, how a quantum computer can be made. The idea of quantum computation can be described within the limits of quantum formalism. But in order to understand how this idea can be put into practice one should realize the question: "What could the quantum formalism describe?", in spite of the absence of an universally recognized answer. Only a realization of this question and the undecided problem of quantum foundations allows to see in which quantum systems the superposition and EPR correlation could be expected. Because of the "specialization barbarism" many authors are sure that Bell proved full impossibility of any hidden-variables interpretation. Therefore it is important to emphasize that in reality Bell has restricted to validity limits of the no-hidden-variables proof and has shown that two-state quantum system can be described by hidden variables. The later means that no experimental result obtained on two-state quantum system can prove the existence of superposition and violation of the realism. One should not assume before unambiguous experimental evidence that any two-state quantum system is quantum bit. No experimental evidence of superposition of macroscopically distinct quantum states and of a quantum bit on base of superconductor structure was obtained for the present. Moreover same experimental results can not be described in the limits of the quantum formalism.
Solving the "Hidden Line" Problem
NASA Technical Reports Server (NTRS)
1984-01-01
David Hedgley Jr., a mathematician at Dryden Flight Research Center, has developed an accurate computer program that considers whether a line in a graphic model of a three dimensional object should or should not be visible. The Hidden Line Computer Code, program automatically removes superfluous lines and permits the computer to display an object from specific viewpoints, just as the human eye would see it. Users include Rowland Institute for Science in Cambridge, MA, several departments of Lockheed Georgia Co., and Nebraska Public Power District (NPPD).
Markov random field and Gaussian mixture for segmented MRI-based partial volume correction in PET.
Bousse, Alexandre; Pedemonte, Stefano; Thomas, Benjamin A; Erlandsson, Kjell; Ourselin, Sébastien; Arridge, Simon; Hutton, Brian F
2012-10-21
In this paper we propose a segmented magnetic resonance imaging (MRI) prior-based maximum penalized likelihood deconvolution technique for positron emission tomography (PET) images. The model assumes the existence of activity classes that behave like a hidden Markov random field (MRF) driven by the segmented MRI. We utilize a mean field approximation to compute the likelihood of the MRF. We tested our method on both simulated and clinical data (brain PET) and compared our results with PET images corrected with the re-blurred Van Cittert (VC) algorithm, the simplified Guven (SG) algorithm and the region-based voxel-wise (RBV) technique. We demonstrated our algorithm outperforms the VC algorithm and outperforms SG and RBV corrections when the segmented MRI is inconsistent (e.g. mis-segmentation, lesions, etc) with the PET image.
Identifying bubble collapse in a hydrothermal system using hiddden Markov models
Dawson, Phillip B.; Benitez, M.C.; Lowenstern, Jacob B.; Chouet, Bernard A.
2012-01-01
Beginning in July 2003 and lasting through September 2003, the Norris Geyser Basin in Yellowstone National Park exhibited an unusual increase in ground temperature and hydrothermal activity. Using hidden Markov model theory, we identify over five million high-frequency (>15 Hz) seismic events observed at a temporary seismic station deployed in the basin in response to the increase in hydrothermal activity. The source of these seismic events is constrained to within ~100 m of the station, and produced ~3500–5500 events per hour with mean durations of ~0.35–0.45 s. The seismic event rate, air temperature, hydrologic temperatures, and surficial water flow of the geyser basin exhibited a marked diurnal pattern that was closely associated with solar thermal radiance. We interpret the source of the seismicity to be due to the collapse of small steam bubbles in the hydrothermal system, with the rate of collapse being controlled by surficial temperatures and daytime evaporation rates.
Hidden scale invariance of metals
NASA Astrophysics Data System (ADS)
Hummel, Felix; Kresse, Georg; Dyre, Jeppe C.; Pedersen, Ulf R.
2015-11-01
Density functional theory (DFT) calculations of 58 liquid elements at their triple point show that most metals exhibit near proportionality between the thermal fluctuations of the virial and the potential energy in the isochoric ensemble. This demonstrates a general "hidden" scale invariance of metals making the condensed part of the thermodynamic phase diagram effectively one dimensional with respect to structure and dynamics. DFT computed density scaling exponents, related to the Grüneisen parameter, are in good agreement with experimental values for the 16 elements where reliable data were available. Hidden scale invariance is demonstrated in detail for magnesium by showing invariance of structure and dynamics. Computed melting curves of period three metals follow curves with invariance (isomorphs). The experimental structure factor of magnesium is predicted by assuming scale invariant inverse power-law (IPL) pair interactions. However, crystal packings of several transition metals (V, Cr, Mn, Fe, Nb, Mo, Ta, W, and Hg), most post-transition metals (Ga, In, Sn, and Tl), and the metalloids Si and Ge cannot be explained by the IPL assumption. The virial-energy correlation coefficients of iron and phosphorous are shown to increase at elevated pressures. Finally, we discuss how scale invariance explains the Grüneisen equation of state and a number of well-known empirical melting and freezing rules.
Dissolved gas - the hidden saboteur
Magorien, V.G.
1993-12-31
Almost all hydraulic power components, to properly perform their tasks, rely on one basic, physical property, i.e., the incompressibility of the working fluid. Unfortunately, a frequently overlooked fluid property which frustrates this requirement is its ability to absorb, i.e., dissolve, store and give off gas. The gas is, most often but not always, air. This property is a complex one because it is a function not only of the fluid`s chemical make-up but temperature, pressure, exposed area, depth and time. In its relationshiop to aircraft landing-gear, where energy is absorbed hydraulically, this multi-faceted fluid property can be detrimental in two ways: dynamically, i.e., loss of energy absorption ability and statically, i.e., improper aircraft attitude on the ground. The pupose of this paper is to bring an awareness to this property by presenting: (1) examples of these manifestations with some empirical and practical solutions to them, (2) illustrations of this normally `hidden saboteur` at work, (3) Henry`s Dissolved Gas Law, (4) room-temperature, saturated values of dissolved gas for a number of different working fluids, (5) a description of the instrument used to obtain them, (6) some `missing elements` of the Dissolved Gas Law pertaining to absoption, (7) how static and dynamic conditions effect gas absorption and (8) some recommended solutions to prevent becoming a victim of this `hidden saboteur`
1D-1D Coulomb drag in a 6 Million Mobility Bi-layer Heterostructure
NASA Astrophysics Data System (ADS)
Bilodeau, Simon; Laroche, Dominique; Xia, Jian-Sheng; Lilly, Mike; Reno, John; Pfeiffer, Loren; West, Ken; Gervais, Guillaume
We report Coulomb drag measurements in vertically-coupled quantum wires. The wires are fabricated in GaAs/AlGaAs bilayer heterostructures grown from two different MBE chambers: one at Sandia National Laboratories (1.2M mobility), and the other at Princeton University (6M mobility). The previously observed positive and negative drag signals are seen in both types of devices, demonstrating the robustness of the result. However, attempts to determine the temperature dependence of the drag signal in the 1D regime proved challenging in the higher mobility heterostructure (Princeton), in part because of difficulties in aligning the wires within the same transverse subband configuration. Nevertheless, this work, performed at the Microkelvin laboratory of the University of Florida, is an important proof-of-concept for future investigations of the temperature dependence of the 1D-1D drag signal down to a few mK. Such an experiment could confirm the Luttinger charge density wave interlocking predicted to occur in the wires. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL8500.
A critical appraisal of Markov state models
NASA Astrophysics Data System (ADS)
Schütte, Ch.; Sarich, M.
2015-09-01
Markov State Modelling as a concept for a coarse grained description of the essential kinetics of a molecular system in equilibrium has gained a lot of attention recently. The last 10 years have seen an ever increasing publication activity on how to construct Markov State Models (MSMs) for very different molecular systems ranging from peptides to proteins, from RNA to DNA, and via molecular sensors to molecular aggregation. Simultaneously the accompanying theory behind MSM building and approximation quality has been developed well beyond the concepts and ideas used in practical applications. This article reviews the main theoretical results, provides links to crucial new developments, outlines the full power of MSM building today, and discusses the essential limitations still to overcome.
Equivalent Markov processes under gauge group.
Caruso, M; Jarne, C
2015-11-01
We have studied Markov processes on denumerable state space and continuous time. We found that all these processes are connected via gauge transformations. We have used this result before as a method to resolve equations, included the case in a previous work in which the sample space is time-dependent [Phys. Rev. E 90, 022125 (2014)]. We found a general solution through dilation of the state space, although the prior probability distribution of the states defined in this new space takes smaller values with respect to that in the initial problem. The gauge (local) group of dilations modifies the distribution on the dilated space to restore the original process. In this work, we show how the Markov process in general could be linked via gauge (local) transformations, and we present some illustrative examples for this result.
The cutoff phenomenon in finite Markov chains.
Diaconis, P
1996-01-01
Natural mixing processes modeled by Markov chains often show a sharp cutoff in their convergence to long-time behavior. This paper presents problems where the cutoff can be proved (card shuffling, the Ehrenfests' urn). It shows that chains with polynomial growth (drunkard's walk) do not show cutoffs. The best general understanding of such cutoffs (high multiplicity of second eigenvalues due to symmetry) is explored. Examples are given where the symmetry is broken but the cutoff phenomenon persists. PMID:11607633
Numerical methods in Markov chain modeling
NASA Technical Reports Server (NTRS)
Philippe, Bernard; Saad, Youcef; Stewart, William J.
1989-01-01
Several methods for computing stationary probability distributions of Markov chains are described and compared. The main linear algebra problem consists of computing an eigenvector of a sparse, usually nonsymmetric, matrix associated with a known eigenvalue. It can also be cast as a problem of solving a homogeneous singular linear system. Several methods based on combinations of Krylov subspace techniques are presented. The performance of these methods on some realistic problems are compared.
Markov-switching model for nonstationary runoff conditioned on El Niño information
NASA Astrophysics Data System (ADS)
Gelati, E.; Madsen, H.; Rosbjerg, D.
2010-02-01
We define a Markov-modulated autoregressive model with exogenous input (MARX) to generate runoff scenarios using climatic information. Runoff parameterization is assumed to be conditioned on a hidden climate state following a Markov chain, where state transition probabilities are functions of the climatic input. MARX allows stochastic modeling of nonstationary runoff, as runoff anomalies are described by a mixture of autoregressive models with exogenous input, each one corresponding to a climate state. We apply MARX to inflow time series of the Daule Peripa reservoir (Ecuador). El Niño-Southern Oscillation (ENSO) information is used to condition runoff parameterization. Among the investigated ENSO indexes, the NINO 1+2 sea surface temperature anomalies and the trans-Niño index perform best as predictors. In the perspective of reservoir optimization at various time scales, MARX produces realistic long-term scenarios and short-term forecasts, especially when intense El Niño events occur. Low predictive ability is found for negative runoff anomalies, as no climatic index correlating properly with negative inflow anomalies has yet been identified.
PyEMMA 2: A Software Package for Estimation, Validation, and Analysis of Markov Models.
Scherer, Martin K; Trendelkamp-Schroer, Benjamin; Paul, Fabian; Pérez-Hernández, Guillermo; Hoffmann, Moritz; Plattner, Nuria; Wehmeyer, Christoph; Prinz, Jan-Hendrik; Noé, Frank
2015-11-10
Markov (state) models (MSMs) and related models of molecular kinetics have recently received a surge of interest as they can systematically reconcile simulation data from either a few long or many short simulations and allow us to analyze the essential metastable structures, thermodynamics, and kinetics of the molecular system under investigation. However, the estimation, validation, and analysis of such models is far from trivial and involves sophisticated and often numerically sensitive methods. In this work we present the open-source Python package PyEMMA ( http://pyemma.org ) that provides accurate and efficient algorithms for kinetic model construction. PyEMMA can read all common molecular dynamics data formats, helps in the selection of input features, provides easy access to dimension reduction algorithms such as principal component analysis (PCA) and time-lagged independent component analysis (TICA) and clustering algorithms such as k-means, and contains estimators for MSMs, hidden Markov models, and several other models. Systematic model validation and error calculation methods are provided. PyEMMA offers a wealth of analysis functions such that the user can conveniently compute molecular observables of interest. We have derived a systematic and accurate way to coarse-grain MSMs to few states and to illustrate the structures of the metastable states of the system. Plotting functions to produce a manuscript-ready presentation of the results are available. In this work, we demonstrate the features of the software and show new methodological concepts and results produced by PyEMMA.
Hidden Variable Theories and Quantum Nonlocality
ERIC Educational Resources Information Center
Boozer, A. D.
2009-01-01
We clarify the meaning of Bell's theorem and its implications for the construction of hidden variable theories by considering an example system consisting of two entangled spin-1/2 particles. Using this example, we present a simplified version of Bell's theorem and describe several hidden variable theories that agree with the predictions of…
Smooth non-extremal D1-D5-P solutions as charged gravitational instantons
NASA Astrophysics Data System (ADS)
Chakrabarty, Bidisha; Rocha, Jorge V.; Virmani, Amitabh
2016-08-01
We present an alternative and more direct construction of the non-super-symmetric D1-D5-P supergravity solutions found by Jejjala, Madden, Ross and Titchener. We show that these solutions — with all three charges and both rotations turned on — can be viewed as a charged version of the Myers-Perry instanton. We present an inverse scattering construction of the Myers-Perry instanton metric in Euclidean five-dimensional gravity. The angular momentum bounds in this construction turn out to be precisely the ones necessary for the smooth microstate geometries. We add charges on the Myers-Perry instanton using appropriate SO(4, 4) hidden symmetry transformations. The full construc-tion can be viewed as an extension and simplification of a previous work by Katsimpouri, Kleinschmidt and Virmani.
Heating up the Galaxy with hidden photons
Dubovsky, Sergei; Hernández-Chifflet, Guzmán
2015-12-29
We elaborate on the dynamics of ionized interstellar medium in the presence of hidden photon dark matter. Our main focus is the ultra-light regime, where the hidden photon mass is smaller than the plasma frequency in the Milky Way. We point out that as a result of the Galactic plasma shielding direct detection of ultra-light photons in this mass range is especially challenging. However, we demonstrate that ultra-light hidden photon dark matter provides a powerful heating source for the ionized interstellar medium. This results in a strong bound on the kinetic mixing between hidden and regular photons all the way down to the hidden photon masses of order 10{sup −20} eV.
Heating up the Galaxy with hidden photons
Dubovsky, Sergei; Hernández-Chifflet, Guzmán E-mail: ghc236@nyu.edu
2015-12-01
We elaborate on the dynamics of ionized interstellar medium in the presence of hidden photon dark matter. Our main focus is the ultra-light regime, where the hidden photon mass is smaller than the plasma frequency in the Milky Way. We point out that as a result of the Galactic plasma shielding direct detection of ultra-light photons in this mass range is especially challenging. However, we demonstrate that ultra-light hidden photon dark matter provides a powerful heating source for the ionized interstellar medium. This results in a strong bound on the kinetic mixing between hidden and regular photons all the way down to the hidden photon masses of order 10{sup −20} eV.
Jung, Minsoo
2015-01-01
When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well. PMID:26107223
Jung, Minsoo
2015-01-01
When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.
Bartolucci, Francesco; Farcomeni, Alessio
2015-03-01
Mixed latent Markov (MLM) models represent an important tool of analysis of longitudinal data when response variables are affected by time-fixed and time-varying unobserved heterogeneity, in which the latter is accounted for by a hidden Markov chain. In order to avoid bias when using a model of this type in the presence of informative drop-out, we propose an event-history (EH) extension of the latent Markov approach that may be used with multivariate longitudinal data, in which one or more outcomes of a different nature are observed at each time occasion. The EH component of the resulting model is referred to the interval-censored drop-out, and bias in MLM modeling is avoided by correlated random effects, included in the different model components, which follow common latent distributions. In order to perform maximum likelihood estimation of the proposed model by the expectation-maximization algorithm, we extend the usual forward-backward recursions of Baum and Welch. The algorithm has the same complexity as the one adopted in cases of non-informative drop-out. We illustrate the proposed approach through simulations and an application based on data coming from a medical study about primary biliary cirrhosis in which there are two outcomes of interest, one continuous and the other binary. PMID:25227970
NASA Technical Reports Server (NTRS)
English, Thomas
2005-01-01
A standard tool of reliability analysis used at NASA-JSC is the event tree. An event tree is simply a probability tree, with the probabilities determining the next step through the tree specified at each node. The nodal probabilities are determined by a reliability study of the physical system at work for a particular node. The reliability study performed at a node is typically referred to as a fault tree analysis, with the potential of a fault tree existing.for each node on the event tree. When examining an event tree it is obvious why the event tree/fault tree approach has been adopted. Typical event trees are quite complex in nature, and the event tree/fault tree approach provides a systematic and organized approach to reliability analysis. The purpose of this study was two fold. Firstly, we wanted to explore the possibility that a semi-Markov process can create dependencies between sojourn times (the times it takes to transition from one state to the next) that can decrease the uncertainty when estimating time to failures. Using a generalized semi-Markov model, we studied a four element reliability model and were able to demonstrate such sojourn time dependencies. Secondly, we wanted to study the use of semi-Markov processes to introduce a time variable into the event tree diagrams that are commonly developed in PRA (Probabilistic Risk Assessment) analyses. Event tree end states which change with time are more representative of failure scenarios than are the usual static probability-derived end states.
Weinshank, R L; Zgombick, J M; Macchi, M J; Branchek, T A; Hartig, P R
1992-01-01
The serotonin 1D (5-HT1D) receptor is a pharmacologically defined binding site and functional receptor site. Observed variations in the properties of 5-HT1D receptors in different tissues have led to the speculation that multiple receptor proteins with slightly different properties may exist. We report here the cloning, deduced amino acid sequences, pharmacological properties, and second-messenger coupling of a pair of human 5-HT1D receptor genes, which we have designated 5-HT1D alpha and 5-HT1D beta due to their strong similarities in sequence, pharmacological properties, and second-messenger coupling. Both genes are free of introns in their coding regions, are expressed in the human cerebral cortex, and can couple to inhibition of adenylate cyclase activity. The pharmacological binding properties of these two human receptors are very similar, and match closely the pharmacological properties of human, bovine, and guinea pig 5-HT1D sites. Both receptors exhibit high-affinity binding of sumatriptan, a new anti-migraine medication, and thus are candidates for the pharmacological site of action of this drug. Images PMID:1565658
A classification of hidden-variable properties
NASA Astrophysics Data System (ADS)
Brandenburger, Adam; Yanofsky, Noson
2008-10-01
Hidden variables are extra components added to try to banish counterintuitive features of quantum mechanics. We start with a quantum-mechanical model and describe various properties that can be asked of a hidden-variable model. We present six such properties and a Venn diagram of how they are related. With two existence theorems and three no-go theorems (EPR, Bell and Kochen-Specker), we show which properties of empirically equivalent hidden-variable models are possible and which are not. Formally, our treatment relies only on classical probability models, and physical phenomena are used only to motivate which models to choose.
NASA Astrophysics Data System (ADS)
Korenaga, Jun
2009-11-01
The possibility of a hidden geochemical reservoir in the deep mantle has long been debated in geophysics and geochemistry, because of its bearings on the structure of the core-mantle boundary region, the origin of hotspots, the style of mantle convection, the history of the geomagnetic field, and the thermal evolution of Earth. The presence of such hidden reservoir, however, may invalidate existing models for the composition of the bulk silicate Earth because these models invariably assume that major chemical differentiation in the mantle follows the compositional trend exhibited by upper-mantle rocks. This article presents a new method to estimate the composition of the bulk silicate Earth by explicitly taking into account the possibility of a hidden reservoir. This geochemical inference is formulated as a nonlinear inverse problem, for which an efficient Markov chain Monte Carlo algorithm is developed. Inversion results indicate that the formation of a hidden reservoir, if any, took place at low pressures probably within the first 10 Myr of the history of the solar system and was subsequently lost from the Earth by impact erosion. The global mass balance of the bulk silicate Earth is revisited with the inversion results, and the depletion of highly incompatible elements in the present-day Earth is suggested to be moderate.
Hybrid Discrete-Continuous Markov Decision Processes
NASA Technical Reports Server (NTRS)
Feng, Zhengzhu; Dearden, Richard; Meuleau, Nicholas; Washington, Rich
2003-01-01
This paper proposes a Markov decision process (MDP) model that features both discrete and continuous state variables. We extend previous work by Boyan and Littman on the mono-dimensional time-dependent MDP to multiple dimensions. We present the principle of lazy discretization, and piecewise constant and linear approximations of the model. Having to deal with several continuous dimensions raises several new problems that require new solutions. In the (piecewise) linear case, we use techniques from partially- observable MDPs (POMDPS) to represent value functions as sets of linear functions attached to different partitions of the state space.
Power spectral ensity of markov texture fields
NASA Technical Reports Server (NTRS)
Shanmugan, K. S.; Holtzman, J. C.
1984-01-01
Texture is an important image characteristic. A variety of spatial domain techniques were proposed for extracting and utilizing textural features for segmenting and classifying images. for the most part, these spatial domain techniques are ad hos in nature. A markov random field model for image texture is discussed. A frequency domain description of image texture is derived in terms of the power spectral density. This model is used for designing optimum frequency domain filters for enhancing, restoring and segmenting images based on their textural properties.
Markov counting models for correlated binary responses.
Crawford, Forrest W; Zelterman, Daniel
2015-07-01
We propose a class of continuous-time Markov counting processes for analyzing correlated binary data and establish a correspondence between these models and sums of exchangeable Bernoulli random variables. Our approach generalizes many previous models for correlated outcomes, admits easily interpretable parameterizations, allows different cluster sizes, and incorporates ascertainment bias in a natural way. We demonstrate several new models for dependent outcomes and provide algorithms for computing maximum likelihood estimates. We show how to incorporate cluster-specific covariates in a regression setting and demonstrate improved fits to well-known datasets from familial disease epidemiology and developmental toxicology. PMID:25792624
Markov Chain Analysis of Musical Dice Games
NASA Astrophysics Data System (ADS)
Volchenkov, D.; Dawin, J. R.
2012-07-01
A system for using dice to compose music randomly is known as the musical dice game. The discrete time MIDI models of 804 pieces of classical music written by 29 composers have been encoded into the transition matrices and studied by Markov chains. Contrary to human languages, entropy dominates over redundancy, in the musical dice games based on the compositions of classical music. The maximum complexity is achieved on the blocks consisting of just a few notes (8 notes, for the musical dice games generated over Bach's compositions). First passage times to notes can be used to resolve tonality and feature a composer.
Markov counting models for correlated binary responses.
Crawford, Forrest W; Zelterman, Daniel
2015-07-01
We propose a class of continuous-time Markov counting processes for analyzing correlated binary data and establish a correspondence between these models and sums of exchangeable Bernoulli random variables. Our approach generalizes many previous models for correlated outcomes, admits easily interpretable parameterizations, allows different cluster sizes, and incorporates ascertainment bias in a natural way. We demonstrate several new models for dependent outcomes and provide algorithms for computing maximum likelihood estimates. We show how to incorporate cluster-specific covariates in a regression setting and demonstrate improved fits to well-known datasets from familial disease epidemiology and developmental toxicology.
Assessment of improved root growth representation in a 1-D, field scale crop model
NASA Astrophysics Data System (ADS)
Miltin Mboh, Cho; Gaiser, Thomas; Ewert, Frank
2015-04-01
Many 1-D, field scale crop models over-simplify root growth. The over-simplification of this "hidden half" of the crop may have significant consequences on simulated root water and nutrient uptake with a corresponding reflection on the simulated crop yields. Poor representation of root growth in crop models may therefore constitute a major source of uncertainty propagation. In this study we assess the effect of an improved representation of root growth in a model solution of the model framework SIMPLACE (Scientific Impact assessment and Modeling PLatform for Advanced Crop and Ecosystem management) compared to conventional 1-D approaches. The LINTUL5 crop growth model is coupled to the Hillflow soil water balance model within the SIMPLACE modeling framework (Gaiser et al, 2013). Root water uptake scenarios in the soil hydrological simulator Hillflow (Bronstert, 1995) together with an improved representation of root growth is compared to scenarios for which root growth is simplified. The improvement of root growth is achieved by integrating root growth solutions from R-SWMS (Javaux et al., 2008) into the SIMPLACE model solution. R-SWMS is a three dimensional model for simultaneous modeling of root growth, soil water fluxes and solute transport and uptake. These scenarios are tested by comparing how well the simulated water contents match with the observed soil water dynamics. The impacts of the scenarios on above ground biomass and wheat grain are assessed
Fibroid Tumors in Women: A Hidden Epidemic?
... Home Current Issue Past Issues Fibroid Tumors in Women: A Hidden Epidemic? Past Issues / Spring 2007 Table ... turn Javascript on. Dr. Cynthia Morton is seeking women who have fibroid tumors for a "sister study" ...
Nonstrange and strange pentaquarks with hidden charm
NASA Astrophysics Data System (ADS)
Anisovich, V. V.; Matveev, M. A.; Nyiri, J.; Sarantsev, A. V.; Semenova, A. N.
2015-11-01
Nonstrange and strange pentaquarks with hidden charm are considered as diquark-diquark-antiquark composite systems. Spin and isospin content of such exotic states is discussed and masses are evaluated.
Markov transitions and the propagation of chaos
Gottlieb, A.
1998-12-01
The propagation of chaos is a central concept of kinetic theory that serves to relate the equations of Boltzmann and Vlasov to the dynamics of many-particle systems. Propagation of chaos means that molecular chaos, i.e., the stochastic independence of two random particles in a many-particle system, persists in time, as the number of particles tends to infinity. We establish a necessary and sufficient condition for a family of general n-particle Markov processes to propagate chaos. This condition is expressed in terms of the Markov transition functions associated to the n-particle processes, and it amounts to saying that chaos of random initial states propagates if it propagates for pure initial states. Our proof of this result relies on the weak convergence approach to the study of chaos due to Sztitman and Tanaka. We assume that the space in which the particles live is homomorphic to a complete and separable metric space so that we may invoke Prohorov's theorem in our proof. We also s how that, if the particles can be in only finitely many states, then molecular chaos implies that the specific entropies in the n-particle distributions converge to the entropy of the limiting single-particle distribution.
Stochastic seismic tomography by interacting Markov chains
NASA Astrophysics Data System (ADS)
Bottero, Alexis; Gesret, Alexandrine; Romary, Thomas; Noble, Mark; Maisons, Christophe
2016-10-01
Markov chain Monte Carlo sampling methods are widely used for non-linear Bayesian inversion where no analytical expression for the forward relation between data and model parameters is available. Contrary to the linear(ized) approaches, they naturally allow to evaluate the uncertainties on the model found. Nevertheless their use is problematic in high-dimensional model spaces especially when the computational cost of the forward problem is significant and/or the a posteriori distribution is multimodal. In this case, the chain can stay stuck in one of the modes and hence not provide an exhaustive sampling of the distribution of interest. We present here a still relatively unknown algorithm that allows interaction between several Markov chains at different temperatures. These interactions (based on importance resampling) ensure a robust sampling of any posterior distribution and thus provide a way to efficiently tackle complex fully non-linear inverse problems. The algorithm is easy to implement and is well adapted to run on parallel supercomputers. In this paper, the algorithm is first introduced and applied to a synthetic multimodal distribution in order to demonstrate its robustness and efficiency compared to a simulated annealing method. It is then applied in the framework of first arrival traveltime seismic tomography on real data recorded in the context of hydraulic fracturing. To carry out this study a wavelet-based adaptive model parametrization has been used. This allows to integrate the a priori information provided by sonic logs and to reduce optimally the dimension of the problem.
Stochastic seismic tomography by interacting Markov chains
NASA Astrophysics Data System (ADS)
Bottero, Alexis; Gesret, Alexandrine; Romary, Thomas; Noble, Mark; Maisons, Christophe
2016-07-01
Markov chain Monte Carlo sampling methods are widely used for non-linear Bayesian inversion where no analytical expression for the forward relation between data and model parameters is available. Contrary to the linear(ized) approaches they naturally allow to evaluate the uncertainties on the model found. Nevertheless their use is problematic in high dimensional model spaces especially when the computational cost of the forward problem is significant and/or the a posteriori distribution is multimodal. In this case the chain can stay stuck in one of the modes and hence not provide an exhaustive sampling of the distribution of interest. We present here a still relatively unknown algorithm that allows interaction between several Markov chains at different temperatures. These interactions (based on Importance Resampling) ensure a robust sampling of any posterior distribution and thus provide a way to efficiently tackle complex fully non linear inverse problems. The algorithm is easy to implement and is well adapted to run on parallel supercomputers. In this paper the algorithm is first introduced and applied to a synthetic multimodal distribution in order to demonstrate its robustness and efficiency compared to a Simulated Annealing method. It is then applied in the framework of first arrival traveltime seismic tomography on real data recorded in the context of hydraulic fracturing. To carry out this study a wavelet based adaptive model parametrization has been used. This allows to integrate the a priori information provided by sonic logs and to reduce optimally the dimension of the problem.
Equilibrium Control Policies for Markov Chains
Malikopoulos, Andreas
2011-01-01
The average cost criterion has held great intuitive appeal and has attracted considerable attention. It is widely employed when controlling dynamic systems that evolve stochastically over time by means of formulating an optimization problem to achieve long-term goals efficiently. The average cost criterion is especially appealing when the decision-making process is long compared to other timescales involved, and there is no compelling motivation to select short-term optimization. This paper addresses the problem of controlling a Markov chain so as to minimize the average cost per unit time. Our approach treats the problem as a dual constrained optimization problem. We derive conditions guaranteeing that a saddle point exists for the new dual problem and we show that this saddle point is an equilibrium control policy for each state of the Markov chain. For practical situations with constraints consistent to those we study here, our results imply that recognition of such saddle points may be of value in deriving in real time an optimal control policy.
Markov Chain Monte Carlo and Irreversibility
NASA Astrophysics Data System (ADS)
Ottobre, Michela
2016-06-01
Markov Chain Monte Carlo (MCMC) methods are statistical methods designed to sample from a given measure π by constructing a Markov chain that has π as invariant measure and that converges to π. Most MCMC algorithms make use of chains that satisfy the detailed balance condition with respect to π; such chains are therefore reversible. On the other hand, recent work [18, 21, 28, 29] has stressed several advantages of using irreversible processes for sampling. Roughly speaking, irreversible diffusions converge to equilibrium faster (and lead to smaller asymptotic variance as well). In this paper we discuss some of the recent progress in the study of nonreversible MCMC methods. In particular: i) we explain some of the difficulties that arise in the analysis of nonreversible processes and we discuss some analytical methods to approach the study of continuous-time irreversible diffusions; ii) most of the rigorous results on irreversible diffusions are available for continuous-time processes; however, for computational purposes one needs to discretize such dynamics. It is well known that the resulting discretized chain will not, in general, retain all the good properties of the process that it is obtained from. In particular, if we want to preserve the invariance of the target measure, the chain might no longer be reversible. Therefore iii) we conclude by presenting an MCMC algorithm, the SOL-HMC algorithm [23], which results from a nonreversible discretization of a nonreversible dynamics.
Unmixing hyperspectral images using Markov random fields
Eches, Olivier; Dobigeon, Nicolas; Tourneret, Jean-Yves
2011-03-14
This paper proposes a new spectral unmixing strategy based on the normal compositional model that exploits the spatial correlations between the image pixels. The pure materials (referred to as endmembers) contained in the image are assumed to be available (they can be obtained by using an appropriate endmember extraction algorithm), while the corresponding fractions (referred to as abundances) are estimated by the proposed algorithm. Due to physical constraints, the abundances have to satisfy positivity and sum-to-one constraints. The image is divided into homogeneous distinct regions having the same statistical properties for the abundance coefficients. The spatial dependencies within each class are modeled thanks to Potts-Markov random fields. Within a Bayesian framework, prior distributions for the abundances and the associated hyperparameters are introduced. A reparametrization of the abundance coefficients is proposed to handle the physical constraints (positivity and sum-to-one) inherent to hyperspectral imagery. The parameters (abundances), hyperparameters (abundance mean and variance for each class) and the classification map indicating the classes of all pixels in the image are inferred from the resulting joint posterior distribution. To overcome the complexity of the joint posterior distribution, Markov chain Monte Carlo methods are used to generate samples asymptotically distributed according to the joint posterior of interest. Simulations conducted on synthetic and real data are presented to illustrate the performance of the proposed algorithm.
A Markov model of the Indus script.
Rao, Rajesh P N; Yadav, Nisha; Vahia, Mayank N; Joglekar, Hrishikesh; Adhikari, R; Mahadevan, Iravatham
2009-08-18
Although no historical information exists about the Indus civilization (flourished ca. 2600-1900 B.C.), archaeologists have uncovered about 3,800 short samples of a script that was used throughout the civilization. The script remains undeciphered, despite a large number of attempts and claimed decipherments over the past 80 years. Here, we propose the use of probabilistic models to analyze the structure of the Indus script. The goal is to reveal, through probabilistic analysis, syntactic patterns that could point the way to eventual decipherment. We illustrate the approach using a simple Markov chain model to capture sequential dependencies between signs in the Indus script. The trained model allows new sample texts to be generated, revealing recurring patterns of signs that could potentially form functional subunits of a possible underlying language. The model also provides a quantitative way of testing whether a particular string belongs to the putative language as captured by the Markov model. Application of this test to Indus seals found in Mesopotamia and other sites in West Asia reveals that the script may have been used to express different content in these regions. Finally, we show how missing, ambiguous, or unreadable signs on damaged objects can be filled in with most likely predictions from the model. Taken together, our results indicate that the Indus script exhibits rich synactic structure and the ability to represent diverse content. both of which are suggestive of a linguistic writing system rather than a nonlinguistic symbol system. PMID:19666571
A Chaotic System with Different Families of Hidden Attractors
NASA Astrophysics Data System (ADS)
Pham, Viet-Thanh; Volos, Christos; Jafari, Sajad; Vaidyanathan, Sundarapandian; Kapitaniak, Tomasz; Wang, Xiong
The presence of hidden attractors in dynamical systems has received considerable attention recently both in theory and applications. A novel three-dimensional autonomous chaotic system with hidden attractors is introduced in this paper. It is exciting that this chaotic system can exhibit two different families of hidden attractors: hidden attractors with an infinite number of equilibrium points and hidden attractors without equilibrium. Dynamical behaviors of such system are discovered through mathematical analysis, numerical simulations and circuit implementation.
Brady 1D seismic velocity model ambient noise prelim
Mellors, Robert J.
2013-10-25
Preliminary 1D seismic velocity model derived from ambient noise correlation. 28 Green's functions filtered between 4-10 Hz for Vp, Vs, and Qs were calculated. 1D model estimated for each path. The final model is a median of the individual models. Resolution is best for the top 1 km. Poorly constrained with increasing depth.
Kaplan, Alan D; O'Sullivan, Joseph A; Sirevaag, Erik J; Kristjansson, Sean D; Lai, Po-Hsiang; Rohrbaugh, John W
2010-01-01
A laser Doppler vibrometer (LDV) is used to sense movements of the skin overlying the carotid artery. Fluctuations in carotid artery diameter due to variations in the underlying blood pressure are sensed at the surface of the skin. Portions of the LDV signal corresponding to single heartbeats, called the LDV pulses, are extracted. This paper introduces the use of hidden Markov models (HMMs) to model the dynamics of the LDV pulse from beat to beat based on pulse morphology, which under resting conditions are primarily due to breathing effects. LDV pulses are classified according to state, by computing the optimal state path through the data using trained HMMs. HMM state dynamics are compared to simultaneous recordings of strain gauges placed on the abdomen. The work presented here provides a robust statistical approach to modeling the dependence of the LDV pulse on latent states.
NASA Astrophysics Data System (ADS)
Xiao, C. W.; Ozpineci, A.; Oset, E.
2015-10-01
Using a coupled channel unitary approach, combining the heavy quark spin symmetry and the dynamics of the local hidden gauge, we investigate the meson-meson interaction with hidden beauty. We obtain several new states of isospin I = 0: six bound states, and weakly bound six more possible states which depend on the influence of the coupled channel effects.
Kunte, Amit; Zhang, Wei; Paduraru, Crina; Veerapen, Natacha; Cox, Liam R.; Besra, Gurdyal S.; Cresswell, Peter
2013-01-01
The non-classical major histocompatibility complex (MHC) homologue CD1d presents lipid antigens to innate-like lymphocytes called natural-killer T (NKT) cells. These cells, by virtue of their broad cytokine repertoire, shape innate and adaptive immune responses. Here, we have assessed the role of endoplasmic reticulum glycoprotein quality control in CD1d assembly and function, specifically the role of a key component of the quality control machinery, the enzyme UDP glucose glycoprotein glucosyltransferase (UGT1). We observe that in UGT1-deficient cells, CD1d associates prematurely with β2-microglobulin (β2m) and is able to rapidly exit the endoplasmic reticulum. At least some of these CD1d-β2m heterodimers are shorter-lived and can be rescued by provision of a defined exogenous antigen, α-galactosylceramide. Importantly, we show that in UGT1-deficient cells the CD1d-β2m heterodimers have altered antigenicity despite the fact that their cell surface levels are unchanged. We propose that UGT1 serves as a quality control checkpoint during CD1d assembly and further suggest that UGT1-mediated quality control can shape the lipid repertoire of newly synthesized CD1d. The quality control process may play a role in ensuring stability of exported CD1d-β2m complexes, in facilitating presentation of low abundance high affinity antigens, or in preventing deleterious responses to self lipids. PMID:23615906
Markov and non-Markov processes in complex systems by the dynamical information entropy
NASA Astrophysics Data System (ADS)
Yulmetyev, R. M.; Gafarov, F. M.
1999-12-01
We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.
Popovic, Marta; Zaja, Roko; Fent, Karl; Smital, Tvrtko
2014-10-01
Polyspecific transporters from the organic anion transporting polypeptide (OATP/Oatp) superfamily mediate the uptake of a wide range of compounds. In zebrafish, Oatp1d1 transports conjugated steroid hormones and cortisol. It is predominantly expressed in the liver, brain and testes. In this study we have characterized the transport of xenobiotics by the zebrafish Oatp1d1 transporter. We developed a novel assay for assessing Oatp1d1 interactors using the fluorescent probe Lucifer yellow and transient transfection in HEK293 cells. Our data showed that numerous environmental contaminants interact with zebrafish Oatp1d1. Oatp1d1 mediated the transport of diclofenac with very high affinity, followed by high affinity towards perfluorooctanesulfonic acid (PFOS), nonylphenol, gemfibrozil and 17α-ethinylestradiol; moderate affinity towards carbaryl, diazinon and caffeine; and low affinity towards metolachlor. Importantly, many environmental chemicals acted as strong inhibitors of Oatp1d1. A strong inhibition of Oatp1d1 transport activity was found by perfluorooctanoic acid (PFOA), chlorpyrifos-methyl, estrone (E1) and 17β-estradiol (E2), followed by moderate to low inhibition by diethyl phthalate, bisphenol A, 7-acetyl-1,1,3,4,4,6-hexamethyl-1,2,3,4 tetrahydronapthalene and clofibrate. In this study we identified Oatp1d1 as a first Solute Carrier (SLC) transporter involved in the transport of a wide range of xenobiotics in fish. Considering that Oatps in zebrafish have not been characterized before, our work on zebrafish Oatp1d1 offers important new insights on the understanding of uptake processes of environmental contaminants, and contributes to the better characterization of zebrafish as a model species. - Highlights: • We optimized a novel assay for determination of Oatp1d1 interactors • Oatp1d1 is the first SLC characterized fish xenobiotic transporter • PFOS, nonylphenol, diclofenac, EE2, caffeine are high affinity Oatp1d1substrates • PFOA, chlorpyrifos
SHARP ENTRYWISE PERTURBATION BOUNDS FOR MARKOV CHAINS
THIEDE, ERIK; VAN KOTEN, BRIAN; WEARE, JONATHAN
2015-01-01
For many Markov chains of practical interest, the invariant distribution is extremely sensitive to perturbations of some entries of the transition matrix, but insensitive to others; we give an example of such a chain, motivated by a problem in computational statistical physics. We have derived perturbation bounds on the relative error of the invariant distribution that reveal these variations in sensitivity. Our bounds are sharp, we do not impose any structural assumptions on the transition matrix or on the perturbation, and computing the bounds has the same complexity as computing the invariant distribution or computing other bounds in the literature. Moreover, our bounds have a simple interpretation in terms of hitting times, which can be used to draw intuitive but rigorous conclusions about the sensitivity of a chain to various types of perturbations. PMID:26491218
Estimation and uncertainty of reversible Markov models.
Trendelkamp-Schroer, Benjamin; Wu, Hao; Paul, Fabian; Noé, Frank
2015-11-01
Reversibility is a key concept in Markov models and master-equation models of molecular kinetics. The analysis and interpretation of the transition matrix encoding the kinetic properties of the model rely heavily on the reversibility property. The estimation of a reversible transition matrix from simulation data is, therefore, crucial to the successful application of the previously developed theory. In this work, we discuss methods for the maximum likelihood estimation of transition matrices from finite simulation data and present a new algorithm for the estimation if reversibility with respect to a given stationary vector is desired. We also develop new methods for the Bayesian posterior inference of reversible transition matrices with and without given stationary vector taking into account the need for a suitable prior distribution preserving the meta-stable features of the observed process during posterior inference. All algorithms here are implemented in the PyEMMA software--http://pyemma.org--as of version 2.0. PMID:26547152
Growth and Dissolution of Macromolecular Markov Chains
NASA Astrophysics Data System (ADS)
Gaspard, Pierre
2016-07-01
The kinetics and thermodynamics of free living copolymerization are studied for processes with rates depending on k monomeric units of the macromolecular chain behind the unit that is attached or detached. In this case, the sequence of monomeric units in the growing copolymer is a kth-order Markov chain. In the regime of steady growth, the statistical properties of the sequence are determined analytically in terms of the attachment and detachment rates. In this way, the mean growth velocity as well as the thermodynamic entropy production and the sequence disorder can be calculated systematically. These different properties are also investigated in the regime of depolymerization where the macromolecular chain is dissolved by the surrounding solution. In this regime, the entropy production is shown to satisfy Landauer's principle.
Forest Pest Occurrence Predictionca-Markov Model
NASA Astrophysics Data System (ADS)
Xie, Fangyi; Zhang, Xiaoli; Chen, Xiaoyan
Since the spatial pattern of forest pest occurrence is determined by biological characteristics and habitat conditions, this paper introduced construction of a cellular automaton model combined with Markov model to predicate the forest pest occurrence. Rules of the model includes the cell states rules, neighborhood rules and transition rules which are defined according to the factors from stand conditions, stand structures, climate and the influence of the factors on the state conversion. Coding for the model is also part of the implementations of the model. The participants were designed including attributes and operations of participants expressed with a UML diagram. Finally, the scale issues on forest pest occurrence prediction, of which the core are the prediction of element size and time interval, are partly discussed in this paper.
Markov state models and molecular alchemy
NASA Astrophysics Data System (ADS)
Schütte, Christof; Nielsen, Adam; Weber, Marcus
2015-01-01
In recent years, Markov state models (MSMs) have attracted a considerable amount of attention with regard to modelling conformation changes and associated function of biomolecular systems. They have been used successfully, e.g. for peptides including time-resolved spectroscopic experiments, protein function and protein folding , DNA and RNA, and ligand-receptor interaction in drug design and more complicated multivalent scenarios. In this article, a novel reweighting scheme is introduced that allows to construct an MSM for certain molecular system out of an MSM for a similar system. This permits studying how molecular properties on long timescales differ between similar molecular systems without performing full molecular dynamics simulations for each system under consideration. The performance of the reweighting scheme is illustrated for simple test cases, including one where the main wells of the respective energy landscapes are located differently and an alchemical transformation of butane to pentane where the dimension of the state space is changed.
Estimation and uncertainty of reversible Markov models
NASA Astrophysics Data System (ADS)
Trendelkamp-Schroer, Benjamin; Wu, Hao; Paul, Fabian; Noé, Frank
2015-11-01
Reversibility is a key concept in Markov models and master-equation models of molecular kinetics. The analysis and interpretation of the transition matrix encoding the kinetic properties of the model rely heavily on the reversibility property. The estimation of a reversible transition matrix from simulation data is, therefore, crucial to the successful application of the previously developed theory. In this work, we discuss methods for the maximum likelihood estimation of transition matrices from finite simulation data and present a new algorithm for the estimation if reversibility with respect to a given stationary vector is desired. We also develop new methods for the Bayesian posterior inference of reversible transition matrices with and without given stationary vector taking into account the need for a suitable prior distribution preserving the meta-stable features of the observed process during posterior inference. All algorithms here are implemented in the PyEMMA software — http://pyemma.org — as of version 2.0.
Transition-Independent Decentralized Markov Decision Processes
NASA Technical Reports Server (NTRS)
Becker, Raphen; Silberstein, Shlomo; Lesser, Victor; Goldman, Claudia V.; Morris, Robert (Technical Monitor)
2003-01-01
There has been substantial progress with formal models for sequential decision making by individual agents using the Markov decision process (MDP). However, similar treatment of multi-agent systems is lacking. A recent complexity result, showing that solving decentralized MDPs is NEXP-hard, provides a partial explanation. To overcome this complexity barrier, we identify a general class of transition-independent decentralized MDPs that is widely applicable. The class consists of independent collaborating agents that are tied up by a global reward function that depends on both of their histories. We present a novel algorithm for solving this class of problems and examine its properties. The result is the first effective technique to solve optimally a class of decentralized MDPs. This lays the foundation for further work in this area on both exact and approximate solutions.
Markov state models of biomolecular conformational dynamics
Chodera, John D.; Noé, Frank
2014-01-01
It has recently become practical to construct Markov state models (MSMs) that reproduce the long-time statistical conformational dynamics of biomolecules using data from molecular dynamics simulations. MSMs can predict both stationary and kinetic quantities on long timescales (e.g. milliseconds) using a set of atomistic molecular dynamics simulations that are individually much shorter, thus addressing the well-known sampling problem in molecular dynamics simulation. In addition to providing predictive quantitative models, MSMs greatly facilitate both the extraction of insight into biomolecular mechanism (such as folding and functional dynamics) and quantitative comparison with single-molecule and ensemble kinetics experiments. A variety of methodological advances and software packages now bring the construction of these models closer to routine practice. Here, we review recent progress in this field, considering theoretical and methodological advances, new software tools, and recent applications of these approaches in several domains of biochemistry and biophysics, commenting on remaining challenges. PMID:24836551
Anatomy Ontology Matching Using Markov Logic Networks
Li, Chunhua; Zhao, Pengpeng; Wu, Jian; Cui, Zhiming
2016-01-01
The anatomy of model species is described in ontologies, which are used to standardize the annotations of experimental data, such as gene expression patterns. To compare such data between species, we need to establish relationships between ontologies describing different species. Ontology matching is a kind of solutions to find semantic correspondences between entities of different ontologies. Markov logic networks which unify probabilistic graphical model and first-order logic provide an excellent framework for ontology matching. We combine several different matching strategies through first-order logic formulas according to the structure of anatomy ontologies. Experiments on the adult mouse anatomy and the human anatomy have demonstrated the effectiveness of proposed approach in terms of the quality of result alignment. PMID:27382498
1-D and 2-D Probabilistic Inversions of Fault Zone Guided Waves
NASA Astrophysics Data System (ADS)
Gulley, A.; Eccles, J. D.; Kaipio, J. P.; Malin, P. E.
2015-12-01
Fault Zone Guided Waves (FZGWs) are seismic coda that are trapped by the low velocity damage zone of faults. Inversions of these phases can be carried out using their measured dispersion and a Bayesian probability approach. This method utilises a Markov chain Monte Carlo which allows uncertainties and trade-offs to be quantified. Accordingly we have developed a scheme that estimates the dispersion curve and amplitude response variability from a FZGW record. This method allows the computation of both the point estimates and the covariance of the dispersion curve. The subsequent estimation of fault zone parameters is then based on a Gaussian model for the dispersion curve. We then show that inversions using FZGW dispersion data can only resolve fault zone velocity contrast and fault zone width - it leaves densities, absolute country rock velocities and the earthquake location unresolved. We show that they do however significantly affect the estimated fault zone velocities and widths. As these parameters cannot be resolved, we allow for their effects on the estimates of fault zone width and velocity contrast by using the Bayesian approximation error method. We show that using this method reduces computational time from days to minutes and the associated loss of accuracy is insignificant compared to carrying out the inversion on all parameters. We have extended our scheme to 2-D using 1-D slices. The Bayesian approximation error methodology is further employed to provide a 'correction term' with uncertainty for the 1-D slice approximation. We investigate these features with both synthetic data and FZGW data from the Alpine Fault of New Zealand.
Probing hidden sector photons through the Higgs window
NASA Astrophysics Data System (ADS)
Ahlers, Markus; Jaeckel, Joerg; Redondo, Javier; Ringwald, Andreas
2008-10-01
We investigate the possibility that a (light) hidden sector extra photon receives its mass via spontaneous symmetry breaking of a hidden sector Higgs boson, the so-called hidden-Higgs. The hidden-photon can mix with the ordinary photon via a gauge kinetic mixing term. The hidden-Higgs can couple to the standard model Higgs via a renormalizable quartic term—sometimes called the Higgs portal. We discuss the implications of this light hidden-Higgs in the context of laser polarization and light-shining-through-the-wall experiments as well as cosmological, astrophysical, and non-Newtonian force measurements. For hidden-photons receiving their mass from a hidden-Higgs, we find in the small mass regime significantly stronger bounds than the bounds on massive hidden sector photons alone.
Crossing over...Markov meets Mendel.
Mneimneh, Saad
2012-01-01
Chromosomal crossover is a biological mechanism to combine parental traits. It is perhaps the first mechanism ever taught in any introductory biology class. The formulation of crossover, and resulting recombination, came about 100 years after Mendel's famous experiments. To a great extent, this formulation is consistent with the basic genetic findings of Mendel. More importantly, it provides a mathematical insight for his two laws (and corrects them). From a mathematical perspective, and while it retains similarities, genetic recombination guarantees diversity so that we do not rapidly converge to the same being. It is this diversity that made the study of biology possible. In particular, the problem of genetic mapping and linkage-one of the first efforts towards a computational approach to biology-relies heavily on the mathematical foundation of crossover and recombination. Nevertheless, as students we often overlook the mathematics of these phenomena. Emphasizing the mathematical aspect of Mendel's laws through crossover and recombination will prepare the students to make an early realization that biology, in addition to being experimental, IS a computational science. This can serve as a first step towards a broader curricular transformation in teaching biological sciences. I will show that a simple and modern treatment of Mendel's laws using a Markov chain will make this step possible, and it will only require basic college-level probability and calculus. My personal teaching experience confirms that students WANT to know Markov chains because they hear about them from bioinformaticists all the time. This entire exposition is based on three homework problems that I designed for a course in computational biology. A typical reader is, therefore, an instructional staff member or a student in a computational field (e.g., computer science, mathematics, statistics, computational biology, bioinformatics). However, other students may easily follow by omitting the
Crossing over...Markov meets Mendel.
Mneimneh, Saad
2012-01-01
Chromosomal crossover is a biological mechanism to combine parental traits. It is perhaps the first mechanism ever taught in any introductory biology class. The formulation of crossover, and resulting recombination, came about 100 years after Mendel's famous experiments. To a great extent, this formulation is consistent with the basic genetic findings of Mendel. More importantly, it provides a mathematical insight for his two laws (and corrects them). From a mathematical perspective, and while it retains similarities, genetic recombination guarantees diversity so that we do not rapidly converge to the same being. It is this diversity that made the study of biology possible. In particular, the problem of genetic mapping and linkage-one of the first efforts towards a computational approach to biology-relies heavily on the mathematical foundation of crossover and recombination. Nevertheless, as students we often overlook the mathematics of these phenomena. Emphasizing the mathematical aspect of Mendel's laws through crossover and recombination will prepare the students to make an early realization that biology, in addition to being experimental, IS a computational science. This can serve as a first step towards a broader curricular transformation in teaching biological sciences. I will show that a simple and modern treatment of Mendel's laws using a Markov chain will make this step possible, and it will only require basic college-level probability and calculus. My personal teaching experience confirms that students WANT to know Markov chains because they hear about them from bioinformaticists all the time. This entire exposition is based on three homework problems that I designed for a course in computational biology. A typical reader is, therefore, an instructional staff member or a student in a computational field (e.g., computer science, mathematics, statistics, computational biology, bioinformatics). However, other students may easily follow by omitting the
D1/D5 dopamine receptors modulate spatial memory formation.
da Silva, Weber C N; Köhler, Cristiano C; Radiske, Andressa; Cammarota, Martín
2012-02-01
We investigated the effect of the intra-CA1 administration of the D1/D5 receptor antagonist SCH23390 and the D1/D5 receptor agonist SKF38393 on spatial memory in the water maze. When given immediately, but not 3h after training, SCH23390 hindered long-term spatial memory formation without affecting non-spatial memory or the normal functionality of the hippocampus. On the contrary, post-training infusion of SKF38393 enhanced retention and facilitated the spontaneous recovery of the original spatial preference after reversal learning. Our findings demonstrate that hippocampal D1/D5 receptors play an essential role in spatial memory processing.
Show Me the Invisible: Visualizing Hidden Content.
Geymayer, Thomas; Steinberger, Markus; Lex, Alexander; Streit, Marc; Schmalstieg, Dieter
2014-01-01
Content on computer screens is often inaccessible to users because it is hidden, e.g., occluded by other windows, outside the viewport, or overlooked. In search tasks, the efficient retrieval of sought content is important. Current software, however, only provides limited support to visualize hidden occurrences and rarely supports search synchronization crossing application boundaries. To remedy this situation, we introduce two novel visualization methods to guide users to hidden content. Our first method generates awareness for occluded or out-of-viewport content using see-through visualization. For content that is either outside the screen's viewport or for data sources not opened at all, our second method shows off-screen indicators and an on-demand smart preview. To reduce the chances of overlooking content, we use visual links, i.e., visible edges, to connect the visible content or the visible representations of the hidden content. We show the validity of our methods in a user study, which demonstrates that our technique enables a faster localization of hidden content compared to traditional search functionality and thereby assists users in information retrieval tasks. PMID:25325078
A human serotonin 1D receptor variant (5HT1D beta) encoded by an intronless gene on chromosome 6.
Demchyshyn, L; Sunahara, R K; Miller, K; Teitler, M; Hoffman, B J; Kennedy, J L; Seeman, P; Van Tol, H H; Niznik, H B
1992-01-01
An intronless gene encoding a serotonin receptor (5HT1D beta) has been cloned and functionally expressed in mammalian fibroblast cultures. Based on the deduced amino acid sequence, the gene encodes a 390-amino acid protein displaying considerable homology, within putative transmembrane domains (approximately 75% identity) to the canine and human 5HT1D receptors. Membranes prepared from CHO cells stably expressing the receptor bound [3H]serotonin with high affinity (Kd 4 nM) and displayed a pharmacological profile consistent, but not identical, with that of the characterized serotonin 5HT1D receptor. Most notably, metergoline and serotonergic piperazine derivatives, as a group, display 3- to 8-fold lower affinity for the 5HT1D beta receptor than for the 5HT1D receptor, whereas both receptors display similar affinities for tryptamine derivatives, including the antimigraine drug sumatriptan. Northern blot analysis revealed an mRNA of approximately 5.5 kilobases expressed in human and monkey frontal cortex, medulla, striatum, hippocampus and amygdala but not in cerebellum, olfactory tubercle, and pituitary. The 5HT1D beta gene maps to human chromosome 6. The existence of multiple neuronal 5HT1D-like receptors may help account for some of the complexities associated with [3H]serotonin binding patterns in native membranes. Images PMID:1351684
Specification test for Markov models with measurement errors*
Kim, Seonjin; Zhao, Zhibiao
2014-01-01
Most existing works on specification testing assume that we have direct observations from the model of interest. We study specification testing for Markov models based on contaminated observations. The evolving model dynamics of the unobservable Markov chain is implicitly coded into the conditional distribution of the observed process. To test whether the underlying Markov chain follows a parametric model, we propose measuring the deviation between nonparametric and parametric estimates of conditional regression functions of the observed process. Specifically, we construct a nonparametric simultaneous confidence band for conditional regression functions and check whether the parametric estimate is contained within the band. PMID:25346552
Sheehan, Sara; Harris, Kelley; Song, Yun S
2013-07-01
Throughout history, the population size of modern humans has varied considerably due to changes in environment, culture, and technology. More accurate estimates of population size changes, and when they occurred, should provide a clearer picture of human colonization history and help remove confounding effects from natural selection inference. Demography influences the pattern of genetic variation in a population, and thus genomic data of multiple individuals sampled from one or more present-day populations contain valuable information about the past demographic history. Recently, Li and Durbin developed a coalescent-based hidden Markov model, called the pairwise sequentially Markovian coalescent (PSMC), for a pair of chromosomes (or one diploid individual) to estimate past population sizes. This is an efficient, useful approach, but its accuracy in the very recent past is hampered by the fact that, because of the small sample size, only few coalescence events occur in that period. Multiple genomes from the same population contain more information about the recent past, but are also more computationally challenging to study jointly in a coalescent framework. Here, we present a new coalescent-based method that can efficiently infer population size changes from multiple genomes, providing access to a new store of information about the recent past. Our work generalizes the recently developed sequentially Markov conditional sampling distribution framework, which provides an accurate approximation of the probability of observing a newly sampled haplotype given a set of previously sampled haplotypes. Simulation results demonstrate that we can accurately reconstruct the true population histories, with a significant improvement over the PSMC in the recent past. We apply our method, called diCal, to the genomes of multiple human individuals of European and African ancestry to obtain a detailed population size change history during recent times.
Sheehan, Sara; Harris, Kelley; Song, Yun S.
2013-01-01
Throughout history, the population size of modern humans has varied considerably due to changes in environment, culture, and technology. More accurate estimates of population size changes, and when they occurred, should provide a clearer picture of human colonization history and help remove confounding effects from natural selection inference. Demography influences the pattern of genetic variation in a population, and thus genomic data of multiple individuals sampled from one or more present-day populations contain valuable information about the past demographic history. Recently, Li and Durbin developed a coalescent-based hidden Markov model, called the pairwise sequentially Markovian coalescent (PSMC), for a pair of chromosomes (or one diploid individual) to estimate past population sizes. This is an efficient, useful approach, but its accuracy in the very recent past is hampered by the fact that, because of the small sample size, only few coalescence events occur in that period. Multiple genomes from the same population contain more information about the recent past, but are also more computationally challenging to study jointly in a coalescent framework. Here, we present a new coalescent-based method that can efficiently infer population size changes from multiple genomes, providing access to a new store of information about the recent past. Our work generalizes the recently developed sequentially Markov conditional sampling distribution framework, which provides an accurate approximation of the probability of observing a newly sampled haplotype given a set of previously sampled haplotypes. Simulation results demonstrate that we can accurately reconstruct the true population histories, with a significant improvement over the PSMC in the recent past. We apply our method, called diCal, to the genomes of multiple human individuals of European and African ancestry to obtain a detailed population size change history during recent times. PMID:23608192
60. BOILER CHAMBER No. 1, D LOOP STEAM GENERATOR AND ...
60. BOILER CHAMBER No. 1, D LOOP STEAM GENERATOR AND MAIN COOLANT PUMP LOOKING NORTHEAST (LOCATION OOO) - Shippingport Atomic Power Station, On Ohio River, 25 miles Northwest of Pittsburgh, Shippingport, Beaver County, PA
Severe Hypertriglyceridemia in Glut1D on Ketogenic Diet.
Klepper, Joerg; Leiendecker, Baerbel; Heussinger, Nicole; Lausch, Ekkehart; Bosch, Friedrich
2016-04-01
High-fat ketogenic diets are the only treatment available for Glut1 deficiency (Glut1D). Here, we describe an 8-year-old girl with classical Glut1D responsive to a 3:1 ketogenic diet and ethosuximide. After 3 years on the diet a gradual increase of blood lipids was followed by rapid, severe asymptomatic hypertriglyceridemia (1,910 mg/dL). Serum lipid apheresis was required to determine liver, renal, and pancreatic function. A combination of medium chain triglyceride-oil and a reduction of the ketogenic diet to 1:1 ratio normalized triglyceride levels within days but triggered severe myoclonic seizures requiring comedication with sultiam. Severe hypertriglyceridemia in children with Glut1D on ketogenic diets may be underdiagnosed and harmful. In contrast to congenital hypertriglyceridemias, children with Glut1D may be treated effectively by dietary adjustments alone. PMID:26902182
1D Nanostructures: Controlled Fabrication and Energy Applications
Hu, Michael Z.
2013-01-01
Jian Wei, Xuchun Song, Chunli Yang, and Michael Z. Hu, 1D Nanostructures: Controlled Fabrication and Energy Applications, Journal of Nanomaterials, published special issue (http://www.hindawi.com/journals/jnm/si/197254/) (2013).
Black hole portal into hidden valleys
Dubovsky, Sergei; Gorbenko, Victor
2011-05-15
Superradiant instability turns rotating astrophysical black holes into unique probes of light axions. We consider what happens when a light axion is coupled to a strongly coupled hidden gauge sector. In this case superradiance results in an adiabatic increase of a hidden sector CP-violating {theta} parameter in a near horizon region. This may trigger a first order phase transition in the gauge sector. As a result a significant fraction of a black hole mass is released as a cloud of hidden mesons and can be later converted into electromagnetic radiation. This results in a violent electromagnetic burst. The characteristic frequency of such bursts may range from {approx}100 eV to {approx}100 MeV.
Hidden Variables Theorems with Fewer Measurements
NASA Astrophysics Data System (ADS)
Lawrence, Jay
A Greenberger-Horne-Zeilinger (GHZ) contradiction may be thought of as a sequence of measurements on a system of N particles, for which each may be duplicated by local hidden variables up to, but not including the last of an irreducible set. Each measurement consists of N spatially separated local measurements on individual particles. Existing contradictions require more such measurements than there are particles, the minimum number being N + 1 . By allowing successive measurements to impose incremental local constraints on the hidden variables (as opposed to global constraints associated with products of hidden variables), we derive contradictions that require fewer measurements. We have found protocols for which the number of measurements, Nm, grows more slowly than linearly with the number of particles: Asymptotically, Nm √{ 2 N } for large N if the particles are qubits, and a similar relation holds for particles of higher spins.
Hidden conformal symmetry and quasinormal modes
NASA Astrophysics Data System (ADS)
Chen, Bin; Long, Jiang
2010-12-01
We provide an algebraic way to calculate the quasinormal modes of a black hole, which possesses a hidden conformal symmetry. We construct an infinite tower of quasinormal modes from the highest-weight mode, in a simple and elegant way. For the scalar, the hidden conformal symmetry manifests itself in the fact that the scalar Laplacian could be rewritten in terms of the SL(2,R) quadratic Casimir. For the vector and the tensor, the hidden conformal symmetry acts on them through Lie derivatives. We show that for three-dimensional black holes, with an appropriate combination of the components, the radial equations of the vector and the tensor could be written in terms of the Lie-induced quadratic Casimir. This makes the algebraic construction of the quasinormal modes feasible. Our results are in good agreement with the previous study.
TBC1D24 genotype–phenotype correlation
Balestrini, Simona; Milh, Mathieu; Castiglioni, Claudia; Lüthy, Kevin; Finelli, Mattea J.; Verstreken, Patrik; Cardon, Aaron; Stražišar, Barbara Gnidovec; Holder, J. Lloyd; Lesca, Gaetan; Mancardi, Maria M.; Poulat, Anne L.; Repetto, Gabriela M.; Banka, Siddharth; Bilo, Leonilda; Birkeland, Laura E.; Bosch, Friedrich; Brockmann, Knut; Cross, J. Helen; Doummar, Diane; Félix, Temis M.; Giuliano, Fabienne; Hori, Mutsuki; Hüning, Irina; Kayserili, Hulia; Kini, Usha; Lees, Melissa M.; Meenakshi, Girish; Mewasingh, Leena; Pagnamenta, Alistair T.; Peluso, Silvio; Mey, Antje; Rice, Gregory M.; Rosenfeld, Jill A.; Taylor, Jenny C.; Troester, Matthew M.; Stanley, Christine M.; Ville, Dorothee; Walkiewicz, Magdalena; Falace, Antonio; Fassio, Anna; Lemke, Johannes R.; Biskup, Saskia; Tardif, Jessica; Ajeawung, Norbert F.; Tolun, Aslihan; Corbett, Mark; Gecz, Jozef; Afawi, Zaid; Howell, Katherine B.; Oliver, Karen L.; Berkovic, Samuel F.; Scheffer, Ingrid E.; de Falco, Fabrizio A.; Oliver, Peter L.; Striano, Pasquale; Zara, Federico
2016-01-01
Objective: To evaluate the phenotypic spectrum associated with mutations in TBC1D24. Methods: We acquired new clinical, EEG, and neuroimaging data of 11 previously unreported and 37 published patients. TBC1D24 mutations, identified through various sequencing methods, can be found online (http://lovd.nl/TBC1D24). Results: Forty-eight patients were included (28 men, 20 women, average age 21 years) from 30 independent families. Eighteen patients (38%) had myoclonic epilepsies. The other patients carried diagnoses of focal (25%), multifocal (2%), generalized (4%), and unclassified epilepsy (6%), and early-onset epileptic encephalopathy (25%). Most patients had drug-resistant epilepsy. We detail EEG, neuroimaging, developmental, and cognitive features, treatment responsiveness, and physical examination. In silico evaluation revealed 7 different highly conserved motifs, with the most common pathogenic mutation located in the first. Neuronal outgrowth assays showed that some TBC1D24 mutations, associated with the most severe TBC1D24-associated disorders, are not necessarily the most disruptive to this gene function. Conclusions: TBC1D24-related epilepsy syndromes show marked phenotypic pleiotropy, with multisystem involvement and severity spectrum ranging from isolated deafness (not studied here), benign myoclonic epilepsy restricted to childhood with complete seizure control and normal intellect, to early-onset epileptic encephalopathy with severe developmental delay and early death. There is no distinct correlation with mutation type or location yet, but patterns are emerging. Given the phenotypic breadth observed, TBC1D24 mutation screening is indicated in a wide variety of epilepsies. A TBC1D24 consortium was formed to develop further research on this gene and its associated phenotypes. PMID:27281533
Hidden treasures - 50 km points of interests
NASA Astrophysics Data System (ADS)
Lommi, Matias; Kortelainen, Jaana
2015-04-01
Tampere is third largest city in Finland and a regional centre. During 70's there occurred several communal mergers. Nowadays this local area has both strong and diversed identity - from wilderness and agricultural fields to high density city living. Outside the city center there are interesting geological points unknown for modern city settlers. There is even a local proverb, "Go abroad to Teisko!". That is the area the Hidden Treasures -student project is focused on. Our school Tammerkoski Upper Secondary School (or Gymnasium) has emphasis on visual arts. We are going to offer our art students scientific and artistic experiences and knowledge about the hidden treasures of Teisko area and involve the Teisko inhabitants into this project. Hidden treasures - Precambrian subduction zone and a volcanism belt with dense bed of gold (Au) and arsenic (As), operating goldmines and quarries of minerals and metamorphic slates. - North of subduction zone a homogenic precambrian magmastone area with quarries, products known as Kuru Grey. - Former ashores of post-glasial Lake Näsijärvi and it's sediments enabled the developing agriculture and sustained settlement. Nowadays these ashores have both scenery and biodiversity values. - Old cattle sheds and dairy buildings made of local granite stones related to cultural stonebuilding inheritance. - Local active community of Kapee, about 100 inhabitants. Students will discover information of these "hidden" phenomena, and rendering this information trough Enviromental Art Method. Final form of this project will be published in several artistic and informative geocaches. These caches are achieved by a GPS-based special Hidden Treasures Cycling Route and by a website guiding people to find these hidden points of interests.
Signatures of a hidden cosmic microwave background.
Jaeckel, Joerg; Redondo, Javier; Ringwald, Andreas
2008-09-26
If there is a light Abelian gauge boson gamma' in the hidden sector its kinetic mixing with the photon can produce a hidden cosmic microwave background (HCMB). For meV masses, resonant oscillations gamma<-->gamma' happen after big bang nucleosynthesis (BBN) but before CMB decoupling, increasing the effective number of neutrinos Nnu(eff) and the baryon to photon ratio, and distorting the CMB blackbody spectrum. The agreement between BBN and CMB data provides new constraints. However, including Lyman-alpha data, Nnu(eff) > 3 is preferred. It is tempting to attribute this effect to the HCMB. The interesting parameter range will be tested in upcoming laboratory experiments. PMID:18851438
The Corporate Illiterates: The Hidden Illiterates of Silicon Valley.
ERIC Educational Resources Information Center
Chase, Sharon
1991-01-01
Describes the writing and business communication problems of college-educated workers in Silicon Valley. Discusses hidden illiterates in the universities and in the workplace. Offers solutions for professors and managers faced with the problem of hidden illiterates. (PRA)
Li, Ying; Qu, Xiaohui; Ma, Ao; Smith, Glenna J; Scherer, Norbert F; Dinner, Aaron R
2009-05-28
Traditionally, microscopic fluctuations of molecules have been probed by measuring responses of an ensemble to perturbations. Now, single-molecule experiments are capable of following fluctuations without introducing perturbations. However, dynamics not readily sampled at equilibrium should be accessible to nonequilibrium single-molecule measurements. In a recent study [Qu, X. et al. Proc. Natl. Acad. Sci. U.S.A. 2008, 105, 6602-6607], the efficiency of fluorescence resonance energy transfer (FRET) between probes on the L18 loop and 3' terminus of the 260 nucleotide RNase P RNA from Bacillus stearothermophilus was found to exhibit complex kinetics that depended on the (periodically alternating) concentration of magnesium ions ([Mg2+]) in solution. Specifically, this time series was found to exhibit a quasi-periodic response to a square-wave pattern of [Mg2+] changes. Because these experiments directly probe only one of the many degrees of freedom in the macromolecule, models are needed to interpret these data. We find that Hidden Markov Models are inadequate for describing the nonequilibrium dynamics, but they serve as starting points for the construction of models in which a discrete observable degree of freedom is coupled to a continuously evolving (hidden) variable. Consideration of several models of this general form indicates that the quasi-periodic response in the nonequilibrium experiments results from the switching (back and forth) in positions of the minima of the effective potential for the hidden variable. This switching drives oscillation of that variable and synchronizes the population to the changing [Mg2+]. We set the models in the context of earlier theoretical and experimental studies and conclude that single-molecule experiments with periodic peturbations can indeed yield qualitatively new information beyond that obtained at equilibrium. PMID:19415919
Optimal q-Markov COVER for finite precision implementation
NASA Technical Reports Server (NTRS)
Williamson, Darrell; Skelton, Robert E.
1989-01-01
The existing q-Markov COVER realization theory does not take into account the problems of arithmetic errors due to both the quantization of states and coefficients of the reduced order model. All q-Markov COVERs allow some freedom in the choice of parameters. Here, researchers exploit this freedom in the existing theory to optimize the models with respect to these finite wordlength effects.
NonMarkov Ito Processes with 1- state memory
NASA Astrophysics Data System (ADS)
McCauley, Joseph L.
2010-08-01
A Markov process, by definition, cannot depend on any previous state other than the last observed state. An Ito process implies the Fokker-Planck and Kolmogorov backward time partial differential eqns. for transition densities, which in turn imply the Chapman-Kolmogorov eqn., but without requiring the Markov condition. We present a class of Ito process superficially resembling Markov processes, but with 1-state memory. In finance, such processes would obey the efficient market hypothesis up through the level of pair correlations. These stochastic processes have been mislabeled in recent literature as 'nonlinear Markov processes'. Inspired by Doob and Feller, who pointed out that the ChapmanKolmogorov eqn. is not restricted to Markov processes, we exhibit a Gaussian Ito transition density with 1-state memory in the drift coefficient that satisfies both of Kolmogorov's partial differential eqns. and also the Chapman-Kolmogorov eqn. In addition, we show that three of the examples from McKean's seminal 1966 paper are also nonMarkov Ito processes. Last, we show that the transition density of the generalized Black-Scholes type partial differential eqn. describes a martingale, and satisfies the ChapmanKolmogorov eqn. This leads to the shortest-known proof that the Green function of the Black-Scholes eqn. with variable diffusion coefficient provides the so-called martingale measure of option pricing.
Fusion moves for Markov random field optimization.
Lempitsky, Victor; Rother, Carsten; Roth, Stefan; Blake, Andrew
2010-08-01
The efficient application of graph cuts to Markov Random Fields (MRFs) with multiple discrete or continuous labels remains an open question. In this paper, we demonstrate one possible way of achieving this by using graph cuts to combine pairs of suboptimal labelings or solutions. We call this combination process the fusion move. By employing recently developed graph-cut-based algorithms (so-called QPBO-graph cut), the fusion move can efficiently combine two proposal labelings in a theoretically sound way, which is in practice often globally optimal. We demonstrate that fusion moves generalize many previous graph-cut approaches, which allows them to be used as building blocks within a broader variety of optimization schemes than were considered before. In particular, we propose new optimization schemes for computer vision MRFs with applications to image restoration, stereo, and optical flow, among others. Within these schemes the fusion moves are used 1) for the parallelization of MRF optimization into several threads, 2) for fast MRF optimization by combining cheap-to-compute solutions, and 3) for the optimization of highly nonconvex continuous-labeled MRFs with 2D labels. Our final example is a nonvision MRF concerned with cartographic label placement, where fusion moves can be used to improve the performance of a standard inference method (loopy belief propagation).
Optimized Markov state models for metastable systems
NASA Astrophysics Data System (ADS)
Guarnera, Enrico; Vanden-Eijnden, Eric
2016-07-01
A method is proposed to identify target states that optimize a metastability index amongst a set of trial states and use these target states as milestones (or core sets) to build Markov State Models (MSMs). If the optimized metastability index is small, this automatically guarantees the accuracy of the MSM, in the sense that the transitions between the target milestones is indeed approximately Markovian. The method is simple to implement and use, it does not require that the dynamics on the trial milestones be Markovian, and it also offers the possibility to partition the system's state-space by assigning every trial milestone to the target milestones it is most likely to visit next and to identify transition state regions. Here the method is tested on the Gly-Ala-Gly peptide, where it is shown to correctly identify the expected metastable states in the dihedral angle space of the molecule without a priori information about these states. It is also applied to analyze the folding landscape of the Beta3s mini-protein, where it is shown to identify the folded basin as a connecting hub between an helix-rich region, which is entropically stabilized, and a beta-rich region, which is energetically stabilized and acts as a kinetic trap.
Markov source model for printed music decoding
NASA Astrophysics Data System (ADS)
Kopec, Gary E.; Chou, Philip A.; Maltz, David A.
1995-03-01
This paper describes a Markov source model for a simple subset of printed music notation. The model is based on the Adobe Sonata music symbol set and a message language of our own design. Chord imaging is the most complex part of the model. Much of the complexity follows from a rule of music typography that requires the noteheads for adjacent pitches to be placed on opposite sides of the chord stem. This rule leads to a proliferation of cases for other typographic details such as dot placement. We describe the language of message strings accepted by the model and discuss some of the imaging issues associated with various aspects of the message language. We also point out some aspects of music notation that appear problematic for a finite-state representation. Development of the model was greatly facilitated by the duality between image synthesis and image decoding. Although our ultimate objective was a music image model for use in decoding, most of the development proceeded by using the evolving model for image synthesis, since it is computationally far less costly to image a message than to decode an image.
Markov branching in the vertex splitting model
NASA Astrophysics Data System (ADS)
Örn Stefánsson, Sigurdur
2012-04-01
We study a special case of the vertex splitting model which is a recent model of randomly growing trees. For any finite maximum vertex degree D, we find a one parameter model, with parameter \\alpha \\in [0,1] which has a so-called Markov branching property. When D=\\infty we find a two parameter model with an additional parameter \\gamma \\in [0,1] which also has this feature. In the case D = 3, the model bears resemblance to Ford's α-model of phylogenetic trees and when D=\\infty it is similar to its generalization, the αγ-model. For α = 0, the model reduces to the well known model of preferential attachment. In the case α > 0, we prove convergence of the finite volume probability measures, generated by the growth rules, to a measure on infinite trees which is concentrated on the set of trees with a single spine. We show that the annealed Hausdorff dimension with respect to the infinite volume measure is 1/α. When γ = 0 the model reduces to a model of growing caterpillar graphs in which case we prove that the Hausdorff dimension is almost surely 1/α and that the spectral dimension is almost surely 2/(1 + α). We comment briefly on the distribution of vertex degrees and correlations between degrees of neighbouring vertices.
Leukocytes segmentation using Markov random fields.
Reta, C; Gonzalez, J A; Diaz, R; Guichard, J S
2011-01-01
The segmentation of leukocytes and their components plays an important role in the extraction of geometric, texture, and morphological characteristics used to diagnose different diseases. This paper presents a novel method to segment leukocytes and their respective nucleus and cytoplasm from microscopic bone marrow leukemia cell images. Our method uses color and texture contextual information of image pixels to extract cellular elements from images, which show heterogeneous color and texture staining and high-cell population. The CIEL ( ∗ ) a ( ∗ ) b ( ∗ ) color space is used to extract color features, whereas a 2D Wold Decomposition model is applied to extract structural and stochastic texture features. The color and texture contextual information is incorporated into an unsupervised binary Markov Random Field segmentation model. Experimental results show the performance of the proposed method on both synthetic and real leukemia cell images. An average accuracy of 95% was achieved in the segmentation of real cell images by comparing those results with manually segmented cell images.
Noiseless compression using non-Markov models
NASA Technical Reports Server (NTRS)
Blumer, Anselm
1989-01-01
Adaptive data compression techniques can be viewed as consisting of a model specified by a database common to the encoder and decoder, an encoding rule and a rule for updating the model to ensure that the encoder and decoder always agree on the interpretation of the next transmission. The techniques which fit this framework range from run-length coding, to adaptive Huffman and arithmetic coding, to the string-matching techniques of Lempel and Ziv. The compression obtained by arithmetic coding is dependent on the generality of the source model. For many sources, an independent-letter model is clearly insufficient. Unfortunately, a straightforward implementation of a Markov model requires an amount of space exponential in the number of letters remembered. The Directed Acyclic Word Graph (DAWG) can be constructed in time and space proportional to the text encoded, and can be used to estimate the probabilities required for arithmetic coding based on an amount of memory which varies naturally depending on the encoded text. The tail of that portion of the text which was encoded is the longest suffix that has occurred previously. The frequencies of letters following these previous occurrences can be used to estimate the probability distribution of the next letter. Experimental results indicate that compression is often far better than that obtained using independent-letter models, and sometimes also significantly better than other non-independent techniques.
The Hidden Curriculum of Doctoral Advising
ERIC Educational Resources Information Center
Harding-DeKam, Jenni L.; Hamilton, Boni; Loyd, Stacy
2012-01-01
We examined the hidden curriculum of doctoral advising by conceptualizing the advisor as a teacher. Using autoethnographic methods in this case study, we simultaneously explored both sides of the advisor-student relationship. The constructivist paradigm permeated all aspects of the research: data collection, analysis, and interpretation. The…
Hidden Messages: Instructional Materials for Investigating Culture.
ERIC Educational Resources Information Center
Finkelstein, Barbara, Ed.; Eder, Elizabeth K., Ed.
This book, intended to be used in the middle and high school classroom, provides teachers with unique ideas and lesson plans for exploring culture and adding a multicultural perspective to diverse subjects. "Hidden messages" are the messages of culture that are entwined in everyday lives, but which are seldom recognized or appreciated for the…
The Hidden Labour Market of the Academic.
ERIC Educational Resources Information Center
Rouhelo, Anne
Finding employment as an academic is becoming increasingly challenging for several reasons, including the tightening employment market and increases in the qualifications demanded of jobseekers and the pool of academically trained job seekers. A two-round Delphi study was therefore conducted to identify recruitment channels in the hidden labor…
Discovering Hidden Treasures with GPS Technology
ERIC Educational Resources Information Center
Nagel, Paul; Palmer, Roger
2014-01-01
"I found it!" Addison proudly proclaimed, as she used an iPhone and Global Positioning System (GPS) software to find the hidden geocache along the riverbank. Others in Lisa Bostick's fourth grade class were jealous, but there would be other geocaches to find. With the excitement of movies like "Pirates of the Caribbean"…
Registration of 'Hidden Valley' meadow fescue
Technology Transfer Automated Retrieval System (TEKTRAN)
'Hidden Valley' (Reg. No. CV-xxxx, PI xxxxxx) meadow fescue [Schedonorus pratensis (Huds.) P. Beauv.; syn. Festuca pratensis Huds.; syn. Lolium pratense (Huds.) Darbysh.] is a synthetic population originating from 561 parental genotypes. The original germplasm is of unknown central or northern Europ...
Hidden Transcripts of Flight Attendant Resistance.
ERIC Educational Resources Information Center
Murphy, Alexandra G.
1998-01-01
Analyzes (using flight attendants) hidden transcripts--interactions, stories, myths, and rituals in which employees participate beyond direct observation--to provide an avenue to identify resistance and change in the organizing process. Challenges the outdated ideal of transmissional meaning, questions organizational power by including the…
A Hidden Minority Amidst White Privilege
ERIC Educational Resources Information Center
Singer, Miriam J.
2008-01-01
It seems rather amusing to say that the author belongs to a minority, no less a hidden minority. After all, at first glance, she appears to be just another white girl (or woman). She grew up in the mid-west in a predominantly white community, middle class, and well educated. The paradox comes in their definition of minority. Today, as they seek to…
Taking Impressions of Hidden Cavity Walls
NASA Technical Reports Server (NTRS)
Burley, D.; Mayer, W.
1987-01-01
Lightweight, portable internal-molding device makes it possible to measure radii of, or examine contours of, passageways in hidden or complicated cavities. With device, measurements made in field, without returning assemblies to shop or laboratory for inspection. Molding head expands when compressed air applied. Inflatable tubes around head perform dual sealing and aligning function.
Subtleties of Hidden Quantifiers in Implication
ERIC Educational Resources Information Center
Shipman, Barbara A.
2016-01-01
Mathematical conjectures and theorems are most often of the form P(x) ? Q(x), meaning ?x,P(x) ? Q(x). The hidden quantifier ?x is crucial in understanding the implication as a statement with a truth value. Here P(x) and Q(x) alone are only predicates, without truth values, since they contain unquantified variables. But standard textbook…
Hidden supersymmetry in quantum bosonic systems
Correa, Francisco Plyushchay, Mikhail S.
2007-10-15
We show that some simple well-studied quantum mechanical systems without fermion (spin) degrees of freedom display, surprisingly, a hidden supersymmetry. The list includes the bound state Aharonov-Bohm, the Dirac delta and the Poeschl-Teller potential problems, in which the unbroken and broken N = 2 supersymmetry of linear and nonlinear (polynomial) forms is revealed.
Krejci, Adam; Hupp, Ted R.; Lexa, Matej; Vojtesek, Borivoj; Muller, Petr
2016-01-01
Motivation: Proteins often recognize their interaction partners on the basis of short linear motifs located in disordered regions on proteins’ surface. Experimental techniques that study such motifs use short peptides to mimic the structural properties of interacting proteins. Continued development of these methods allows for large-scale screening, resulting in vast amounts of peptide sequences, potentially containing information on multiple protein-protein interactions. Processing of such datasets is a complex but essential task for large-scale studies investigating protein-protein interactions. Results: The software tool presented in this article is able to rapidly identify multiple clusters of sequences carrying shared specificity motifs in massive datasets from various sources and generate multiple sequence alignments of identified clusters. The method was applied on a previously published smaller dataset containing distinct classes of ligands for SH3 domains, as well as on a new, an order of magnitude larger dataset containing epitopes for several monoclonal antibodies. The software successfully identified clusters of sequences mimicking epitopes of antibody targets, as well as secondary clusters revealing that the antibodies accept some deviations from original epitope sequences. Another test indicates that processing of even much larger datasets is computationally feasible. Availability and implementation: Hammock is published under GNU GPL v. 3 license and is freely available as a standalone program (from http://www.recamo.cz/en/software/hammock-cluster-peptides/) or as a tool for the Galaxy toolbox (from https://toolshed.g2.bx.psu.edu/view/hammock/hammock). The source code can be downloaded from https://github.com/hammock-dev/hammock/releases. Contact: muller@mou.cz Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26342231
NASA Astrophysics Data System (ADS)
Knapmeyer-Endrun, Brigitte; Hammer, Conny
2015-10-01
Detection and identification of interesting events in single-station seismic data with little prior knowledge and under tight time constraints is a typical scenario in planetary seismology. The Apollo lunar seismic data, with the only confirmed events recorded on any extraterrestrial body yet, provide a valuable test case. Here we present the application of a stochastic event detector and classifier to the data of station Apollo 16. Based on a single-waveform example for each event class and some hours of background noise, the system is trained to recognize deep moonquakes, impacts, and shallow moonquakes and performs reliably over 3 years of data. The algorithm's demonstrated ability to detect rare events and flag previously undefined signal classes as new event types is of particular interest in the analysis of the first seismic recordings from a completely new environment. We are able to classify more than 50% of previously unclassified lunar events, and additionally find over 200 new events not listed in the current lunar event catalog. These events include deep moonquakes as well as impacts and could be used to update studies on temporal variations in event rate or deep moonquakes stacks used in phase picking for localization. No unambiguous new shallow moonquake was detected, but application to data of the other Apollo stations has the potential for additional new discoveries 40 years after the data were recorded. Besides, the classification system could be useful for future seismometer missions to other planets, e.g., the InSight mission to Mars.
Slator, Paddy J.; Cairo, Christopher W.; Burroughs, Nigel J.
2015-01-01
We develop a Bayesian analysis framework to detect heterogeneity in the diffusive behaviour of single particle trajectories on cells, implementing model selection to classify trajectories as either consistent with Brownian motion or with a two-state (diffusion coefficient) switching model. The incorporation of localisation accuracy is essential, as otherwise false detection of switching within a trajectory was observed and diffusion coefficient estimates were inflated. Since our analysis is on a single trajectory basis, we are able to examine heterogeneity between trajectories in a quantitative manner. Applying our method to the lymphocyte function-associated antigen 1 (LFA-1) receptor tagged with latex beads (4 s trajectories at 1000 frames s−1), both intra- and inter-trajectory heterogeneity were detected; 12–26% of trajectories display clear switching between diffusive states dependent on condition, whilst the inter-trajectory variability is highly structured with the diffusion coefficients being related by D1 = 0.68D0 − 1.5 × 104 nm2 s−1, suggestive that on these time scales we are detecting switching due to a single process. Further, the inter-trajectory variability of the diffusion coefficient estimates (1.6 × 102 − 2.6 × 105 nm2 s−1) is very much larger than the measurement uncertainty within trajectories, suggesting that LFA-1 aggregation and cytoskeletal interactions are significantly affecting mobility, whilst the timescales of these processes are distinctly different giving rise to inter- and intra-trajectory variability. There is also an ‘immobile’ state (defined as D < 3.0 × 103 nm2 s−1) that is rarely involved in switching, immobility occurring with the highest frequency (47%) under T cell activation (phorbol-12-myristate-13-acetate (PMA) treatment) with enhanced cytoskeletal attachment (calpain inhibition). Such ‘immobile’ states frequently display slow linear drift, potentially reflecting binding to a dynamic actin cortex. Our methods allow significantly more information to be extracted from individual trajectories (ultimately limited by time resolution and time-series length), and allow statistical comparisons between trajectories thereby quantifying inter-trajectory heterogeneity. Such methods will be highly informative for the construction and fitting of molecule mobility models within membranes incorporating aggregation, binding to the cytoskeleton, or traversing membrane microdomains. PMID:26473352
NASA Astrophysics Data System (ADS)
Pham, Tuan D.; Salvetti, Federica; Wang, Bing; Diani, Marco; Heindel, Walter; Knecht, Stefan; Wersching, Heike; Baune, Bernhard T.; Berger, Klaus
2011-02-01
Rating and quantification of cerebral white matter hyperintensities on magnetic resonance imaging (MRI) are important tasks in various clinical and scientific settings. As manual evaluation is time consuming and imprecise, much effort has been made to automate the quantification of white matter hyperintensities. There is rarely any report that attempts to study the similarity/dissimilarity of white matter hyperintensity patterns that have different sizes, shapes and spatial localizations on the MRI. This paper proposes an original computational neuroscience framework for such a conceptual study with a standpoint that the prior knowledge about white matter hyperintensities can be accumulated and utilized to enable a reliable inference of the rating of a new white matter hyperintensity observation. This computational approach for rating inference of white matter hyperintensities, which appears to be the first study, can be utilized as a computerized rating-assisting tool and can be very economical for diagnostic evaluation of brain tissue lesions.
1D nanocrystals with precisely controlled dimensions, compositions, and architectures
NASA Astrophysics Data System (ADS)
Pang, Xinchang; He, Yanjie; Jung, Jaehan; Lin, Zhiqun
2016-09-01
The ability to synthesize a diverse spectrum of one-dimensional (1D) nanocrystals presents an enticing prospect for exploring nanoscale size- and shape-dependent properties. Here we report a general strategy to craft a variety of plain nanorods, core-shell nanorods, and nanotubes with precisely controlled dimensions and compositions by capitalizing on functional bottlebrush-like block copolymers with well-defined structures and narrow molecular weight distributions as nanoreactors. These cylindrical unimolecular nanoreactors enable a high degree of control over the size, shape, architecture, surface chemistry, and properties of 1D nanocrystals. We demonstrate the synthesis of metallic, ferroelectric, upconversion, semiconducting, and thermoelectric 1D nanocrystals, among others, as well as combinations thereof.
The GIRAFFE Archive: 1D and 3D Spectra
NASA Astrophysics Data System (ADS)
Royer, F.; Jégouzo, I.; Tajahmady, F.; Normand, J.; Chilingarian, I.
2013-10-01
The GIRAFFE Archive (http://giraffe-archive.obspm.fr) contains the reduced spectra observed with the intermediate and high resolution multi-fiber spectrograph installed at VLT/UT2 (ESO). In its multi-object configuration and the different integral field unit configurations, GIRAFFE produces 1D spectra and 3D spectra. We present here the status of the archive and the different functionalities to select and download both 1D and 3D data products, as well as the present content. The two collections are available in the VO: the 1D spectra (summed in the case of integral field observations) and the 3D field observations. These latter products can be explored using the VO Paris Euro3D Client (http://voplus.obspm.fr/ chil/Euro3D).
PC-1D installation manual and user's guide
Basore, P.A.
1991-05-01
PC-1D is a software package for personal computers that uses finite-element analysis to solve the fully-coupled two-carrier semiconductor transport equations in one dimension. This program is particularly useful for analyzing the performance of optoelectronic devices such as solar cells, but can be applied to any bipolar device whose carrier flows are primarily one-dimensional. This User's Guide provides the information necessary to install PC-1D, define a problem for solution, solve the problem, and examine the results. Example problems are presented which illustrate these steps. The physical models and numerical methods utilized are presented in detail. This document supports version 3.1 of PC-1D, which incorporates faster numerical algorithms with better convergence properties than previous versions of the program. 51 refs., 17 figs., 5 tabs.
Pitch-based pattern splitting for 1D layout
NASA Astrophysics Data System (ADS)
Nakayama, Ryo; Ishii, Hiroyuki; Mikami, Koji; Tsujita, Koichiro; Yaegashi, Hidetami; Oyama, Kenichi; Smayling, Michael C.; Axelrad, Valery
2015-07-01
The pattern splitting algorithm for 1D Gridded-Design-Rules layout (1D layout) for sub-10 nm node logic devices is shown. It is performed with integer linear programming (ILP) based on the conflict graph created from a grid map for each designated pitch. The relation between the number of times for patterning and the minimum pitch is shown systematically with a sample pattern of contact layer for each node. From the result, the number of times for patterning for 1D layout is fewer than that for conventional 2D layout. Moreover, an experimental result including SMO and total integrated process with hole repair technique is presented with the sample pattern of contact layer whose pattern density is relatively high among critical layers (fin, gate, local interconnect, contact, and metal).
1D nanocrystals with precisely controlled dimensions, compositions, and architectures.
Pang, Xinchang; He, Yanjie; Jung, Jaehan; Lin, Zhiqun
2016-09-16
The ability to synthesize a diverse spectrum of one-dimensional (1D) nanocrystals presents an enticing prospect for exploring nanoscale size- and shape-dependent properties. Here we report a general strategy to craft a variety of plain nanorods, core-shell nanorods, and nanotubes with precisely controlled dimensions and compositions by capitalizing on functional bottlebrush-like block copolymers with well-defined structures and narrow molecular weight distributions as nanoreactors. These cylindrical unimolecular nanoreactors enable a high degree of control over the size, shape, architecture, surface chemistry, and properties of 1D nanocrystals. We demonstrate the synthesis of metallic, ferroelectric, upconversion, semiconducting, and thermoelectric 1D nanocrystals, among others, as well as combinations thereof. PMID:27634531
Flexible Photodetectors Based on 1D Inorganic Nanostructures
Lou, Zheng
2015-01-01
Flexible photodetectors with excellent flexibility, high mechanical stability and good detectivity, have attracted great research interest in recent years. 1D inorganic nanostructures provide a number of opportunities and capabilities for use in flexible photodetectors as they have unique geometry, good transparency, outstanding mechanical flexibility, and excellent electronic/optoelectronic properties. This article offers a comprehensive review of several types of flexible photodetectors based on 1D nanostructures from the past ten years, including flexible ultraviolet, visible, and infrared photodetectors. High‐performance organic‐inorganic hybrid photodetectors, as well as devices with 1D nanowire (NW) arrays, are also reviewed. Finally, new concepts of flexible photodetectors including piezophototronic, stretchable and self‐powered photodetectors are examined to showcase the future research in this exciting field. PMID:27774404
GIS-BASED 1-D DIFFUSIVE WAVE OVERLAND FLOW MODEL
KALYANAPU, ALFRED; MCPHERSON, TIMOTHY N.; BURIAN, STEVEN J.
2007-01-17
This paper presents a GIS-based 1-d distributed overland flow model and summarizes an application to simulate a flood event. The model estimates infiltration using the Green-Ampt approach and routes excess rainfall using the 1-d diffusive wave approximation. The model was designed to use readily available topographic, soils, and land use/land cover data and rainfall predictions from a meteorological model. An assessment of model performance was performed for a small catchment and a large watershed, both in urban environments. Simulated runoff hydrographs were compared to observations for a selected set of validation events. Results confirmed the model provides reasonable predictions in a short period of time.
Observation of Dynamical Fermionization in 1D Bose Gases
NASA Astrophysics Data System (ADS)
Malvania, Neel; Xia, Lin; Xu, Wei; Wilson, Joshua M.; Zundel, Laura A.; Rigol, Marcos; Weiss, David S.
2016-05-01
The momentum distribution of a harmonically trapped 1D Bose gases in the Tonks-Girardeau limit is expected to undergo dynamical fermionization. That is, after the harmonic trap is suddenly turned off, the momentum distribution steadily transforms into that of an ideal Fermi gas in the same initial trap. We measure 1D momentum distributions at variable times after such a quench, and observe the predicted dynamical fermionization. In addition to working in the strong coupling limit, we also perform the experiment with intermediate coupling, where theoretical calculations are more challenging.
Fitting complex population models by combining particle filters with Markov chain Monte Carlo.
Knape, Jonas; de Valpine, Perry
2012-02-01
We show how a recent framework combining Markov chain Monte Carlo (MCMC) with particle filters (PFMCMC) may be used to estimate population state-space models. With the purpose of utilizing the strengths of each method, PFMCMC explores hidden states by particle filters, while process and observation parameters are estimated using an MCMC algorithm. PFMCMC is exemplified by analyzing time series data on a red kangaroo (Macropus rufus) population in New South Wales, Australia, using MCMC over model parameters based on an adaptive Metropolis-Hastings algorithm. We fit three population models to these data; a density-dependent logistic diffusion model with environmental variance, an unregulated stochastic exponential growth model, and a random-walk model. Bayes factors and posterior model probabilities show that there is little support for density dependence and that the random-walk model is the most parsimonious model. The particle filter Metropolis-Hastings algorithm is a brute-force method that may be used to fit a range of complex population models. Implementation is straightforward and less involved than standard MCMC for many models, and marginal densities for model selection can be obtained with little additional effort. The cost is mainly computational, resulting in long running times that may be improved by parallelizing the algorithm.
Fitting complex population models by combining particle filters with Markov chain Monte Carlo.
Knape, Jonas; de Valpine, Perry
2012-02-01
We show how a recent framework combining Markov chain Monte Carlo (MCMC) with particle filters (PFMCMC) may be used to estimate population state-space models. With the purpose of utilizing the strengths of each method, PFMCMC explores hidden states by particle filters, while process and observation parameters are estimated using an MCMC algorithm. PFMCMC is exemplified by analyzing time series data on a red kangaroo (Macropus rufus) population in New South Wales, Australia, using MCMC over model parameters based on an adaptive Metropolis-Hastings algorithm. We fit three population models to these data; a density-dependent logistic diffusion model with environmental variance, an unregulated stochastic exponential growth model, and a random-walk model. Bayes factors and posterior model probabilities show that there is little support for density dependence and that the random-walk model is the most parsimonious model. The particle filter Metropolis-Hastings algorithm is a brute-force method that may be used to fit a range of complex population models. Implementation is straightforward and less involved than standard MCMC for many models, and marginal densities for model selection can be obtained with little additional effort. The cost is mainly computational, resulting in long running times that may be improved by parallelizing the algorithm. PMID:22624307
Modeling stereopsis via Markov random field.
Ming, Yansheng; Hu, Zhanyi
2010-08-01
Markov random field (MRF) and belief propagation have given birth to stereo vision algorithms with top performance. This article explores their biological plausibility. First, an MRF model guided by physiological and psychophysical facts was designed. Typically an MRF-based stereo vision algorithm employs a likelihood function that reflects the local similarity of two regions and a potential function that models the continuity constraint. In our model, the likelihood function is constructed on the basis of the disparity energy model because complex cells are considered as front-end disparity encoders in the visual pathway. Our likelihood function is also relevant to several psychological findings. The potential function in our model is constrained by the psychological finding that the strength of the cooperative interaction minimizing relative disparity decreases as the separation between stimuli increases. Our model is tested on three kinds of stereo images. In simulations on images with repetitive patterns, we demonstrate that our model could account for the human depth percepts that were previously explained by the second-order mechanism. In simulations on random dot stereograms and natural scene images, we demonstrate that false matches introduced by the disparity energy model can be reliably removed using our model. A comparison with the coarse-to-fine model shows that our model is able to compute the absolute disparity of small objects with larger relative disparity. We also relate our model to several physiological findings. The hypothesized neurons of the model are selective for absolute disparity and have facilitative extra receptive field. There are plenty of such neurons in the visual cortex. In conclusion, we think that stereopsis can be implemented by neural networks resembling MRF.
On Markov Earth Mover’s Distance
Wei, Jie
2015-01-01
In statistics, pattern recognition and signal processing, it is of utmost importance to have an effective and efficient distance to measure the similarity between two distributions and sequences. In statistics this is referred to as goodness-of-fit problem. Two leading goodness of fit methods are chi-square and Kolmogorov–Smirnov distances. The strictly localized nature of these two measures hinders their practical utilities in patterns and signals where the sample size is usually small. In view of this problem Rubner and colleagues developed the earth mover’s distance (EMD) to allow for cross-bin moves in evaluating the distance between two patterns, which find a broad spectrum of applications. EMD-L1 was later proposed to reduce the time complexity of EMD from super-cubic by one order of magnitude by exploiting the special L1 metric. EMD-hat was developed to turn the global EMD to a localized one by discarding long-distance earth movements. In this work, we introduce a Markov EMD (MEMD) by treating the source and destination nodes absolutely symmetrically. In MEMD, like hat-EMD, the earth is only moved locally as dictated by the degree d of neighborhood system. Nodes that cannot be matched locally is handled by dummy source and destination nodes. By use of this localized network structure, a greedy algorithm that is linear to the degree d and number of nodes is then developed to evaluate the MEMD. Empirical studies on the use of MEMD on deterministic and statistical synthetic sequences and SIFT-based image retrieval suggested encouraging performances. PMID:25983362
Hidden gauged U (1 ) model: Unifying scotogenic neutrino and flavor dark matter
NASA Astrophysics Data System (ADS)
Yu, Jiang-Hao
2016-06-01
In both scotogenic neutrino and flavor dark matter models, the dark sector communicates with the standard model fermions via Yukawa portal couplings. We propose an economic scenario where the scotogenic neutrino and a flavored mediator share the same inert Higgs doublet and all are charged under a hidden gauged U (1 ) symmetry. The dark Z2 symmetry in the dark sector is regarded as the remnant of this hidden U (1 ) symmetry breaking. In particular, we investigate a dark U (1 )D [and also U (1 )B-L] model which unifies the scotogenic neutrino and top-flavored mediator. Thus dark tops and dark neutrinos are the standard model fermion partners, and the dark matter could be the inert Higgs or the lightest dark neutrino. We note that this model has rich collider signatures on dark tops, the inert Higgs and the Z' gauge boson. Moreover, the scalar associated to the U (1 )D [and also U (1 )B -L ] symmetry breaking could explain the 750 GeV diphoton excess reported by ATLAS and CMS recently.
Non-cooperative Brownian donkeys: A solvable 1D model
NASA Astrophysics Data System (ADS)
Jiménez de Cisneros, B.; Reimann, P.; Parrondo, J. M. R.
2003-12-01
A paradigmatic 1D model for Brownian motion in a spatially symmetric, periodic system is tackled analytically. Upon application of an external static force F the system's response is an average current which is positive for F < 0 and negative for F > 0 (absolute negative mobility). Under suitable conditions, the system approaches 100% efficiency when working against the external force F.
1D design style implications for mask making and CEBL
NASA Astrophysics Data System (ADS)
Smayling, Michael C.
2013-09-01
At advanced nodes, CMOS logic is being designed in a highly regular design style because of the resolution limitations of optical lithography equipment. Logic and memory layouts using 1D Gridded Design Rules (GDR) have been demonstrated to nodes beyond 12nm.[1-4] Smaller nodes will require the same regular layout style but with multiple patterning for critical layers. One of the significant advantages of 1D GDR is the ease of splitting layouts into lines and cuts. A lines and cuts approach has been used to achieve good pattern fidelity and process margin to below 12nm.[4] Line scaling with excellent line-edge roughness (LER) has been demonstrated with self-aligned spacer processing.[5] This change in design style has important implications for mask making: • The complexity of the masks will be greatly reduced from what would be required for 2D designs with very complex OPC or inverse lithography corrections. • The number of masks will initially increase, as for conventional multiple patterning. But in the case of 1D design, there are future options for mask count reduction. • The line masks will remain simple, with little or no OPC, at pitches (1x) above 80nm. This provides an excellent opportunity for continual improvement of line CD and LER. The line pattern will be processed through a self-aligned pitch division sequence to divide pitch by 2 or by 4. • The cut masks can be done with "simple OPC" as demonstrated to beyond 12nm.[6] Multiple simple cut masks may be required at advanced nodes. "Coloring" has been demonstrated to below 12nm for two colors and to 8nm for three colors. • Cut/hole masks will eventually be replaced by e-beam direct write using complementary e-beam lithography (CEBL).[7-11] This transition is gated by the availability of multiple column e-beam systems with throughput adequate for high- volume manufacturing. A brief description of 1D and 2D design styles will be presented, followed by examples of 1D layouts. Mask complexity for 1
Laser experiments explore the hidden sector
NASA Astrophysics Data System (ADS)
Ahlers, M.; Gies, H.; Jaeckel, J.; Redondo, J.; Ringwald, A.
2008-05-01
Recently, the laser experiments BMV and GammeV, searching for light shining through walls, have published data and calculated new limits on the allowed masses and couplings for axionlike particles. In this paper we point out that these experiments can serve to constrain a much wider variety of hidden-sector particles such as, e.g., minicharged particles and hidden-sector photons. The new experiments improve the existing bounds from the older BFRT experiment by a factor of 2. Moreover, we use the new PVLAS constraints on a possible rotation and ellipticity of light after it has passed through a strong magnetic field to constrain pure minicharged particle models. For masses ≲0.05eV, the charge is now restricted to be less than (3-4)×10-7 times the electron electric charge. This is the best laboratory bound and comparable to bounds inferred from the energy spectrum of the cosmic microwave background.
Hidden SU (N ) glueball dark matter
NASA Astrophysics Data System (ADS)
Soni, Amarjit; Zhang, Yue
2016-06-01
We investigate the possibility that the dark matter candidate is from a pure non-Abelian gauge theory of the hidden sector, motivated in large part by its elegance and simplicity. The dark matter is the lightest bound state made of the confined gauge fields, the hidden glueball. We point out that this simple setup is capable of providing rich and novel phenomena in the dark sector, especially in the parameter space of large N . They include self-interacting and warm dark matter scenarios, Bose-Einstein condensation leading to massive dark stars possibly millions of times heavier than our sun giving rise to gravitational lensing effects, and indirect detections through higher dimensional operators as well as interesting collider signatures.
Hidden penis release: adjunctive suprapubic lipectomy.
Horton, C E; Vorstman, B; Teasley, D; Winslow, B
1987-08-01
We believe the hidden penis may be caused and concealed by a prominent suprapubic fat pad in addition to the restrictive fibrous bands of the dartos fascia fixing the shaft of the penis proximally while loose skin folds prolapse distally over the phallus. A penis of inadequate length or appearance may affect body image. Patients with this problem often require psychological support. Hidden penis may be distinguished from micropenis by palpating adequate corpora and showing a stretched penile length within 2 SD of normal. Excision of suprapubic fat with sectioning of the tethering dartos bands will release and increase the length of the penis. Suprapubic fat pad resection may also be helpful to elongate a short penis in cases of adult microphallus, or after partial penectomy because of trauma or cancer. Circumcision is contraindicated.
Biofortification for combating 'hidden hunger' for iron.
Murgia, Irene; Arosio, Paolo; Tarantino, Delia; Soave, Carlo
2012-01-01
Micronutrient deficiencies are responsible for so-called 'hidden undernutrition'. In particular, iron (Fe) deficiency adversely affects growth, immune function and can cause anaemia. However, supplementation of iron can exacerbate infectious diseases and current policies of iron therapy carefully evaluate the risks and benefits of these interventions. Here we review the approaches of biofortification of valuable crops for reducing 'hidden undernutrition' of iron in the light of the latest nutritional and medical advances. The increase of iron and prebiotics in edible parts of plants is expected to improve health, whereas the reduction of phytic acid concentration, in crops valuable for human diet, might be less beneficial for the developed countries, or for the developing countries exposed to endemic infections. PMID:22093370
Hidden variables and nonlocality in quantum mechanics
NASA Astrophysics Data System (ADS)
Hemmick, Douglas Lloyd
1997-05-01
Most physicists hold a skeptical attitude toward a 'hidden variables' interpretation of quantum theory, despite David Bohm's successful construction of such a theory and John S. Bell's strong arguments in favor of the idea. The first reason for doubt concerns certain mathematical theorems (von Neumann's, Gleason's, Kochen and Specker's, and Bell's) which can be applied to the hidden variables issue. These theorems are often credited with proving that hidden variables are indeed 'impossible', in the sense that they cannot replicate the predictions of quantum mechanics. Many who do not draw such a strong conclusion nevertheless accept that hidden variables have been shown to exhibit prohibitively complicated features. The second concern is that the most sophisticated example of a hidden variables theory-that of David Bohm-exhibits non-locality, i.e., consequences of events at one place can propagate to other places instantaneously. However, neither the mathematical theorems in question nor the attribute of nonlocality detract from the importance of a hidden variables interpretation of quantum theory. Nonlocality is present in quantum mechanics itself, and is a required characteristic of any theory that agrees with the quantum mechanical predictions. We first discuss the earliest analysis of hidden variables-that of von Neumann's theorem-and review John S. Bell's refutation of von Neumann's 'impossibility proof'. We recall and elaborate on Bell's arguments regarding the theorems of Gleason, and Kochen and Specker. According to Bell, these latter theorems do not imply that hidden variables interpretations are untenable, but instead that such theories must exhibit contextuality, i.e., they must allow for the dependence of measurement results on the characteristics of both measured system and measuring apparatus. We demonstrate a new way to understand the implications of both Gleason's theorem and Kochen and Specker's theorem by noting that they prove a result we call
Hydraulic-Leak Detector for Hidden Joints
NASA Technical Reports Server (NTRS)
Anderson, G. E.; Loo, S.
1986-01-01
Slow leakage of fluid made obvious. Indicator consists of wick wrapped at one end around joint to be monitored. Wick absorbs hydraulic fluid leaking from joint and transmits to opposite end, located outside cover plate and visible to inspector. Leakage manifested as discoloration of outside end of wick. Indicator reveals leaks in hidden fittings on hydraulic lines. Fast inspection of joints without disassembly. Used in aerospace, petroleum, chemical, nuclear, and other industries where removing covers for inspection impossible, difficult, or time-consuming.
The Hidden Gifts of Quiet Kids
ERIC Educational Resources Information Center
Trierweiler, Hannah
2006-01-01
The author relates that she was an introvert child. It has always taken her time and energy to find her place in a group. As a grown-up, she still needed quiet time to regroup during a busy day. In this article, the author presents an interview with Marti Olsen Laney, author of "The Hidden Gifts of the Introverted Child." During the interview,…
Kaiglová, Jana; Langhammer, Jakub; Jiřinec, Petr; Janský, Bohumír; Chalupová, Dagmar
2015-03-01
This article used various hydrodynamic and sediment transport models to analyze the potential and the limits of different channel schematizations. The main aim was to select and evaluate the most suitable simulation method for fine-grained sediment remobilization assessment. Three types of channel schematization were selected to study the flow potential for remobilizing fine-grained sediment in artificially modified channels. Schematization with a 1D cross-sectional horizontal plan, a 1D+ approach, splitting the riverbed into different functional zones, and full 2D mesh, adopted in MIKE by the DHI modeling suite, was applied to the study. For the case study, a 55-km stretch of the Bílina River, in the Czech Republic, Central Europe, which has been heavily polluted by the chemical and coal mining industry since the mid-twentieth century, was selected. Long-term exposure to direct emissions of toxic pollutants including heavy metals and persistent organic pollutants (POPs) resulted in deposits of pollutants in fine-grained sediments in the riverbed. Simulations, based on three hydrodynamic model schematizations, proved that for events not exceeding the extent of the riverbed profile, the 1D schematization can provide comparable results to a 2D model. The 1D+ schematization can improve accuracy while keeping the benefits of high-speed simulation and low requirements of input DEM data, but the method's suitability is limited by the channel properties. PMID:25687259
Extracting hidden messages in steganographic images
Quach, Tu-Thach
2014-07-17
The eventual goal of steganalytic forensic is to extract the hidden messages embedded in steganographic images. A promising technique that addresses this problem partially is steganographic payload location, an approach to reveal the message bits, but not their logical order. It works by finding modified pixels, or residuals, as an artifact of the embedding process. This technique is successful against simple least-significant bit steganography and group-parity steganography. The actual messages, however, remain hidden as no logical order can be inferred from the located payload. This paper establishes an important result addressing this shortcoming: we show that the expected mean residuals contain enough information to logically order the located payload provided that the size of the payload in each stego image is not fixed. The located payload can be ordered as prescribed by the mean residuals to obtain the hidden messages without knowledge of the embedding key, exposing the vulnerability of these embedding algorithms. We provide experimental results to support our analysis.
Lepton mixing from the hidden sector
NASA Astrophysics Data System (ADS)
Ludl, P. O.; Smirnov, A. Yu.
2015-10-01
Experimental results indicate a possible relation between the lepton and quark mixing matrices of the form UPMNS≈VCKM†UX , where UX is a matrix with special structure related to the mechanism of neutrino mass generation. We propose a framework which can realize such a relation. The main ingredients of the framework are the double seesaw mechanism, SO(10) grand unification and a hidden sector of theory. The latter is composed of singlets (fermions and bosons) of the grand unified theory (GUT) symmetry with masses between the GUT and Planck scale. The interactions in this sector obey certain symmetries Ghidden. We explore the conditions under which symmetries Ghidden can produce flavor structures in the visible sector. Here the key elements are the basis-fixing symmetry and mediators which communicate information about properties of the hidden sector to the visible one. The interplay of SO(10) symmetry, basis-fixing symmetry identified as Z2×Z2 and Ghidden can lead to the required form of UX. A different kind of new physics is responsible for generation of the CKM mixing. We present the simplest realizations of the framework which differ by nature of the mediators and by symmetries of the hidden sector.
Extracting hidden messages in steganographic images
Quach, Tu-Thach
2014-07-17
The eventual goal of steganalytic forensic is to extract the hidden messages embedded in steganographic images. A promising technique that addresses this problem partially is steganographic payload location, an approach to reveal the message bits, but not their logical order. It works by finding modified pixels, or residuals, as an artifact of the embedding process. This technique is successful against simple least-significant bit steganography and group-parity steganography. The actual messages, however, remain hidden as no logical order can be inferred from the located payload. This paper establishes an important result addressing this shortcoming: we show that the expected mean residualsmore » contain enough information to logically order the located payload provided that the size of the payload in each stego image is not fixed. The located payload can be ordered as prescribed by the mean residuals to obtain the hidden messages without knowledge of the embedding key, exposing the vulnerability of these embedding algorithms. We provide experimental results to support our analysis.« less
MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes
Williams, B.K.
1988-01-01
Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.
Influence of credit scoring on the dynamics of Markov chain
NASA Astrophysics Data System (ADS)
Galina, Timofeeva
2015-11-01
Markov processes are widely used to model the dynamics of a credit portfolio and forecast the portfolio risk and profitability. In the Markov chain model the loan portfolio is divided into several groups with different quality, which determined by presence of indebtedness and its terms. It is proposed that dynamics of portfolio shares is described by a multistage controlled system. The article outlines mathematical formalization of controls which reflect the actions of the bank's management in order to improve the loan portfolio quality. The most important control is the organization of approval procedure of loan applications. The credit scoring is studied as a control affecting to the dynamic system. Different formalizations of "good" and "bad" consumers are proposed in connection with the Markov chain model.
Markov sequential pattern recognition : dependency and the unknown class.
Malone, Kevin Thomas; Haschke, Greg Benjamin; Koch, Mark William
2004-10-01
The sequential probability ratio test (SPRT) minimizes the expected number of observations to a decision and can solve problems in sequential pattern recognition. Some problems have dependencies between the observations, and Markov chains can model dependencies where the state occupancy probability is geometric. For a non-geometric process we show how to use the effective amount of independent information to modify the decision process, so that we can account for the remaining dependencies. Along with dependencies between observations, a successful system needs to handle the unknown class in unconstrained environments. For example, in an acoustic pattern recognition problem any sound source not belonging to the target set is in the unknown class. We show how to incorporate goodness of fit (GOF) classifiers into the Markov SPRT, and determine the worse case nontarget model. We also develop a multiclass Markov SPRT using the GOF concept.
1-D Numerical Analysis of ABCC Engine Performance
NASA Technical Reports Server (NTRS)
Holden, Richard
1999-01-01
ABCC engine combines air breathing and rocket engine into a single engine to increase the specific impulse over an entire flight trajectory. Except for the heat source, the basic operation of the ABCC is similar to the basic operation of the RBCC engine. The ABCC is intended to have a higher specific impulse than the RBCC for single stage Earth to orbit vehicle. Computational fluid dynamics (CFD) is a useful tool for the analysis of complex transport processes in various components in ABCC propulsion system. The objective of the present research was to develop a transient 1-D numerical model using conservation of mass, linear momentum, and energy equations that could be used to predict flow behavior throughout a generic ABCC engine following a flight path. At specific points during the development of the 1-D numerical model a myriad of tests were performed to prove the program produced consistent, realistic numbers that follow compressible flow theory for various inlet conditions.