Science.gov

Sample records for kernel fisher discriminant

  1. The use of kernel local Fisher discriminant analysis for the channelization of the Hotelling model observer

    NASA Astrophysics Data System (ADS)

    Wen, Gezheng; Markey, Mia K.

    2015-03-01

    It is resource-intensive to conduct human studies for task-based assessment of medical image quality and system optimization. Thus, numerical model observers have been developed as a surrogate for human observers. The Hotelling observer (HO) is the optimal linear observer for signal-detection tasks, but the high dimensionality of imaging data results in a heavy computational burden. Channelization is often used to approximate the HO through a dimensionality reduction step, but how to produce channelized images without losing significant image information remains a key challenge. Kernel local Fisher discriminant analysis (KLFDA) uses kernel techniques to perform supervised dimensionality reduction, which finds an embedding transformation that maximizes betweenclass separability and preserves within-class local structure in the low-dimensional manifold. It is powerful for classification tasks, especially when the distribution of a class is multimodal. Such multimodality could be observed in many practical clinical tasks. For example, primary and metastatic lesions may both appear in medical imaging studies, but the distributions of their typical characteristics (e.g., size) may be very different. In this study, we propose to use KLFDA as a novel channelization method. The dimension of the embedded manifold (i.e., the result of KLFDA) is a counterpart to the number of channels in the state-of-art linear channelization. We present a simulation study to demonstrate the potential usefulness of KLFDA for building the channelized HOs (CHOs) and generating reliable decision statistics for clinical tasks. We show that the performance of the CHO with KLFDA channels is comparable to that of the benchmark CHOs.

  2. Fault diagnosis of nonlinear and large-scale processes using novel modified kernel Fisher discriminant analysis approach

    NASA Astrophysics Data System (ADS)

    Shi, Huaitao; Liu, Jianchang; Wu, Yuhou; Zhang, Ke; Zhang, Lixiu; Xue, Peng

    2016-04-01

    It is pretty significant for fault diagnosis timely and accurately to improve the dependability of industrial processes. In this study, fault diagnosis of nonlinear and large-scale processes by variable-weighted kernel Fisher discriminant analysis (KFDA) based on improved biogeography-based optimisation (IBBO) is proposed, referred to as IBBO-KFDA, where IBBO is used to determine the parameters of variable-weighted KFDA, and variable-weighted KFDA is used to solve the multi-classification overlapping problem. The main contributions of this work are four-fold to further improve the performance of KFDA for fault diagnosis. First, a nonlinear fault diagnosis approach with variable-weighted KFDA is developed for maximising separation between the overlapping fault samples. Second, kernel parameters and features selection of variable-weighted KFDA are simultaneously optimised using IBBO. Finally, a single fitness function that combines erroneous diagnosis rate with feature cost is created, a novel mixed kernel function is introduced to improve the classification capability in the feature space and diagnosis accuracy of the IBBO-KFDA, and serves as the target function in the optimisation problem. Moreover, an IBBO approach is developed to obtain the better quality of solution and faster convergence speed. On the one hand, the proposed IBBO-KFDA method is first used on Tennessee Eastman process benchmark data sets to validate the feasibility and efficiency. On the other hand, IBBO-KFDA is applied to diagnose faults of automation gauge control system. Simulation results demonstrate that IBBO-KFDA can obtain better kernel parameters and feature vectors with a lower computing cost, higher diagnosis accuracy and a better real-time capacity.

  3. A one-class kernel fisher criterion for outlier detection.

    PubMed

    Dufrenois, Franck

    2015-05-01

    Recently, Dufrenois and Noyer proposed a one class Fisher's linear discriminant to isolate normal data from outliers. In this paper, a kernelized version of their criterion is presented. Originally on the basis of an iterative optimization process, alternating between subspace selection and clustering, I show here that their criterion has an upper bound making these two problems independent. In particular, the estimation of the label vector is formulated as an unconstrained binary linear problem (UBLP) which can be solved using an iterative perturbation method. Once the label vector is estimated, an optimal projection subspace is obtained by solving a generalized eigenvalue problem. Like many other kernel methods, the performance of the proposed approach depends on the choice of the kernel. Constructed with a Gaussian kernel, I show that the proposed contrast measure is an efficient indicator for selecting an optimal kernel width. This property simplifies the model selection problem which is typically solved by costly (generalized) cross-validation procedures. Initialization, convergence analysis, and computational complexity are also discussed. Lastly, the proposed algorithm is compared with recent novelty detectors on synthetic and real data sets.

  4. Kernel Optimization in Discriminant Analysis

    PubMed Central

    You, Di; Hamsici, Onur C.; Martinez, Aleix M.

    2011-01-01

    Kernel mapping is one of the most used approaches to intrinsically derive nonlinear classifiers. The idea is to use a kernel function which maps the original nonlinearly separable problem to a space of intrinsically larger dimensionality where the classes are linearly separable. A major problem in the design of kernel methods is to find the kernel parameters that make the problem linear in the mapped representation. This paper derives the first criterion that specifically aims to find a kernel representation where the Bayes classifier becomes linear. We illustrate how this result can be successfully applied in several kernel discriminant analysis algorithms. Experimental results using a large number of databases and classifiers demonstrate the utility of the proposed approach. The paper also shows (theoretically and experimentally) that a kernel version of Subclass Discriminant Analysis yields the highest recognition rates. PMID:20820072

  5. The MDF discrimination measure: Fisher in disguise.

    PubMed

    Loog, Marco; Duin, Robert P W; Viergever, Max A

    2004-05-01

    Recently, a discrimination measure for feature extraction for two-class data, called the maximum discriminating (MDF) measure (Talukder and Casasent [Neural Networks 14 (2001) 1201-1218]), was introduced. In the present paper, it is shown that the MDF discrimination measure produces exactly the same results as the classical Fisher criterion, on the condition that the two prior probabilities are chosen to be equal. The effect of unequal priors on the efficiency of the measures is also discussed.

  6. Semisupervised kernel marginal Fisher analysis for face recognition.

    PubMed

    Wang, Ziqiang; Sun, Xia; Sun, Lijun; Huang, Yuchun

    2013-01-01

    Dimensionality reduction is a key problem in face recognition due to the high-dimensionality of face image. To effectively cope with this problem, a novel dimensionality reduction algorithm called semisupervised kernel marginal Fisher analysis (SKMFA) for face recognition is proposed in this paper. SKMFA can make use of both labelled and unlabeled samples to learn the projection matrix for nonlinear dimensionality reduction. Meanwhile, it can successfully avoid the singularity problem by not calculating the matrix inverse. In addition, in order to make the nonlinear structure captured by the data-dependent kernel consistent with the intrinsic manifold structure, a manifold adaptive nonparameter kernel is incorporated into the learning process of SKMFA. Experimental results on three face image databases demonstrate the effectiveness of our proposed algorithm.

  7. Semisupervised Kernel Marginal Fisher Analysis for Face Recognition

    PubMed Central

    Wang, Ziqiang; Sun, Xia; Sun, Lijun; Huang, Yuchun

    2013-01-01

    Dimensionality reduction is a key problem in face recognition due to the high-dimensionality of face image. To effectively cope with this problem, a novel dimensionality reduction algorithm called semisupervised kernel marginal Fisher analysis (SKMFA) for face recognition is proposed in this paper. SKMFA can make use of both labelled and unlabeled samples to learn the projection matrix for nonlinear dimensionality reduction. Meanwhile, it can successfully avoid the singularity problem by not calculating the matrix inverse. In addition, in order to make the nonlinear structure captured by the data-dependent kernel consistent with the intrinsic manifold structure, a manifold adaptive nonparameter kernel is incorporated into the learning process of SKMFA. Experimental results on three face image databases demonstrate the effectiveness of our proposed algorithm. PMID:24163638

  8. Emotion recognition from single-trial EEG based on kernel Fisher's emotion pattern and imbalanced quasiconformal kernel support vector machine.

    PubMed

    Liu, Yi-Hung; Wu, Chien-Te; Cheng, Wei-Teng; Hsiao, Yu-Tsung; Chen, Po-Ming; Teng, Jyh-Tong

    2014-07-24

    Electroencephalogram-based emotion recognition (EEG-ER) has received increasing attention in the fields of health care, affective computing, and brain-computer interface (BCI). However, satisfactory ER performance within a bi-dimensional and non-discrete emotional space using single-trial EEG data remains a challenging task. To address this issue, we propose a three-layer scheme for single-trial EEG-ER. In the first layer, a set of spectral powers of different EEG frequency bands are extracted from multi-channel single-trial EEG signals. In the second layer, the kernel Fisher's discriminant analysis method is applied to further extract features with better discrimination ability from the EEG spectral powers. The feature vector produced by layer 2 is called a kernel Fisher's emotion pattern (KFEP), and is sent into layer 3 for further classification where the proposed imbalanced quasiconformal kernel support vector machine (IQK-SVM) serves as the emotion classifier. The outputs of the three layer EEG-ER system include labels of emotional valence and arousal. Furthermore, to collect effective training and testing datasets for the current EEG-ER system, we also use an emotion-induction paradigm in which a set of pictures selected from the International Affective Picture System (IAPS) are employed as emotion induction stimuli. The performance of the proposed three-layer solution is compared with that of other EEG spectral power-based features and emotion classifiers. Results on 10 healthy participants indicate that the proposed KFEP feature performs better than other spectral power features, and IQK-SVM outperforms traditional SVM in terms of the EEG-ER accuracy. Our findings also show that the proposed EEG-ER scheme achieves the highest classification accuracies of valence (82.68%) and arousal (84.79%) among all testing methods.

  9. Emotion Recognition from Single-Trial EEG Based on Kernel Fisher's Emotion Pattern and Imbalanced Quasiconformal Kernel Support Vector Machine

    PubMed Central

    Liu, Yi-Hung; Wu, Chien-Te; Cheng, Wei-Teng; Hsiao, Yu-Tsung; Chen, Po-Ming; Teng, Jyh-Tong

    2014-01-01

    Electroencephalogram-based emotion recognition (EEG-ER) has received increasing attention in the fields of health care, affective computing, and brain-computer interface (BCI). However, satisfactory ER performance within a bi-dimensional and non-discrete emotional space using single-trial EEG data remains a challenging task. To address this issue, we propose a three-layer scheme for single-trial EEG-ER. In the first layer, a set of spectral powers of different EEG frequency bands are extracted from multi-channel single-trial EEG signals. In the second layer, the kernel Fisher's discriminant analysis method is applied to further extract features with better discrimination ability from the EEG spectral powers. The feature vector produced by layer 2 is called a kernel Fisher's emotion pattern (KFEP), and is sent into layer 3 for further classification where the proposed imbalanced quasiconformal kernel support vector machine (IQK-SVM) serves as the emotion classifier. The outputs of the three layer EEG-ER system include labels of emotional valence and arousal. Furthermore, to collect effective training and testing datasets for the current EEG-ER system, we also use an emotion-induction paradigm in which a set of pictures selected from the International Affective Picture System (IAPS) are employed as emotion induction stimuli. The performance of the proposed three-layer solution is compared with that of other EEG spectral power-based features and emotion classifiers. Results on 10 healthy participants indicate that the proposed KFEP feature performs better than other spectral power features, and IQK-SVM outperforms traditional SVM in terms of the EEG-ER accuracy. Our findings also show that the proposed EEG-ER scheme achieves the highest classification accuracies of valence (82.68%) and arousal (84.79%) among all testing methods. PMID:25061837

  10. Kernel PLS-SVC for Linear and Nonlinear Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Trejo, Leonard J.; Matthews, Bryan

    2003-01-01

    A new methodology for discrimination is proposed. This is based on kernel orthonormalized partial least squares (PLS) dimensionality reduction of the original data space followed by support vector machines for classification. Close connection of orthonormalized PLS and Fisher's approach to linear discrimination or equivalently with canonical correlation analysis is described. This gives preference to use orthonormalized PLS over principal component analysis. Good behavior of the proposed method is demonstrated on 13 different benchmark data sets and on the real world problem of the classification finger movement periods versus non-movement periods based on electroencephalogram.

  11. Kernel Partial Least Squares for Nonlinear Regression and Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper summarizes recent results on applying the method of partial least squares (PLS) in a reproducing kernel Hilbert space (RKHS). A previously proposed kernel PLS regression model was proven to be competitive with other regularized regression methods in RKHS. The family of nonlinear kernel-based PLS models is extended by considering the kernel PLS method for discrimination. Theoretical and experimental results on a two-class discrimination problem indicate usefulness of the method.

  12. Selection of principal components based on Fisher discriminant ratio

    NASA Astrophysics Data System (ADS)

    Zeng, Xiangyan; Naghedolfeizi, Masoud; Arora, Sanjeev; Yousif, Nabil; Aberra, Dawit

    2016-05-01

    Principal component analysis transforms a set of possibly correlated variables into uncorrelated variables, and is widely used as a technique of dimensionality reduction and feature extraction. In some applications of dimensionality reduction, the objective is to use a small number of principal components to represent most variation in the data. On the other hand, the main purpose of feature extraction is to facilitate subsequent pattern recognition and machine learning tasks, such as classification. Selecting principal components for classification tasks aims for more than dimensionality reduction. The capability of distinguishing different classes is another major concern. Components that have larger eigenvalues do not necessarily have better distinguishing capabilities. In this paper, we investigate a strategy of selecting principal components based on the Fisher discriminant ratio. The ratio of between class variance to within class variance is calculated for each component, based on which the principal components are selected. The number of relevant components is determined by the classification accuracy. To alleviate overfitting which is common when there are few training data available, we use a cross-validation procedure to determine the number of principal components. The main objective is to select the components that have large Fisher discriminant ratios so that adequate class separability is obtained. The number of selected components is determined by the classification accuracy of the validation data. The selection method is evaluated by face recognition experiments.

  13. Kernel generalized neighbor discriminant embedding for SAR automatic target recognition

    NASA Astrophysics Data System (ADS)

    Huang, Yulin; Pei, Jifang; Yang, Jianyu; Wang, Tao; Yang, Haiguang; Wang, Bing

    2014-12-01

    In this paper, we propose a new supervised feature extraction algorithm in synthetic aperture radar automatic target recognition (SAR ATR), called generalized neighbor discriminant embedding (GNDE). Based on manifold learning, GNDE integrates class and neighborhood information to enhance discriminative power of extracted feature. Besides, the kernelized counterpart of this algorithm is also proposed, called kernel-GNDE (KGNDE). The experiment in this paper shows that the proposed algorithms have better recognition performance than PCA and KPCA.

  14. Plethysmographic arterial waveform strain discrimination by Fisher's method.

    PubMed

    Kucewicz, John C; Huang, Lingyun; Beach, Kirk W

    2004-06-01

    Plethysmography has been used for over 50 years to measure gross change in tissue blood volume. Over the cardiac cycle, perfused tissue initially expands as the blood flow into the arterioles exceeds the flow through the capillary bed. Later in the cardiac cycle, the accumulated blood drains into the venous vasculature, allowing the tissue to return to its presystolic blood volume. Specific features in the plethysmographic waveform can be used to identify normal and abnormal perfusion. We are developing a Doppler strain-imaging technique to measure the local pulsatile expansion and relaxation of tissue analogous to the gross measurement of tissue volume change with conventional plethysmography. A phantom has been built to generate plethysmographic-style strains with amplitudes of less than 0.1% in a tissue-mimicking material. With Fisher's discriminant analysis, it is shown that normal and abnormal plethysmographic-style strains can be differentiated with high sensitivities using the Fourier components of the strain waveforms normalized to compensate for the variance in the strain amplitude estimate.

  15. Semi-supervised learning for ordinal Kernel Discriminant Analysis.

    PubMed

    Pérez-Ortiz, M; Gutiérrez, P A; Carbonero-Ruz, M; Hervás-Martínez, C

    2016-12-01

    Ordinal classification considers those classification problems where the labels of the variable to predict follow a given order. Naturally, labelled data is scarce or difficult to obtain in this type of problems because, in many cases, ordinal labels are given by a user or expert (e.g. in recommendation systems). Firstly, this paper develops a new strategy for ordinal classification where both labelled and unlabelled data are used in the model construction step (a scheme which is referred to as semi-supervised learning). More specifically, the ordinal version of kernel discriminant learning is extended for this setting considering the neighbourhood information of unlabelled data, which is proposed to be computed in the feature space induced by the kernel function. Secondly, a new method for semi-supervised kernel learning is devised in the context of ordinal classification, which is combined with our developed classification strategy to optimise the kernel parameters. The experiments conducted compare 6 different approaches for semi-supervised learning in the context of ordinal classification in a battery of 30 datasets, showing (1) the good synergy of the ordinal version of discriminant analysis and the use of unlabelled data and (2) the advantage of computing distances in the feature space induced by the kernel function.

  16. Learning Discriminative Stein Kernel for SPD Matrices and Its Applications.

    PubMed

    Zhang, Jianjia; Wang, Lei; Zhou, Luping; Li, Wanqing

    2016-05-01

    Stein kernel (SK) has recently shown promising performance on classifying images represented by symmetric positive definite (SPD) matrices. It evaluates the similarity between two SPD matrices through their eigenvalues. In this paper, we argue that directly using the original eigenvalues may be problematic because: 1) eigenvalue estimation becomes biased when the number of samples is inadequate, which may lead to unreliable kernel evaluation, and 2) more importantly, eigenvalues reflect only the property of an individual SPD matrix. They are not necessarily optimal for computing SK when the goal is to discriminate different classes of SPD matrices. To address the two issues, we propose a discriminative SK (DSK), in which an extra parameter vector is defined to adjust the eigenvalues of input SPD matrices. The optimal parameter values are sought by optimizing a proxy of classification performance. To show the generality of the proposed method, three kernel learning criteria that are commonly used in the literature are employed as a proxy. A comprehensive experimental study is conducted on a variety of image classification tasks to compare the proposed DSK with the original SK and other methods for evaluating the similarity between SPD matrices. The results demonstrate that the DSK can attain greater discrimination and better align with classification tasks by altering the eigenvalues. This makes it produce higher classification performance than the original SK and other commonly used methods.

  17. Multilevel image recognition using discriminative patches and kernel covariance descriptor

    NASA Astrophysics Data System (ADS)

    Lu, Le; Yao, Jianhua; Turkbey, Evrim; Summers, Ronald M.

    2014-03-01

    Computer-aided diagnosis of medical images has emerged as an important tool to objectively improve the performance, accuracy and consistency for clinical workflow. To computerize the medical image diagnostic recognition problem, there are three fundamental problems: where to look (i.e., where is the region of interest from the whole image/volume), image feature description/encoding, and similarity metrics for classification or matching. In this paper, we exploit the motivation, implementation and performance evaluation of task-driven iterative, discriminative image patch mining; covariance matrix based descriptor via intensity, gradient and spatial layout; and log-Euclidean distance kernel for support vector machine, to address these three aspects respectively. To cope with often visually ambiguous image patterns for the region of interest in medical diagnosis, discovery of multilabel selective discriminative patches is desired. Covariance of several image statistics summarizes their second order interactions within an image patch and is proved as an effective image descriptor, with low dimensionality compared with joint statistics and fast computation regardless of the patch size. We extensively evaluate two extended Gaussian kernels using affine-invariant Riemannian metric or log-Euclidean metric with support vector machines (SVM), on two medical image classification problems of degenerative disc disease (DDD) detection on cortical shell unwrapped CT maps and colitis detection on CT key images. The proposed approach is validated with promising quantitative results on these challenging tasks. Our experimental findings and discussion also unveil some interesting insights on the covariance feature composition with or without spatial layout for classification and retrieval, and different kernel constructions for SVM. This will also shed some light on future work using covariance feature and kernel classification for medical image analysis.

  18. A Gabor-block-based kernel discriminative common vector approach using cosine kernels for human face recognition.

    PubMed

    Kar, Arindam; Bhattacharjee, Debotosh; Basu, Dipak Kumar; Nasipuri, Mita; Kundu, Mahantapas

    2012-01-01

    In this paper a nonlinear Gabor Wavelet Transform (GWT) discriminant feature extraction approach for enhanced face recognition is proposed. Firstly, the low-energized blocks from Gabor wavelet transformed images are extracted. Secondly, the nonlinear discriminating features are analyzed and extracted from the selected low-energized blocks by the generalized Kernel Discriminative Common Vector (KDCV) method. The KDCV method is extended to include cosine kernel function in the discriminating method. The KDCV with the cosine kernels is then applied on the extracted low-energized discriminating feature vectors to obtain the real component of a complex quantity for face recognition. In order to derive positive kernel discriminative vectors, we apply only those kernel discriminative eigenvectors that are associated with nonzero eigenvalues. The feasibility of the low-energized Gabor-block-based generalized KDCV method with cosine kernel function models has been successfully tested for classification using the L(1), L(2) distance measures; and the cosine similarity measure on both frontal and pose-angled face recognition. Experimental results on the FRAV2D and the FERET database demonstrate the effectiveness of this new approach.

  19. Radial basis function neural networks for nonlinear Fisher discrimination and Neyman-Pearson classification.

    PubMed

    Casasent, David; Chen, Xue-wen

    2003-01-01

    We propose a novel technique for the design of radial basis function (RBF) neural networks (NNs). To select various RBF parameters, the class membership information of training samples is utilized to produce new cluster classes. This allows emphasis of classification performance for certain class data rather than best overall classification. This allows us to control performance as desired and to approximate Neyman-Pearson classification. We also show that by properly choosing the desired output neuron levels, then the RBF hidden to output layer performs Fisher discrimination analysis, and that the full system performs a nonlinear Fisher analysis. Data on an agricultural product inspection problem and on synthetic data confirm the effectiveness of these methods.

  20. Sparse dimensionality reduction of hyperspectral image based on semi-supervised local Fisher discriminant analysis

    NASA Astrophysics Data System (ADS)

    Shao, Zhenfeng; Zhang, Lei

    2014-09-01

    This paper presents a novel sparse dimensionality reduction method of hyperspectral image based on semi-supervised local Fisher discriminant analysis (SELF). The proposed method is designed to be especially effective for dealing with the out-of-sample extrapolation to realize advantageous complementarities between SELF and sparsity preserving projections (SPP). Compared to SELF and SPP, the method proposed herein offers highly discriminative ability and produces an explicit nonlinear feature mapping for the out-of-sample extrapolation. This is due to the fact that the proposed method can get an explicit feature mapping for dimensionality reduction and improve the classification performance of classifiers by performing dimensionality reduction. Experimental analysis on the sparsity and efficacy of low dimensional outputs shows that, sparse dimensionality reduction based on SELF can yield good classification results and interpretability in the field of hyperspectral remote sensing.

  1. A Feature Selection Method Based on Fisher's Discriminant Ratio for Text Sentiment Classification

    NASA Astrophysics Data System (ADS)

    Wang, Suge; Li, Deyu; Wei, Yingjie; Li, Hongxia

    With the rapid growth of e-commerce, product reviews on the Web have become an important information source for customers' decision making when they intend to buy some product. As the reviews are often too many for customers to go through, how to automatically classify them into different sentiment orientation categories (i.e. positive/negative) has become a research problem. In this paper, based on Fisher's discriminant ratio, an effective feature selection method is proposed for product review text sentiment classification. In order to validate the validity of the proposed method, we compared it with other methods respectively based on information gain and mutual information while support vector machine is adopted as the classifier. In this paper, 6 subexperiments are conducted by combining different feature selection methods with 2 kinds of candidate feature sets. Under 1006 review documents of cars, the experimental results indicate that the Fisher's discriminant ratio based on word frequency estimation has the best performance with F value 83.3% while the candidate features are the words which appear in both positive and negative texts.

  2. Fisher's linear discriminant ratio based threshold for moving human detection in thermal video

    NASA Astrophysics Data System (ADS)

    Sharma, Lavanya; Yadav, Dileep Kumar; Singh, Annapurna

    2016-09-01

    In video surveillance, the moving human detection in thermal video is a critical phase that filters out redundant information to extract relevant information. The moving object detection is applied on thermal video because it penetrate challenging problems such as dynamic issues of background and illumination variation. In this work, we have proposed a new background subtraction method using Fisher's linear discriminant ratio based threshold. This threshold is investigated automatically during run-time for each pixel of every sequential frame. Automatically means to avoid the involvement of external source such as programmer or user for threshold selection. This threshold provides better pixel classification at run-time. This method handles problems generated due to multiple behavior of background more accurately using Fisher's ratio. It maximizes the separation between object pixel and the background pixel. To check the efficacy, the performance of this work is observed in terms of various parameters depicted in analysis. The experimental results and their analysis demonstrated better performance of proposed method against considered peer methods.

  3. Color model and method for video fire flame and smoke detection using Fisher linear discriminant

    NASA Astrophysics Data System (ADS)

    Wei, Yuan; Jie, Li; Jun, Fang; Yongming, Zhang

    2013-02-01

    Video fire detection is playing an increasingly important role in our life. But recent research is often based on a traditional RGB color model used to analyze the flame, which may be not the optimal color space for fire recognition. It is worse when we research smoke simply using gray images instead of color ones. We clarify the importance of color information for fire detection. We present a fire discriminant color (FDC) model for flame or smoke recognition based on color images. The FDC models aim to unify fire color image representation and fire recognition task into one framework. With the definition of between-class scatter matrices and within-class scatter matrices of Fisher linear discriminant, the proposed models seek to obtain one color-space-transform matrix and a discriminate projection basis vector by maximizing the ratio of these two scatter matrices. First, an iterative basic algorithm is designed to get one-component color space transformed from RGB. Then, a general algorithm is extended to generate three-component color space for further improvement. Moreover, we propose a method for video fire detection based on the models using the kNN classifier. To evaluate the recognition performance, we create a database including flame, smoke, and nonfire images for training and testing. The test experiments show that the proposed model achieves a flame verification rate receiver operating characteristic (ROC I) of 97.5% at a false alarm rate (FAR) of 1.06% and a smoke verification rate (ROC II) of 91.5% at a FAR of 1.2%, and lots of fire video experiments demonstrate that our method reaches a high accuracy for fire recognition.

  4. Multiple Kernel Sparse Representation based Orthogonal Discriminative Projection and Its Cost-Sensitive Extension.

    PubMed

    Zhang, Guoqing; Sun, Huaijiang; Xia, Guiyu; Sun, Quansen

    2016-07-07

    Sparse representation based classification (SRC) has been developed and shown great potential for real-world application. Based on SRC, Yang et al. [10] devised a SRC steered discriminative projection (SRC-DP) method. However, as a linear algorithm, SRC-DP cannot handle the data with highly nonlinear distribution. Kernel sparse representation-based classifier (KSRC) is a non-linear extension of SRC and can remedy the drawback of SRC. KSRC requires the use of a predetermined kernel function and selection of the kernel function and its parameters is difficult. Recently, multiple kernel learning for SRC (MKL-SRC) [22] has been proposed to learn a kernel from a set of base kernels. However, MKL-SRC only considers the within-class reconstruction residual while ignoring the between-class relationship, when learning the kernel weights. In this paper, we propose a novel multiple kernel sparse representation-based classifier (MKSRC), and then we use it as a criterion to design a multiple kernel sparse representation based orthogonal discriminative projection method (MK-SR-ODP). The proposed algorithm aims at learning a projection matrix and a corresponding kernel from the given base kernels such that in the low dimension subspace the between-class reconstruction residual is maximized and the within-class reconstruction residual is minimized. Furthermore, to achieve a minimum overall loss by performing recognition in the learned low-dimensional subspace, we introduce cost information into the dimensionality reduction method. The solutions for the proposed method can be efficiently found based on trace ratio optimization method [33]. Extensive experimental results demonstrate the superiority of the proposed algorithm when compared with the state-of-the-art methods.

  5. Early discriminant method of infected kernel based on the erosion effects of laser ultrasonics

    NASA Astrophysics Data System (ADS)

    Fan, Chao

    2015-07-01

    To discriminate the infected kernel of the wheat as early as possible, a new kind of detection method of hidden insects, especially in their egg and larvae stage, was put forward based on the erosion effect of the laser ultrasonic in this paper. The surface of the grain is exposured by the pulsed laser, the energy of which is absorbed and the ultrasonic is excited, and the infected kernel can be recognized by appropriate signal analyzing. Firstly, the detection principle was given based on the classical wave equation and the platform was established. Then, the detected ultrasonic signal was processed both in the time domain and the frequency domain by using FFT and DCT , and six significant features were selected as the characteristic parameters of the signal by the method of stepwise discriminant analysis. Finally, a BP neural network was designed by using these six parameters as the input to classify the infected kernels from the normal ones. Numerous experiments were performed by using twenty wheat varieties, the results shown that the the infected kernels can be recognized effectively, and the false negative error and the false positive error was 12% and 9% respectively, the discriminant method of the infected kernels based on the erosion effect of laser ultrasonics is feasible.

  6. Kernel-Based Discriminant Techniques for Educational Placement

    ERIC Educational Resources Information Center

    Lin, Miao-hsiang; Huang, Su-yun; Chang, Yuan-chin

    2004-01-01

    This article considers the problem of educational placement. Several discriminant techniques are applied to a data set from a survey project of science ability. A profile vector for each student consists of five science-educational indicators. The students are intended to be placed into three reference groups: advanced, regular, and remedial.…

  7. Identification of wheat varieties with a parallel-plate capacitance sensor using fisher linear discriminant analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Fisher’s linear discriminant (FLD) models for wheat variety classification were developed and validated. The inputs to the FLD models were the capacitance (C), impedance (Z), and phase angle ('), measured at two frequencies. Classification of wheat varieties was obtained as output of the FLD mod...

  8. Bilinear analysis for kernel selection and nonlinear feature extraction.

    PubMed

    Yang, Shu; Yan, Shuicheng; Zhang, Chao; Tang, Xiaoou

    2007-09-01

    This paper presents a unified criterion, Fisher + kernel criterion (FKC), for feature extraction and recognition. This new criterion is intended to extract the most discriminant features in different nonlinear spaces, and then, fuse these features under a unified measurement. Thus, FKC can simultaneously achieve nonlinear discriminant analysis and kernel selection. In addition, we present an efficient algorithm Fisher + kernel analysis (FKA), which utilizes the bilinear analysis, to optimize the new criterion. This FKA algorithm can alleviate the ill-posed problem existed in traditional kernel discriminant analysis (KDA), and usually, has no singularity problem. The effectiveness of our proposed algorithm is validated by a series of face-recognition experiments on several different databases.

  9. Discriminative Learning for Automatic Staging of Placental Maturity via Multi-layer Fisher Vector

    NASA Astrophysics Data System (ADS)

    Lei, Baiying; Yao, Yuan; Chen, Siping; Li, Shengli; Li, Wanjun; Ni, Dong; Wang, Tianfu

    2015-07-01

    Currently, placental maturity is performed using subjective evaluation, which can be unreliable as it is highly dependent on the observations and experiences of clinicians. To address this problem, this paper proposes a method to automatically stage placenta maturity from B-mode ultrasound (US) images based on dense sampling and novel feature descriptors. Specifically, our proposed method first densely extracts features with a regular grid based on dense sampling instead of a few unreliable interest points. Followed by, these features are clustered using generative Gaussian mixture model (GMM) to obtain high order statistics of the features. The clustering representatives (i.e., cluster means) are encoded by Fisher vector (FV) for staging accuracy enhancement. Differing from the previous studies, a multi-layer FV is investigated to exploit the spatial information rather than the single layer FV. Experimental results show that the proposed method with the dense FV has achieved an area under the receiver of characteristics (AUC) of 96.77%, sensitivity and specificity of 98.04% and 93.75% for the placental maturity staging, respectively. Our experimental results also demonstrate that the dense feature outperforms the traditional sparse feature for placental maturity staging.

  10. Joint L2,1 Norm and Fisher Discrimination Constrained Feature Selection for Rational Synthesis of Microporous Aluminophosphates.

    PubMed

    Qi, Miao; Wang, Ting; Yi, Yugen; Gao, Na; Kong, Jun; Wang, Jianzhong

    2017-04-01

    Feature selection has been regarded as an effective tool to help researchers understand the generating process of data. For mining the synthesis mechanism of microporous AlPOs, this paper proposes a novel feature selection method by joint l2,1 norm and Fisher discrimination constraints (JNFDC). In order to obtain more effective feature subset, the proposed method can be achieved in two steps. The first step is to rank the features according to sparse and discriminative constraints. The second step is to establish predictive model with the ranked features, and select the most significant features in the light of the contribution of improving the predictive accuracy. To the best of our knowledge, JNFDC is the first work which employs the sparse representation theory to explore the synthesis mechanism of six kinds of pore rings. Numerical simulations demonstrate that our proposed method can select significant features affecting the specified structural property and improve the predictive accuracy. Moreover, comparison results show that JNFDC can obtain better predictive performances than some other state-of-the-art feature selection methods.

  11. Unsupervised Wishart Classfication of Wetlands in Newfoundland, Canada Using Polsar Data Based on Fisher Linear Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Mohammadimanesh, F.; Salehi, B.; Mahdianpari, M.; Homayouni, S.

    2016-06-01

    Polarimetric Synthetic Aperture Radar (PolSAR) imagery is a complex multi-dimensional dataset, which is an important source of information for various natural resources and environmental classification and monitoring applications. PolSAR imagery produces valuable information by observing scattering mechanisms from different natural and man-made objects. Land cover mapping using PolSAR data classification is one of the most important applications of SAR remote sensing earth observations, which have gained increasing attention in the recent years. However, one of the most challenging aspects of classification is selecting features with maximum discrimination capability. To address this challenge, a statistical approach based on the Fisher Linear Discriminant Analysis (FLDA) and the incorporation of physical interpretation of PolSAR data into classification is proposed in this paper. After pre-processing of PolSAR data, including the speckle reduction, the H/α classification is used in order to classify the basic scattering mechanisms. Then, a new method for feature weighting, based on the fusion of FLDA and physical interpretation, is implemented. This method proves to increase the classification accuracy as well as increasing between-class discrimination in the final Wishart classification. The proposed method was applied to a full polarimetric C-band RADARSAT-2 data set from Avalon area, Newfoundland and Labrador, Canada. This imagery has been acquired in June 2015, and covers various types of wetlands including bogs, fens, marshes and shallow water. The results were compared with the standard Wishart classification, and an improvement of about 20% was achieved in the overall accuracy. This method provides an opportunity for operational wetland classification in northern latitude with high accuracy using only SAR polarimetric data.

  12. Accurate palm vein recognition based on wavelet scattering and spectral regression kernel discriminant analysis

    NASA Astrophysics Data System (ADS)

    Elnasir, Selma; Shamsuddin, Siti Mariyam; Farokhi, Sajad

    2015-01-01

    Palm vein recognition (PVR) is a promising new biometric that has been applied successfully as a method of access control by many organizations, which has even further potential in the field of forensics. The palm vein pattern has highly discriminative features that are difficult to forge because of its subcutaneous position in the palm. Despite considerable progress and a few practical issues, providing accurate palm vein readings has remained an unsolved issue in biometrics. We propose a robust and more accurate PVR method based on the combination of wavelet scattering (WS) with spectral regression kernel discriminant analysis (SRKDA). As the dimension of WS generated features is quite large, SRKDA is required to reduce the extracted features to enhance the discrimination. The results based on two public databases-PolyU Hyper Spectral Palmprint public database and PolyU Multi Spectral Palmprint-show the high performance of the proposed scheme in comparison with state-of-the-art methods. The proposed approach scored a 99.44% identification rate and a 99.90% verification rate [equal error rate (EER)=0.1%] for the hyperspectral database and a 99.97% identification rate and a 99.98% verification rate (EER=0.019%) for the multispectral database.

  13. Discriminating between HuR and TTP binding sites using the k-spectrum kernel method

    PubMed Central

    Goldberg, Debra S.; Dowell, Robin

    2017-01-01

    Background The RNA binding proteins (RBPs) human antigen R (HuR) and Tristetraprolin (TTP) are known to exhibit competitive binding but have opposing effects on the bound messenger RNA (mRNA). How cells discriminate between the two proteins is an interesting problem. Machine learning approaches, such as support vector machines (SVMs), may be useful in the identification of discriminative features. However, this method has yet to be applied to studies of RNA binding protein motifs. Results Applying the k-spectrum kernel to a support vector machine (SVM), we first verified the published binding sites of both HuR and TTP. Additional feature engineering highlighted the U-rich binding preference of HuR and AU-rich binding preference for TTP. Domain adaptation along with multi-task learning was used to predict the common binding sites. Conclusion The distinction between HuR and TTP binding appears to be subtle content features. HuR prefers strongly U-rich sequences whereas TTP prefers AU-rich as with increasing A content, the sequences are more likely to be bound only by TTP. Our model is consistent with competitive binding of the two proteins, particularly at intermediate AU-balanced sequences. This suggests that fine changes in the A/U balance within a untranslated region (UTR) can alter the binding and subsequent stability of the message. Both feature engineering and domain adaptation emphasized the extent to which these proteins recognize similar general sequence features. This work suggests that the k-spectrum kernel method could be useful when studying RNA binding proteins and domain adaptation techniques such as feature augmentation could be employed particularly when examining RBPs with similar binding preferences. PMID:28333956

  14. Discriminative time-frequency kernels for gait analysis for amyotrophic lateral sclerosis.

    PubMed

    Sugavaneswaran, Lakshmi; Umapathy, Karthikeyan; Krishnan, Sridhar

    2011-01-01

    Many stochastic systems show certain trends which in turn govern their underlying non-stationary time varying behavior. In order to facilitate efficient quantification of such signals, their analysis necessitates the use of robust tools for discerning between different classes of data. Research show that, use of time-frequency techniques offer intelligible representations for non-stationary signals, along with facilitating computation of instantaneous parameters. Further, in order to obtain efficient discrimination machine learning (ML) modules are often used alongside suitable representation techniques. In this work, we exploit the concepts of ML-kernel functions directly by incorporating them in the ambiguity time-frequency (TF) space, thereby obtaining a one-step discrimination between different non-stationary patterns. The proposed technique is evaluated for quantification applications for gait signal analysis. An overall classification accuracy of 93.1% is reported for the neurological gait database consisting of signals from 16-control and 13-amyotrophic lateral sclerosis (ALS) subjects. Results indicate that this scheme offers great potential in designing robust tools for time-varying signal analysis.

  15. A Method for Selecting between Fisher's Linear Classification Functions and Least Absolute Deviation in Predictive Discriminant Analysis.

    ERIC Educational Resources Information Center

    Meshbane, Alice; Morris, John D.

    A method for comparing the cross-validated classification accuracy of Fisher's linear classification functions (FLCFs) and the least absolute deviation is presented under varying data conditions for the two-group classification problem. With this method, separate-group as well as total-sample proportions of current classifications can be compared…

  16. Discrimination of pulp oil and kernel oil from pequi (Caryocar brasiliense) by fatty acid methyl esters fingerprinting, using GC-FID and multivariate analysis.

    PubMed

    Faria-Machado, Adelia F; Tres, Alba; van Ruth, Saskia M; Antoniassi, Rosemar; Junqueira, Nilton T V; Lopes, Paulo Sergio N; Bizzo, Humberto R

    2015-11-18

    Pequi is an oleaginous fruit whose edible oil is composed mainly by saturated and monounsaturated fatty acids. The biological and nutritional properties of pequi oil are dependent on its composition, which can change according to the oil source (pulp or kernel). There is little data in the scientific literature concerning the differences between the compositions of pequi kernel and pulp oils. Therefore, in this study, different pequi genotypes were evaluated to determine the fatty acid composition of pulp and kernel oils. PCA and PLS-DA were applied to develop a model to distinguish these oils. For all evaluated genotypes, the major fatty acids of both pulp and kernel oils were oleic and palmitic acids. Despite the apparent similarity between the analyzed samples, it was possible to discriminate pulp and kernel oils by means of their fatty acid composition using chemometrics, as well as the unique pequi genotype without endocarp spines (CPAC-PQ-SE-06).

  17. Choosing parameters of kernel subspace LDA for recognition of face images under pose and illumination variations.

    PubMed

    Huang, Jian; Yuen, Pong C; Chen, Wen-Sheng; Lai, Jian Huang

    2007-08-01

    This paper addresses the problem of automatically tuning multiple kernel parameters for the kernel-based linear discriminant analysis (LDA) method. The kernel approach has been proposed to solve face recognition problems under complex distribution by mapping the input space to a high-dimensional feature space. Some recognition algorithms such as the kernel principal components analysis, kernel Fisher discriminant, generalized discriminant analysis, and kernel direct LDA have been developed in the last five years. The experimental results show that the kernel-based method is a good and feasible approach to tackle the pose and illumination variations. One of the crucial factors in the kernel approach is the selection of kernel parameters, which highly affects the generalization capability and stability of the kernel-based learning methods. In view of this, we propose an eigenvalue-stability-bounded margin maximization (ESBMM) algorithm to automatically tune the multiple parameters of the Gaussian radial basis function kernel for the kernel subspace LDA (KSLDA) method, which is developed based on our previously developed subspace LDA method. The ESBMM algorithm improves the generalization capability of the kernel-based LDA method by maximizing the margin maximization criterion while maintaining the eigenvalue stability of the kernel-based LDA method. An in-depth investigation on the generalization performance on pose and illumination dimensions is performed using the YaleB and CMU PIE databases. The FERET database is also used for benchmark evaluation. Compared with the existing PCA-based and LDA-based methods, our proposed KSLDA method, with the ESBMM kernel parameter estimation algorithm, gives superior performance.

  18. Support vector machine with a Pearson VII function kernel for discriminating halophilic and non-halophilic proteins.

    PubMed

    Zhang, Guangya; Ge, Huihua

    2013-10-01

    Understanding of proteins adaptive to hypersaline environment and identifying them is a challenging task and would help to design stable proteins. Here, we have systematically analyzed the normalized amino acid compositions of 2121 halophilic and 2400 non-halophilic proteins. The results showed that halophilic protein contained more Asp at the expense of Lys, Ile, Cys and Met, fewer small and hydrophobic residues, and showed a large excess of acidic over basic amino acids. Then, we introduce a support vector machine method to discriminate the halophilic and non-halophilic proteins, by using a novel Pearson VII universal function based kernel. In the three validation check methods, it achieved an overall accuracy of 97.7%, 91.7% and 86.9% and outperformed other machine learning algorithms. We also address the influence of protein size on prediction accuracy and found the worse performance for small size proteins might be some significant residues (Cys and Lys) were missing in the proteins.

  19. Discrimination of adulterated milk based on two-dimensional correlation spectroscopy (2D-COS) combined with kernel orthogonal projection to latent structure (K-OPLS).

    PubMed

    Yang, Renjie; Liu, Rong; Xu, Kexin; Yang, Yanrong

    2013-12-01

    A new method for discrimination analysis of adulterated milk and pure milk is proposed by combining two-dimensional correlation spectroscopy (2D-COS) with kernel orthogonal projection to latent structure (K-OPLS). Three adulteration types of milk with urea, melamine, and glucose were prepared, respectively. The synchronous 2D spectra of adulterated milk and pure milk samples were calculated. Based on the characteristics of 2D correlation spectra of adulterated milk and pure milk, a discriminant model of urea-tainted milk, melamine-tainted milk, glucose-tainted milk, and pure milk was built by K-OPLS. The classification accuracy rates of unknown samples were 85.7, 92.3, 100, and 87.5%, respectively. The results show that this method has great potential in the rapid discrimination analysis of adulterated milk and pure milk.

  20. Discriminative clustering via extreme learning machine.

    PubMed

    Huang, Gao; Liu, Tianchi; Yang, Yan; Lin, Zhiping; Song, Shiji; Wu, Cheng

    2015-10-01

    Discriminative clustering is an unsupervised learning framework which introduces the discriminative learning rule of supervised classification into clustering. The underlying assumption is that a good partition (clustering) of the data should yield high discrimination, namely, the partitioned data can be easily classified by some classification algorithms. In this paper, we propose three discriminative clustering approaches based on Extreme Learning Machine (ELM). The first algorithm iteratively trains weighted ELM (W-ELM) classifier to gradually maximize the data discrimination. The second and third methods are both built on Fisher's Linear Discriminant Analysis (LDA); but one approach adopts alternative optimization, while the other leverages kernel k-means. We show that the proposed algorithms can be easily implemented, and yield competitive clustering accuracy on real world data sets compared to state-of-the-art clustering methods.

  1. A conditional entropy minimization criterion for dimensionality reduction and multiple kernel learning.

    PubMed

    Hino, Hideitsu; Murata, Noboru

    2010-11-01

    Reducing the dimensionality of high-dimensional data without losing its essential information is an important task in information processing. When class labels of training data are available, Fisher discriminant analysis (FDA) has been widely used. However, the optimality of FDA is guaranteed only in a very restricted ideal circumstance, and it is often observed that FDA does not provide a good classification surface for many real problems. This letter treats the problem of supervised dimensionality reduction from the viewpoint of information theory and proposes a framework of dimensionality reduction based on class-conditional entropy minimization. The proposed linear dimensionality-reduction technique is validated both theoretically and experimentally. Then, through kernel Fisher discriminant analysis (KFDA), the multiple kernel learning problem is treated in the proposed framework, and a novel algorithm, which iteratively optimizes the parameters of the classification function and kernel combination coefficients, is proposed. The algorithm is experimentally shown to be comparable to or outperforms KFDA for large-scale benchmark data sets, and comparable to other multiple kernel learning techniques on the yeast protein function annotation task.

  2. Generalized Fisher matrices

    NASA Astrophysics Data System (ADS)

    Heavens, A. F.; Seikel, M.; Nord, B. D.; Aich, M.; Bouffanais, Y.; Bassett, B. A.; Hobson, M. P.

    2014-12-01

    The Fisher Information Matrix formalism (Fisher 1935) is extended to cases where the data are divided into two parts (X, Y), where the expectation value of Y depends on X according to some theoretical model, and X and Y both have errors with arbitrary covariance. In the simplest case, (X, Y) represent data pairs of abscissa and ordinate, in which case the analysis deals with the case of data pairs with errors in both coordinates, but X can be any measured quantities on which Y depends. The analysis applies for arbitrary covariance, provided all errors are Gaussian, and provided the errors in X are small, both in comparison with the scale over which the expected signal Y changes, and with the width of the prior distribution. This generalizes the Fisher Matrix approach, which normally only considers errors in the `ordinate' Y. In this work, we include errors in X by marginalizing over latent variables, effectively employing a Bayesian hierarchical model, and deriving the Fisher Matrix for this more general case. The methods here also extend to likelihood surfaces which are not Gaussian in the parameter space, and so techniques such as DALI (Derivative Approximation for Likelihoods) can be generalized straightforwardly to include arbitrary Gaussian data error covariances. For simple mock data and theoretical models, we compare to Markov Chain Monte Carlo experiments, illustrating the method with cosmological supernova data. We also include the new method in the FISHER4CAST software.

  3. Fisher in Adelaide.

    PubMed

    Mayo, Oliver

    2014-06-01

    R. A. Fisher spent much of his final 3 years of life in Adelaide. It was a congenial place to live and work, and he was much in demand as a speaker, in Australia and overseas. It was, however, a difficult time for him because of the sustained criticism of fiducial inference from the early 1950s onwards. The article discusses some of Fisher's work on inference from an Adelaide perspective. It also considers some of the successes arising from this time, in the statistics of field experimentation and in evolutionary genetics. A few personal recollections of Fisher as houseguest are provided. This article is the text of a article presented on August 31, 2012 at the 26th International Biometric Conference, Kobe, Japan.

  4. Analysis of the Fisher solution

    SciTech Connect

    Abdolrahimi, Shohreh; Shoom, Andrey A.

    2010-01-15

    We study the d-dimensional Fisher solution which represents a static, spherically symmetric, asymptotically flat spacetime with a massless scalar field. The solution has two parameters, the mass M and the 'scalar charge' {Sigma}. The Fisher solution has a naked curvature singularity which divides the spacetime manifold into two disconnected parts. The part which is asymptotically flat we call the Fisher spacetime, and another part we call the Fisher universe. The d-dimensional Schwarzschild-Tangherlini solution and the Fisher solution belong to the same theory and are dual to each other. The duality transformation acting in the parameter space (M,{Sigma}) maps the exterior region of the Schwarzschild-Tangherlini black hole into the Fisher spacetime which has a naked timelike singularity, and interior region of the black hole into the Fisher universe, which is an anisotropic expanding-contracting universe and which has two spacelike singularities representing its 'big bang' and 'big crunch'. The big bang singularity and the singularity of the Fisher spacetime are radially weak in the sense that a 1-dimensional object moving along a timelike radial geodesic can arrive to the singularities intact. At the vicinity of the singularity the Fisher spacetime of nonzero mass has a region where its Misner-Sharp energy is negative. The Fisher universe has a marginally trapped surface corresponding to the state of its maximal expansion in the angular directions. These results and derived relations between geometric quantities of the Fisher spacetime, the Fisher universe, and the Schwarzschild-Tangherlini black hole may suggest that the massless scalar field transforms the black hole event horizon into the naked radially weak disjoint singularities of the Fisher spacetime and the Fisher universe which are 'dual to the horizon'.

  5. phase_space_cosmo_fisher: Fisher matrix 2D contours

    NASA Astrophysics Data System (ADS)

    Stark, Alejo

    2016-11-01

    phase_space_cosmo_fisher produces Fisher matrix 2D contours from which the constraints on cosmological parameters can be derived. Given a specified redshift array and cosmological case, 2D marginalized contours of cosmological parameters are generated; the code can also plot the derivatives used in the Fisher matrix. In addition, this package can generate 3D plots of qH^2 and other cosmological quantities as a function of redshift and cosmology.

  6. Natural selection maximizes Fisher information.

    PubMed

    Frank, S A

    2009-02-01

    In biology, information flows from the environment to the genome by the process of natural selection. However, it has not been clear precisely what sort of information metric properly describes natural selection. Here, I show that Fisher information arises as the intrinsic metric of natural selection and evolutionary dynamics. Maximizing the amount of Fisher information about the environment captured by the population leads to Fisher's fundamental theorem of natural selection, the most profound statement about how natural selection influences evolutionary dynamics. I also show a relation between Fisher information and Shannon information (entropy) that may help to unify the correspondence between information and dynamics. Finally, I discuss possible connections between the fundamental role of Fisher information in statistics, biology and other fields of science.

  7. Band-Reweighed Gabor Kernel Embedding for Face Image Representation and Recognition.

    PubMed

    Ren, Chuan-Xian; Dai, Dao-Qing; Li, Xiao-Xin; Lai, Zhao-Rong

    2014-02-01

    Face recognition with illumination or pose variation is a challenging problem in image processing and pattern recognition. A novel algorithm using band-reweighed Gabor kernel embedding to deal with the problem is proposed in this paper. For a given image, it is first transformed by a group of Gabor filters, which output Gabor features using different orientation and scale parameters. Fisher scoring function is used to measure the importance of features in each band, and then, the features with the largest scores are preserved for saving memory requirements. The reduced bands are combined by a vector, which is determined by a weighted kernel discriminant criterion and solved by a constrained quadratic programming method, and then, the weighted sum of these nonlinear bands is defined as the similarity between two images. Compared with existing concatenation-based Gabor feature representation and the uniformly weighted similarity calculation approaches, our method provides a new way to use Gabor features for face recognition and presents a reasonable interpretation for highlighting discriminant orientations and scales. The minimum Mahalanobis distance considering the spatial correlations within the data is exploited for feature matching, and the graphical lasso is used therein for directly estimating the sparse inverse covariance matrix. Experiments using benchmark databases show that our new algorithm improves the recognition results and obtains competitive performance.

  8. The Influence of Fractional Diffusion in Fisher-KPP Equations

    NASA Astrophysics Data System (ADS)

    Cabré, Xavier; Roquejoffre, Jean-Michel

    2013-06-01

    We study the Fisher-KPP equation where the Laplacian is replaced by the generator of a Feller semigroup with power decaying kernel, an important example being the fractional Laplacian. In contrast with the case of the standard Laplacian where the stable state invades the unstable one at constant speed, we prove that with fractional diffusion, generated for instance by a stable Lévy process, the front position is exponential in time. Our results provide a mathematically rigorous justification of numerous heuristics about this model.

  9. Weighted Bergman kernels and virtual Bergman kernels

    NASA Astrophysics Data System (ADS)

    Roos, Guy

    2005-12-01

    We introduce the notion of "virtual Bergman kernel" and apply it to the computation of the Bergman kernel of "domains inflated by Hermitian balls", in particular when the base domain is a bounded symmetric domain.

  10. Flooded area cartography with kernel-based classifiers and Landsat TM imagery

    NASA Astrophysics Data System (ADS)

    Volpi, M.; Petropoulos, G. P.; Kanevski, M.

    2012-04-01

    Timely and accurate flooding extent maps for both emergency and recovery phases are required by scientists, local authorities and decision makers. In particular, the issue of reducing exposure by quantifying vulnerability to inundation has recently began to be considered by European policies. Remote sensing can provide valuable information to this task, particularly over inaccessible regions. Provided that cloud-free conditions exist, multi-temporal optical images can be exploited for automatic cartography of the inundation. Image processing techniques based on kernels are promising tools in many remote sensing problems, ranging from biophysical parameter estimation to multi-temporal classification and change detection. The success of such methods is largely due to the explicit non-linear nature of the discriminant function and to their robustness to high-dimensional input spaces, such as those generated from remote sensing spectral bands. In our study, we examined the application of two supervised kernel-based classifiers for flooded area extraction from Landsat TM imagery. As a case study, we analyzed a region of the Missouri River in South Dakota, United States, in which images before and after a flood that took place in 2011 were available. In our approach, the mapping issue is recast as a change detection problem, whereby only the amount of water in excess to the permanent standing one was considered. Support Vector Machine (SVM) and Fisher's Linear Discriminant Analysis (LDA) classifications were applied successfully. Both classifiers were utilized in their linear and non-linear (kernel) versions. Evaluation of the ability of the two methods in delineating the flooding extent was conducted on the basis of classification accuracy assessment metrics as well as the McNemar statistical significance testing. Our findings showed the suitability of the non-linear kernel extensions to accurately map the flood extent. Possible future developments of the methodology

  11. FISHER GULCH ROADLESS AREA, CALIFORNIA.

    USGS Publications Warehouse

    Huber, Donald F.; Cather, Eric E.

    1984-01-01

    The Fisher Gulch Roadless Area occupies an area of about 5. 2 sq mi near the Trinity Alps in the Klamath Mountains, about 10 mi northwest of Weaverville, California. On the basis of a study, the Fisher Gulch Roadless Area has a probable potential for small amounts of placer gold resources in a narrow elongate area along the northeast boundary. There is little promise for the occurrence of other metallic, or nonmetallic resources and the geologic terrane precludes the occurrence of fossil fuel resources.

  12. Quantum criticality from Fisher information

    NASA Astrophysics Data System (ADS)

    Song, Hongting; Luo, Shunlong; Fu, Shuangshuang

    2017-04-01

    Quantum phase transition is primarily characterized by a qualitative sudden change in the ground state of a quantum system when an external or internal parameter of the Hamiltonian is continuously varied. Investigating quantum criticality using information-theoretic methods has generated fruitful results. Quantum correlations and fidelity have been exploited to characterize the quantum critical phenomena. In this work, we employ quantum Fisher information to study quantum criticality. The singular or extremal point of the quantum Fisher information is adopted as the estimated thermal critical point. By a significant model constructed in Quan et al. (Phys Rev Lett 96: 140604, 2006), the effectiveness of this method is illustrated explicitly.

  13. "Fisher v. Texas": Strictly Disappointing

    ERIC Educational Resources Information Center

    Nieli, Russell K.

    2013-01-01

    Russell K. Nieli writes in this opinion paper that as far as the ability of state colleges and universities to use race as a criteria for admission goes, "Fisher v. Texas" was a big disappointment, and failed in the most basic way. Nieli states that although some affirmative action opponents have tried to put a more positive spin on the…

  14. On Fisher Information and Thermodynamics

    EPA Science Inventory

    Fisher information is a measure of the information obtainable by an observer from the observation of reality. However, information is obtainable only when there are patterns or features to observe, and these only exist when there is order. For example, a system in perfect disor...

  15. A novel fuzzy Fisher classifier for signal peptide prediction.

    PubMed

    Gao, Cui-Fang; Qiu, Zi-Xue; Wu, Xiao-Jun; Tian, Feng-Wei; Zhang, Hao; Chen, Wei

    2011-08-01

    Signal peptides recognition by bioinformatics approaches is particularly important for the efficient secretion and production of specific proteins. We concentrate on developing an integrated fuzzy Fisher clustering (IFFC) and designing a novel classifier based on IFFC for predicting secretory proteins. IFFC provides a powerful optimal discriminant vector calculated by fuzzy intra-cluster scatter matrix and fuzzy inter-cluster scatter matrix. Because the training samples and test samples are processed together in IFFC, it is convenient for users to employ their own specific samples of high reliability as training data if necessary. The cross-validation results on some existing datasets indicate that the fuzzy Fisher classifier is quite promising for signal peptide prediction.

  16. Semisupervised kernel matrix learning by kernel propagation.

    PubMed

    Hu, Enliang; Chen, Songcan; Zhang, Daoqiang; Yin, Xuesong

    2010-11-01

    The goal of semisupervised kernel matrix learning (SS-KML) is to learn a kernel matrix on all the given samples on which just a little supervised information, such as class label or pairwise constraint, is provided. Despite extensive research, the performance of SS-KML still leaves some space for improvement in terms of effectiveness and efficiency. For example, a recent pairwise constraints propagation (PCP) algorithm has formulated SS-KML into a semidefinite programming (SDP) problem, but its computation is very expensive, which undoubtedly restricts PCPs scalability in practice. In this paper, a novel algorithm, called kernel propagation (KP), is proposed to improve the comprehensive performance in SS-KML. The main idea of KP is first to learn a small-sized sub-kernel matrix (named seed-kernel matrix) and then propagate it into a larger-sized full-kernel matrix. Specifically, the implementation of KP consists of three stages: 1) separate the supervised sample (sub)set X(l) from the full sample set X; 2) learn a seed-kernel matrix on X(l) through solving a small-scale SDP problem; and 3) propagate the learnt seed-kernel matrix into a full-kernel matrix on X . Furthermore, following the idea in KP, we naturally develop two conveniently realizable out-of-sample extensions for KML: one is batch-style extension, and the other is online-style extension. The experiments demonstrate that KP is encouraging in both effectiveness and efficiency compared with three state-of-the-art algorithms and its related out-of-sample extensions are promising too.

  17. Approximate kernel competitive learning.

    PubMed

    Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang

    2015-03-01

    Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches.

  18. The arms race between fishers

    NASA Astrophysics Data System (ADS)

    Rijnsdorp, Adriaan D.; Poos, Jan Jaap; Quirijns, Floor J.; HilleRisLambers, Reinier; De Wilde, Jan W.; Den Heijer, Willem M.

    An analysis of the changes in the Dutch demersal fishing fleet since the 1950s revealed that competitive interactions among vessels and gear types within the constraints imposed by biological, economic and fisheries management factors are the dominant processes governing the dynamics of fishing fleets. Double beam trawling, introduced in the early 1960s, proved a successful fishing method to catch deep burying flatfish, in particular sole. In less than 10 years, the otter trawl fleet was replaced by a highly specialised beam trawling fleet, despite an initial doubling of the loss rate of vessels due to stability problems. Engine power, size of the beam trawl, number of tickler chains and fishing speed rapidly increased and fishing activities expanded into previously lightly fished grounds and seasons. Following the ban on flatfish trawling within the 12 nautical mile zone for vessels of more than 300 hp in 1975 and with the restriction of engine power to 2000 hp in 1987, the beam trawl fleet bifurcated. Changes in the fleet capacity were related to the economic results and showed a cyclic pattern with a period of 6-7 years. The arms race between fishers was fuelled by competitive interactions among fishers: while the catchability of the fleet more than doubled in the ten years following the introduction of the beam trawl, a decline in catchability was observed in reference beam trawlers that remained the same. Vessel performance was not only affected by the technological characteristics but also by the number and characteristics of competing vessels.

  19. Implementing Kernel Methods Incrementally by Incremental Nonlinear Projection Trick.

    PubMed

    Kwak, Nojun

    2016-05-20

    Recently, the nonlinear projection trick (NPT) was introduced enabling direct computation of coordinates of samples in a reproducing kernel Hilbert space. With NPT, any machine learning algorithm can be extended to a kernel version without relying on the so called kernel trick. However, NPT is inherently difficult to be implemented incrementally because an ever increasing kernel matrix should be treated as additional training samples are introduced. In this paper, an incremental version of the NPT (INPT) is proposed based on the observation that the centerization step in NPT is unnecessary. Because the proposed INPT does not change the coordinates of the old data, the coordinates obtained by INPT can directly be used in any incremental methods to implement a kernel version of the incremental methods. The effectiveness of the INPT is shown by applying it to implement incremental versions of kernel methods such as, kernel singular value decomposition, kernel principal component analysis, and kernel discriminant analysis which are utilized for problems of kernel matrix reconstruction, letter classification, and face image retrieval, respectively.

  20. Seeing the Fisher Z-Transformation

    ERIC Educational Resources Information Center

    Bond, Charles F., Jr.; Richardson, Ken

    2004-01-01

    Since 1915, statisticians have been applying Fisher's Z-transformation to Pearson product-moment correlation coefficients. We offer new geometric interpretations of this transformation. (Contains 9 figures.)

  1. Multivariate acoustic detection of small explosions using Fisher's combined probability test.

    PubMed

    Arrowsmith, Stephen J; Taylor, Steven R

    2013-03-01

    A methodology for the combined acoustic detection and discrimination of explosions, which uses three discriminants, is developed for the purpose of identifying weak explosion signals embedded in complex background noise. By utilizing physical models for simple explosions that are formulated as statistical hypothesis tests, the detection/discrimination approach does not require a model for the background noise, which can be highly complex and variable in practice. Fisher's Combined Probability Test is used to combine the p-values from all multivariate discriminants. This framework is applied to acoustic data from a 400 g explosion conducted at Los Alamos National Laboratory.

  2. Optimized Kernel Entropy Components.

    PubMed

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2016-02-25

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  3. Fisher Information, Sustainability, Development and Political Instability

    EPA Science Inventory

    Fisher information is a measure of order inherent in the timer series data for any dynamic system. We have computed the Fisher Information for nation-states using the data from 1960 to 1997 from the State Instability Task Force. We find that nation-states fall into two categories...

  4. Intelligent classification methods of grain kernels using computer vision analysis

    NASA Astrophysics Data System (ADS)

    Lee, Choon Young; Yan, Lei; Wang, Tianfeng; Lee, Sang Ryong; Park, Cheol Woo

    2011-06-01

    In this paper, a digital image analysis method was developed to classify seven kinds of individual grain kernels (common rice, glutinous rice, rough rice, brown rice, buckwheat, common barley and glutinous barley) widely planted in Korea. A total of 2800 color images of individual grain kernels were acquired as a data set. Seven color and ten morphological features were extracted and processed by linear discriminant analysis to improve the efficiency of the identification process. The output features from linear discriminant analysis were used as input to the four-layer back-propagation network to classify different grain kernel varieties. The data set was divided into three groups: 70% for training, 20% for validation, and 10% for testing the network. The classification experimental results show that the proposed method is able to classify the grain kernel varieties efficiently.

  5. Gene Selection for Multiclass Prediction by Weighted Fisher Criterion

    PubMed Central

    2007-01-01

    Gene expression profiling has been widely used to study molecular signatures of many diseases and to develop molecular diagnostics for disease prediction. Gene selection, as an important step for improved diagnostics, screens tens of thousands of genes and identifies a small subset that discriminates between disease types. A two-step gene selection method is proposed to identify informative gene subsets for accurate classification of multiclass phenotypes. In the first step, individually discriminatory genes (IDGs) are identified by using one-dimensional weighted Fisher criterion (wFC). In the second step, jointly discriminatory genes (JDGs) are selected by sequential search methods, based on their joint class separability measured by multidimensional weighted Fisher criterion (wFC). The performance of the selected gene subsets for multiclass prediction is evaluated by artificial neural networks (ANNs) and/or support vector machines (SVMs). By applying the proposed IDG/JDG approach to two microarray studies, that is, small round blue cell tumors (SRBCTs) and muscular dystrophies (MDs), we successfully identified a much smaller yet efficient set of JDGs for diagnosing SRBCTs and MDs with high prediction accuracies (96.9% for SRBCTs and 92.3% for MDs, resp.). These experimental results demonstrated that the two-step gene selection method is able to identify a subset of highly discriminative genes for improved multiclass prediction. PMID:17713593

  6. Iterative software kernels

    SciTech Connect

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  7. Learning with Box Kernels.

    PubMed

    Melacci, Stefano; Gori, Marco

    2013-04-12

    Supervised examples and prior knowledge on regions of the input space have been profitably integrated in kernel machines to improve the performance of classifiers in different real-world contexts. The proposed solutions, which rely on the unified supervision of points and sets, have been mostly based on specific optimization schemes in which, as usual, the kernel function operates on points only. In this paper, arguments from variational calculus are used to support the choice of a special class of kernels, referred to as box kernels, which emerges directly from the choice of the kernel function associated with a regularization operator. It is proven that there is no need to search for kernels to incorporate the structure deriving from the supervision of regions of the input space, since the optimal kernel arises as a consequence of the chosen regularization operator. Although most of the given results hold for sets, we focus attention on boxes, whose labeling is associated with their propositional description. Based on different assumptions, some representer theorems are given which dictate the structure of the solution in terms of box kernel expansion. Successful results are given for problems of medical diagnosis, image, and text categorization.

  8. Learning with box kernels.

    PubMed

    Melacci, Stefano; Gori, Marco

    2013-11-01

    Supervised examples and prior knowledge on regions of the input space have been profitably integrated in kernel machines to improve the performance of classifiers in different real-world contexts. The proposed solutions, which rely on the unified supervision of points and sets, have been mostly based on specific optimization schemes in which, as usual, the kernel function operates on points only. In this paper, arguments from variational calculus are used to support the choice of a special class of kernels, referred to as box kernels, which emerges directly from the choice of the kernel function associated with a regularization operator. It is proven that there is no need to search for kernels to incorporate the structure deriving from the supervision of regions of the input space, because the optimal kernel arises as a consequence of the chosen regularization operator. Although most of the given results hold for sets, we focus attention on boxes, whose labeling is associated with their propositional description. Based on different assumptions, some representer theorems are given that dictate the structure of the solution in terms of box kernel expansion. Successful results are given for problems of medical diagnosis, image, and text categorization.

  9. Kernel Affine Projection Algorithms

    NASA Astrophysics Data System (ADS)

    Liu, Weifeng; Príncipe, José C.

    2008-12-01

    The combination of the famed kernel trick and affine projection algorithms (APAs) yields powerful nonlinear extensions, named collectively here, KAPA. This paper is a follow-up study of the recently introduced kernel least-mean-square algorithm (KLMS). KAPA inherits the simplicity and online nature of KLMS while reducing its gradient noise, boosting performance. More interestingly, it provides a unifying model for several neural network techniques, including kernel least-mean-square algorithms, kernel adaline, sliding-window kernel recursive-least squares (KRLS), and regularization networks. Therefore, many insights can be gained into the basic relations among them and the tradeoff between computation complexity and performance. Several simulations illustrate its wide applicability.

  10. General perspective view of the Fisher School Covered Bridge, view ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    General perspective view of the Fisher School Covered Bridge, view looking east along Five Rivers Road. - Fisher School Covered Bridge, Crab Creek Road at Fiver Rivers Road, Fisher, Lincoln County, OR

  11. General topographic view of the Fisher School Covered Bridge, view ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    General topographic view of the Fisher School Covered Bridge, view looking northwest from Crab Creek Road. - Fisher School Covered Bridge, Crab Creek Road at Fiver Rivers Road, Fisher, Lincoln County, OR

  12. Interior of the Fisher School Covered Bridge, view to north ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior of the Fisher School Covered Bridge, view to north showing road deck, guardrail, and howe truss. - Fisher School Covered Bridge, Crab Creek Road at Fiver Rivers Road, Fisher, Lincoln County, OR

  13. General perspective view of the Fisher School Covered Bridge, view ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    General perspective view of the Fisher School Covered Bridge, view looking southwest from Five Rivers Road. - Fisher School Covered Bridge, Crab Creek Road at Fiver Rivers Road, Fisher, Lincoln County, OR

  14. LETTER: Fisher renormalization for logarithmic corrections

    NASA Astrophysics Data System (ADS)

    Kenna, Ralph; Hsu, Hsiao-Ping; von Ferber, Christian

    2008-10-01

    For continuous phase transitions characterized by power-law divergences, Fisher renormalization prescribes how to obtain the critical exponents for a system under constraint from their ideal counterparts. In statistical mechanics, such ideal behaviour at phase transitions is frequently modified by multiplicative logarithmic corrections. Here, Fisher renormalization for the exponents of these logarithms is developed in a general manner. As for the leading exponents, Fisher renormalization at the logarithmic level is seen to be involutory and the renormalized exponents obey the same scaling relations as their ideal analogues. The scheme is tested in lattice animals and the Yang-Lee problem at their upper critical dimensions, where predictions for logarithmic corrections are made.

  15. Multiple collaborative kernel tracking.

    PubMed

    Fan, Zhimin; Yang, Ming; Wu, Ying

    2007-07-01

    Those motion parameters that cannot be recovered from image measurements are unobservable in the visual dynamic system. This paper studies this important issue of singularity in the context of kernel-based tracking and presents a novel approach that is based on a motion field representation which employs redundant but sparsely correlated local motion parameters instead of compact but uncorrelated global ones. This approach makes it easy to design fully observable kernel-based motion estimators. This paper shows that these high-dimensional motion fields can be estimated efficiently by the collaboration among a set of simpler local kernel-based motion estimators, which makes the new approach very practical.

  16. [Pathophysiology of Ataxia in Fisher Syndrome].

    PubMed

    Kuwabara, Satoshi

    2016-12-01

    Fisher syndrome is regarded as a peculiar inflammatory neuropathy associated with ophthalmoplegia, ataxia, and areflexia. The disorder is associated with preceding infection, cerebrospinal fluid albumino-cytological dissociation, and spontaneous recovery, and regarded as a variant of Guillain-Barré syndrome. The discovery of anti-GQ1b IgG antibodies led to dramatic advances in understanding the pathophysiology of Fisher syndrome. The lesions in Fisher syndrome are determined by expression of ganglioside GQ1b in the human nervous system. This review article focuses on the pathophysiology of ataxia in Fisher syndrome. Current evidence suggests that antibody attack on Group Ia neurons in the dorsal root ganglia is mainly responsible for the sensory ataxia. Involvement of the muscle spindles might also contribute to the development of ataxia.

  17. An evaluation of parturition indices in fishers

    USGS Publications Warehouse

    Frost, H.C.; York, E.C.; Krohn, W.B.; Elowe, K.D.; Decker, T.A.; Powell, S.M.; Fuller, T.K.

    1999-01-01

    Fishers (Martes pennanti) are important forest carnivores and furbearers that are susceptible to overharvest. Traditional indices used to monitor fisher populations typically overestimate litter size and proportion of females that give birth. We evaluated the usefulness of 2 indices of reproduction to determine proportion of female fishers that gave birth in a particular year. We used female fishers of known age and reproductive histories to compare appearance of placental scars with incidence of pregnancy and litter size. Microscopic observation of freshly removed reproductive tracts correctly identified pregnant fishers and correctly estimated litter size in 3 of 4 instances, but gross observation of placental scars failed to correctly identify pregnant fishers and litter size. Microscopic observations of reproductive tracts in carcasses that were not fresh also failed to identify pregnant animals and litter size. We evaluated mean sizes of anterior nipples to see if different reproductive classes could be distinguished. Mean anterior nipple size of captive and wild fishers correctly identified current-year breeders from nonbreeders. Former breeders were misclassified in 4 of 13 instances. Presence of placental scars accurately predicted parturition in a small sample size of fishers, but absence of placental scars did not signify that a female did not give birth. In addition to enabling the estimation of parturition rates in live animals more accurately than traditional indices, mean anterior nipple size also provided an estimate of the percentage of adult females that successfully raised young. Though using mean anterior nipple size to index reproductive success looks promising, additional data are needed to evaluate effects of using dried, stretched pelts on nipple size for management purposes.

  18. Fishers' knowledge and seahorse conservation in Brazil

    PubMed Central

    Rosa, Ierecê ML; Alves, Rômulo RN; Bonifácio, Kallyne M; Mourão, José S; Osório, Frederico M; Oliveira, Tacyana PR; Nottingham, Mara C

    2005-01-01

    From a conservationist perspective, seahorses are threatened fishes. Concomitantly, from a socioeconomic perspective, they represent a source of income to many fishing communities in developing countries. An integration between these two views requires, among other things, the recognition that seahorse fishers have knowledge and abilities that can assist the implementation of conservation strategies and of management plans for seahorses and their habitats. This paper documents the knowledge held by Brazilian fishers on the biology and ecology of the longsnout seahorse Hippocampus reidi. Its aims were to explore collaborative approaches to seahorse conservation and management in Brazil; to assess fishers' perception of seahorse biology and ecology, in the context evaluating potential management options; to increase fishers' involvement with seahorse conservation in Brazil. Data were obtained through questionnaires and interviews made during field surveys conducted in fishing villages located in the States of Piauí, Ceará, Paraíba, Maranhão, Pernambuco and Pará. We consider the following aspects as positive for the conservation of seahorses and their habitats in Brazil: fishers were willing to dialogue with researchers; although captures and/or trade of brooding seahorses occurred, most interviewees recognized the importance of reproduction to the maintenance of seahorses in the wild (and therefore of their source of income), and expressed concern over population declines; fishers associated the presence of a ventral pouch with reproduction in seahorses (regardless of them knowing which sex bears the pouch), and this may facilitate the construction of collaborative management options designed to eliminate captures of brooding specimens; fishers recognized microhabitats of importance to the maintenance of seahorse wild populations; fishers who kept seahorses in captivity tended to recognize the condtions as poor, and as being a cause of seahorse mortality. PMID

  19. Robotic Intelligence Kernel: Communications

    SciTech Connect

    Walton, Mike C.

    2009-09-16

    The INL Robotic Intelligence Kernel-Comms is the communication server that transmits information between one or more robots using the RIK and one or more user interfaces. It supports event handling and multiple hardware communication protocols.

  20. Robotic Intelligence Kernel: Driver

    SciTech Connect

    2009-09-16

    The INL Robotic Intelligence Kernel-Driver is built on top of the RIK-A and implements a dynamic autonomy structure. The RIK-D is used to orchestrate hardware for sensing and action as well as software components for perception, communication, behavior and world modeling into a single cognitive behavior kernel that provides intrinsic intelligence for a wide variety of unmanned ground vehicle systems.

  1. Nonparametric estimation of Fisher information from real data

    NASA Astrophysics Data System (ADS)

    Har-Shemesh, Omri; Quax, Rick; Miñano, Borja; Hoekstra, Alfons G.; Sloot, Peter M. A.

    2016-02-01

    The Fisher information matrix (FIM) is a widely used measure for applications including statistical inference, information geometry, experiment design, and the study of criticality in biological systems. The FIM is defined for a parametric family of probability distributions and its estimation from data follows one of two paths: either the distribution is assumed to be known and the parameters are estimated from the data or the parameters are known and the distribution is estimated from the data. We consider the latter case which is applicable, for example, to experiments where the parameters are controlled by the experimenter and a complicated relation exists between the input parameters and the resulting distribution of the data. Since we assume that the distribution is unknown, we use a nonparametric density estimation on the data and then compute the FIM directly from that estimate using a finite-difference approximation to estimate the derivatives in its definition. The accuracy of the estimate depends on both the method of nonparametric estimation and the difference Δ θ between the densities used in the finite-difference formula. We develop an approach for choosing the optimal parameter difference Δ θ based on large deviations theory and compare two nonparametric density estimation methods, the Gaussian kernel density estimator and a novel density estimation using field theory method. We also compare these two methods to a recently published approach that circumvents the need for density estimation by estimating a nonparametric f divergence and using it to approximate the FIM. We use the Fisher information of the normal distribution to validate our method and as a more involved example we compute the temperature component of the FIM in the two-dimensional Ising model and show that it obeys the expected relation to the heat capacity and therefore peaks at the phase transition at the correct critical temperature.

  2. A fisher vector representation of GPR data for detecting buried objects

    NASA Astrophysics Data System (ADS)

    Karem, Andrew; Khalifa, Amine B.; Frigui, Hichem

    2016-05-01

    We present a new method, based on the Fisher Vector (FV), for detecting buried explosive objects using ground- penetrating radar (GPR) data. First, low-level dense SIFT features are extracted from a grid covering a region of interest (ROIs). ROIs are identified as regions with high-energy along the (down-track, depth) dimensions of the 3-D GPR cube, or with high-energy along the (cross-track, depth) dimensions. Next, we model the training data (in the SIFT feature space) by a mixture of Gaussian components. Then, we construct FV descriptors based on the Fisher Kernel. The Fisher Kernel characterizes low-level features from an ROI by their deviation from a generative model. The deviation is the gradient of the ROI log-likelihood with respect to the generative model parameters. The vectorial representation of all the deviations is called the Fisher Vector. FV is a generalization of the standard Bag of Words (BoW) method, which provides a framework to map a set of local descriptors to a global feature vector. It is more efficient to compute than the BoW since it relies on a significantly smaller codebook. In addition, mapping a GPR signature into one global feature vector using this technique makes it more efficient to classify using simple and fast linear classifiers such as Support Vector Machines. The proposed approach is applied to detect buried explosive objects using GPR data. The selected data were accumulated across multiple dates and multiple test sites by a vehicle mounted mine detector (VMMD) using GPR sensor. This data consist of a diverse set of conventional landmines and other buried explosive objects consisting of varying shapes, metal content, and burial depths. The performance of the proposed approach is analyzed using receiver operating characteristics (ROC) and is compared to other state-of-the-art feature representation methods.

  3. Application of Fisher Information to Complex Dynamic Systems (Tucson)

    EPA Science Inventory

    Fisher information was developed by the statistician Ronald Fisher as a measure of the information obtainable from data being used to fit a related parameter. Starting from the work of Ronald Fisher1 and B. Roy Frieden2, we have developed Fisher information as a measure of order ...

  4. Application of Fisher Information to Complex Dynamic Systems

    EPA Science Inventory

    Fisher information was developed by the statistician Ronald Fisher as a measure of the information obtainable from data being used to fit a related parameter. Starting from the work of Ronald Fisher1 and B. Roy Frieden2, we have developed Fisher information as a measure of order ...

  5. UNICOS Kernel Internals Application Development

    NASA Technical Reports Server (NTRS)

    Caredo, Nicholas; Craw, James M. (Technical Monitor)

    1995-01-01

    Having an understanding of UNICOS Kernel Internals is valuable information. However, having the knowledge is only half the value. The second half comes with knowing how to use this information and apply it to the development of tools. The kernel contains vast amounts of useful information that can be utilized. This paper discusses the intricacies of developing utilities that utilize kernel information. In addition, algorithms, logic, and code will be discussed for accessing kernel information. Code segments will be provided that demonstrate how to locate and read kernel structures. Types of applications that can utilize kernel information will also be discussed.

  6. Spectrum-based kernel length estimation for Gaussian process classification.

    PubMed

    Wang, Liang; Li, Chuan

    2014-06-01

    Recent studies have shown that Gaussian process (GP) classification, a discriminative supervised learning approach, has achieved competitive performance in real applications compared with most state-of-the-art supervised learning methods. However, the problem of automatic model selection in GP classification, involving the kernel function form and the corresponding parameter values (which are unknown in advance), remains a challenge. To make GP classification a more practical tool, this paper presents a novel spectrum analysis-based approach for model selection by refining the GP kernel function to match the given input data. Specifically, we target the problem of GP kernel length scale estimation. Spectrums are first calculated analytically from the kernel function itself using the autocorrelation theorem as well as being estimated numerically from the training data themselves. Then, the kernel length scale is automatically estimated by equating the two spectrum values, i.e., the kernel function spectrum equals to the estimated training data spectrum. Compared with the classical Bayesian method for kernel length scale estimation via maximizing the marginal likelihood (which is time consuming and could suffer from multiple local optima), extensive experimental results on various data sets show that our proposed method is both efficient and accurate.

  7. Kernel mucking in top

    SciTech Connect

    LeFebvre, W.

    1994-08-01

    For many years, the popular program top has aided system administrations in examination of process resource usage on their machines. Yet few are familiar with the techniques involved in obtaining this information. Most of what is displayed by top is available only in the dark recesses of kernel memory. Extracting this information requires familiarity not only with how bytes are read from the kernel, but also what data needs to be read. The wide variety of systems and variants of the Unix operating system in today`s marketplace makes writing such a program very challenging. This paper explores the tremendous diversity in kernel information across the many platforms and the solutions employed by top to achieve and maintain ease of portability in the presence of such divergent systems.

  8. Anytime query-tuned kernel machine classifiers via Cholesky factorization

    NASA Technical Reports Server (NTRS)

    DeCoste, D.

    2002-01-01

    We recently demonstrated 2 to 64-fold query-time speedups of Support Vector Machine and Kernel Fisher classifiers via a new computational geometry method for anytime output bounds (DeCoste,2002). This new paper refines our approach in two key ways. First, we introduce a simple linear algebra formulation based on Cholesky factorization, yielding simpler equations and lower computational overhead. Second, this new formulation suggests new methods for achieving additional speedups, including tuning on query samples. We demonstrate effectiveness on benchmark datasets.

  9. Minimum classification error-based weighted support vector machine kernels for speaker verification.

    PubMed

    Suh, Youngjoo; Kim, Hoirin

    2013-04-01

    Support vector machines (SVMs) have been proved to be an effective approach to speaker verification. An appropriate selection of the kernel function is a key issue in SVM-based classification. In this letter, a new SVM-based speaker verification method utilizing weighted kernels in the Gaussian mixture model supervector space is proposed. The weighted kernels are derived by using the discriminative training approach, which minimizes speaker verification errors. Experiments performed on the NIST 2008 speaker recognition evaluation task showed that the proposed approach provides substantially improved performance over the baseline kernel-based method.

  10. H. A. L. Fisher: Scholar and Minister

    ERIC Educational Resources Information Center

    Judge, Harry

    2006-01-01

    H. A. L. Fisher came from an influential family, studied at Oxford and in France and Germany, and became an Oxford academic with a strong interest in public affairs. In 1912 he became Vice-Chancellor of Sheffield University and four years later was recruited by the new British Prime Minister to become his Minister of Education. In that office he…

  11. EXERGY AND FISHER INFORMATION AS ECOLOGICAL INDEXES

    EPA Science Inventory

    Ecological indices are used to provide summary information about a particular aspect of ecosystem behavior. Many such indices have been proposed and here we investigate two: exergy and Fisher Information. Exergy, a thermodynamically based index, is a measure of maximum amount o...

  12. FISHER INFORMATION AND ECOSYSTEM REGIME CHANGES

    EPA Science Inventory

    Following Fisher’s work, we propose two different expressions for the Fisher Information along with Shannon Information as a means of detecting and assessing shifts between alternative ecosystem regimes. Regime shifts are a consequence of bifurcations in the dynamics of an ecosys...

  13. Robotic Intelligence Kernel: Architecture

    SciTech Connect

    2009-09-16

    The INL Robotic Intelligence Kernel Architecture (RIK-A) is a multi-level architecture that supports a dynamic autonomy structure. The RIK-A is used to coalesce hardware for sensing and action as well as software components for perception, communication, behavior and world modeling into a framework that can be used to create behaviors for humans to interact with the robot.

  14. Robotic Intelligence Kernel: Visualization

    SciTech Connect

    2009-09-16

    The INL Robotic Intelligence Kernel-Visualization is the software that supports the user interface. It uses the RIK-C software to communicate information to and from the robot. The RIK-V illustrates the data in a 3D display and provides an operating picture wherein the user can task the robot.

  15. HMM-Fisher: identifying differential methylation using a hidden Markov model and Fisher's exact test.

    PubMed

    Sun, Shuying; Yu, Xiaoqing

    2016-03-01

    DNA methylation is an epigenetic event that plays an important role in regulating gene expression. It is important to study DNA methylation, especially differential methylation patterns between two groups of samples (e.g. patients vs. normal individuals). With next generation sequencing technologies, it is now possible to identify differential methylation patterns by considering methylation at the single CG site level in an entire genome. However, it is challenging to analyze large and complex NGS data. In order to address this difficult question, we have developed a new statistical method using a hidden Markov model and Fisher's exact test (HMM-Fisher) to identify differentially methylated cytosines and regions. We first use a hidden Markov chain to model the methylation signals to infer the methylation state as Not methylated (N), Partly methylated (P), and Fully methylated (F) for each individual sample. We then use Fisher's exact test to identify differentially methylated CG sites. We show the HMM-Fisher method and compare it with commonly cited methods using both simulated data and real sequencing data. The results show that HMM-Fisher outperforms the current available methods to which we have compared. HMM-Fisher is efficient and robust in identifying heterogeneous DM regions.

  16. Optimizing spatial filters with kernel methods for BCI applications

    NASA Astrophysics Data System (ADS)

    Zhang, Jiacai; Tang, Jianjun; Yao, Li

    2007-11-01

    Brain Computer Interface (BCI) is a communication or control system in which the user's messages or commands do not depend on the brain's normal output channels. The key step of BCI technology is to find a reliable method to detect the particular brain signals, such as the alpha, beta and mu components in EEG/ECOG trials, and then translate it into usable control signals. In this paper, our objective is to introduce a novel approach that is able to extract the discriminative pattern from the non-stationary EEG signals based on the common spatial patterns(CSP) analysis combined with kernel methods. The basic idea of our Kernel CSP method is performing a nonlinear form of CSP by the use of kernel methods that can efficiently compute the common and distinct components in high dimensional feature spaces related to input space by some nonlinear map. The algorithm described here is tested off-line with dataset I from the BCI Competition 2005. Our experiments show that the spatial filters employed with kernel CSP can effectively extract discriminatory information from single-trial EGOG recorded during imagined movements. The high recognition of linear discriminative rates and computational simplicity of "Kernel Trick" make it a promising method for BCI systems.

  17. Kernel-aligned multi-view canonical correlation analysis for image recognition

    NASA Astrophysics Data System (ADS)

    Su, Shuzhi; Ge, Hongwei; Yuan, Yun-Hao

    2016-09-01

    Existing kernel-based correlation analysis methods mainly adopt a single kernel in each view. However, only a single kernel is usually insufficient to characterize nonlinear distribution information of a view. To solve the problem, we transform each original feature vector into a 2-dimensional feature matrix by means of kernel alignment, and then propose a novel kernel-aligned multi-view canonical correlation analysis (KAMCCA) method on the basis of the feature matrices. Our proposed method can simultaneously employ multiple kernels to better capture the nonlinear distribution information of each view, so that correlation features learned by KAMCCA can have well discriminating power in real-world image recognition. Extensive experiments are designed on five real-world image datasets, including NIR face images, thermal face images, visible face images, handwritten digit images, and object images. Promising experimental results on the datasets have manifested the effectiveness of our proposed method.

  18. Differential evolution algorithm-based kernel parameter selection for Fukunaga-Koontz Transform subspaces construction

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin

    2015-10-01

    The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.

  19. Multiple Kernel Point Set Registration.

    PubMed

    Nguyen, Thanh Minh; Wu, Q M Jonathan

    2015-12-22

    The finite Gaussian mixture model with kernel correlation is a flexible tool that has recently received attention for point set registration. While there are many algorithms for point set registration presented in the literature, an important issue arising from these studies concerns the mapping of data with nonlinear relationships and the ability to select a suitable kernel. Kernel selection is crucial for effective point set registration. We focus here on multiple kernel point set registration. We make several contributions in this paper. First, each observation is modeled using the Student's t-distribution, which is heavily tailed and more robust than the Gaussian distribution. Second, by automatically adjusting the kernel weights, the proposed method allows us to prune the ineffective kernels. This makes the choice of kernels less crucial. After parameter learning, the kernel saliencies of the irrelevant kernels go to zero. Thus, the choice of kernels is less crucial and it is easy to include other kinds of kernels. Finally, we show empirically that our model outperforms state-of-the-art methods recently proposed in the literature.

  20. Multiple Kernel Point Set Registration.

    PubMed

    Nguyen, Thanh Minh; Wu, Q M Jonathan

    2016-06-01

    The finite Gaussian mixture model with kernel correlation is a flexible tool that has recently received attention for point set registration. While there are many algorithms for point set registration presented in the literature, an important issue arising from these studies concerns the mapping of data with nonlinear relationships and the ability to select a suitable kernel. Kernel selection is crucial for effective point set registration. We focus here on multiple kernel point set registration. We make several contributions in this paper. First, each observation is modeled using the Student's t-distribution, which is heavily tailed and more robust than the Gaussian distribution. Second, by automatically adjusting the kernel weights, the proposed method allows us to prune the ineffective kernels. This makes the choice of kernels less crucial. After parameter learning, the kernel saliencies of the irrelevant kernels go to zero. Thus, the choice of kernels is less crucial and it is easy to include other kinds of kernels. Finally, we show empirically that our model outperforms state-of-the-art methods recently proposed in the literature.

  1. Kernel methods for phenotyping complex plant architecture.

    PubMed

    Kawamura, Koji; Hibrand-Saint Oyant, Laurence; Foucher, Fabrice; Thouroude, Tatiana; Loustau, Sébastien

    2014-02-07

    The Quantitative Trait Loci (QTL) mapping of plant architecture is a critical step for understanding the genetic determinism of plant architecture. Previous studies adopted simple measurements, such as plant-height, stem-diameter and branching-intensity for QTL mapping of plant architecture. Many of these quantitative traits were generally correlated to each other, which give rise to statistical problem in the detection of QTL. We aim to test the applicability of kernel methods to phenotyping inflorescence architecture and its QTL mapping. We first test Kernel Principal Component Analysis (KPCA) and Support Vector Machines (SVM) over an artificial dataset of simulated inflorescences with different types of flower distribution, which is coded as a sequence of flower-number per node along a shoot. The ability of discriminating the different inflorescence types by SVM and KPCA is illustrated. We then apply the KPCA representation to the real dataset of rose inflorescence shoots (n=1460) obtained from a 98 F1 hybrid mapping population. We find kernel principal components with high heritability (>0.7), and the QTL analysis identifies a new QTL, which was not detected by a trait-by-trait analysis of simple architectural measurements. The main tools developed in this paper could be use to tackle the general problem of QTL mapping of complex (sequences, 3D structure, graphs) phenotypic traits.

  2. Fisher classifier and its probability of error estimation

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  3. Fisher equation for a decaying brane

    NASA Astrophysics Data System (ADS)

    Ghoshal, Debashis

    2011-12-01

    We consider the inhomogeneous decay of an unstable D-brane. The dynamical equation that describes this process (in light-cone time) is a variant of the non-linear reaction-diffusion equation that first made its appearance in the pioneering work of (Luther and) Fisher and appears in a variety of natural phenomena. We analyze its travelling front solution using singular perturbation theory.

  4. A theory of Fisher's reproductive value.

    PubMed

    Grafen, Alan

    2006-07-01

    The formal Darwinism project aims to provide a mathematically rigorous basis for optimisation thinking in relation to natural selection. This paper deals with the situation in which individuals in a population belong to classes, such as sexes, or size and/or age classes. Fisher introduced the concept of reproductive value into biology to help analyse evolutionary processes of populations divided into classes. Here a rigorously defined and very general structure justifies, and shows the unity of concept behind, Fisher's uses of reproductive value as measuring the significance for evolutionary processes of (i) an individual and (ii) a class; (iii) recursively, as calculable for a parent as a sum of its shares in the reproductive values of its offspring; and (iv) as an evolutionary maximand under natural selection. The maximand is the same for all parental classes, and is a weighted sum of offspring numbers, which implies that a tradeoff in one aspect of the phenotype can legitimately be studied separately from other aspects. The Price equation, measure theory, Markov theory and positive operators contribute to the framework, which is then applied to a number of examples, including a new and fully rigorous version of Fisher's sex ratio argument. Classes may be discrete (e.g. sex), continuous (e.g. weight at fledging) or multidimensional with discrete and continuous components (e.g. sex and weight at fledging and adult tarsus length).

  5. Canonical energy is quantum Fisher information

    NASA Astrophysics Data System (ADS)

    Lashkari, Nima; Van Raamsdonk, Mark

    2016-04-01

    In quantum information theory, Fisher Information is a natural metric on the space of perturbations to a density matrix, defined by calculating the relative entropy with the unperturbed state at quadratic order in perturbations. In gravitational physics, Canonical Energy defines a natural metric on the space of perturbations to spacetimes with a Killing horizon. In this paper, we show that the Fisher information metric for perturbations to the vacuum density matrix of a ball-shaped region B in a holographic CFT is dual to the canonical energy metric for perturbations to a corresponding Rindler wedge R B of Anti-de-Sitter space. Positivity of relative entropy at second order implies that the Fisher information metric is positive definite. Thus, for physical perturbations to anti-de-Sitter spacetime, the canonical energy associated to any Rindler wedge must be positive. This second-order constraint on the metric extends the first order result from relative entropy positivity that physical perturbations must satisfy the linearized Einstein's equations.

  6. Sir Ronald A. Fisher and the International Biometric Society.

    PubMed

    Billard, Lynne

    2014-06-01

    The year 2012 marks the 50th anniversary of the death of Sir Ronald A. Fisher, one of the two Fathers of Statistics and a Founder of the International Biometric Society (the "Society"). To celebrate the extraordinary genius of Fisher and the far-sighted vision of Fisher and Chester Bliss in organizing and promoting the formation of the Society, this article looks at the origins and growth of the Society, some of the key players and events, and especially the roles played by Fisher himself as the First President. A fresh look at Fisher, the man rather than the scientific genius is also presented.

  7. Kernel machine SNP-set testing under multiple candidate kernels.

    PubMed

    Wu, Michael C; Maity, Arnab; Lee, Seunggeun; Simmons, Elizabeth M; Harmon, Quaker E; Lin, Xinyi; Engel, Stephanie M; Molldrem, Jeffrey J; Armistead, Paul M

    2013-04-01

    Joint testing for the cumulative effect of multiple single-nucleotide polymorphisms grouped on the basis of prior biological knowledge has become a popular and powerful strategy for the analysis of large-scale genetic association studies. The kernel machine (KM)-testing framework is a useful approach that has been proposed for testing associations between multiple genetic variants and many different types of complex traits by comparing pairwise similarity in phenotype between subjects to pairwise similarity in genotype, with similarity in genotype defined via a kernel function. An advantage of the KM framework is its flexibility: choosing different kernel functions allows for different assumptions concerning the underlying model and can allow for improved power. In practice, it is difficult to know which kernel to use a priori because this depends on the unknown underlying trait architecture and selecting the kernel which gives the lowest P-value can lead to inflated type I error. Therefore, we propose practical strategies for KM testing when multiple candidate kernels are present based on constructing composite kernels and based on efficient perturbation procedures. We demonstrate through simulations and real data applications that the procedures protect the type I error rate and can lead to substantially improved power over poor choices of kernels and only modest differences in power vs. using the best candidate kernel.

  8. A Global Estimate of the Number of Coral Reef Fishers.

    PubMed

    Teh, Louise S L; Teh, Lydia C L; Sumaila, U Rashid

    2013-01-01

    Overfishing threatens coral reefs worldwide, yet there is no reliable estimate on the number of reef fishers globally. We address this data gap by quantifying the number of reef fishers on a global scale, using two approaches - the first estimates reef fishers as a proportion of the total number of marine fishers in a country, based on the ratio of reef-related to total marine fish landed values. The second estimates reef fishers as a function of coral reef area, rural coastal population, and fishing pressure. In total, we find that there are 6 million reef fishers in 99 reef countries and territories worldwide, of which at least 25% are reef gleaners. Our estimates are an improvement over most existing fisher population statistics, which tend to omit accounting for gleaners and reef fishers. Our results suggest that slightly over a quarter of the world's small-scale fishers fish on coral reefs, and half of all coral reef fishers are in Southeast Asia. Coral reefs evidently support the socio-economic well-being of numerous coastal communities. By quantifying the number of people who are employed as reef fishers, we provide decision-makers with an important input into planning for sustainable coral reef fisheries at the appropriate scale.

  9. 7 CFR 51.1415 - Inedible kernels.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Inedible kernels. 51.1415 Section 51.1415 Agriculture... Standards for Grades of Pecans in the Shell 1 Definitions § 51.1415 Inedible kernels. Inedible kernels means that the kernel or pieces of kernels are rancid, moldy, decayed, injured by insects or...

  10. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Inedible kernel. 981.408 Section 981.408 Agriculture... Administrative Rules and Regulations § 981.408 Inedible kernel. Pursuant to § 981.8, the definition of inedible kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored...

  11. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Inedible kernel. 981.8 Section 981.8 Agriculture... Regulating Handling Definitions § 981.8 Inedible kernel. Inedible kernel means a kernel, piece, or particle of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel,...

  12. Fisher-Mendel controversy in genetics: scientific argument, intellectual integrity, a fair society, Western falls and bioethical evaluation.

    PubMed

    Tang, Bing H

    2009-10-01

    This review article aims to discuss and analyze the background and findings regarding Fisher-Mendel Controversy in Genetics and to elucidate the scientific argument and intellectual integrity involved, as well as their importance in a fair society, and the lesson of Western falls as learned. At the onset of this review, the kernel of Mendel-Fisher Controversy is dissected and then identified. The fact of an organizational restructuring that had never gone towards a happy synchronization for the ensuing years since 1933 is demonstrated. It was at that time after Fisher succeeded Karl Pearson not only as the Francis Galton Professor of Eugenics but also as the chief of the Galton Laboratory at University College, London. The academic style of eugenics in the late 19th and early 20th centuries in the UK is then introduced. Fisher's ideology at that time, with its effects on the human value system and policy-making at that juncture are portrayed. Bioethical assessment is provided. Lessons in history, the emergence of the Eastern phenomenon and the decline of the Western power are outlined.

  13. Fisher, Neyman, and Bayes at FDA.

    PubMed

    Rubin, Donald B

    2016-01-01

    The wise use of statistical ideas in practice essentially requires some Bayesian thinking, in contrast to the classical rigid frequentist dogma. This dogma too often has seemed to influence the applications of statistics, even at agencies like the FDA. Greg Campbell was one of the most important advocates there for more nuanced modes of thought, especially Bayesian statistics. Because two brilliant statisticians, Ronald Fisher and Jerzy Neyman, are often credited with instilling the traditional frequentist approach in current practice, I argue that both men were actually seeking very Bayesian answers, and neither would have endorsed the rigid application of their ideas.

  14. Kernel phase and kernel amplitude in Fizeau imaging

    NASA Astrophysics Data System (ADS)

    Pope, Benjamin J. S.

    2016-12-01

    Kernel phase interferometry is an approach to high angular resolution imaging which enhances the performance of speckle imaging with adaptive optics. Kernel phases are self-calibrating observables that generalize the idea of closure phases from non-redundant arrays to telescopes with arbitrarily shaped pupils, by considering a matrix-based approximation to the diffraction problem. In this paper I discuss the recent history of kernel phase, in particular in the matrix-based study of sparse arrays, and propose an analogous generalization of the closure amplitude to kernel amplitudes. This new approach can self-calibrate throughput and scintillation errors in optical imaging, which extends the power of kernel phase-like methods to symmetric targets where amplitude and not phase calibration can be a significant limitation, and will enable further developments in high angular resolution astronomy.

  15. Studying Cerebral Vasculature Using Structure Proximity and Graph Kernels

    PubMed Central

    Kwitt, Roland; Pace, Danielle; Niethammer, Marc; Aylward, Stephen

    2014-01-01

    An approach to study population differences in cerebral vasculature is proposed. This is done by 1) extending the concept of encoding cerebral blood vessel networks as spatial graphs and 2) quantifying graph similarity in a kernel-based discriminant classifier setup. We argue that augmenting graph vertices with information about their proximity to selected brain structures adds discriminative information and consequently leads to a more expressive encoding. Using graph-kernels then allows us to quantify graph similarity in a principled way. To demonstrate our approach, we assess the hypothesis that gender differences manifest as variations in the architecture of cerebral blood vessels, an observation that previously had only been tested and confirmed for the Circle of Willis. Our results strongly support this hypothesis, i.e, we can demonstrate non-trivial, statistically significant deviations from random gender classification in a cross-validation setup on 40 healthy patients. PMID:24579182

  16. The Adaptive Kernel Neural Network

    DTIC Science & Technology

    1989-10-01

    A neural network architecture for clustering and classification is described. The Adaptive Kernel Neural Network (AKNN) is a density estimation...classification layer. The AKNN retains the inherent parallelism common in neural network models. Its relationship to the kernel estimator allows the network to

  17. Manifold Kernel Sparse Representation of Symmetric Positive-Definite Matrices and Its Applications.

    PubMed

    Wu, Yuwei; Jia, Yunde; Li, Peihua; Zhang, Jian; Yuan, Junsong

    2015-11-01

    The symmetric positive-definite (SPD) matrix, as a connected Riemannian manifold, has become increasingly popular for encoding image information. Most existing sparse models are still primarily developed in the Euclidean space. They do not consider the non-linear geometrical structure of the data space, and thus are not directly applicable to the Riemannian manifold. In this paper, we propose a novel sparse representation method of SPD matrices in the data-dependent manifold kernel space. The graph Laplacian is incorporated into the kernel space to better reflect the underlying geometry of SPD matrices. Under the proposed framework, we design two different positive definite kernel functions that can be readily transformed to the corresponding manifold kernels. The sparse representation obtained has more discriminating power. Extensive experimental results demonstrate good performance of manifold kernel sparse codes in image classification, face recognition, and visual tracking.

  18. GeneFisher-P: variations of GeneFisher as processes in Bio-jETI

    PubMed Central

    Lamprecht, Anna-Lena; Margaria, Tiziana; Steffen, Bernhard; Sczyrba, Alexander; Hartmeier, Sven; Giegerich, Robert

    2008-01-01

    Background PCR primer design is an everyday, but not trivial task requiring state-of-the-art software. We describe the popular tool GeneFisher and explain its recent restructuring using workflow techniques. We apply a service-oriented approach to model and implement GeneFisher-P, a process-based version of the GeneFisher web application, as a part of the Bio-jETI platform for service modeling and execution. We show how to introduce a flexible process layer to meet the growing demand for improved user-friendliness and flexibility. Results Within Bio-jETI, we model the process using the jABC framework, a mature model-driven, service-oriented process definition platform. We encapsulate remote legacy tools and integrate web services using jETI, an extension of the jABC for seamless integration of remote resources as basic services, ready to be used in the process. Some of the basic services used by GeneFisher are in fact already provided as individual web services at BiBiServ and can be directly accessed. Others are legacy programs, and are made available to Bio-jETI via the jETI technology. The full power of service-based process orientation is required when more bioinformatics tools, available as web services or via jETI, lead to easy extensions or variations of the basic process. This concerns for instance variations of data retrieval or alignment tools as provided by the European Bioinformatics Institute (EBI). Conclusions The resulting service- and process-oriented GeneFisher-P demonstrates how basic services from heterogeneous sources can be easily orchestrated in the Bio-jETI platform and lead to a flexible family of specialized processes tailored to specific tasks. PMID:18460174

  19. Modulated traveling fronts for a nonlocal Fisher-KPP equation: A dynamical systems approach

    NASA Astrophysics Data System (ADS)

    Faye, Grégory; Holzer, Matt

    2015-04-01

    We consider a nonlocal generalization of the Fisher-KPP equation in one spatial dimension. As a parameter is varied, the system undergoes a Turing bifurcation. We study the dynamics near this Turing bifurcation. Our results are two-fold. First, we prove the existence of a two-parameter family of bifurcating stationary periodic solutions and derive a rigorous asymptotic approximation of these solutions. We also study the spectral stability of the bifurcating stationary periodic solutions with respect to almost co-periodic perturbations. Second, we restrict to a specific class of exponential kernels for which the nonlocal problem is transformed into a higher order partial differential equation. In this context, we prove the existence of modulated traveling fronts near the Turing bifurcation that describe the invasion of the Turing unstable homogeneous state by the periodic pattern established in the first part. Both results rely on a center manifold reduction to a finite dimensional ordinary differential equation.

  20. Olympic Fisher Reintroduction Project: Progress report 2008-2011

    USGS Publications Warehouse

    Jeffrey C. Lewis,; Patti J. Happe,; Jenkins, Kurt J.; Manson, David J.

    2012-01-01

    This progress report summarizes the final year of activities of Phase I of the Olympic fisher restoration project. The intent of the Olympic fisher reintroduction project is to reestablish a self-sustaining population of fishers on the Olympic Peninsula. To achieve this goal, the Olympic fisher reintroduction project released 90 fishers within Olympic National Park from 2008 to 2010. The reintroduction of fishers to the Olympic Peninsula was designed as an adaptive management project, including the monitoring of released fishers as a means to (1) evaluate reintroduction success, (2) investigate key biological and ecological traits of fishers, and (3) inform future reintroduction, monitoring, and research efforts. This report summarizes reintroduction activities and preliminary research and monitoring results completed through December 2011. The report is non-interpretational in nature. Although we report the status of movement, survival, and home range components of the research, we have not completed final analyses and interpretation of research results. Much of the data collected during the monitoring and research project will be analyzed and interpreted in the doctoral dissertation being developed by Jeff Lewis; the completion of this dissertation is anticipated prior to April 2013. We anticipate that this work, and analyses of other data collected during the project, will result in several peer-reviewed scientific publications in ecological and conservation journals, which collectively will comprise the final reporting of work summarized here. These publications will include papers addressing post-release movements, survival, resource selection, food habits, and age determination of fishers.

  1. 77 FR 15650 - Fisher House and Other Temporary Lodging

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-16

    ... treatment associated with organ transplant, chemotherapy, or radiation.'' The use of Fisher House or other... organ transplant, chemotherapy, or radiation. (3) Hospitalization for a critical injury or...

  2. Robotic intelligence kernel

    DOEpatents

    Bruemmer, David J.

    2009-11-17

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes a robot intelligence kernel (RIK) that includes a multi-level architecture and a dynamic autonomy structure. The multi-level architecture includes a robot behavior level for defining robot behaviors, that incorporate robot attributes and a cognitive level for defining conduct modules that blend an adaptive interaction between predefined decision functions and the robot behaviors. The dynamic autonomy structure is configured for modifying a transaction capacity between an operator intervention and a robot initiative and may include multiple levels with at least a teleoperation mode configured to maximize the operator intervention and minimize the robot initiative and an autonomous mode configured to minimize the operator intervention and maximize the robot initiative. Within the RIK at least the cognitive level includes the dynamic autonomy structure.

  3. Flexible Kernel Memory

    PubMed Central

    Nowicki, Dimitri; Siegelmann, Hava

    2010-01-01

    This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors can be added, deleted, and updated on-line simply, without harming existing memories, and the number of attractors is independent of input dimension. Input vectors do not have to adhere to a fixed or bounded dimensionality; they can increase and decrease it without relearning previous memories. A memory consolidation process enables the network to generalize concepts and form clusters of input data, which outperforms many unsupervised clustering techniques; this process is demonstrated on handwritten digits from MNIST. Another process, reminiscent of memory reconsolidation is introduced, in which existing memories are refreshed and tuned with new inputs; this process is demonstrated on series of morphed faces. PMID:20552013

  4. Prenatal development in fishers (Martes pennanti)

    USGS Publications Warehouse

    Frost, H.C.; Krohn, W.B.; Bezembluk, E.A.; Lott, R.; Wallace, C.R.

    2005-01-01

    We evaluated and quantified prenatal growth of fishers (Martes pennanti) using ultrasonography. Seven females gave birth to 21 kits. The first identifiable embryonic structures were seen 42 d prepartum; these appeared to be unimplanted blastocysts or gestational sacs, which subsequently implanted in the uterine horns. Maternal and fetal heart rates were monitored from first detection to birth. Maternal heart rates did not differ among sampling periods, while fetal hearts rates increased from first detection to birth. Head and body differentiation, visible limbs and skeletal ossification were visible by 30, 23 and 21 d prepartum, respectively. Mean diameter of gestational sacs and crown-rump lengths were linearly related to gestational age (P < 0.001). Biparietal and body diameters were also linearly related to gestational age (P < 0.001) and correctly predicted parturition dates within 1-2 d. ?? 2004 Elsevier Inc. All rights reserved.

  5. Trajectory Synthesis for Fisher Information Maximization

    PubMed Central

    Wilson, Andrew D.; Schultz, Jarvis A.; Murphey, Todd D.

    2015-01-01

    Estimation of model parameters in a dynamic system can be significantly improved with the choice of experimental trajectory. For general nonlinear dynamic systems, finding globally “best” trajectories is typically not feasible; however, given an initial estimate of the model parameters and an initial trajectory, we present a continuous-time optimization method that produces a locally optimal trajectory for parameter estimation in the presence of measurement noise. The optimization algorithm is formulated to find system trajectories that improve a norm on the Fisher information matrix (FIM). A double-pendulum cart apparatus is used to numerically and experimentally validate this technique. In simulation, the optimized trajectory increases the minimum eigenvalue of the FIM by three orders of magnitude, compared with the initial trajectory. Experimental results show that this optimized trajectory translates to an order-of-magnitude improvement in the parameter estimate error in practice. PMID:25598763

  6. On the realization of quantum Fisher information

    NASA Astrophysics Data System (ADS)

    Saha, Aparna; Talukdar, B.; Chatterjee, Supriya

    2017-03-01

    With special attention to the role of information theory in physical sciences we present analytical results for the coordinate- and momentum-space Fisher information of some important one-dimensional quantum systems which differ in spacing of their energy levels. The studies envisaged allow us to relate the coordinate-space information ({I}ρ ) with the familiar energy levels of the quantum system. The corresponding momentum-space information ({I}γ ) does not obey such a simple relationship with the energy spectrum. Our results for the product ({I}ρ {I}γ ) depend quadratically on the principal quantum number n and satisfy an appropriate uncertainty relation derived by Dehesa et al (2007 J. Phys. A: Math. Theor. 40 1845)

  7. Labeled Graph Kernel for Behavior Analysis.

    PubMed

    Zhao, Ruiqi; Martinez, Aleix M

    2016-08-01

    Automatic behavior analysis from video is a major topic in many areas of research, including computer vision, multimedia, robotics, biology, cognitive science, social psychology, psychiatry, and linguistics. Two major problems are of interest when analyzing behavior. First, we wish to automatically categorize observed behaviors into a discrete set of classes (i.e., classification). For example, to determine word production from video sequences in sign language. Second, we wish to understand the relevance of each behavioral feature in achieving this classification (i.e., decoding). For instance, to know which behavior variables are used to discriminate between the words apple and onion in American Sign Language (ASL). The present paper proposes to model behavior using a labeled graph, where the nodes define behavioral features and the edges are labels specifying their order (e.g., before, overlaps, start). In this approach, classification reduces to a simple labeled graph matching. Unfortunately, the complexity of labeled graph matching grows exponentially with the number of categories we wish to represent. Here, we derive a graph kernel to quickly and accurately compute this graph similarity. This approach is very general and can be plugged into any kernel-based classifier. Specifically, we derive a Labeled Graph Support Vector Machine (LGSVM) and a Labeled Graph Logistic Regressor (LGLR) that can be readily employed to discriminate between many actions (e.g., sign language concepts). The derived approach can be readily used for decoding too, yielding invaluable information for the understanding of a problem (e.g., to know how to teach a sign language). The derived algorithms allow us to achieve higher accuracy results than those of state-of-the-art algorithms in a fraction of the time. We show experimental results on a variety of problems and datasets, including multimodal data.

  8. Labeled Graph Kernel for Behavior Analysis

    PubMed Central

    Zhao, Ruiqi; Martinez, Aleix M.

    2016-01-01

    Automatic behavior analysis from video is a major topic in many areas of research, including computer vision, multimedia, robotics, biology, cognitive science, social psychology, psychiatry, and linguistics. Two major problems are of interest when analyzing behavior. First, we wish to automatically categorize observed behaviors into a discrete set of classes (i.e., classification). For example, to determine word production from video sequences in sign language. Second, we wish to understand the relevance of each behavioral feature in achieving this classification (i.e., decoding). For instance, to know which behavior variables are used to discriminate between the words apple and onion in American Sign Language (ASL). The present paper proposes to model behavior using a labeled graph, where the nodes define behavioral features and the edges are labels specifying their order (e.g., before, overlaps, start). In this approach, classification reduces to a simple labeled graph matching. Unfortunately, the complexity of labeled graph matching grows exponentially with the number of categories we wish to represent. Here, we derive a graph kernel to quickly and accurately compute this graph similarity. This approach is very general and can be plugged into any kernel-based classifier. Specifically, we derive a Labeled Graph Support Vector Machine (LGSVM) and a Labeled Graph Logistic Regressor (LGLR) that can be readily employed to discriminate between many actions (e.g., sign language concepts). The derived approach can be readily used for decoding too, yielding invaluable information for the understanding of a problem (e.g., to know how to teach a sign language). The derived algorithms allow us to achieve higher accuracy results than those of state-of-the-art algorithms in a fraction of the time. We show experimental results on a variety of problems and datasets, including multimodal data. PMID:26415154

  9. 7 CFR 51.2295 - Half kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Half kernel. 51.2295 Section 51.2295 Agriculture... Standards for Shelled English Walnuts (Juglans Regia) Definitions § 51.2295 Half kernel. Half kernel means the separated half of a kernel with not more than one-eighth broken off....

  10. 7 CFR 981.9 - Kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Kernel weight. 981.9 Section 981.9 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Definitions § 981.9 Kernel weight. Kernel weight means the weight of kernels,...

  11. 7 CFR 981.7 - Edible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Edible kernel. 981.7 Section 981.7 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Definitions § 981.7 Edible kernel. Edible kernel means a kernel, piece, or particle...

  12. Job Satisfaction among Fishers in the Dominican Republic

    ERIC Educational Resources Information Center

    Ruiz, Victor

    2012-01-01

    This paper reflects on the results of a job satisfaction study of small-scale fishers in the Dominican Republic. The survey results suggest that, although fishers are generally satisfied with their occupations, they also have serious concerns. These concerns include anxieties about the level of earnings, the condition of marine resources and the…

  13. R. A. Fisher and his advocacy of randomization.

    PubMed

    Hall, Nancy S

    2007-01-01

    The requirement of randomization in experimental design was first stated by R. A. Fisher, statistician and geneticist, in 1925 in his book Statistical Methods for Research Workers. Earlier designs were systematic and involved the judgment of the experimenter; this led to possible bias and inaccurate interpretation of the data. Fisher's dictum was that randomization eliminates bias and permits a valid test of significance. Randomization in experimenting had been used by Charles Sanders Peirce in 1885 but the practice was not continued. Fisher developed his concepts of randomizing as he considered the mathematics of small samples, in discussions with "Student," William Sealy Gosset. Fisher published extensively. His principles of experimental design were spread worldwide by the many "voluntary workers" who came from other institutions to Rothamsted Agricultural Station in England to learn Fisher's methods.

  14. Characterizing contrast adaptation in a population of cat primary visual cortical neurons using Fisher information.

    PubMed

    Durant, Szonya; Clifford, Colin W G; Crowder, Nathan A; Price, Nicholas S C; Ibbotson, Michael R

    2007-06-01

    When cat V1/V2 cells are adapted to contrast at their optimal orientation, a reduction in gain and/or a shift in the contrast response function is found. We investigated how these factors combine at the population level to affect the accuracy for detecting variations in contrast. Using the contrast response function parameters from a physiologically measured population, we model the population accuracy (using Fisher information) for contrast discrimination. Adaptation at 16%, 32%, and 100% contrast causes a shift in peak accuracy. Despite an overall drop in firing rate over the whole population, accuracy is enhanced around the adapted contrast and at higher contrasts, leading to greater efficiency of contrast coding at these levels. The estimated contrast discrimination threshold curve becomes elevated and shifted toward higher contrasts after adaptation, as has been found previously in human psychophysical experiments.

  15. Kernel-based whole-genome prediction of complex traits: a review

    PubMed Central

    Morota, Gota; Gianola, Daniel

    2014-01-01

    Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics. PMID:25360145

  16. [Charles Miller Fisher: a giant of neurology].

    PubMed

    Tapia, Jorge

    2013-08-01

    C. Miller Fisher MD, one of the great neurologists in the 20th century, died in April 2012. Born in Canada, he studied medicine at the University of Toronto. As a Canadian Navy medical doctor he participated in World War II and was a war prisoner from 1941 to 1944. He did a residency in neurology at the Montreal Neurological Institute between 1946 and 1948, and later on was a Fellow in Neurology and Neuropathology at the Boston City Hospital. In 1954 he entered the Massachusetts General Hospital as a neurologist and neuropathologist, where he remained until his retirement, in 2005. His academic career ended as Professor Emeritus at Harvard University. His area of special interest in neurology was cerebrovascular disease (CVD). In 1954 he created the first Vascular Neurology service in the world and trained many leading neurologists on this field. His scientific contributions are present in more than 250 publications, as journal articles and book chapters. Many of his articles, certainly not restricted to CVD, were seminal in neurology. Several concepts and terms that he coined are currently used in daily clinical practice. The chapters on CVD, in seven consecutive editions of Harrison's Internal Medicine textbook, are among his highlights. His death was deeply felt by the neurological community.

  17. 76 FR 63355 - Proposed Information Collection (Regulation on Application for Fisher Houses and Other Temporary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-12

    ... Information Collection (Regulation on Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application); Comment Request AGENCY: Veterans Health Administration, Department of Veterans... use of other forms of information technology. Title: Regulation on Application for Fisher Houses...

  18. Statistical classification methods applied to seismic discrimination

    SciTech Connect

    Ryan, F.M.; Anderson, D.N.; Anderson, K.K.; Hagedorn, D.N.; Higbee, K.T.; Miller, N.E.; Redgate, T.; Rohay, A.C.

    1996-06-11

    To verify compliance with a Comprehensive Test Ban Treaty (CTBT), low energy seismic activity must be detected and discriminated. Monitoring small-scale activity will require regional (within {approx}2000 km) monitoring capabilities. This report provides background information on various statistical classification methods and discusses the relevance of each method in the CTBT seismic discrimination setting. Criteria for classification method selection are explained and examples are given to illustrate several key issues. This report describes in more detail the issues and analyses that were initially outlined in a poster presentation at a recent American Geophysical Union (AGU) meeting. Section 2 of this report describes both the CTBT seismic discrimination setting and the general statistical classification approach to this setting. Seismic data examples illustrate the importance of synergistically using multivariate data as well as the difficulties due to missing observations. Classification method selection criteria are presented and discussed in Section 3. These criteria are grouped into the broad classes of simplicity, robustness, applicability, and performance. Section 4 follows with a description of several statistical classification methods: linear discriminant analysis, quadratic discriminant analysis, variably regularized discriminant analysis, flexible discriminant analysis, logistic discriminant analysis, K-th Nearest Neighbor discrimination, kernel discrimination, and classification and regression tree discrimination. The advantages and disadvantages of these methods are summarized in Section 5.

  19. Infrasound analysis using Fisher detector and Hough transform

    NASA Astrophysics Data System (ADS)

    Averbuch, Gil; Assink, Jelle D.; Smets, Pieter S. M.; Evers, Läslo G.

    2016-04-01

    Automatic detection of infrasound signals from the International Monitoring System (IMS) from the Comprehensive Nuclear-Test-Ban Treaty requires low rates of both false alarms and missed events. The Fisher detector is a statistical method used for detecting such infrasonic events. The detector aims to detect coherent signals after Beamforming is applied on the recordings. A detection is defined to be above a threshold value of Fisher ratio. The Fisher distribution for such a detection is affected by the SNR. While events with high Fisher ratio and SNR can easily be detected automatically, events with lower Fisher ratios and SNRs might be missed. The Hough transform is a post processing step. It is based on a slope-intercept transform applied to a discretely sampled data, with the goal of finding straight lines (in apparent velocity and back azimuth). Applying it on the results from the Fisher detector is advantageous in case of noisy data, which corresponds to low Fisher ratios and SNRs. Results of the Hough transform on synthetic data with SNR down to 0.7 provided a lower number of missed events. In this work, we will present the results of an automatic detector, based on both methods. Synthetic data with different lengths and SNRs are evaluated. Furthermore, continuous data from the IMS infrasound station I18DK will be analyzed. We will compare the performances of both methods and investigate their ability in reducing the number of missed events.

  20. An Approximate Approach to Automatic Kernel Selection.

    PubMed

    Ding, Lizhong; Liao, Shizhong

    2016-02-02

    Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.

  1. The Fisher-Shannon information plane, an electron correlation tool.

    PubMed

    Romera, E; Dehesa, J S

    2004-05-15

    A new correlation measure, the product of the Shannon entropy power and the Fisher information of the electron density, is introduced by analyzing the Fisher-Shannon information plane of some two-electron systems (He-like ions, Hooke's atoms). The uncertainty and scaling properties of this information product are pointed out. In addition, the Fisher and Shannon measures of a finite many-electron system are shown to be bounded by the corresponding single-electron measures and the number of electrons of the system.

  2. Entropy, Fisher Information and Variance with Frost-Musulin Potenial

    NASA Astrophysics Data System (ADS)

    Idiodi, J. O. A.; Onate, C. A.

    2016-09-01

    This study presents the Shannon and Renyi information entropy for both position and momentum space and the Fisher information for the position-dependent mass Schrödinger equation with the Frost-Musulin potential. The analysis of the quantum mechanical probability has been obtained via the Fisher information. The variance information of this potential is equally computed. This controls both the chemical properties and physical properties of some of the molecular systems. We have observed the behaviour of the Shannon entropy. Renyi entropy, Fisher information and variance with the quantum number n respectively.

  3. Sparse kernel entropy component analysis for dimensionality reduction of neuroimaging data.

    PubMed

    Jiang, Qikun; Shi, Jun

    2014-01-01

    The neuroimaging data typically has extremely high dimensions. Therefore, dimensionality reduction is commonly used to extract discriminative features. Kernel entropy component analysis (KECA) is a newly developed data transformation method, where the key idea is to preserve the most estimated Renyi entropy of the input space data set via a kernel-based estimator. Despite its good performance, KECA still suffers from the problem of low computational efficiency for large-scale data. In this paper, we proposed a sparse KECA (SKECA) algorithm with the recursive divide-and-conquer based solution, and then applied it to perform dimensionality reduction of neuroimaging data for classification of the Alzheimer's disease (AD). We compared the SKECA with KECA, principal component analysis (PCA), kernel PCA (KPCA) and sparse KPCA. The experimental results indicate that the proposed SKECA has most superior performance to all other algorithms when extracting discriminative features from neuroimaging data for AD classification.

  4. A kernel Gabor-based weighted region covariance matrix for face recognition.

    PubMed

    Qin, Huafeng; Qin, Lan; Xue, Lian; Li, Yantao

    2012-01-01

    This paper proposes a novel image region descriptor for face recognition, named kernel Gabor-based weighted region covariance matrix (KGWRCM). As different parts are different effectual in characterizing and recognizing faces, we construct a weighting matrix by computing the similarity of each pixel within a face sample to emphasize features. We then incorporate the weighting matrices into a region covariance matrix, named weighted region covariance matrix (WRCM), to obtain the discriminative features of faces for recognition. Finally, to further preserve discriminative features in higher dimensional space, we develop the kernel Gabor-based weighted region covariance matrix (KGWRCM). Experimental results show that the KGWRCM outperforms other algorithms including the kernel Gabor-based region covariance matrix (KGCRM).

  5. RTOS kernel in portable electrocardiograph

    NASA Astrophysics Data System (ADS)

    Centeno, C. A.; Voos, J. A.; Riva, G. G.; Zerbini, C.; Gonzalez, E. A.

    2011-12-01

    This paper presents the use of a Real Time Operating System (RTOS) on a portable electrocardiograph based on a microcontroller platform. All medical device digital functions are performed by the microcontroller. The electrocardiograph CPU is based on the 18F4550 microcontroller, in which an uCOS-II RTOS can be embedded. The decision associated with the kernel use is based on its benefits, the license for educational use and its intrinsic time control and peripherals management. The feasibility of its use on the electrocardiograph is evaluated based on the minimum memory requirements due to the kernel structure. The kernel's own tools were used for time estimation and evaluation of resources used by each process. After this feasibility analysis, the migration from cyclic code to a structure based on separate processes or tasks able to synchronize events is used; resulting in an electrocardiograph running on one Central Processing Unit (CPU) based on RTOS.

  6. Using Fisher information to track stability in multivariate systems

    EPA Science Inventory

    With the current proliferation of data, the proficient use of statistical and mining techniques offer substantial benefits to capture useful information from any dataset. As numerous approaches make use of information theory concepts, here, we discuss how Fisher information (FI) ...

  7. Density Estimation with Mercer Kernels

    NASA Technical Reports Server (NTRS)

    Macready, William G.

    2003-01-01

    We present a new method for density estimation based on Mercer kernels. The density estimate can be understood as the density induced on a data manifold by a mixture of Gaussians fit in a feature space. As is usual, the feature space and data manifold are defined with any suitable positive-definite kernel function. We modify the standard EM algorithm for mixtures of Gaussians to infer the parameters of the density. One benefit of the approach is it's conceptual simplicity, and uniform applicability over many different types of data. Preliminary results are presented for a number of simple problems.

  8. Local Observed-Score Kernel Equating

    ERIC Educational Resources Information Center

    Wiberg, Marie; van der Linden, Wim J.; von Davier, Alina A.

    2014-01-01

    Three local observed-score kernel equating methods that integrate methods from the local equating and kernel equating frameworks are proposed. The new methods were compared with their earlier counterparts with respect to such measures as bias--as defined by Lord's criterion of equity--and percent relative error. The local kernel item response…

  9. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  10. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  11. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  12. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  13. Travel-Time and Amplitude Sensitivity Kernels

    DTIC Science & Technology

    2011-09-01

    amplitude sensitivity kernels shown in the lower panels concentrate about the corresponding eigenrays . Each 3D kernel exhibits a broad negative...in 2 and 3 dimensions have similar 11 shapes to corresponding travel-time sensitivity kernels (TSKs), centered about the respective eigenrays

  14. Adaptive wiener image restoration kernel

    SciTech Connect

    Yuan, Ding

    2007-06-05

    A method and device for restoration of electro-optical image data using an adaptive Wiener filter begins with constructing imaging system Optical Transfer Function, and the Fourier Transformations of the noise and the image. A spatial representation of the imaged object is restored by spatial convolution of the image using a Wiener restoration kernel.

  15. The NAS kernel benchmark program

    NASA Technical Reports Server (NTRS)

    Bailey, D. H.; Barton, J. T.

    1985-01-01

    A collection of benchmark test kernels that measure supercomputer performance has been developed for the use of the NAS (Numerical Aerodynamic Simulation) program at the NASA Ames Research Center. This benchmark program is described in detail and the specific ground rules are given for running the program as a performance test.

  16. 76 FR 78739 - Agency Information Collection (Regulation on Application for Fisher Houses and Other Temporary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-19

    ... AFFAIRS Agency Information Collection (Regulation on Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application) Activity Under OMB Review AGENCY: Veterans Health Administration... Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application, VA Forms...

  17. On the validity of cosmological Fisher matrix forecasts

    SciTech Connect

    Wolz, Laura; Kilbinger, Martin; Weller, Jochen; Giannantonio, Tommaso E-mail: martin.kilbinger@cea.fr E-mail: tommaso@usm.lmu.de

    2012-09-01

    We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w{sub 0} and w{sub a}. For purely geometrical probes, and especially when marginalising over w{sub a}, we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function. More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts.

  18. Fisher statistics for analysis of diffusion tensor directional information.

    PubMed

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation.

  19. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  20. Localized Multiple Kernel Learning With Dynamical Clustering and Matrix Regularization.

    PubMed

    Han, Yina; Yang, Kunde; Yang, Yixin; Ma, Yuanliang

    2016-12-20

    Localized multiple kernel learning (LMKL) is an attractive strategy for combining multiple heterogeneous features with regard to their discriminative power for each individual sample. However, the learning of numerous local solutions may not scale well even for a moderately sized training set, and the independently learned local models may suffer from overfitting. Hence, in existing local methods, the distributed samples are typically assumed to share the same weights, and various unsupervised clustering methods are applied as preprocessing. In this paper, to enable the learner to discover and benefit from the underlying local coherence and diversity of the samples, we incorporate the clustering procedure into the canonical support vector machine-based LMKL framework. Then, to explore the relatedness among different samples, which has been ignored in a vector ℓp-norm analysis, we organize the cluster-specific kernel weights into a matrix and introduce a matrix-based extension of the ℓp-norm for constraint enforcement. By casting the joint optimization problem as a problem of alternating optimization, we show how the cluster structure is gradually revealed and how the matrix-regularized kernel weights are obtained. A theoretical analysis of such a regularizer is performed using a Rademacher complexity bound, and complementary empirical experiments on real-world data sets demonstrate the effectiveness of our technique.

  1. Local Kernel for Brains Classification in Schizophrenia

    NASA Astrophysics Data System (ADS)

    Castellani, U.; Rossato, E.; Murino, V.; Bellani, M.; Rambaldelli, G.; Tansella, M.; Brambilla, P.

    In this paper a novel framework for brain classification is proposed in the context of mental health research. A learning by example method is introduced by combining local measurements with non linear Support Vector Machine. Instead of considering a voxel-by-voxel comparison between patients and controls, we focus on landmark points which are characterized by local region descriptors, namely Scale Invariance Feature Transform (SIFT). Then, matching is obtained by introducing the local kernel for which the samples are represented by unordered set of features. Moreover, a new weighting approach is proposed to take into account the discriminative relevance of the detected groups of features. Experiments have been performed including a set of 54 patients with schizophrenia and 54 normal controls on which region of interest (ROI) have been manually traced by experts. Preliminary results on Dorso-lateral PreFrontal Cortex (DLPFC) region are promising since up to 75% of successful classification rate has been obtained with this technique and the performance has improved up to 85% when the subjects have been stratified by sex.

  2. Nonlinear Deep Kernel Learning for Image Annotation.

    PubMed

    Jiu, Mingyuan; Sahbi, Hichem

    2017-02-08

    Multiple kernel learning (MKL) is a widely used technique for kernel design. Its principle consists in learning, for a given support vector classifier, the most suitable convex (or sparse) linear combination of standard elementary kernels. However, these combinations are shallow and often powerless to capture the actual similarity between highly semantic data, especially for challenging classification tasks such as image annotation. In this paper, we redefine multiple kernels using deep multi-layer networks. In this new contribution, a deep multiple kernel is recursively defined as a multi-layered combination of nonlinear activation functions, each one involves a combination of several elementary or intermediate kernels, and results into a positive semi-definite deep kernel. We propose four different frameworks in order to learn the weights of these networks: supervised, unsupervised, kernel-based semisupervised and Laplacian-based semi-supervised. When plugged into support vector machines (SVMs), the resulting deep kernel networks show clear gain, compared to several shallow kernels for the task of image annotation. Extensive experiments and analysis on the challenging ImageCLEF photo annotation benchmark, the COREL5k database and the Banana dataset validate the effectiveness of the proposed method.

  3. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation

    PubMed Central

    Sun, Rui; Zhang, Guanghai; Yan, Xiaoxing; Gao, Jun

    2016-01-01

    Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods. PMID:27537888

  4. Nonlinear projection trick in kernel methods: an alternative to the kernel trick.

    PubMed

    Kwak, Nojun

    2013-12-01

    In kernel methods such as kernel principal component analysis (PCA) and support vector machines, the so called kernel trick is used to avoid direct calculations in a high (virtually infinite) dimensional kernel space. In this brief, based on the fact that the effective dimensionality of a kernel space is less than the number of training samples, we propose an alternative to the kernel trick that explicitly maps the input data into a reduced dimensional kernel space. This is easily obtained by the eigenvalue decomposition of the kernel matrix. The proposed method is named as the nonlinear projection trick in contrast to the kernel trick. With this technique, the applicability of the kernel methods is widened to arbitrary algorithms that do not use the dot product. The equivalence between the kernel trick and the nonlinear projection trick is shown for several conventional kernel methods. In addition, we extend PCA-L1, which uses L1-norm instead of L2-norm (or dot product), into a kernel version and show the effectiveness of the proposed approach.

  5. FISHER INFORMATION AS A METRIC FOR SUSTAINABLE SYSTEM REGIMES

    EPA Science Inventory

    The important question in sustainability is not whether the world is sustainable, but whether a humanly acceptable regime of the world is sustainable. We propose Fisher Information as a metric for the sustainability of dynamic regimes in complex systems. The quantity now known ...

  6. Detection and Assessment of Ecosystem Regime Shifts from Fisher Information

    EPA Science Inventory

    Ecosystem regime shifts, which are long-term system reorganizations, have profound implications for sustainability. There is a great need for indicators of regime shifts, particularly methods that are applicable to data from real systems. We have developed a form of Fisher info...

  7. Postcolonial Appalachia: Bhabha, Bakhtin, and Diane Gilliam Fisher's "Kettle Bottom"

    ERIC Educational Resources Information Center

    Stevenson, Sheryl

    2006-01-01

    Diane Gilliam Fisher's 2004 award-winning book of poems, "Kettle Bottom," offers students a revealing vantage point for seeing Appalachian regional culture in a postcolonial context. An artful and accessible poetic sequence that was selected as the 2005 summer reading for entering students at Smith College, "Kettle Bottom"…

  8. Fisher, Wall and Wilson on "Punishment": A Critique

    ERIC Educational Resources Information Center

    Wilson, P. S.

    1973-01-01

    Discussion based on Wilson on the justification of punishment,'' by M. Fisher and G. Wall, Journal of Moral Education, v1 n3; and The justification of punishment,'' by J. Wilson, British Journal of Educational Studies, v19 pt2. (CB)

  9. Fisher information and quantum potential well model for finance

    NASA Astrophysics Data System (ADS)

    Nastasiuk, V. A.

    2015-09-01

    The probability distribution function (PDF) for prices on financial markets is derived by extremization of Fisher information. It is shown how on that basis the quantum-like description for financial markets arises and different financial market models are mapped by quantum mechanical ones.

  10. All about Community: Jane Fisher--New York Public Library

    ERIC Educational Resources Information Center

    Library Journal, 2004

    2004-01-01

    This brief article focuses on the career and accomplishments of Coordinator of Information Services, New York Public Library (NYPL), Jane Fisher. Her professional and academic career has spanned the fields of librarianship, health care, and public administration. Based on her most recent experiences and advancements, she has learned how libraries…

  11. FISHER INFORMATION AS A METRIC FOR SUSTAINABLE REGIMES

    EPA Science Inventory

    The important question in sustainability is not whether the world is sustainable, but whether a humanly acceptable regime of the world is sustainable. We propose Fisher Information as a metric for the sustainability of dynamic regimes in complex systems. The quantity now known ...

  12. FISHER INFORMATION AND DYNAMIC REGIME CHANGES IN ECOLOGICAL SYSTEMS

    EPA Science Inventory

    Fisher Information and Dynamic Regime Changes in Ecological Systems
    Abstract for the 3rd Conference of the International Society for Ecological Informatics
    Audrey L. Mayer, Christopher W. Pawlowski, and Heriberto Cabezas

    The sustainable nature of particular dynamic...

  13. Diffusion Map Kernel Analysis for Target Classification

    DTIC Science & Technology

    2010-06-01

    Gaussian and Polynomial kernels are most familiar from support vector machines. The Laplacian and Rayleigh were introduced previously in [7]. IV ...Cancer • Clev. Heart: Heart Disease Data Set, Cleveland • Wisc . BC: Wisconsin Breast Cancer Original • Sonar2: Shallow Water Acoustic Toolset [9...the Rayleigh kernel captures the embedding with an average PC of 77.3% and a slightly higher PFA than the Gaussian kernel. For the Wisc . BC

  14. Damage Identification with Linear Discriminant Operators

    SciTech Connect

    Farrar, C.R.; Nix, D.A.; Duffey, T.A.; Cornwell, P.J.; Pardoen, G.C.

    1999-02-08

    This paper explores the application of statistical pattern recognition and machine learning techniques to vibration-based damage detection. First, the damage detection process is described in terms of a problem in statistical pattern recognition. Next, a specific example of a statistical-pattern-recognition-based damage detection process using a linear discriminant operator, ''Fisher's Discriminant'', is applied to the problem of identifying structural damage in a physical system. Accelerometer time histories are recorded from sensors attached to the system as that system is excited using a measured input. Linear Prediction Coding (LPC) coefficients are utilized to convert the accelerometer time-series data into multi-dimensional samples representing the resonances of the system during a brief segment of the time series. Fisher's discriminant is then used to find the linear projection of the LPC data distributions that best separates data from undamaged and damaged systems. The method i s applied to data from concrete bridge columns as the columns are progressively damaged. For this case, the method captures a clear distinction between undamaged and damaged vibration profiles. Further, the method assigns a probability of damage that can be used to rank systems in order of priority for inspection.

  15. Assessing Fishers' Support of Striped Bass Management Strategies.

    PubMed

    Murphy, Robert D; Scyphers, Steven B; Grabowski, Jonathan H

    2015-01-01

    Incorporating the perspectives and insights of stakeholders is an essential component of ecosystem-based fisheries management, such that policy strategies should account for the diverse interests of various groups of anglers to enhance their efficacy. Here we assessed fishing stakeholders' perceptions on the management of Atlantic striped bass (Morone saxatilis) and receptiveness to potential future regulations using an online survey of recreational and commercial fishers in Massachusetts and Connecticut (USA). Our results indicate that most fishers harbored adequate to positive perceptions of current striped bass management policies when asked to grade their state's management regime. Yet, subtle differences in perceptions existed between recreational and commercial fishers, as well as across individuals with differing levels of fishing experience, resource dependency, and tournament participation. Recreational fishers in both states were generally supportive or neutral towards potential management actions including slot limits (71%) and mandated circle hooks to reduce mortality of released fish (74%), but less supportive of reduced recreational bag limits (51%). Although commercial anglers were typically less supportive of management changes than their recreational counterparts, the majority were still supportive of slot limits (54%) and mandated use of circle hooks (56%). Our study suggests that both recreational and commercial fishers are generally supportive of additional management strategies aimed at sustaining healthy striped bass populations and agree on a variety of strategies. However, both stakeholder groups were less supportive of harvest reductions, which is the most direct measure of reducing mortality available to fisheries managers. By revealing factors that influence stakeholders' support or willingness to comply with management strategies, studies such as ours can help managers identify potential stakeholder support for or conflicts that may

  16. Kernel Learning of Histogram of Local Gabor Phase Patterns for Face Recognition

    NASA Astrophysics Data System (ADS)

    Zhang, Baochang; Wang, Zongli; Zhong, Bineng

    2008-12-01

    This paper proposes a new face recognition method, named kernel learning of histogram of local Gabor phase pattern (K-HLGPP), which is based on Daugman's method for iris recognition and the local XOR pattern (LXP) operator. Unlike traditional Gabor usage exploiting the magnitude part in face recognition, we encode the Gabor phase information for face classification by the quadrant bit coding (QBC) method. Two schemes are proposed for face recognition. One is based on the nearest-neighbor classifier with chi-square as the similarity measurement, and the other makes kernel discriminant analysis for HLGPP (K-HLGPP) using histogram intersection and Gaussian-weighted chi-square kernels. The comparative experiments show that K-HLGPP achieves a higher recognition rate than other well-known face recognition systems on the large-scale standard FERET, FERET200, and CAS-PEAL-R1 databases.

  17. Molecular Hydrodynamics from Memory Kernels

    NASA Astrophysics Data System (ADS)

    Lesnicki, Dominika; Vuilleumier, Rodolphe; Carof, Antoine; Rotenberg, Benjamin

    2016-04-01

    The memory kernel for a tagged particle in a fluid, computed from molecular dynamics simulations, decays algebraically as t-3 /2 . We show how the hydrodynamic Basset-Boussinesq force naturally emerges from this long-time tail and generalize the concept of hydrodynamic added mass. This mass term is negative in the present case of a molecular solute, which is at odds with incompressible hydrodynamics predictions. Lastly, we discuss the various contributions to the friction, the associated time scales, and the crossover between the molecular and hydrodynamic regimes upon increasing the solute radius.

  18. Data-Driven Hierarchical Structure Kernel for Multiscale Part-Based Object Recognition

    PubMed Central

    Wang, Botao; Xiong, Hongkai; Jiang, Xiaoqian; Zheng, Yuan F.

    2017-01-01

    Detecting generic object categories in images and videos are a fundamental issue in computer vision. However, it faces the challenges from inter and intraclass diversity, as well as distortions caused by viewpoints, poses, deformations, and so on. To solve object variations, this paper constructs a structure kernel and proposes a multiscale part-based model incorporating the discriminative power of kernels. The structure kernel would measure the resemblance of part-based objects in three aspects: 1) the global similarity term to measure the resemblance of the global visual appearance of relevant objects; 2) the part similarity term to measure the resemblance of the visual appearance of distinctive parts; and 3) the spatial similarity term to measure the resemblance of the spatial layout of parts. In essence, the deformation of parts in the structure kernel is penalized in a multiscale space with respect to horizontal displacement, vertical displacement, and scale difference. Part similarities are combined with different weights, which are optimized efficiently to maximize the intraclass similarities and minimize the interclass similarities by the normalized stochastic gradient ascent algorithm. In addition, the parameters of the structure kernel are learned during the training process with regard to the distribution of the data in a more discriminative way. With flexible part sizes on scale and displacement, it can be more robust to the intraclass variations, poses, and viewpoints. Theoretical analysis and experimental evaluations demonstrate that the proposed multiscale part-based representation model with structure kernel exhibits accurate and robust performance, and outperforms state-of-the-art object classification approaches. PMID:24808345

  19. Discrimination of Maize Haploid Seeds from Hybrid Seeds Using Vis Spectroscopy and Support Vector Machine Method.

    PubMed

    Liu, Jin; Guo, Ting-ting; Li, Hao-chuan; Jia, Shi-qiang; Yan, Yan-lu; An, Dong; Zhang, Yao; Chen, Shao-jiang

    2015-11-01

    Doubled haploid (DH) lines are routinely applied in the hybrid maize breeding programs of many institutes and companies for their advantages of complete homozygosity and short breeding cycle length. A key issue in this approach is an efficient screening system to identify haploid kernels from the hybrid kernels crossed with the inducer. At present, haploid kernel selection is carried out manually using the"red-crown" kernel trait (the haploid kernel has a non-pigmented embryo and pigmented endosperm) controlled by the R1-nj gene. Manual selection is time-consuming and unreliable. Furthermore, the color of the kernel embryo is concealed by the pericarp. Here, we establish a novel approach for identifying maize haploid kernels based on visible (Vis) spectroscopy and support vector machine (SVM) pattern recognition technology. The diffuse transmittance spectra of individual kernels (141 haploid kernels and 141 hybrid kernels from 9 genotypes) were collected using a portable UV-Vis spectrometer and integrating sphere. The raw spectral data were preprocessed using smoothing and vector normalization methods. The desired feature wavelengths were selected based on the results of the Kolmogorov-Smirnov test. The wavelengths with p values above 0. 05 were eliminated because the distributions of absorbance data in these wavelengths show no significant difference between haploid and hybrid kernels. Principal component analysis was then performed to reduce the number of variables. The SVM model was evaluated by 9-fold cross-validation. In each round, samples of one genotype were used as the testing set, while those of other genotypes were used as the training set. The mean rate of correct discrimination was 92.06%. This result demonstrates the feasibility of using Vis spectroscopy to identify haploid maize kernels. The method would help develop a rapid and accurate automated screening-system for haploid kernels.

  20. 77 FR 75670 - Importer of Controlled Substances; Notice of Registration; Fisher Clinical Services,Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-21

    ... Enforcement Administration Importer of Controlled Substances; Notice of Registration; Fisher Clinical Services... FR 60143, Fisher Clinical Services, Inc., 7554 Schantz Road, Allentown, Pennsylvania 18106, made... listed substances for analytical research and clinical trials. No comments or objections have...

  1. Astronaut Anna Fisher practices control of the RMS in a trainer

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Astronaut Anna Lee Fisher, mission specialist for 51-A, practices control of the remote manipulator system (RMS) at a special trainer at JSC. Dr. Fisher is pictured in the manipulator development facility (MDF) of JSC's Shuttle mockup and integration laboratory.

  2. 77 FR 72409 - Importer of Controlled Substances; Notice of Application; Fisher Clinical Services, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-05

    ... Enforcement Administration Importer of Controlled Substances; Notice of Application; Fisher Clinical Services..., 2012, Fisher Clinical Services, Inc., 7554 Schantz Road, Allentown, Pennsylvania 18106, made application to the Drug Enforcement Administration (DEA) for registration as an importer of levorphanol...

  3. 77 FR 67396 - Importer of Controlled Substances; Notice of Application, Fisher Clinical Services, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... Enforcement Administration Importer of Controlled Substances; Notice of Application, Fisher Clinical Services..., 2012, Fisher Clinical Services, Inc., 7554 Schantz Road, ] Allentown, Pennsylvania 18106, made application to the Drug Enforcement Administration (DEA) for registration as an importer of Tapentadol...

  4. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of the separated halves of an entire pecan kernel with not more than one-eighth of its original volume...

  5. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of the separated halves of an entire pecan kernel with not more than...

  6. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of the separated halves of an entire pecan kernel with not more than...

  7. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of the separated halves of an entire pecan kernel with not more than one-eighth of its original volume...

  8. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of the separated halves of an entire pecan kernel with not more than one-eighth of its original volume...

  9. Bergman Kernel from Path Integral

    NASA Astrophysics Data System (ADS)

    Douglas, Michael R.; Klevtsov, Semyon

    2010-01-01

    We rederive the expansion of the Bergman kernel on Kähler manifolds developed by Tian, Yau, Zelditch, Lu and Catlin, using path integral and perturbation theory, and generalize it to supersymmetric quantum mechanics. One physics interpretation of this result is as an expansion of the projector of wave functions on the lowest Landau level, in the special case that the magnetic field is proportional to the Kähler form. This is relevant for the quantum Hall effect in curved space, and for its higher dimensional generalizations. Other applications include the theory of coherent states, the study of balanced metrics, noncommutative field theory, and a conjecture on metrics in black hole backgrounds discussed in [24]. We give a short overview of these various topics. From a conceptual point of view, this expansion is noteworthy as it is a geometric expansion, somewhat similar to the DeWitt-Seeley-Gilkey et al short time expansion for the heat kernel, but in this case describing the long time limit, without depending on supersymmetry.

  10. Kernel current source density method.

    PubMed

    Potworowski, Jan; Jakuczun, Wit; Lȩski, Szymon; Wójcik, Daniel

    2012-02-01

    Local field potentials (LFP), the low-frequency part of extracellular electrical recordings, are a measure of the neural activity reflecting dendritic processing of synaptic inputs to neuronal populations. To localize synaptic dynamics, it is convenient, whenever possible, to estimate the density of transmembrane current sources (CSD) generating the LFP. In this work, we propose a new framework, the kernel current source density method (kCSD), for nonparametric estimation of CSD from LFP recorded from arbitrarily distributed electrodes using kernel methods. We test specific implementations of this framework on model data measured with one-, two-, and three-dimensional multielectrode setups. We compare these methods with the traditional approach through numerical approximation of the Laplacian and with the recently developed inverse current source density methods (iCSD). We show that iCSD is a special case of kCSD. The proposed method opens up new experimental possibilities for CSD analysis from existing or new recordings on arbitrarily distributed electrodes (not necessarily on a grid), which can be obtained in extracellular recordings of single unit activity with multiple electrodes.

  11. KERNEL PHASE IN FIZEAU INTERFEROMETRY

    SciTech Connect

    Martinache, Frantz

    2010-11-20

    The detection of high contrast companions at small angular separation appears feasible in conventional direct images using the self-calibration properties of interferometric observable quantities. The friendly notion of closure phase, which is key to the recent observational successes of non-redundant aperture masking interferometry used with adaptive optics, appears to be one example of a wide family of observable quantities that are not contaminated by phase noise. In the high-Strehl regime, soon to be available thanks to the coming generation of extreme adaptive optics systems on ground-based telescopes, and already available from space, closure phase like information can be extracted from any direct image, even taken with a redundant aperture. These new phase-noise immune observable quantities, called kernel phases, are determined a priori from the knowledge of the geometry of the pupil only. Re-analysis of archive data acquired with the Hubble Space Telescope NICMOS instrument using this new kernel-phase algorithm demonstrates the power of the method as it clearly detects and locates with milliarcsecond precision a known companion to a star at angular separation less than the diffraction limit.

  12. Ranking Support Vector Machine with Kernel Approximation

    PubMed Central

    Dou, Yong

    2017-01-01

    Learning to rank algorithm has become important in recent years due to its successful application in information retrieval, recommender system, and computational biology, and so forth. Ranking support vector machine (RankSVM) is one of the state-of-art ranking models and has been favorably used. Nonlinear RankSVM (RankSVM with nonlinear kernels) can give higher accuracy than linear RankSVM (RankSVM with a linear kernel) for complex nonlinear ranking problem. However, the learning methods for nonlinear RankSVM are still time-consuming because of the calculation of kernel matrix. In this paper, we propose a fast ranking algorithm based on kernel approximation to avoid computing the kernel matrix. We explore two types of kernel approximation methods, namely, the Nyström method and random Fourier features. Primal truncated Newton method is used to optimize the pairwise L2-loss (squared Hinge-loss) objective function of the ranking model after the nonlinear kernel approximation. Experimental results demonstrate that our proposed method gets a much faster training speed than kernel RankSVM and achieves comparable or better performance over state-of-the-art ranking algorithms. PMID:28293256

  13. Ranking Support Vector Machine with Kernel Approximation.

    PubMed

    Chen, Kai; Li, Rongchun; Dou, Yong; Liang, Zhengfa; Lv, Qi

    2017-01-01

    Learning to rank algorithm has become important in recent years due to its successful application in information retrieval, recommender system, and computational biology, and so forth. Ranking support vector machine (RankSVM) is one of the state-of-art ranking models and has been favorably used. Nonlinear RankSVM (RankSVM with nonlinear kernels) can give higher accuracy than linear RankSVM (RankSVM with a linear kernel) for complex nonlinear ranking problem. However, the learning methods for nonlinear RankSVM are still time-consuming because of the calculation of kernel matrix. In this paper, we propose a fast ranking algorithm based on kernel approximation to avoid computing the kernel matrix. We explore two types of kernel approximation methods, namely, the Nyström method and random Fourier features. Primal truncated Newton method is used to optimize the pairwise L2-loss (squared Hinge-loss) objective function of the ranking model after the nonlinear kernel approximation. Experimental results demonstrate that our proposed method gets a much faster training speed than kernel RankSVM and achieves comparable or better performance over state-of-the-art ranking algorithms.

  14. Improving the Bandwidth Selection in Kernel Equating

    ERIC Educational Resources Information Center

    Andersson, Björn; von Davier, Alina A.

    2014-01-01

    We investigate the current bandwidth selection methods in kernel equating and propose a method based on Silverman's rule of thumb for selecting the bandwidth parameters. In kernel equating, the bandwidth parameters have previously been obtained by minimizing a penalty function. This minimization process has been criticized by practitioners…

  15. Fisheries productivity and its effects on the consumption of animal protein and food sharing of fishers' and non-fishers' families.

    PubMed

    da Costa, Mikaelle Kaline Bezerra; de Melo, Clarissy Dinyz; Lopes, Priscila Fabiana Macedo

    2014-01-01

    This study compared the consumption of animal protein and food sharing among fishers' and non-fishers' families of the northeastern Brazilian coast. The diet of these families was registered through the 24-hour-recall method during 10 consecutive days in January (good fishing season) and June (bad fishing season) 2012. Fish consumption was not different between the fishers' and non-fishers' families, but varied according to fisheries productivity to both groups. Likewise, food sharing was not different between the two groups, but food was shared more often when fisheries were productive. Local availability of fish, more than a direct dependency on fisheries, determines local patterns of animal protein consumption, but a direct dependency on fisheries exposes families to a lower-quality diet in less-productive seasons. As such, fisheries could shape and affect the livelihoods of coastal villages, including fishers' and non-fishers' families.

  16. 75 FR 62423 - Barnstead Thermolyne Corporation, a Subsidiary of Thermo Fisher Scientific, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-08

    ...] Barnstead Thermolyne Corporation, a Subsidiary of Thermo Fisher Scientific, Including On-Site Leased Workers... subsidiary of Thermo Fisher Scientific, including on- site leased workers from Sedona Staffing, Dubuque, Iowa... Thermolyne Corporation, a subsidiary of Thermo Fisher Scientific. The Department has determined that...

  17. The context-tree kernel for strings.

    PubMed

    Cuturi, Marco; Vert, Jean-Philippe

    2005-10-01

    We propose a new kernel for strings which borrows ideas and techniques from information theory and data compression. This kernel can be used in combination with any kernel method, in particular Support Vector Machines for string classification, with notable applications in proteomics. By using a Bayesian averaging framework with conjugate priors on a class of Markovian models known as probabilistic suffix trees or context-trees, we compute the value of this kernel in linear time and space while only using the information contained in the spectrum of the considered strings. This is ensured through an adaptation of a compression method known as the context-tree weighting algorithm. Encouraging classification results are reported on a standard protein homology detection experiment, showing that the context-tree kernel performs well with respect to other state-of-the-art methods while using no biological prior knowledge.

  18. Kernel method for corrections to scaling.

    PubMed

    Harada, Kenji

    2015-07-01

    Scaling analysis, in which one infers scaling exponents and a scaling function in a scaling law from given data, is a powerful tool for determining universal properties of critical phenomena in many fields of science. However, there are corrections to scaling in many cases, and then the inference problem becomes ill-posed by an uncontrollable irrelevant scaling variable. We propose a new kernel method based on Gaussian process regression to fix this problem generally. We test the performance of the new kernel method for some example cases. In all cases, when the precision of the example data increases, inference results of the new kernel method correctly converge. Because there is no limitation in the new kernel method for the scaling function even with corrections to scaling, unlike in the conventional method, the new kernel method can be widely applied to real data in critical phenomena.

  19. Improved dynamical scaling analysis using the kernel method for nonequilibrium relaxation.

    PubMed

    Echinaka, Yuki; Ozeki, Yukiyasu

    2016-10-01

    The dynamical scaling analysis for the Kosterlitz-Thouless transition in the nonequilibrium relaxation method is improved by the use of Bayesian statistics and the kernel method. This allows data to be fitted to a scaling function without using any parametric model function, which makes the results more reliable and reproducible and enables automatic and faster parameter estimation. Applying this method, the bootstrap method is introduced and a numerical discrimination for the transition type is proposed.

  20. A lattice Boltzmann model for the Burgers-Fisher equation.

    PubMed

    Zhang, Jianying; Yan, Guangwu

    2010-06-01

    A lattice Boltzmann model is developed for the one- and two-dimensional Burgers-Fisher equation based on the method of the higher-order moment of equilibrium distribution functions and a series of partial differential equations in different time scales. In order to obtain the two-dimensional Burgers-Fisher equation, vector sigma(j) has been used. And in order to overcome the drawbacks of "error rebound," a new assumption of additional distribution is presented, where two additional terms, in first order and second order separately, are used. Comparisons with the results obtained by other methods reveal that the numerical solutions obtained by the proposed method converge to exact solutions. The model under new assumption gives better results than that with second order assumption.

  1. Scombroid fish poisoning. Underreporting and prevention among noncommercial recreational fishers.

    PubMed Central

    Gellert, G A; Ralls, J; Brown, C; Huston, J; Merryman, R

    1992-01-01

    Food-borne diseases, including those caused by seafood products, are common and greatly underreported sources of morbidity. In this article we review the epidemiology of scombroid fish poisoning and its possible relationship to the noncommercial and recreational catch and sale of fish. More than 20% of all fish sold in the United States is caught by sport fishers, and outbreaks of scombroid fish poisoning have involved improperly handled fish from private catches. We report an outbreak of scombroid fish poisoning among recreational fishers in California. The unregulated sale of recreationally caught fish for consumption and the prevention of scombrotoxism are discussed from the perspectives of public health agencies, clinicians, and the fishing public. Scientific and policy issues that require further attention are high-lighted. PMID:1475947

  2. Takotsubo cardiomyopathy associated with Miller-Fisher syndrome.

    PubMed

    Gill, Dalvir; Liu, Kan

    2016-12-22

    51-year-old female who presented with progressive paresthesia, numbness of the lower extremities, double vision, and trouble walking. Physical exam was remarkable for areflexia, and ptosis. Her initial EKG showed nonspecific ST segment changes and her Troponin T was elevated to 0.41ng/mL which peaked at 0.66ng/mL. Echocardiogram showed a depressed left ventricular ejection fraction to 35% with severely hypokinetic anterior wall and left ventricular apex was severely hypokinetic. EMG nerve conduction study showed severely decreased conduction velocity and prolonged distal latency in all nerves consistent with demyelinating disease. She was treated with 5days of intravenous immunoglobulin therapy to which she showed significant improvement in strength in her lower extremities. Echocardiogram repeated 4days later showing an improved left ventricular ejection fraction of 55% and no left ventricular wall motion abnormalities. Takotsubo cardiomyopathy is a rare complication of Miller-Fisher syndrome and literature review did not reveal any cases. Miller-Fisher syndrome is an autoimmune process that affects the peripheral nervous system causing autonomic dysfunction which may involve the heart. Due to significant autonomic dysfunction in Miller-Fisher syndrome, it could lead to arrhythmias, blood pressure changes, acute coronary syndrome and myocarditis, Takotsubo cardiomyopathy can be difficult to distinguish. The treatment of Takotsubo cardiomyopathy is supportive with beta-blockers and angiotensin-converting enzyme inhibitors are recommended until left ventricle ejection fraction improvement. Takotsubo cardiomyopathy is a rare complication during the acute phase of Miller-Fisher syndrome and must be distinguished from autonomic dysfunction as both diagnoses have different approaches to treatment.

  3. Traditional botanical knowledge of artisanal fishers in southern Brazil

    PubMed Central

    2013-01-01

    Background This study characterized the botanical knowledge of artisanal fishers of the Lami community, Porto Alegre, southern Brazil based on answers to the following question: Is the local botanical knowledge of the artisanal fishers of the rural-urban district of Lami still active, even since the district’s insertion into the metropolitan region of Porto Alegre? Methods This region, which contains a mosaic of urban and rural areas, hosts the Lami Biological Reserve (LBR) and a community of 13 artisanal fisher families. Semi-structured interviews were conducted with 15 fishers, complemented by participatory observation techniques and free-lists; in these interviews, the species of plants used by the community and their indicated uses were identified. Results A total of 111 species belonging to 50 families were identified. No significant differences between the diversities of native and exotic species were found. Seven use categories were reported: medicinal (49%), human food (23.2%), fishing (12.3%), condiments (8%), firewood (5%), mystical purposes (1.45%), and animal food (0.72%). The medicinal species with the highest level of agreement regarding their main uses (AMUs) were Aloe arborescens Mill., Plectranthus barbatus Andrews, Dodonaea viscosa Jacq., Plectranthus ornatus Codd, Eugenia uniflora L., and Foeniculum vulgare Mill. For illness and diseases, most plants were used for problems with the digestive system (20 species), followed by the respiratory system (16 species). This community possesses a wide botanical knowledge, especially of medicinal plants, comparable to observations made in other studies with fishing communities in coastal areas of the Atlantic Forest of Brazil. Conclusions Ethnobotanical studies in rural-urban areas contribute to preserving local knowledge and provide information that aids in conserving the remaining ecosystems in the region. PMID:23898973

  4. R A Fisher, design theory, and the Indian connection.

    PubMed

    Rau, A R P

    2009-09-01

    Design Theory, a branch of mathematics, was born out of the experimental statistics research of the population geneticist R A Fisher and of Indian mathematical statisticians in the 1930s. The field combines elements of combinatorics, finite projective geometries, Latin squares, and a variety of further mathematical structures, brought together in surprising ways. This essay will present these structures and ideas as well as how the field came together, in itself an interesting story.

  5. Fisher equation for anisotropic diffusion: simulating South American human dispersals.

    PubMed

    Martino, Luis A; Osella, Ana; Dorso, Claudio; Lanata, José L

    2007-09-01

    The Fisher equation is commonly used to model population dynamics. This equation allows describing reaction-diffusion processes, considering both population growth and diffusion mechanism. Some results have been reported about modeling human dispersion, always assuming isotropic diffusion. Nevertheless, it is well-known that dispersion depends not only on the characteristics of the habitats where individuals are but also on the properties of the places where they intend to move, then isotropic approaches cannot adequately reproduce the evolution of the wave of advance of populations. Solutions to a Fisher equation are difficult to obtain for complex geometries, moreover, when anisotropy has to be considered and so few studies have been conducted in this direction. With this scope in mind, we present in this paper a solution for a Fisher equation, introducing anisotropy. We apply a finite difference method using the Crank-Nicholson approximation and analyze the results as a function of the characteristic parameters. Finally, this methodology is applied to model South American human dispersal.

  6. Sustainable theory of a logistic model - Fisher information approach.

    PubMed

    Al-Saffar, Avan; Kim, Eun-Jin

    2017-03-01

    Information theory provides a useful tool to understand the evolution of complex nonlinear systems and their sustainability. In particular, Fisher information has been evoked as a useful measure of sustainability and the variability of dynamical systems including self-organising systems. By utilising Fisher information, we investigate the sustainability of the logistic model for different perturbations in the positive and/or negative feedback. Specifically, we consider different oscillatory modulations in the parameters for positive and negative feedback and investigate their effect on the evolution of the system and Probability Density Functions (PDFs). Depending on the relative time scale of the perturbation to the response time of the system (the linear growth rate), we demonstrate the maintenance of the initial condition for a long time, manifested by a broad bimodal PDF. We present the analysis of Fisher information in different cases and elucidate its implications for the sustainability of population dynamics. We also show that a purely oscillatory growth rate can lead to a finite amplitude solution while self-organisation of these systems can break down with an exponentially growing solution due to the periodic fluctuations in negative feedback.

  7. Fisher Pierce products for improving distribution system reliability

    SciTech Connect

    1994-12-31

    The challenges facing the electric power utility today in the 1990s has changed significantly from those of even 10 years ago. The proliferation of automation and the personnel computer have heightened the requirements and demands put on the electric distribution system. Today`s customers, fighting to compete in a world market, demand quality, uninterrupted power service. Privatization and the concept of unregulated competition require utilities to streamline to minimize system support costs and optimize power delivery efficiency. Fisher Pierce, serving the electric utility industry for over 50 years, offers a line of products to assist utilities in meeting these challenges. The Fisher Pierce Family of products provide tools for the electric utility to exceed customer service demands. A full line of fault indicating devices are offered to expedite system power restoration both locally and in conjunction with SCADA systems. Fisher Pierce is the largest supplier of roadway lighting controls, manufacturing on a 6 million dollar automated line assuring the highest quality in the world. The distribution system capacitor control line offers intelligent local or radio linked switching control to maintain system voltage and Var levels for quality and cost efficient power delivery under varying customer loads. Additional products, designed to authenticate revenue metering calibration and verify on sight metering service wiring, help optimize the profitability of the utility assuring continuous system service improvements for their customers.

  8. Bayesian Kernel Mixtures for Counts.

    PubMed

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online.

  9. MULTIVARIATE KERNEL PARTITION PROCESS MIXTURES

    PubMed Central

    Dunson, David B.

    2013-01-01

    Mixtures provide a useful approach for relaxing parametric assumptions. Discrete mixture models induce clusters, typically with the same cluster allocation for each parameter in multivariate cases. As a more flexible approach that facilitates sparse nonparametric modeling of multivariate random effects distributions, this article proposes a kernel partition process (KPP) in which the cluster allocation varies for different parameters. The KPP is shown to be the driving measure for a multivariate ordered Chinese restaurant process that induces a highly-flexible dependence structure in local clustering. This structure allows the relative locations of the random effects to inform the clustering process, with spatially-proximal random effects likely to be assigned the same cluster index. An exact block Gibbs sampler is developed for posterior computation, avoiding truncation of the infinite measure. The methods are applied to hormone curve data, and a dependent KPP is proposed for classification from functional predictors. PMID:24478563

  10. Putting Priors in Mixture Density Mercer Kernels

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2004-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly infinite dimensional feature space. We describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using predefined kernels. These data adaptive kernels can en- code prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS). The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains template for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic- algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code. The results show that the Mixture Density Mercer-Kernel described here outperforms tree-based classification in distinguishing high-redshift galaxies from low- redshift galaxies by approximately 16% on test data, bagged trees by approximately 7%, and bagged trees built on a much larger sample of data by approximately 2%.

  11. Perturbed kernel approximation on homogeneous manifolds

    NASA Astrophysics Data System (ADS)

    Levesley, J.; Sun, X.

    2007-02-01

    Current methods for interpolation and approximation within a native space rely heavily on the strict positive-definiteness of the underlying kernels. If the domains of approximation are the unit spheres in euclidean spaces, then zonal kernels (kernels that are invariant under the orthogonal group action) are strongly favored. In the implementation of these methods to handle real world problems, however, some or all of the symmetries and positive-definiteness may be lost in digitalization due to small random errors that occur unpredictably during various stages of the execution. Perturbation analysis is therefore needed to address the stability problem encountered. In this paper we study two kinds of perturbations of positive-definite kernels: small random perturbations and perturbations by Dunkl's intertwining operators [C. Dunkl, Y. Xu, Orthogonal polynomials of several variables, Encyclopedia of Mathematics and Its Applications, vol. 81, Cambridge University Press, Cambridge, 2001]. We show that with some reasonable assumptions, a small random perturbation of a strictly positive-definite kernel can still provide vehicles for interpolation and enjoy the same error estimates. We examine the actions of the Dunkl intertwining operators on zonal (strictly) positive-definite kernels on spheres. We show that the resulted kernels are (strictly) positive-definite on spheres of lower dimensions.

  12. A Physical Basis of the Tully-Fisher Relation

    NASA Astrophysics Data System (ADS)

    Rhee, M.-H.

    1996-04-01

    This thesis consists of two main parts. The first part presents a series of theoretical and interpretative studies on the subject of the Tully-Fisher relation. We mainly focus on a better understanding of the physical basis of the Tully-Fisher relation. The second parts of the thesis presents the results of the new Westerbork HI observations of spiral galaxies. We study the HI properties and the Tully-Fisher relation of spiral and irregular galaxies based on these observations. In the first part, we first analyse the dependence of the luminous mass-to-light ratio of spiral galaxies on the present star formation rate, and find that galaxies with high present star formation rates have low luminous mass-to-light ratios, presumably as a result of the enhanced luminosity. On this basis we argue that variations in the stellar content of galaxies result in a major source of intrinsic scatter in the Tully-Fisher relation. We have also analysed the relation between the (maximum) luminous mass and circular velocity, and find it to have small scatter. We therefore propose that the physical basis of the Tully-Fisher relation lies in a relationship between the luminous mass and circular velocity, in combination with a `well-behaved' relation between luminous and dark matter (Chapter 2). We show that the errors in the Tully-Fisher relation are intercorrelated through inclination corrections, resulting in a lower combined scatter than the individual errors. Ignoring this effect could result in an underestimation of the intrinsic scatter in the Tully-Fisher relation. We discuss a compensation effect between luminosity enhancement due to stellar activity and luminosity dimming due to dust, which could result in a small apparent scatter in the TF relation because high star formation activity is associated with high dust content. We argue that the Tully-Fisher relation for the low surface brightness galaxies and IRAS mini-survey galaxies could also be the results of some compensation

  13. Retrieval of Brain Tumors by Adaptive Spatial Pooling and Fisher Vector Representation

    PubMed Central

    Huang, Meiyan; Huang, Wei; Jiang, Jun; Zhou, Yujia; Yang, Ru; Zhao, Jie; Feng, Yanqiu; Feng, Qianjin; Chen, Wufan

    2016-01-01

    Content-based image retrieval (CBIR) techniques have currently gained increasing popularity in the medical field because they can use numerous and valuable archived images to support clinical decisions. In this paper, we concentrate on developing a CBIR system for retrieving brain tumors in T1-weighted contrast-enhanced MRI images. Specifically, when the user roughly outlines the tumor region of a query image, brain tumor images in the database of the same pathological type are expected to be returned. We propose a novel feature extraction framework to improve the retrieval performance. The proposed framework consists of three steps. First, we augment the tumor region and use the augmented tumor region as the region of interest to incorporate informative contextual information. Second, the augmented tumor region is split into subregions by an adaptive spatial division method based on intensity orders; within each subregion, we extract raw image patches as local features. Third, we apply the Fisher kernel framework to aggregate the local features of each subregion into a respective single vector representation and concatenate these per-subregion vector representations to obtain an image-level signature. After feature extraction, a closed-form metric learning algorithm is applied to measure the similarity between the query image and database images. Extensive experiments are conducted on a large dataset of 3604 images with three types of brain tumors, namely, meningiomas, gliomas, and pituitary tumors. The mean average precision can reach 94.68%. Experimental results demonstrate the power of the proposed algorithm against some related state-of-the-art methods on the same dataset. PMID:27273091

  14. Sugar Profile of Kernels as a Marker of Origin and Ripening Time of Peach (Prunus persicae L.).

    PubMed

    Stanojević, Marija; Trifković, Jelena; Akšić, Milica Fotirić; Rakonjac, Vera; Nikolić, Dragan; Šegan, Sandra; Milojković-Opsenica, Dušanka

    2015-12-01

    Large amounts of fruit seeds, especially peach, are discarded annually in juice or conserve producing industries which is a potential waste of valuable resource and serious disposal problem. Regarding the fact that peach seeds can be obtained as a byproduct from processing companies their exploitation should be greater and, consequently more information of cultivars' kernels and their composition is required. A total of 25 samples of kernels from various peach germplasm (including commercial cultivars, perspective hybrids and vineyard peach accessions) differing in origin and ripening time were characterized by evaluation of their sugar composition. Twenty characteristic carbohydrates and sugar alcohols were determined and quantified using high-performance anion-exchange chromatography with pulsed amperometric detection (HPAEC/PAD). Sucrose, glucose and fructose are the most important sugars in peach kernels similar to other representatives of the Rosaceae family. Also, high amounts of sugars in seeds of promising hybrids implies that through conventional breeding programs peach kernels with high sugar content can be obtained. In addition, by the means of several pattern recognition methods the variables that discriminate peach kernels arising from diverse germplasm and different stage of maturity were identified and successful models for further prediction were developed. Sugars such as ribose, trehalose, arabinose, galactitol, fructose, maltose, sorbitol, sucrose, iso-maltotriose were marked as most important for such discrimination.

  15. A Further Evaluation of Picture Prompts during Auditory-Visual Conditional Discrimination Training

    ERIC Educational Resources Information Center

    Carp, Charlotte L.; Peterson, Sean P.; Arkel, Amber J.; Petursdottir, Anna I.; Ingvarsson, Einar T.

    2012-01-01

    This study was a systematic replication and extension of Fisher, Kodak, and Moore (2007), in which a picture prompt embedded into a least-to-most prompting sequence facilitated acquisition of auditory-visual conditional discriminations. Participants were 4 children who had been diagnosed with autism; 2 had limited prior receptive skills, and 2 had…

  16. Relationship between cyanogenic compounds in kernels, leaves, and roots of sweet and bitter kernelled almonds.

    PubMed

    Dicenta, F; Martínez-Gómez, P; Grané, N; Martín, M L; León, A; Cánovas, J A; Berenguer, V

    2002-03-27

    The relationship between the levels of cyanogenic compounds (amygdalin and prunasin) in kernels, leaves, and roots of 5 sweet-, 5 slightly bitter-, and 5 bitter-kernelled almond trees was determined. Variability was observed among the genotypes for these compounds. Prunasin was found only in the vegetative part (roots and leaves) for all genotypes tested. Amygdalin was detected only in the kernels, mainly in bitter genotypes. In general, bitter-kernelled genotypes had higher levels of prunasin in their roots than nonbitter ones, but the correlation between cyanogenic compounds in the different parts of plants was not high. While prunasin seems to be present in most almond roots (with a variable concentration) only bitter-kernelled genotypes are able to transform it into amygdalin in the kernel. Breeding for prunasin-based resistance to the buprestid beetle Capnodis tenebrionis L. is discussed.

  17. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Kernel color classification. 51.1403 Section 51.1403... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Kernel Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the...

  18. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Kernel color classification. 51.1403 Section 51.1403... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Kernel Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the...

  19. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Kernel color classification. 51.1403 Section 51.1403... Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the color classifications provided in this section. When the color of kernels in a...

  20. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Kernel color classification. 51.1403 Section 51.1403... Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the color classifications provided in this section. When the color of kernels in a...

  1. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Kernel color classification. 51.1403 Section 51.1403... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Kernel Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the...

  2. 7 CFR 51.2296 - Three-fourths half kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Three-fourths half kernel. 51.2296 Section 51.2296 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards...-fourths half kernel. Three-fourths half kernel means a portion of a half of a kernel which has more...

  3. 7 CFR 51.2125 - Split or broken kernels.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Split or broken kernels. 51.2125 Section 51.2125 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... kernels. Split or broken kernels means seven-eighths or less of complete whole kernels but which will...

  4. Discriminating harmonicity

    NASA Astrophysics Data System (ADS)

    Kidd, Gerald; Mason, Christine R.; Brughera, Andrew; Chiu, Chung-Yiu Peter

    2003-08-01

    Simultaneous tones that are harmonically related tend to be grouped perceptually to form a unitary auditory image. A partial that is mistuned stands out from the other tones, and harmonic complexes with different fundamental frequencies can readily be perceived as separate auditory objects. These phenomena are evidence for the strong role of harmonicity in perceptual grouping and segregation of sounds. This study measured the discriminability of harmonicity directly. In a two interval, two alternative forced-choice (2I2AFC) paradigm, the listener chose which of two sounds, signal or foil, was composed of tones that more closely matched an exact harmonic relationship. In one experiment, the signal was varied from perfectly harmonic to highly inharmonic by adding frequency perturbation to each component. The foil always had 100% perturbation. Group mean performance decreased from greater than 90% correct for 0% signal perturbation to near chance for 80% signal perturbation. In the second experiment, adding a masker presented simultaneously with the signals and foils disrupted harmonicity. Both monaural and dichotic conditions were tested. Signal level was varied relative to masker level to obtain psychometric functions from which slopes and midpoints were estimated. Dichotic presentation of these audible stimuli improved performance by 3-10 dB, due primarily to a release from ``informational masking'' by the perceptual segregation of the signal from the masker.

  5. Kernel-Based Equiprobabilistic Topographic Map Formation.

    PubMed

    Van Hulle MM

    1998-09-15

    We introduce a new unsupervised competitive learning rule, the kernel-based maximum entropy learning rule (kMER), which performs equiprobabilistic topographic map formation in regular, fixed-topology lattices, for use with nonparametric density estimation as well as nonparametric regression analysis. The receptive fields of the formal neurons are overlapping radially symmetric kernels, compatible with radial basis functions (RBFs); but unlike other learning schemes, the radii of these kernels do not have to be chosen in an ad hoc manner: the radii are adapted to the local input density, together with the weight vectors that define the kernel centers, so as to produce maps of which the neurons have an equal probability to be active (equiprobabilistic maps). Both an "online" and a "batch" version of the learning rule are introduced, which are applied to nonparametric density estimation and regression, respectively. The application envisaged is blind source separation (BSS) from nonlinear, noisy mixtures.

  6. Bergman kernel from the lowest Landau level

    NASA Astrophysics Data System (ADS)

    Klevtsov, S.

    2009-07-01

    We use path integral representation for the density matrix, projected on the lowest Landau level, to generalize the expansion of the Bergman kernel on Kähler manifold to the case of arbitrary magnetic field.

  7. Quantum kernel applications in medicinal chemistry.

    PubMed

    Huang, Lulu; Massa, Lou

    2012-07-01

    Progress in the quantum mechanics of biological molecules is being driven by computational advances. The notion of quantum kernels can be introduced to simplify the formalism of quantum mechanics, making it especially suitable for parallel computation of very large biological molecules. The essential idea is to mathematically break large biological molecules into smaller kernels that are calculationally tractable, and then to represent the full molecule by a summation over the kernels. The accuracy of the kernel energy method (KEM) is shown by systematic application to a great variety of molecular types found in biology. These include peptides, proteins, DNA and RNA. Examples are given that explore the KEM across a variety of chemical models, and to the outer limits of energy accuracy and molecular size. KEM represents an advance in quantum biology applicable to problems in medicine and drug design.

  8. KITTEN Lightweight Kernel 0.1 Beta

    SciTech Connect

    Pedretti, Kevin; Levenhagen, Michael; Kelly, Suzanne; VanDyke, John; Hudson, Trammell

    2007-12-12

    The Kitten Lightweight Kernel is a simplified OS (operating system) kernel that is intended to manage a compute node's hardware resources. It provides a set of mechanisms to user-level applications for utilizing hardware resources (e.g., allocating memory, creating processes, accessing the network). Kitten is much simpler than general-purpose OS kernels, such as Linux or Windows, but includes all of the esssential functionality needed to support HPC (high-performance computing) MPI, PGAS and OpenMP applications. Kitten provides unique capabilities such as physically contiguous application memory, transparent large page support, and noise-free tick-less operation, which enable HPC applications to obtain greater efficiency and scalability than with general purpose OS kernels.

  9. TICK: Transparent Incremental Checkpointing at Kernel Level

    SciTech Connect

    Petrini, Fabrizio; Gioiosa, Roberto

    2004-10-25

    TICK is a software package implemented in Linux 2.6 that allows the save and restore of user processes, without any change to the user code or binary. With TICK a process can be suspended by the Linux kernel upon receiving an interrupt and saved in a file. This file can be later thawed in another computer running Linux (potentially the same computer). TICK is implemented as a Linux kernel module, in the Linux version 2.6.5

  10. Feature expectation heightens visual sensitivity during fine orientation discrimination

    PubMed Central

    Cheadle, Sam; Egner, Tobias; Wyart, Valentin; Wu, Claire; Summerfield, Christopher

    2015-01-01

    Attending to a stimulus enhances the sensitivity of perceptual decisions. However, it remains unclear how perceptual sensitivity varies according to whether a feature is expected or unexpected. Here, observers made fine discrimination judgments about the orientation of visual gratings embedded in low spatial-frequency noise, and psychophysical reverse correlation was used to estimate decision ‘kernels' that revealed how visual features influenced choices. Orthogonal cues alerted subjects to which of two spatial locations was likely to be probed (spatial attention cue) and which of two oriented gratings was likely to occur (feature expectation cue). When an expected (relative to unexpected) feature occurred, decision kernels shifted away from the category boundary, allowing observers to capitalize on more informative, “off-channel” stimulus features. By contrast, the spatial attention cue had a multiplicative influence on decision kernels, consistent with an increase in response gain. Feature expectation thus heightens sensitivity to the most informative visual features, independent of selective attention. PMID:26505967

  11. Evidence for Fisher renormalization in the compressible phi4 model.

    PubMed

    Tröster, A

    2008-04-11

    We present novel Fourier Monte Carlo simulations of a compressible phi4-model on a simple-cubic lattice with linear-quadratic coupling of order parameter and strain, focusing on the detection of fluctuation-induced first-order transitions and deviations from standard critical behavior. The former is indeed observed in the constant stress ensemble and for auxetic systems at constant strain, while for regular isotropic systems at constant strain, we find strong evidence for Fisher-renormalized critical behavior and are led to predict the existence of a tricritical point.

  12. Evaluating the Gradient of the Thin Wire Kernel

    NASA Technical Reports Server (NTRS)

    Wilton, Donald R.; Champagne, Nathan J.

    2008-01-01

    Recently, a formulation for evaluating the thin wire kernel was developed that employed a change of variable to smooth the kernel integrand, canceling the singularity in the integrand. Hence, the typical expansion of the wire kernel in a series for use in the potential integrals is avoided. The new expression for the kernel is exact and may be used directly to determine the gradient of the wire kernel, which consists of components that are parallel and radial to the wire axis.

  13. Weighted Bergman Kernels and Quantization}

    NASA Astrophysics Data System (ADS)

    Engliš, Miroslav

    Let Ω be a bounded pseudoconvex domain in CN, φ, ψ two positive functions on Ω such that - log ψ, - log φ are plurisubharmonic, and z∈Ω a point at which - log φ is smooth and strictly plurisubharmonic. We show that as k-->∞, the Bergman kernels with respect to the weights φkψ have an asymptotic expansion for x,y near z, where φ(x,y) is an almost-analytic extension of &\\phi(x)=φ(x,x) and similarly for ψ. Further, . If in addition Ω is of finite type, φ,ψ behave reasonably at the boundary, and - log φ, - log ψ are strictly plurisubharmonic on Ω, we obtain also an analogous asymptotic expansion for the Berezin transform and give applications to the Berezin quantization. Finally, for Ω smoothly bounded and strictly pseudoconvex and φ a smooth strictly plurisubharmonic defining function for Ω, we also obtain results on the Berezin-Toeplitz quantization.

  14. RKF-PCA: robust kernel fuzzy PCA.

    PubMed

    Heo, Gyeongyong; Gader, Paul; Frigui, Hichem

    2009-01-01

    Principal component analysis (PCA) is a mathematical method that reduces the dimensionality of the data while retaining most of the variation in the data. Although PCA has been applied in many areas successfully, it suffers from sensitivity to noise and is limited to linear principal components. The noise sensitivity problem comes from the least-squares measure used in PCA and the limitation to linear components originates from the fact that PCA uses an affine transform defined by eigenvectors of the covariance matrix and the mean of the data. In this paper, a robust kernel PCA method that extends the kernel PCA and uses fuzzy memberships is introduced to tackle the two problems simultaneously. We first introduce an iterative method to find robust principal components, called Robust Fuzzy PCA (RF-PCA), which has a connection with robust statistics and entropy regularization. The RF-PCA method is then extended to a non-linear one, Robust Kernel Fuzzy PCA (RKF-PCA), using kernels. The modified kernel used in the RKF-PCA satisfies the Mercer's condition, which means that the derivation of the K-PCA is also valid for the RKF-PCA. Formal analyses and experimental results suggest that the RKF-PCA is an efficient non-linear dimension reduction method and is more noise-robust than the original kernel PCA.

  15. Kernel Manifold Alignment for Domain Adaptation.

    PubMed

    Tuia, Devis; Camps-Valls, Gustau

    2016-01-01

    The wealth of sensory data coming from different modalities has opened numerous opportunities for data analysis. The data are of increasing volume, complexity and dimensionality, thus calling for new methodological innovations towards multimodal data processing. However, multimodal architectures must rely on models able to adapt to changes in the data distribution. Differences in the density functions can be due to changes in acquisition conditions (pose, illumination), sensors characteristics (number of channels, resolution) or different views (e.g. street level vs. aerial views of a same building). We call these different acquisition modes domains, and refer to the adaptation problem as domain adaptation. In this paper, instead of adapting the trained models themselves, we alternatively focus on finding mappings of the data sources into a common, semantically meaningful, representation domain. This field of manifold alignment extends traditional techniques in statistics such as canonical correlation analysis (CCA) to deal with nonlinear adaptation and possibly non-corresponding data pairs between the domains. We introduce a kernel method for manifold alignment (KEMA) that can match an arbitrary number of data sources without needing corresponding pairs, just few labeled examples in all domains. KEMA has interesting properties: 1) it generalizes other manifold alignment methods, 2) it can align manifolds of very different complexities, performing a discriminative alignment preserving each manifold inner structure, 3) it can define a domain-specific metric to cope with multimodal specificities, 4) it can align data spaces of different dimensionality, 5) it is robust to strong nonlinear feature deformations, and 6) it is closed-form invertible, which allows transfer across-domains and data synthesis. To authors' knowledge this is the first method addressing all these important issues at once. We also present a reduced-rank version of KEMA for computational

  16. Kernel Manifold Alignment for Domain Adaptation

    PubMed Central

    Tuia, Devis; Camps-Valls, Gustau

    2016-01-01

    The wealth of sensory data coming from different modalities has opened numerous opportunities for data analysis. The data are of increasing volume, complexity and dimensionality, thus calling for new methodological innovations towards multimodal data processing. However, multimodal architectures must rely on models able to adapt to changes in the data distribution. Differences in the density functions can be due to changes in acquisition conditions (pose, illumination), sensors characteristics (number of channels, resolution) or different views (e.g. street level vs. aerial views of a same building). We call these different acquisition modes domains, and refer to the adaptation problem as domain adaptation. In this paper, instead of adapting the trained models themselves, we alternatively focus on finding mappings of the data sources into a common, semantically meaningful, representation domain. This field of manifold alignment extends traditional techniques in statistics such as canonical correlation analysis (CCA) to deal with nonlinear adaptation and possibly non-corresponding data pairs between the domains. We introduce a kernel method for manifold alignment (KEMA) that can match an arbitrary number of data sources without needing corresponding pairs, just few labeled examples in all domains. KEMA has interesting properties: 1) it generalizes other manifold alignment methods, 2) it can align manifolds of very different complexities, performing a discriminative alignment preserving each manifold inner structure, 3) it can define a domain-specific metric to cope with multimodal specificities, 4) it can align data spaces of different dimensionality, 5) it is robust to strong nonlinear feature deformations, and 6) it is closed-form invertible, which allows transfer across-domains and data synthesis. To authors’ knowledge this is the first method addressing all these important issues at once. We also present a reduced-rank version of KEMA for computational

  17. Which Fishers are Satisfied in the Caribbean? A Comparative Analysis of Job Satisfaction Among Caribbean Lobster Fishers.

    PubMed

    Monnereau, Iris; Pollnac, Richard

    2012-10-01

    Lobster fishing (targeting the spiny lobster Panulirus argus) is an important economic activity throughout the Wider Caribbean Region both as a source of income and employment for the local population as well as foreign exchange for national governments. Due to the high unit prices of the product, international lobster trade provides a way to improve the livelihoods of fisheries-dependent populations. The specie harvested is identical throughout the region and end market prices are roughly similar. In this paper we wish to investigate to which extent lobster fishers' job satisfaction differs in three countries in the Caribbean and how these differences can be explained by looking at the national governance arrangements.

  18. The Stefan problem for the Fisher-KPP equation

    NASA Astrophysics Data System (ADS)

    Du, Yihong; Guo, Zongming

    We study the Fisher-KPP equation with a free boundary governed by a one-phase Stefan condition. Such a problem arises in the modeling of the propagation of a new or invasive species, with the free boundary representing the propagation front. In one space dimension this problem was investigated in Du and Lin (2010) [11], and the radially symmetric case in higher space dimensions was studied in Du and Guo (2011) [10]. In both cases a spreading-vanishing dichotomy was established, namely the species either successfully spreads to all the new environment and stabilizes at a positive equilibrium state, or fails to establish and dies out in the long run; moreover, in the case of spreading, the asymptotic spreading speed was determined. In this paper, we consider the non-radially symmetric case. In such a situation, similar to the classical Stefan problem, smooth solutions need not exist even if the initial data are smooth. We thus introduce and study the "weak solution" for a class of free boundary problems that include the Fisher-KPP as a special case. We establish the existence and uniqueness of the weak solution, and through suitable comparison arguments, we extend some of the results obtained earlier in Du and Lin (2010) [11] and Du and Guo (2011) [10] to this general case. We also show that the classical Aronson-Weinberger result on the spreading speed obtained through the traveling wave solution approach is a limiting case of our free boundary problem here.

  19. A Case of Miller Fisher Syndrome and Literature Review

    PubMed Central

    Taboada, Javier

    2017-01-01

    Miller Fisher syndrome (MFS)  was first recognized by James Collier in 1932 as a clinical triad of ataxia, areflexia, and ophthalmoplegia. Later, it was described in 1956 by Charles Miller Fisher as a possible variant of Guillain-Barré syndrome (GBS). Here, we write a case of a patient with atypical presentation of this clinical triad as the patient presented with double vision initially due to unilateral ocular involvement that progressed to bilateral ophthalmoplegia. He developed weakness of the lower extremities and areflexia subsequently. A diagnosis of MFS was made due to the clinical presentation and the presence of albuminocytologic dissociation in the cerebrospinal fluid (CSF) along with normal results of brain imaging and blood workup. The patient received intravenous immune globulin (IVIG), and his symptoms improved. The initial diagnosis of MFS is based on the clinical presentation and is confirmed by cerebral spinal fluid analysis and clinical neurophysiology studies. This case which emphasizes the knowledge of a rare syndrome can help narrow down the differentials to act promptly and appropriately manage such patients. PMID:28367386

  20. Using Fisher information to track stability in multivariate ...

    EPA Pesticide Factsheets

    With the current proliferation of data, the proficient use of statistical and mining techniques offer substantial benefits to capture useful information from any dataset. As numerous approaches make use of information theory concepts, here, we discuss how Fisher information (FI) can be applied to sustainability science problems and used in data mining applications by analyzing patterns in data. FI was developed as a measure of information content in data, and it has been adapted to assess order in complex system behaviors. The main advantage of the approach is the ability to collapse multiple variables into an index that can be used to assess stability and track overall trends in a system, including its regimes and regime shifts. Here, we provide a brief overview of FI theory, followed by a simple step-by-step numerical example on how to compute FI. Furthermore, we introduce an open source Python library that can be freely downloaded from GitHub and we use it in a simple case study to evaluate the evolution of FI for the global-mean temperature from 1880 to 2015. Results indicate significant declines in FI starting in 1978, suggesting a possible regime shift. Demonstrate Fisher information as a useful method for assessing patterns in big data.

  1. Identifying alternate pathways for climate change to impact inland recreational fishers

    USGS Publications Warehouse

    Hunt, Len M.; Fenichel, Eli P.; Fulton, David C.; Mendelsohn, Robert; Smith, Jordan W.; Tunney, Tyler D.; Lynch, Abigail J.; Paukert, Craig P.; Whitney, James E.

    2016-01-01

    Fisheries and human dimensions literature suggests that climate change influences inland recreational fishers in North America through three major pathways. The most widely recognized pathway suggests that climate change impacts habitat and fish populations (e.g., water temperature impacting fish survival) and cascades to impact fishers. Climate change also impacts recreational fishers by influencing environmental conditions that directly affect fishers (e.g., increased temperatures in northern climates resulting in extended open water fishing seasons and increased fishing effort). The final pathway occurs from climate change mitigation and adaptation efforts (e.g., refined energy policies result in higher fuel costs, making distant trips more expensive). To address limitations of past research (e.g., assessing climate change impacts for only one pathway at a time and not accounting for climate variability, extreme weather events, or heterogeneity among fishers), we encourage researchers to refocus their efforts to understand and document climate change impacts to inland fishers.

  2. Kernel-Based Reconstruction of Graph Signals

    NASA Astrophysics Data System (ADS)

    Romero, Daniel; Ma, Meng; Giannakis, Georgios B.

    2017-02-01

    A number of applications in engineering, social sciences, physics, and biology involve inference over networks. In this context, graph signals are widely encountered as descriptors of vertex attributes or features in graph-structured data. Estimating such signals in all vertices given noisy observations of their values on a subset of vertices has been extensively analyzed in the literature of signal processing on graphs (SPoG). This paper advocates kernel regression as a framework generalizing popular SPoG modeling and reconstruction and expanding their capabilities. Formulating signal reconstruction as a regression task on reproducing kernel Hilbert spaces of graph signals permeates benefits from statistical learning, offers fresh insights, and allows for estimators to leverage richer forms of prior information than existing alternatives. A number of SPoG notions such as bandlimitedness, graph filters, and the graph Fourier transform are naturally accommodated in the kernel framework. Additionally, this paper capitalizes on the so-called representer theorem to devise simpler versions of existing Thikhonov regularized estimators, and offers a novel probabilistic interpretation of kernel methods on graphs based on graphical models. Motivated by the challenges of selecting the bandwidth parameter in SPoG estimators or the kernel map in kernel-based methods, the present paper further proposes two multi-kernel approaches with complementary strengths. Whereas the first enables estimation of the unknown bandwidth of bandlimited signals, the second allows for efficient graph filter selection. Numerical tests with synthetic as well as real data demonstrate the merits of the proposed methods relative to state-of-the-art alternatives.

  3. Multiple Factors Affect Socioeconomics and Wellbeing of Artisanal Sea Cucumber Fishers

    PubMed Central

    Ngaluafe, Poasi; Foale, Simon J.; Cocks, Nicole; Cullis, Brian R.; Lalavanua, Watisoni

    2016-01-01

    Small-scale fisheries are important to livelihoods and subsistence seafood consumption of millions of fishers. Sea cucumbers are fished worldwide for export to Asia, yet few studies have assessed factors affecting socioeconomics and wellbeing among fishers. We interviewed 476 men and women sea cucumber fishers at multiple villages within multiple locations in Fiji, Kiribati, Tonga and New Caledonia using structured questionnaires. Low rates of subsistence consumption confirmed a primary role of sea cucumbers in income security. Prices of sea cucumbers sold by fishers varied greatly among countries, depending on the species. Gender variation in landing prices could be due to women catching smaller sea cucumbers or because some traders take advantage of them. Dissatisfaction with fishery income was common (44% of fishers), especially for i-Kiribati fishers, male fishers, and fishers experiencing difficulty selling their catch, but was uncorrelated with sale prices. Income dissatisfaction worsened with age. The number of livelihood activities averaged 2.2–2.5 across countries, and varied significantly among locations. Sea cucumbers were often a primary source of income to fishers, especially in Tonga. Other common livelihood activities were fishing other marine resources, copra production in Kiribati, agriculture in Fiji, and salaried jobs in New Caledonia. Fishing other coastal and coral reef resources was the most common fall-back livelihood option if fishers were forced to exit the fishery. Our data highlight large disparities in subsistence consumption, gender-related price equity, and livelihood diversity among parallel artisanal fisheries. Improvement of supply chains in dispersed small-scale fisheries appears as a critical need for enhancing income and wellbeing of fishers. Strong evidence for co-dependence among small-scale fisheries, through fall-back livelihood preferences of fishers, suggests that resource managers must mitigate concomitant effects on

  4. Using Fisher Information Criteria for Chemical Sensor Selection via Convex Optimization Methods

    DTIC Science & Technology

    2016-11-16

    best sensors after an optimization procedure. Due to the positive definite nature of the Fisher information matrix, convex optimization may be used to...parametrized to select the best sensors after an optimization procedure. Due to the positive definite nature of the Fisher information matrix, convex op...Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6180--16-9711 Using Fisher Information Criteria for Chemical Sensor Selection via Convex

  5. Diffusion on a hypersphere: application to the Wright-Fisher model

    NASA Astrophysics Data System (ADS)

    Maruyama, Kishiko; Itoh, Yoshiaki

    2016-04-01

    The eigenfunction expansion by Gegenbauer polynomials for the diffusion on a hypersphere is transformed into the diffusion for the Wright-Fisher model with a particular mutation rate. We use the Ito calculus considering stochastic differential equations. The expansion gives a simple interpretation of the Griffiths eigenfunction expansion for the Wright-Fisher model. Our representation is useful to simulate the Wright-Fisher model as well as Brownian motion on a hypersphere.

  6. Multiple Factors Affect Socioeconomics and Wellbeing of Artisanal Sea Cucumber Fishers.

    PubMed

    Purcell, Steven W; Ngaluafe, Poasi; Foale, Simon J; Cocks, Nicole; Cullis, Brian R; Lalavanua, Watisoni

    2016-01-01

    Small-scale fisheries are important to livelihoods and subsistence seafood consumption of millions of fishers. Sea cucumbers are fished worldwide for export to Asia, yet few studies have assessed factors affecting socioeconomics and wellbeing among fishers. We interviewed 476 men and women sea cucumber fishers at multiple villages within multiple locations in Fiji, Kiribati, Tonga and New Caledonia using structured questionnaires. Low rates of subsistence consumption confirmed a primary role of sea cucumbers in income security. Prices of sea cucumbers sold by fishers varied greatly among countries, depending on the species. Gender variation in landing prices could be due to women catching smaller sea cucumbers or because some traders take advantage of them. Dissatisfaction with fishery income was common (44% of fishers), especially for i-Kiribati fishers, male fishers, and fishers experiencing difficulty selling their catch, but was uncorrelated with sale prices. Income dissatisfaction worsened with age. The number of livelihood activities averaged 2.2-2.5 across countries, and varied significantly among locations. Sea cucumbers were often a primary source of income to fishers, especially in Tonga. Other common livelihood activities were fishing other marine resources, copra production in Kiribati, agriculture in Fiji, and salaried jobs in New Caledonia. Fishing other coastal and coral reef resources was the most common fall-back livelihood option if fishers were forced to exit the fishery. Our data highlight large disparities in subsistence consumption, gender-related price equity, and livelihood diversity among parallel artisanal fisheries. Improvement of supply chains in dispersed small-scale fisheries appears as a critical need for enhancing income and wellbeing of fishers. Strong evidence for co-dependence among small-scale fisheries, through fall-back livelihood preferences of fishers, suggests that resource managers must mitigate concomitant effects on other

  7. Oecophylla longinoda (Hymenoptera: Formicidae) Lead to Increased Cashew Kernel Size and Kernel Quality.

    PubMed

    Anato, F M; Sinzogan, A A C; Offenberg, J; Adandonon, A; Wargui, R B; Deguenon, J M; Ayelo, P M; Vayssières, J-F; Kossou, D K

    2017-03-03

    Weaver ants, Oecophylla spp., are known to positively affect cashew, Anacardium occidentale L., raw nut yield, but their effects on the kernels have not been reported. We compared nut size and the proportion of marketable kernels between raw nuts collected from trees with and without ants. Raw nuts collected from trees with weaver ants were 2.9% larger than nuts from control trees (i.e., without weaver ants), leading to 14% higher proportion of marketable kernels. On trees with ants, the kernel: raw nut ratio from nuts damaged by formic acid was 4.8% lower compared with nondamaged nuts from the same trees. Weaver ants provided three benefits to cashew production by increasing yields, yielding larger nuts, and by producing greater proportions of marketable kernel mass.

  8. A new Mercer sigmoid kernel for clinical data classification.

    PubMed

    Carrington, André M; Fieguth, Paul W; Chen, Helen H

    2014-01-01

    In classification with Support Vector Machines, only Mercer kernels, i.e. valid kernels, such as the Gaussian RBF kernel, are widely accepted and thus suitable for clinical data. Practitioners would also like to use the sigmoid kernel, a non-Mercer kernel, but its range of validity is difficult to determine, and even within range its validity is in dispute. Despite these shortcomings the sigmoid kernel is used by some, and two kernels in the literature attempt to emulate and improve upon it. We propose the first Mercer sigmoid kernel, that is therefore trustworthy for the classification of clinical data. We show the similarity between the Mercer sigmoid kernel and the sigmoid kernel and, in the process, identify a normalization technique that improves the classification accuracy of the latter. The Mercer sigmoid kernel achieves the best mean accuracy on three clinical data sets, detecting melanoma in skin lesions better than the most popular kernels; while with non-clinical data sets it has no significant difference in median accuracy as compared with the Gaussian RBF kernel. It consistently classifies some points correctly that the Gaussian RBF kernel does not and vice versa.

  9. Sarcocystis neurona-associated meningoencephalitis and description of intramuscular sarcocysts in a fisher (Martes pennanti).

    PubMed

    Gerhold, Richard W; Howerth, Elizabeth W; Lindsay, David S

    2005-01-01

    A free-ranging juvenile fisher (Martes pennanti) with ataxia, lethargy, stupor, and intermittent, whole-body tremors was examined postmortem. Microscopically, the fisher had protozoal meningoencephalitis caused by Sarcocystis neurona, which was confirmed by immunohistochemistry, polymerase chain reaction (PCR) and restriction fragment length polymorphism testing, and genetic sequencing. Sarcocysts found in the skeletal muscle of the fisher were negative for S. neurona by PCR, but were morphologically similar to previous light and electron microscopy descriptions of S. neurona. This is the first report of clinical neural S. neurona infection in a fisher.

  10. Statistical algorithms for a comprehensive test ban treaty discrimination framework

    SciTech Connect

    Foote, N.D.; Anderson, D.N.; Higbee, K.T.; Miller, N.E.; Redgate, T.; Rohay, A.C.; Hagedorn, D.N.

    1996-10-01

    Seismic discrimination is the process of identifying a candidate seismic event as an earthquake or explosion using information from seismic waveform features (seismic discriminants). In the CTBT setting, low energy seismic activity must be detected and identified. A defensible CTBT discrimination decision requires an understanding of false-negative (declaring an event to be an earthquake given it is an explosion) and false-position (declaring an event to be an explosion given it is an earthquake) rates. These rates are derived from a statistical discrimination framework. A discrimination framework can be as simple as a single statistical algorithm or it can be a mathematical construct that integrates many different types of statistical algorithms and CTBT technologies. In either case, the result is the identification of an event and the numerical assessment of the accuracy of an identification, that is, false-negative and false-positive rates. In Anderson et al., eight statistical discrimination algorithms are evaluated relative to their ability to give results that effectively contribute to a decision process and to be interpretable with physical (seismic) theory. These algorithms can be discrimination frameworks individually or components of a larger framework. The eight algorithms are linear discrimination (LDA), quadratic discrimination (QDA), variably regularized discrimination (VRDA), flexible discrimination (FDA), logistic discrimination, K-th nearest neighbor (KNN), kernel discrimination, and classification and regression trees (CART). In this report, the performance of these eight algorithms, as applied to regional seismic data, is documented. Based on the findings in Anderson et al. and this analysis: CART is an appropriate algorithm for an automated CTBT setting.

  11. The Cosmological Origin of the Tully-Fisher Relation

    NASA Astrophysics Data System (ADS)

    Steinmetz, Matthias; Navarro, Julio F.

    1999-03-01

    We use high-resolution cosmological simulations that include the effects of gasdynamics and star formation to investigate the origin of the Tully-Fisher relation in the standard cold dark matter cosmogony. Stars are assumed to form in collapsing, Jeans-unstable gas clumps at a rate set by the local gas density and the dynamical/cooling timescale. The energetic feedback from stellar evolution is assumed to heat the gas-surrounding regions of ongoing star formation, where it is radiated away very rapidly. The star formation algorithm thus has little effect on the rate at which gas cools and collapses, and, as a result, most galaxies form their stars very early. Luminosities are computed for each model galaxy using their full star formation histories and the latest spectrophotometric models. We find that the stellar mass of model galaxies is proportional to the total baryonic mass within the virial radius of their surrounding halos. Circular velocity then correlates tightly with the total luminosity of the galaxy, which reflects the equivalence between mass and circular velocity of systems identified in a cosmological context. The slope of the relation steepens slightly from the blue to the red bandpasses and is in fairly good agreement with observations. Its scatter is small, decreasing from ~0.38 mag in the U band to ~0.24 mag in the K band. The particular cosmological model we explore here seems unable to account for the zero point of the correlation. Model galaxies are too faint at z=0 (by about 2 mag) if the circular velocity at the edge of the luminous galaxy is used as an estimator of the rotation speed. The model Tully-Fisher relation is brighter in the past by ~0.7 mag in the B band at z=1, which is at odds with recent observations of z~1 galaxies. We conclude that the slope and tightness of the Tully-Fisher relation can be naturally explained in hierarchical models, but that its normalization and evolution depend strongly on the star formation algorithm

  12. Kernel bandwidth optimization in spike rate estimation.

    PubMed

    Shimazaki, Hideaki; Shinomoto, Shigeru

    2010-08-01

    Kernel smoother and a time-histogram are classical tools for estimating an instantaneous rate of spike occurrences. We recently established a method for selecting the bin width of the time-histogram, based on the principle of minimizing the mean integrated square error (MISE) between the estimated rate and unknown underlying rate. Here we apply the same optimization principle to the kernel density estimation in selecting the width or "bandwidth" of the kernel, and further extend the algorithm to allow a variable bandwidth, in conformity with data. The variable kernel has the potential to accurately grasp non-stationary phenomena, such as abrupt changes in the firing rate, which we often encounter in neuroscience. In order to avoid possible overfitting that may take place due to excessive freedom, we introduced a stiffness constant for bandwidth variability. Our method automatically adjusts the stiffness constant, thereby adapting to the entire set of spike data. It is revealed that the classical kernel smoother may exhibit goodness-of-fit comparable to, or even better than, that of modern sophisticated rate estimation methods, provided that the bandwidth is selected properly for a given set of spike data, according to the optimization methods presented here.

  13. Analog forecasting with dynamics-adapted kernels

    NASA Astrophysics Data System (ADS)

    Zhao, Zhizhen; Giannakis, Dimitrios

    2016-09-01

    Analog forecasting is a nonparametric technique introduced by Lorenz in 1969 which predicts the evolution of states of a dynamical system (or observables defined on the states) by following the evolution of the sample in a historical record of observations which most closely resembles the current initial data. Here, we introduce a suite of forecasting methods which improve traditional analog forecasting by combining ideas from kernel methods developed in harmonic analysis and machine learning and state-space reconstruction for dynamical systems. A key ingredient of our approach is to replace single-analog forecasting with weighted ensembles of analogs constructed using local similarity kernels. The kernels used here employ a number of dynamics-dependent features designed to improve forecast skill, including Takens’ delay-coordinate maps (to recover information in the initial data lost through partial observations) and a directional dependence on the dynamical vector field generating the data. Mathematically, our approach is closely related to kernel methods for out-of-sample extension of functions, and we discuss alternative strategies based on the Nyström method and the multiscale Laplacian pyramids technique. We illustrate these techniques in applications to forecasting in a low-order deterministic model for atmospheric dynamics with chaotic metastability, and interannual-scale forecasting in the North Pacific sector of a comprehensive climate model. We find that forecasts based on kernel-weighted ensembles have significantly higher skill than the conventional approach following a single analog.

  14. Online Sequential Extreme Learning Machine With Kernels.

    PubMed

    Scardapane, Simone; Comminiello, Danilo; Scarpiniti, Michele; Uncini, Aurelio

    2015-09-01

    The extreme learning machine (ELM) was recently proposed as a unifying framework for different families of learning algorithms. The classical ELM model consists of a linear combination of a fixed number of nonlinear expansions of the input vector. Learning in ELM is hence equivalent to finding the optimal weights that minimize the error on a dataset. The update works in batch mode, either with explicit feature mappings or with implicit mappings defined by kernels. Although an online version has been proposed for the former, no work has been done up to this point for the latter, and whether an efficient learning algorithm for online kernel-based ELM exists remains an open problem. By explicating some connections between nonlinear adaptive filtering and ELM theory, in this brief, we present an algorithm for this task. In particular, we propose a straightforward extension of the well-known kernel recursive least-squares, belonging to the kernel adaptive filtering (KAF) family, to the ELM framework. We call the resulting algorithm the kernel online sequential ELM (KOS-ELM). Moreover, we consider two different criteria used in the KAF field to obtain sparse filters and extend them to our context. We show that KOS-ELM, with their integration, can result in a highly efficient algorithm, both in terms of obtained generalization error and training time. Empirical evaluations demonstrate interesting results on some benchmarking datasets.

  15. The connection between regularization operators and support vector kernels.

    PubMed

    Smola, Alex J.; Schölkopf, Bernhard; Müller, Klaus Robert

    1998-06-01

    In this paper a correspondence is derived between regularization operators used in regularization networks and support vector kernels. We prove that the Green's Functions associated with regularization operators are suitable support vector kernels with equivalent regularization properties. Moreover, the paper provides an analysis of currently used support vector kernels in the view of regularization theory and corresponding operators associated with the classes of both polynomial kernels and translation invariant kernels. The latter are also analyzed on periodical domains. As a by-product we show that a large number of radial basis functions, namely conditionally positive definite functions, may be used as support vector kernels.

  16. [Fisher's syndrome. Peripheral or central origin (author's transl)].

    PubMed

    Collard, M; Mathe, J F; Guihenneuc, P; Coquillat, G; Eber, A M; Ruh, D

    1978-05-01

    The syndrome described by M. Fisher in 1956 includes ophtalmoplegia, ataxia, and generalized loss of reflexes. It is classically considered to be of peripheral origin and its relation to Guillain and Barre's syndrome in its mesencephalic form is debatable. The authors review 5 cases and discuss the question of a probable central origin. They base their opinion on the pathognomonic features of these cases and those in the literature, as well as the results of their oculographic and electromyographic studies. They stress the importance of the nature of the ataxia; the severe equilibrium disturbances noted in these patients could result, contrary to usual thinking, more from a central vestibular syndrome than from a cerebellar lesion.

  17. Fisher-Wright model with deterministic seed bank and selection.

    PubMed

    Koopmann, Bendix; Müller, Johannes; Tellier, Aurélien; Živković, Daniel

    2017-04-01

    Seed banks are common characteristics to many plant species, which allow storage of genetic diversity in the soil as dormant seeds for various periods of time. We investigate an above-ground population following a Fisher-Wright model with selection coupled with a deterministic seed bank assuming the length of the seed bank is kept constant and the number of seeds is large. To assess the combined impact of seed banks and selection on genetic diversity, we derive a general diffusion model. The applied techniques outline a path of approximating a stochastic delay differential equation by an appropriately rescaled stochastic differential equation. We compute the equilibrium solution of the site-frequency spectrum and derive the times to fixation of an allele with and without selection. Finally, it is demonstrated that seed banks enhance the effect of selection onto the site-frequency spectrum while slowing down the time until the mutation-selection equilibrium is reached.

  18. Enhancing teleportation of quantum Fisher information by partial measurements

    NASA Astrophysics Data System (ADS)

    Xiao, Xing; Yao, Yao; Zhong, Wo-Jun; Li, Yan-Ling; Xie, Ying-Mao

    2016-01-01

    The purport of quantum teleportation is to completely transfer information from one party to another distant partner. However, from the perspective of parameter estimation, it is the information carried by a particular parameter, not the information of total quantum state that needs to be teleported. Due to the inevitable noise in environments, we propose two schemes to enhance quantum Fisher information (QFI) teleportation under amplitude damping noise with the technique of partial measurements. We find that post-partial measurement can greatly enhance the teleported QFI, while the combination of prior partial measurement and post-partial measurement reversal could completely eliminate the effect of decoherence. We show that, somewhat consequentially, enhancing QFI teleportation is more economic than that of improving fidelity teleportation. Our work extends the ability of partial measurements as a quantum technique to battle decoherence in quantum information processing.

  19. Fisher symmetry and the geometry of quantum states

    NASA Astrophysics Data System (ADS)

    Gross, Jonathan A.; Barnum, Howard; Caves, Carlton M.

    The quantum Fisher information (QFI) is a valuable tool on account of the achievable lower bound it provides for single-parameter estimation. Due to the existence of incompatible quantum observables, however, the lower bound provided by the QFI cannot be saturated in the general multi-parameter case. A bound demonstrated by Gill and Massar (GM) captures some of the limitations that incompatibility imposes in the multi-parameter case. We further explore the structure of measurements allowed by quantum mechanics, identifying restrictions beyond those given by the QFI and GM bound. These additional restrictions give insight into the geometry of quantum state space and notions of measurement symmetry related to the QFI.

  20. Fisher information and the thermodynamics of scale-invariant systems

    NASA Astrophysics Data System (ADS)

    Hernando, A.; Vesperinas, C.; Plastino, A.

    2010-02-01

    We present a thermodynamic formulation for scale-invariant systems based on the minimization with constraints of the Fisher information measure. In such a way a clear analogy between these systems’ thermal properties and those of gases and fluids is seen to emerge in a natural fashion. We focus our attention on the non-interacting scenario, speaking thus of scale-free ideal gases (SFIGs) and present some empirical evidences regarding such disparate systems as electoral results, city populations and total citations in Physics journals, that seem to indicate that SFIGs do exist. We also illustrate the way in which Zipf’s law can be understood in a thermodynamical context as the surface of a finite system. Finally, we derive an equivalent microscopic description of our systems which totally agrees with previous numerical simulations found in the literature.

  1. Detailed H I kinematics of Tully-Fisher calibrator galaxies

    NASA Astrophysics Data System (ADS)

    Ponomareva, Anastasia A.; Verheijen, Marc A. W.; Bosma, Albert

    2016-12-01

    We present spatially resolved H I kinematics of 32 spiral galaxies which have Cepheid or/and tip of the red giant branch distances, and define a calibrator sample for the Tully-Fisher relation. The interferometric H I data for this sample were collected from available archives and supplemented with new Giant Metrewave Radio Telescope observations. This paper describes a uniform analysis of the H I kinematics of this inhomogeneous data set. Our main result is an atlas for our calibrator sample that presents global H I profiles, integrated H I column-density maps, H I surface-density profiles and, most importantly, detailed kinematic information in the form of high-quality rotation curves derived from highly resolved, two-dimensional velocity fields and position-velocity diagrams.

  2. Nonparametric entropy estimation using kernel densities.

    PubMed

    Lake, Douglas E

    2009-01-01

    The entropy of experimental data from the biological and medical sciences provides additional information over summary statistics. Calculating entropy involves estimates of probability density functions, which can be effectively accomplished using kernel density methods. Kernel density estimation has been widely studied and a univariate implementation is readily available in MATLAB. The traditional definition of Shannon entropy is part of a larger family of statistics, called Renyi entropy, which are useful in applications that require a measure of the Gaussianity of data. Of particular note is the quadratic entropy which is related to the Friedman-Tukey (FT) index, a widely used measure in the statistical community. One application where quadratic entropy is very useful is the detection of abnormal cardiac rhythms, such as atrial fibrillation (AF). Asymptotic and exact small-sample results for optimal bandwidth and kernel selection to estimate the FT index are presented and lead to improved methods for entropy estimation.

  3. Fast generation of sparse random kernel graphs

    SciTech Connect

    Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo

    2015-09-10

    The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in time at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.

  4. Fast generation of sparse random kernel graphs

    DOE PAGES

    Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo

    2015-09-10

    The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less

  5. Phenolic constituents of shea (Vitellaria paradoxa) kernels.

    PubMed

    Maranz, Steven; Wiesman, Zeev; Garti, Nissim

    2003-10-08

    Analysis of the phenolic constituents of shea (Vitellaria paradoxa) kernels by LC-MS revealed eight catechin compounds-gallic acid, catechin, epicatechin, epicatechin gallate, gallocatechin, epigallocatechin, gallocatechin gallate, and epigallocatechin gallate-as well as quercetin and trans-cinnamic acid. The mean kernel content of the eight catechin compounds was 4000 ppm (0.4% of kernel dry weight), with a 2100-9500 ppm range. Comparison of the profiles of the six major catechins from 40 Vitellaria provenances from 10 African countries showed that the relative proportions of these compounds varied from region to region. Gallic acid was the major phenolic compound, comprising an average of 27% of the measured total phenols and exceeding 70% in some populations. Colorimetric analysis (101 samples) of total polyphenols extracted from shea butter into hexane gave an average of 97 ppm, with the values for different provenances varying between 62 and 135 ppm of total polyphenols.

  6. Tile-Compressed FITS Kernel for IRAF

    NASA Astrophysics Data System (ADS)

    Seaman, R.

    2011-07-01

    The Flexible Image Transport System (FITS) is a ubiquitously supported standard of the astronomical community. Similarly, the Image Reduction and Analysis Facility (IRAF), developed by the National Optical Astronomy Observatory, is a widely used astronomical data reduction package. IRAF supplies compatibility with FITS format data through numerous tools and interfaces. The most integrated of these is IRAF's FITS image kernel that provides access to FITS from any IRAF task that uses the basic IMIO interface. The original FITS kernel is a complex interface of purpose-built procedures that presents growing maintenance issues and lacks recent FITS innovations. A new FITS kernel is being developed at NOAO that is layered on the CFITSIO library from the NASA Goddard Space Flight Center. The simplified interface will minimize maintenance headaches as well as add important new features such as support for the FITS tile-compressed (fpack) format.

  7. Fractal Weyl law for Linux Kernel architecture

    NASA Astrophysics Data System (ADS)

    Ermann, L.; Chepelianskii, A. D.; Shepelyansky, D. L.

    2011-01-01

    We study the properties of spectrum and eigenstates of the Google matrix of a directed network formed by the procedure calls in the Linux Kernel. Our results obtained for various versions of the Linux Kernel show that the spectrum is characterized by the fractal Weyl law established recently for systems of quantum chaotic scattering and the Perron-Frobenius operators of dynamical maps. The fractal Weyl exponent is found to be ν ≈ 0.65 that corresponds to the fractal dimension of the network d ≈ 1.3. An independent computation of the fractal dimension by the cluster growing method, generalized for directed networks, gives a close value d ≈ 1.4. The eigenmodes of the Google matrix of Linux Kernel are localized on certain principal nodes. We argue that the fractal Weyl law should be generic for directed networks with the fractal dimension d < 2.

  8. A kernel-based approach for biomedical named entity recognition.

    PubMed

    Patra, Rakesh; Saha, Sujan Kumar

    2013-01-01

    Support vector machine (SVM) is one of the popular machine learning techniques used in various text processing tasks including named entity recognition (NER). The performance of the SVM classifier largely depends on the appropriateness of the kernel function. In the last few years a number of task-specific kernel functions have been proposed and used in various text processing tasks, for example, string kernel, graph kernel, tree kernel and so on. So far very few efforts have been devoted to the development of NER task specific kernel. In the literature we found that the tree kernel has been used in NER task only for entity boundary detection or reannotation. The conventional tree kernel is unable to execute the complete NER task on its own. In this paper we have proposed a kernel function, motivated by the tree kernel, which is able to perform the complete NER task. To examine the effectiveness of the proposed kernel, we have applied the kernel function on the openly available JNLPBA 2004 data. Our kernel executes the complete NER task and achieves reasonable accuracy.

  9. Classification of Hazelnut Kernels by Using Impact Acoustic Time-Frequency Patterns

    NASA Astrophysics Data System (ADS)

    Kalkan, Habil; Ince, Nuri Firat; Tewfik, Ahmed H.; Yardimci, Yasemin; Pearson, Tom

    2007-12-01

    Hazelnuts with damaged or cracked shells are more prone to infection with aflatoxin producing molds ( Aspergillus flavus). These molds can cause cancer. In this study, we introduce a new approach that separates damaged/cracked hazelnut kernels from good ones by using time-frequency features obtained from impact acoustic signals. The proposed technique requires no prior knowledge of the relevant time and frequency locations. In an offline step, the algorithm adaptively segments impact signals from a training data set in time using local cosine packet analysis and a Kullback-Leibler criterion to assess the discrimination power of different segmentations. In each resulting time segment, the signal is further decomposed into subbands using an undecimated wavelet transform. The most discriminative subbands are selected according to the Euclidean distance between the cumulative probability distributions of the corresponding subband coefficients. The most discriminative subbands are fed into a linear discriminant analysis classifier. In the online classification step, the algorithm simply computes the learned features from the observed signal and feeds them to the linear discriminant analysis (LDA) classifier. The algorithm achieved a throughput rate of 45 nuts/s and a classification accuracy of 96% with the 30 most discriminative features, a higher rate than those provided with prior methods.

  10. Experimental study of turbulent flame kernel propagation

    SciTech Connect

    Mansour, Mohy; Peters, Norbert; Schrader, Lars-Uve

    2008-07-15

    Flame kernels in spark ignited combustion systems dominate the flame propagation and combustion stability and performance. They are likely controlled by the spark energy, flow field and mixing field. The aim of the present work is to experimentally investigate the structure and propagation of the flame kernel in turbulent premixed methane flow using advanced laser-based techniques. The spark is generated using pulsed Nd:YAG laser with 20 mJ pulse energy in order to avoid the effect of the electrodes on the flame kernel structure and the variation of spark energy from shot-to-shot. Four flames have been investigated at equivalence ratios, {phi}{sub j}, of 0.8 and 1.0 and jet velocities, U{sub j}, of 6 and 12 m/s. A combined two-dimensional Rayleigh and LIPF-OH technique has been applied. The flame kernel structure has been collected at several time intervals from the laser ignition between 10 {mu}s and 2 ms. The data show that the flame kernel structure starts with spherical shape and changes gradually to peanut-like, then to mushroom-like and finally disturbed by the turbulence. The mushroom-like structure lasts longer in the stoichiometric and slower jet velocity. The growth rate of the average flame kernel radius is divided into two linear relations; the first one during the first 100 {mu}s is almost three times faster than that at the later stage between 100 and 2000 {mu}s. The flame propagation is slightly faster in leaner flames. The trends of the flame propagation, flame radius, flame cross-sectional area and mean flame temperature are related to the jet velocity and equivalence ratio. The relations obtained in the present work allow the prediction of any of these parameters at different conditions. (author)

  11. A dynamic kernel modifier for linux

    SciTech Connect

    Minnich, R. G.

    2002-09-03

    Dynamic Kernel Modifier, or DKM, is a kernel module for Linux that allows user-mode programs to modify the execution of functions in the kernel without recompiling or modifying the kernel source in any way. Functions may be traced, either function entry only or function entry and exit; nullified; or replaced with some other function. For the tracing case, function execution results in the activation of a watchpoint. When the watchpoint is activated, the address of the function is logged in a FIFO buffer that is readable by external applications. The watchpoints are time-stamped with the resolution of the processor high resolution timers, which on most modem processors are accurate to a single processor tick. DKM is very similar to earlier systems such as the SunOS trace device or Linux TT. Unlike these two systems, and other similar systems, DKM requires no kernel modifications. DKM allows users to do initial probing of the kernel to look for performance problems, or even to resolve potential problems by turning functions off or replacing them. DKM watchpoints are not without cost: it takes about 200 nanoseconds to make a log entry on an 800 Mhz Pentium-Ill. The overhead numbers are actually competitive with other hardware-based trace systems, although it has less 'Los Alamos National Laboratory is operated by the University of California for the National Nuclear Security Administration of the United States Department of Energy under contract W-7405-ENG-36. accuracy than an In-Circuit Emulator such as the American Arium. Once the user has zeroed in on a problem, other mechanisms with a higher degree of accuracy can be used.

  12. Kernel abortion in maize. II. Distribution of /sup 14/C among kernel carboydrates

    SciTech Connect

    Hanft, J.M.; Jones, R.J.

    1986-06-01

    This study was designed to compare the uptake and distribution of /sup 14/C among fructose, glucose, sucrose, and starch in the cob, pedicel, and endosperm tissues of maize (Zea mays L.) kernels induced to abort by high temperature with those that develop normally. Kernels cultured in vitro at 309 and 35/sup 0/C were transferred to (/sup 14/C)sucrose media 10 days after pollination. Kernels cultured at 35/sup 0/C aborted prior to the onset of linear dry matter accumulation. Significant uptake into the cob, pedicel, and endosperm of radioactivity associated with the soluble and starch fractions of the tissues was detected after 24 hours in culture on atlageled media. After 8 days in culture on (/sup 14/C)sucrose media, 48 and 40% of the radioactivity associated with the cob carbohydrates was found in the reducing sugars at 30 and 35/sup 0/C, respectively. Of the total carbohydrates, a higher percentage of label was associated with sucrose and lower percentage with fructose and glucose in pedicel tissue of kernels cultured at 35/sup 0/C compared to kernels cultured at 30/sup 0/C. These results indicate that sucrose was not cleaved to fructose and glucose as rapidly during the unloading process in the pedicel of kernels induced to abort by high temperature. Kernels cultured at 35/sup 0/C had a much lower proportion of label associated with endosperm starch (29%) than did kernels cultured at 30/sup 0/C (89%). Kernels cultured at 35/sup 0/C had a correspondingly higher proportion of /sup 14/C in endosperm fructose, glucose, and sucrose.

  13. Reduced multiple empirical kernel learning machine.

    PubMed

    Wang, Zhe; Lu, MingZhe; Gao, Daqi

    2015-02-01

    Multiple kernel learning (MKL) is demonstrated to be flexible and effective in depicting heterogeneous data sources since MKL can introduce multiple kernels rather than a single fixed kernel into applications. However, MKL would get a high time and space complexity in contrast to single kernel learning, which is not expected in real-world applications. Meanwhile, it is known that the kernel mapping ways of MKL generally have two forms including implicit kernel mapping and empirical kernel mapping (EKM), where the latter is less attracted. In this paper, we focus on the MKL with the EKM, and propose a reduced multiple empirical kernel learning machine named RMEKLM for short. To the best of our knowledge, it is the first to reduce both time and space complexity of the MKL with EKM. Different from the existing MKL, the proposed RMEKLM adopts the Gauss Elimination technique to extract a set of feature vectors, which is validated that doing so does not lose much information of the original feature space. Then RMEKLM adopts the extracted feature vectors to span a reduced orthonormal subspace of the feature space, which is visualized in terms of the geometry structure. It can be demonstrated that the spanned subspace is isomorphic to the original feature space, which means that the dot product of two vectors in the original feature space is equal to that of the two corresponding vectors in the generated orthonormal subspace. More importantly, the proposed RMEKLM brings a simpler computation and meanwhile needs a less storage space, especially in the processing of testing. Finally, the experimental results show that RMEKLM owns a much efficient and effective performance in terms of both complexity and classification. The contributions of this paper can be given as follows: (1) by mapping the input space into an orthonormal subspace, the geometry of the generated subspace is visualized; (2) this paper first reduces both the time and space complexity of the EKM-based MKL; (3

  14. Full Waveform Inversion Using Waveform Sensitivity Kernels

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang

    2013-04-01

    We present a full waveform inversion concept for applications ranging from seismological to enineering contexts, in which the steps of forward simulation, computation of sensitivity kernels, and the actual inversion are kept separate of each other. We derive waveform sensitivity kernels from Born scattering theory, which for unit material perturbations are identical to the Born integrand for the considered path between source and receiver. The evaluation of such a kernel requires the calculation of Green functions and their strains for single forces at the receiver position, as well as displacement fields and strains originating at the seismic source. We compute these quantities in the frequency domain using the 3D spectral element code SPECFEM3D (Tromp, Komatitsch and Liu, 2008) and the 1D semi-analytical code GEMINI (Friederich and Dalkolmo, 1995) in both, Cartesian and spherical framework. We developed and implemented the modularized software package ASKI (Analysis of Sensitivity and Kernel Inversion) to compute waveform sensitivity kernels from wavefields generated by any of the above methods (support for more methods is planned), where some examples will be shown. As the kernels can be computed independently from any data values, this approach allows to do a sensitivity and resolution analysis first without inverting any data. In the context of active seismic experiments, this property may be used to investigate optimal acquisition geometry and expectable resolution before actually collecting any data, assuming the background model is known sufficiently well. The actual inversion step then, can be repeated at relatively low costs with different (sub)sets of data, adding different smoothing conditions. Using the sensitivity kernels, we expect the waveform inversion to have better convergence properties compared with strategies that use gradients of a misfit function. Also the propagation of the forward wavefield and the backward propagation from the receiver

  15. Regularization techniques for PSF-matching kernels - I. Choice of kernel basis

    NASA Astrophysics Data System (ADS)

    Becker, A. C.; Homrighausen, D.; Connolly, A. J.; Genovese, C. R.; Owen, R.; Bickerton, S. J.; Lupton, R. H.

    2012-09-01

    We review current methods for building point spread function (PSF)-matching kernels for the purposes of image subtraction or co-addition. Such methods use a linear decomposition of the kernel on a series of basis functions. The correct choice of these basis functions is fundamental to the efficiency and effectiveness of the matching - the chosen bases should represent the underlying signal using a reasonably small number of shapes, and/or have a minimum number of user-adjustable tuning parameters. We examine methods whose bases comprise multiple Gauss-Hermite polynomials, as well as a form-free basis composed of delta-functions. Kernels derived from delta-functions are unsurprisingly shown to be more expressive; they are able to take more general shapes and perform better in situations where sum-of-Gaussian methods are known to fail. However, due to its many degrees of freedom (the maximum number allowed by the kernel size) this basis tends to overfit the problem and yields noisy kernels having large variance. We introduce a new technique to regularize these delta-function kernel solutions, which bridges the gap between the generality of delta-function kernels and the compactness of sum-of-Gaussian kernels. Through this regularization we are able to create general kernel solutions that represent the intrinsic shape of the PSF-matching kernel with only one degree of freedom, the strength of the regularization λ. The role of λ is effectively to exchange variance in the resulting difference image with variance in the kernel itself. We examine considerations in choosing the value of λ, including statistical risk estimators and the ability of the solution to predict solutions for adjacent areas. Both of these suggest moderate strengths of λ between 0.1 and 1.0, although this optimization is likely data set dependent. This model allows for flexible representations of the convolution kernel that have significant predictive ability and will prove useful in implementing

  16. Fisher Information, Entropy, and the Second and Third Laws of Thermodynamics

    EPA Science Inventory

    We propose Fisher Information as a new calculable thermodynamic property that can be shown to follow the Second and the Third Laws of Thermodynamics. Fisher Information is, however, qualitatively different from entropy and potentially possessing a great deal more structure. Hence...

  17. A Case of Miller Fisher Syndrome, Thromboembolic Disease, and Angioedema: Association or Coincidence?

    PubMed Central

    Salehi, Nooshin; Choi, Eric D.; Garrison, Roger C.

    2017-01-01

    Patient: Male, 32 Final Diagnosis: Miller Fisher syndrome Symptoms: Ataxia • headache • ophthalmoplegia Medication: — Clinical Procedure: Plasmapheresis Specialty: Neurology Objective: Rare co-existance of disease or pathology Background: Miller Fisher Syndrome is characterized by the clinical triad of ophthalmoplegia, ataxia, and areflexia, and is considered to be a variant of Guillain-Barre Syndrome. Miller Fisher Syndrome is observed in approximately 1–5% of all Guillain-Barre cases in Western countries. Patients with Miller Fisher Syndrome usually have good recovery without residual deficits. Venous thromboembolism is a common complication of Guillain-Barre Syndrome and has also been reported in Miller Fisher Syndrome, but it has generally been reported in the presence of at least one prothrombotic risk factor such as immobility. A direct correlation between venous thromboembolism and Miller Fisher Syndrome or Guillain-Barre Syndrome has not been previously described. Case Report: We report the case of a 32-year-old Hispanic male who presented with acute, severe thromboembolic disease and concurrently demonstrated characteristic clinical features of Miller Fisher Syndrome including ophthalmoplegia, ataxia, and areflexia. Past medical and family history were negative for thromboembolic disease, and subsequent hypercoagulability workup was unremarkable. During the course of hospitalization, the patient also developed angioedema. Conclusions: We describe a possible association between Miller Fisher Syndrome, thromboembolic disease, and angioedema. PMID:28090073

  18. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Fishers Island Sound, Stonington, Conn. 110.50a Section 110.50a Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island...

  19. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Fishers Island Sound, Stonington, Conn. 110.50a Section 110.50a Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island...

  20. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Fishers Island Sound, Stonington, Conn. 110.50a Section 110.50a Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island...

  1. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Fishers Island Sound, Stonington, Conn. 110.50a Section 110.50a Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island...

  2. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Fishers Island Sound, Stonington, Conn. 110.50a Section 110.50a Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island...

  3. Evaluating the sustainability of a regional system using Fisher information in the San Luis Basin, Colorado

    EPA Science Inventory

    This paper describes the theory, data, and methodology necessary for using Fisher information to assess the sustainability of the San Luis Basin (SLB) regional system over time. Fisher information was originally developed as a measure of the information content in data and is an ...

  4. Accuracy of Reduced and Extended Thin-Wire Kernels

    SciTech Connect

    Burke, G J

    2008-11-24

    Some results are presented comparing the accuracy of the reduced thin-wire kernel and an extended kernel with exact integration of the 1/R term of the Green's function and results are shown for simple wire structures.

  5. Centered Kernel Alignment Enhancing Neural Network Pretraining for MRI-Based Dementia Diagnosis

    PubMed Central

    Cárdenas-Peña, David; Collazos-Huertas, Diego; Castellanos-Dominguez, German

    2016-01-01

    Dementia is a growing problem that affects elderly people worldwide. More accurate evaluation of dementia diagnosis can help during the medical examination. Several methods for computer-aided dementia diagnosis have been proposed using resonance imaging scans to discriminate between patients with Alzheimer's disease (AD) or mild cognitive impairment (MCI) and healthy controls (NC). Nonetheless, the computer-aided diagnosis is especially challenging because of the heterogeneous and intermediate nature of MCI. We address the automated dementia diagnosis by introducing a novel supervised pretraining approach that takes advantage of the artificial neural network (ANN) for complex classification tasks. The proposal initializes an ANN based on linear projections to achieve more discriminating spaces. Such projections are estimated by maximizing the centered kernel alignment criterion that assesses the affinity between the resonance imaging data kernel matrix and the label target matrix. As a result, the performed linear embedding allows accounting for features that contribute the most to the MCI class discrimination. We compare the supervised pretraining approach to two unsupervised initialization methods (autoencoders and Principal Component Analysis) and against the best four performing classification methods of the 2014 CADDementia challenge. As a result, our proposal outperforms all the baselines (7% of classification accuracy and area under the receiver-operating-characteristic curve) at the time it reduces the class biasing. PMID:27148392

  6. Analysis of maize ( Zea mays ) kernel density and volume using microcomputed tomography and single-kernel near-infrared spectroscopy.

    PubMed

    Gustin, Jeffery L; Jackson, Sean; Williams, Chekeria; Patel, Anokhee; Armstrong, Paul; Peter, Gary F; Settles, A Mark

    2013-11-20

    Maize kernel density affects milling quality of the grain. Kernel density of bulk samples can be predicted by near-infrared reflectance (NIR) spectroscopy, but no accurate method to measure individual kernel density has been reported. This study demonstrates that individual kernel density and volume are accurately measured using X-ray microcomputed tomography (μCT). Kernel density was significantly correlated with kernel volume, air space within the kernel, and protein content. Embryo density and volume did not influence overall kernel density. Partial least-squares (PLS) regression of μCT traits with single-kernel NIR spectra gave stable predictive models for kernel density (R(2) = 0.78, SEP = 0.034 g/cm(3)) and volume (R(2) = 0.86, SEP = 2.88 cm(3)). Density and volume predictions were accurate for data collected over 10 months based on kernel weights calculated from predicted density and volume (R(2) = 0.83, SEP = 24.78 mg). Kernel density was significantly correlated with bulk test weight (r = 0.80), suggesting that selection of dense kernels can translate to improved agronomic performance.

  7. Fabrication of Uranium Oxycarbide Kernels for HTR Fuel

    SciTech Connect

    Charles Barnes; CLay Richardson; Scott Nagley; John Hunn; Eric Shaber

    2010-10-01

    Babcock and Wilcox (B&W) has been producing high quality uranium oxycarbide (UCO) kernels for Advanced Gas Reactor (AGR) fuel tests at the Idaho National Laboratory. In 2005, 350-µm, 19.7% 235U-enriched UCO kernels were produced for the AGR-1 test fuel. Following coating of these kernels and forming the coated-particles into compacts, this fuel was irradiated in the Advanced Test Reactor (ATR) from December 2006 until November 2009. B&W produced 425-µm, 14% enriched UCO kernels in 2008, and these kernels were used to produce fuel for the AGR-2 experiment that was inserted in ATR in 2010. B&W also produced 500-µm, 9.6% enriched UO2 kernels for the AGR-2 experiments. Kernels of the same size and enrichment as AGR-1 were also produced for the AGR-3/4 experiment. In addition to fabricating enriched UCO and UO2 kernels, B&W has produced more than 100 kg of natural uranium UCO kernels which are being used in coating development tests. Successive lots of kernels have demonstrated consistent high quality and also allowed for fabrication process improvements. Improvements in kernel forming were made subsequent to AGR-1 kernel production. Following fabrication of AGR-2 kernels, incremental increases in sintering furnace charge size have been demonstrated. Recently small scale sintering tests using a small development furnace equipped with a residual gas analyzer (RGA) has increased understanding of how kernel sintering parameters affect sintered kernel properties. The steps taken to increase throughput and process knowledge have reduced kernel production costs. Studies have been performed of additional modifications toward the goal of increasing capacity of the current fabrication line to use for production of first core fuel for the Next Generation Nuclear Plant (NGNP) and providing a basis for the design of a full scale fuel fabrication facility.

  8. Fisher's contributions to genetics and heredity, with special emphasis on the Gregor Mendel controversy.

    PubMed

    Piegorsch, W W

    1990-12-01

    R. A. Fisher is widely respected for his contributions to both statistics and genetics. For instance, his 1930 text on The Genetical Theory of Natural Selection remains a watershed contribution in that area. Fisher's subsequent research led him to study the work of (Johann) Gregor Mendel, the 19th century monk who first developed the basic principles of heredity with experiments on garden peas. In examining Mendel's original 1865 article, Fisher noted that the conformity between Mendel's reported and proposed (theoretical) ratios of segregating individuals was unusually good, "too good" perhaps. The resulting controversy as to whether Mendel "cooked" his data for presentation has continued to the current day. This review highlights Fisher's most salient points as regards Mendel's "too good" fit, within the context of Fisher's extensive contributions to the development of genetical and evolutionary theory.

  9. Post-tsunami relocation of fisher settlements in South Asia: evidence from the Coromandel Coast, India.

    PubMed

    Bavinck, Maarten; de Klerk, Leo; van der Plaat, Felice; Ravesteijn, Jorik; Angel, Dominique; Arendsen, Hendrik; van Dijk, Tom; de Hoog, Iris; van Koolwijk, Ant; Tuijtel, Stijn; Zuurendonk, Benjamin

    2015-07-01

    The tsunami that struck the coasts of India on 26 December 2004 resulted in the large-scale destruction of fisher habitations. The post-tsunami rehabilitation effort in Tamil Nadu was directed towards relocating fisher settlements in the interior. This paper discusses the outcomes of a study on the social effects of relocation in a sample of nine communities along the Coromandel Coast. It concludes that, although the participation of fishing communities in house design and in allocation procedures has been limited, many fisher households are satisfied with the quality of the facilities. The distance of the new settlements to the shore, however, is regarded as an impediment to engaging in the fishing profession, and many fishers are actually moving back to their old locations. This raises questions as to the direction of coastal zone policy in India, as well as to the weight accorded to safety (and other coastal development interests) vis-à-vis the livelihood needs of fishers.

  10. The interaction between seaweed farming as an alternative occupation and fisher numbers in the central Philippines.

    PubMed

    Hill, Nicholas A O; Rowcliffe, J Marcus; Koldewey, Heather J; Milner-Gulland, E J

    2012-04-01

    Alternative occupations are frequently promoted as a means to reduce the number of people exploiting declining fisheries. However, there is little evidence that alternative occupations reduce fisher numbers. Seaweed farming is frequently promoted as a lucrative alternative occupation for artisanal fishers in Southeast Asia. We examined how the introduction of seaweed farming has affected village-level changes in the number of fishers on Danajon Bank, central Philippines, where unsustainable fishing has led to declining fishery yields. To determine how fisher numbers had changed since seaweed farming started, we interviewed the heads of household from 300 households in 10 villages to examine their perceptions of how fisher numbers had changed in their village and the reasons they associated with these changes. We then asked key informants (people with detailed knowledge of village members) to estimate fisher numbers in these villages before seaweed farming began and at the time of the survey. We compared the results of how fisher numbers had changed in each village with the wealth, education, seaweed farm sizes, and other attributes of households in these villages, which we collected through interviews, and with village-level factors such as distance to markets. We also asked people why they either continued to engage in or ceased fishing. In four villages, respondents thought seaweed farming and low fish catches had reduced fisher numbers, at least temporarily. In one of these villages, there was a recent return to fishing due to declines in the price of seaweed and increased theft of seaweed. In another four villages, fisher numbers increased as human population increased, despite the widespread uptake of seaweed farming. Seaweed farming failed for technical reasons in two other villages. Our results suggest seaweed farming has reduced fisher numbers in some villages, a result that may be correlated with socioeconomic status, but the heterogeneity of outcomes is

  11. 7 CFR 868.254 - Broken kernels determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Broken kernels determination. 868.254 Section 868.254 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Governing Application of Standards § 868.254 Broken kernels determination. Broken kernels shall...

  12. 7 CFR 868.304 - Broken kernels determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Broken kernels determination. 868.304 Section 868.304 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Application of Standards § 868.304 Broken kernels determination. Broken kernels shall be determined by the...

  13. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Determination of kernel weight. 981.60 Section 981.60... Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  14. Multiple spectral kernel learning and a gaussian complexity computation.

    PubMed

    Reyhani, Nima

    2013-07-01

    Multiple kernel learning (MKL) partially solves the kernel selection problem in support vector machines and similar classifiers by minimizing the empirical risk over a subset of the linear combination of given kernel matrices. For large sample sets, the size of the kernel matrices becomes a numerical issue. In many cases, the kernel matrix is of low-efficient rank. However, the low-rank property is not efficiently utilized in MKL algorithms. Here, we suggest multiple spectral kernel learning that efficiently uses the low-rank property by finding a kernel matrix from a set of Gram matrices of a few eigenvectors from all given kernel matrices, called a spectral kernel set. We provide a new bound for the gaussian complexity of the proposed kernel set, which depends on both the geometry of the kernel set and the number of Gram matrices. This characterization of the complexity implies that in an MKL setting, adding more kernels may not monotonically increase the complexity, while previous bounds show otherwise.

  15. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 3 2010-04-01 2009-04-01 true Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in...

  16. 7 CFR 981.61 - Redetermination of kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Redetermination of kernel weight. 981.61 Section 981... GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.61 Redetermination of kernel weight. The Board, on the basis of reports by handlers, shall redetermine the kernel weight of...

  17. Thermomechanical property of rice kernels studied by DMA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The thermomechanical property of the rice kernels was investigated using a dynamic mechanical analyzer (DMA). The length change of rice kernel with a loaded constant force along the major axis direction was detected during temperature scanning. The thermomechanical transition occurred in rice kernel...

  18. NIRS method for precise identification of Fusarium damaged wheat kernels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Development of scab resistant wheat varieties may be enhanced by non-destructive evaluation of kernels for Fusarium damaged kernels (FDKs) and deoxynivalenol (DON) levels. Fusarium infection generally affects kernel appearance, but insect damage and other fungi can cause similar symptoms. Also, some...

  19. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... shall mean the actual gross weight of any lot of almonds: Less weight of containers; less moisture of... material, 350 grams, and moisture content of kernels, seven percent. Excess moisture is two percent. The...: Edible kernels, 840 grams; inedible kernels, 120 grams; foreign material, 40 grams; and moisture...

  20. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... shall mean the actual gross weight of any lot of almonds: Less weight of containers; less moisture of... material, 350 grams, and moisture content of kernels, seven percent. Excess moisture is two percent. The...: Edible kernels, 840 grams; inedible kernels, 120 grams; foreign material, 40 grams; and moisture...

  1. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... shall mean the actual gross weight of any lot of almonds: Less weight of containers; less moisture of... material, 350 grams, and moisture content of kernels, seven percent. Excess moisture is two percent. The...: Edible kernels, 840 grams; inedible kernels, 120 grams; foreign material, 40 grams; and moisture...

  2. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... shall mean the actual gross weight of any lot of almonds: Less weight of containers; less moisture of... material, 350 grams, and moisture content of kernels, seven percent. Excess moisture is two percent. The...: Edible kernels, 840 grams; inedible kernels, 120 grams; foreign material, 40 grams; and moisture...

  3. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... shall mean the actual gross weight of any lot of almonds: Less weight of containers; less moisture of... material, 350 grams, and moisture content of kernels, seven percent. Excess moisture is two percent. The...: Edible kernels, 840 grams; inedible kernels, 120 grams; foreign material, 40 grams; and moisture...

  4. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  5. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  6. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  7. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  8. Protein Structure Prediction Using String Kernels

    DTIC Science & Technology

    2006-03-03

    Prediction using String Kernels 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER...consists of 4352 sequences from SCOP version 1.53 extracted from the Astral database, grouped into families and superfamilies. The dataset is processed

  9. Kernel Temporal Differences for Neural Decoding

    PubMed Central

    Bae, Jihye; Sanchez Giraldo, Luis G.; Pohlmeyer, Eric A.; Francis, Joseph T.; Sanchez, Justin C.; Príncipe, José C.

    2015-01-01

    We study the feasibility and capability of the kernel temporal difference (KTD)(λ) algorithm for neural decoding. KTD(λ) is an online, kernel-based learning algorithm, which has been introduced to estimate value functions in reinforcement learning. This algorithm combines kernel-based representations with the temporal difference approach to learning. One of our key observations is that by using strictly positive definite kernels, algorithm's convergence can be guaranteed for policy evaluation. The algorithm's nonlinear functional approximation capabilities are shown in both simulations of policy evaluation and neural decoding problems (policy improvement). KTD can handle high-dimensional neural states containing spatial-temporal information at a reasonable computational complexity allowing real-time applications. When the algorithm seeks a proper mapping between a monkey's neural states and desired positions of a computer cursor or a robot arm, in both open-loop and closed-loop experiments, it can effectively learn the neural state to action mapping. Finally, a visualization of the coadaptation process between the decoder and the subject shows the algorithm's capabilities in reinforcement learning brain machine interfaces. PMID:25866504

  10. Convolution kernels for multi-wavelength imaging

    NASA Astrophysics Data System (ADS)

    Boucaud, A.; Bocchio, M.; Abergel, A.; Orieux, F.; Dole, H.; Hadj-Youcef, M. A.

    2016-12-01

    Astrophysical images issued from different instruments and/or spectral bands often require to be processed together, either for fitting or comparison purposes. However each image is affected by an instrumental response, also known as point-spread function (PSF), that depends on the characteristics of the instrument as well as the wavelength and the observing strategy. Given the knowledge of the PSF in each band, a straightforward way of processing images is to homogenise them all to a target PSF using convolution kernels, so that they appear as if they had been acquired by the same instrument. We propose an algorithm that generates such PSF-matching kernels, based on Wiener filtering with a tunable regularisation parameter. This method ensures all anisotropic features in the PSFs to be taken into account. We compare our method to existing procedures using measured Herschel/PACS and SPIRE PSFs and simulated JWST/MIRI PSFs. Significant gains up to two orders of magnitude are obtained with respect to the use of kernels computed assuming Gaussian or circularised PSFs. A software to compute these kernels is available at https://github.com/aboucaud/pypher

  11. Management decision making for fisher populations informed by occupancy modeling

    USGS Publications Warehouse

    Fuller, Angela K.; Linden, Daniel W.; Royle, J. Andrew

    2016-01-01

    Harvest data are often used by wildlife managers when setting harvest regulations for species because the data are regularly collected and do not require implementation of logistically and financially challenging studies to obtain the data. However, when harvest data are not available because an area had not previously supported a harvest season, alternative approaches are required to help inform management decision making. When distribution or density data are required across large areas, occupancy modeling is a useful approach, and under certain conditions, can be used as a surrogate for density. We collaborated with the New York State Department of Environmental Conservation (NYSDEC) to conduct a camera trapping study across a 70,096-km2 region of southern New York in areas that were currently open to fisher (Pekania [Martes] pennanti) harvest and those that had been closed to harvest for approximately 65 years. We used detection–nondetection data at 826 sites to model occupancy as a function of site-level landscape characteristics while accounting for sampling variation. Fisher occupancy was influenced positively by the proportion of conifer and mixed-wood forest within a 15-km2 grid cell and negatively associated with road density and the proportion of agriculture. Model-averaged predictions indicated high occupancy probabilities (>0.90) when road densities were low (<1 km/km2) and coniferous and mixed forest proportions were high (>0.50). Predicted occupancy ranged 0.41–0.67 in wildlife management units (WMUs) currently open to trapping, which could be used to guide a minimum occupancy threshold for opening new areas to trapping seasons. There were 5 WMUs that had been closed to trapping but had an average predicted occupancy of 0.52 (0.07 SE), and above the threshold of 0.41. These areas are currently under consideration by NYSDEC for opening a conservative harvest season. We demonstrate the use of occupancy modeling as an aid to management

  12. Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to characterize vegetation recovery after fire disturbance

    NASA Astrophysics Data System (ADS)

    Lanorte, Antonio; Lasaponara, Rosa; Lovallo, Michele; Telesca, Luciano

    2014-02-01

    The time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers.

  13. Kernel weights optimization for error diffusion halftoning method

    NASA Astrophysics Data System (ADS)

    Fedoseev, Victor

    2015-02-01

    This paper describes a study to find the best error diffusion kernel for digital halftoning under various restrictions on the number of non-zero kernel coefficients and their set of values. As an objective measure of quality, WSNR was used. The problem of multidimensional optimization was solved numerically using several well-known algorithms: Nelder- Mead, BFGS, and others. The study found a kernel function that provides a quality gain of about 5% in comparison with the best of the commonly used kernel introduced by Floyd and Steinberg. Other kernels obtained allow to significantly reduce the computational complexity of the halftoning process without reducing its quality.

  14. Generalization Performance of Regularized Ranking With Multiscale Kernels.

    PubMed

    Zhou, Yicong; Chen, Hong; Lan, Rushi; Pan, Zhibin

    2016-05-01

    The regularized kernel method for the ranking problem has attracted increasing attentions in machine learning. The previous regularized ranking algorithms are usually based on reproducing kernel Hilbert spaces with a single kernel. In this paper, we go beyond this framework by investigating the generalization performance of the regularized ranking with multiscale kernels. A novel ranking algorithm with multiscale kernels is proposed and its representer theorem is proved. We establish the upper bound of the generalization error in terms of the complexity of hypothesis spaces. It shows that the multiscale ranking algorithm can achieve satisfactory learning rates under mild conditions. Experiments demonstrate the effectiveness of the proposed method for drug discovery and recommendation tasks.

  15. The Local Tully–Fisher Relation for Dwarf Galaxies

    NASA Astrophysics Data System (ADS)

    Karachentsev, Igor D.; Kaisina, Elena I.; Kashibadze (Nasonova, Olga G.

    2017-01-01

    We study different incarnations of the Tully–Fisher (TF) relation for the Local Volume (LV) galaxies taken from Updated Nearby Galaxy Catalog. The UNGC sample contains 656 galaxies with W50 H i-line-width estimates, mostly belonging to low-mass dwarfs. Of them, 296 objects have distances measured with accuracies better than 10%. For the sample of 331 LV galaxies having baryonic masses {log}{M}{bar}> 5.8{log} {M}ȯ , we obtain a relation {log}{M}{bar}=2.49{log}{W}50+3.97 with an observed scatter of 0.38 dex. The largest factors affecting the scatter are observational errors in K-band magnitudes and W50 line widths for the tiny dwarfs, as well as uncertainty of their inclinations. We find that accounting for the surface brightness of the LV galaxies or their gas fraction, specific star-formation rate, or isolation index does not essentially reduce the observed scatter on the baryonic TF diagram. We also notice that a sample of 71 dSph satellites of the Milky Way and M31 with a known stellar velocity dispersion σ* tends to follow nearly the same bTF relation, having slightly lower masses than that of late-type dwarfs.

  16. THE SLOPE OF THE BARYONIC TULLY-FISHER RELATION

    SciTech Connect

    Gurovich, Sebastian; Freeman, Kenneth; Jerjen, Helmut; Staveley-Smith, Lister; Puerari, Ivanio

    2010-09-15

    We present the results of a baryonic Tully-Fisher relation (BTFR) study for a local sample of relatively isolated disk galaxies. We derive a BTFR with a slope near 3 measured over about 4 dex in baryon mass for our combined H I and bright spiral disk samples. This BTFR is significantly flatter and has less scatter than the TFR (stellar mass only) with its slope near 4 reported for other samples and studies. A BTFR slope near 3 is in better agreement with the expected slope from simple {Lambda}CDM cosmological simulations that include both stellar and gas baryons. The scatter in the TFR/BTFR appears to depend on W{sub 20}: galaxies that rotate slower have more scatter. The atomic gas-to-stars ratio shows a break near W{sub 20} = 250 km s{sup -1} probably associated with a change in star formation efficiency. In contrast, the absence of such a break in the BTFR suggests that this relation was probably set at the main epoch of baryon dissipation rather than as a product of later galactic evolution.

  17. Trichloroethylene induces dopaminergic neurodegeneration in Fisher 344 rats.

    PubMed

    Liu, Mei; Choi, Dong-Young; Hunter, Randy L; Pandya, Jignesh D; Cass, Wayne A; Sullivan, Patrick G; Kim, Hyoung-Chun; Gash, Don M; Bing, Guoying

    2010-02-01

    Trichloroethylene, a chlorinated solvent widely used as a degreasing agent, is a common environmental contaminant. Emerging evidence suggests that chronic exposure to trichloroethylene may contribute to the development of Parkinson's disease. The purpose of this study was to determine if selective loss of nigrostriatal dopaminergic neurons could be reproduced by systemic exposure of adult Fisher 344 rats to trichloroethylene. In our experiments, oral administration of trichloroethylene induced a significant loss of dopaminergic neurons in the substantia nigra pars compacta in a dose-dependent manner, whereas the number of both cholinergic and GABAergic neurons were not decreased in the striatum. There was a robust decline in striatal levels of 3, 4-dihydroxyphenylacetic acid without a significant depletion of striatal dopamine. Rats treated with trichloroethylene showed defects in rotarod behavior test. We also found a significantly reduced mitochondrial complex I activity with elevated oxidative stress markers and activated microglia in the nigral area. In addition, we observed intracellular alpha-synuclein accumulation in the dorsal motor nucleus of the vagus nerve, with some in nigral neurons, but little in neurons of cerebral cortex. Overall, our animal model exhibits some important features of Parkinsonism, and further supports that trichloroethylene may be an environmental risk factors for Parkinson's disease.

  18. Reproductive Biology of Leptocybe invasa Fisher & La Salle (Hymenoptera: Eulophidae).

    PubMed

    Zheng, X-L; Huang, Z-Y; Li, J; Yang, Z-D; Yang, X-H; Lu, W

    2017-03-14

    Leptocybe invasa Fisher & La Salle (Hymenoptera: Eulophidae) is an invasive pest in Eucalyptus plantations around the world. The successful colonization of L. invasa is possibly related to its reproductive biology. The objective of this study was to examine the reproductive biology of L. invasa. In Guangxi Province, the sex ratio (proportion of female, 0.99) of L. invasa was female-dominant throughout the year based on natural and artificial infestation. This result was similar to the ratios observed for other geographic populations in China, including those in Fujian (0.99), Guangdong (0.98), Hainan (0.95), Jiangxi (0.96), and Sichuan (0.99). The offspring sex ratio favored females. A large number of females emerged from the galls produced by females, with few males found. Galls on the petioles and midribs of Eucalyptus plants could be caused by newly emerged females with mature eggs. The lengths of the ovariole, spermatheca, common oviduct, and reproductive glands did not differ among L. invasa females, but their lateral oviducts showed differences from 0 to 42 h after emergence, indicating that this insect is proovigenic. These results could explain why L. invasa populations can rapidly increase in invaded areas.

  19. Using Fisher information to track stability in multivariate systems

    PubMed Central

    Derrible, Sybil; Eason, Tarsha; Cabezas, Heriberto

    2016-01-01

    With the current proliferation of data, the proficient use of statistical and mining techniques offer substantial benefits to capture useful information from any dataset. As numerous approaches make use of information theory concepts, here, we discuss how Fisher information (FI) can be applied to sustainability science problems and used in data mining applications by analysing patterns in data. FI was developed as a measure of information content in data, and it has been adapted to assess order in complex system behaviour. The main advantage of the approach is the ability to collapse multiple variables into an index that can be used to assess stability and track overall trends in a system, including its regimes and regime shifts. Here, we provide a brief overview of FI theory, followed by a simple step-by-step numerical example on how to compute FI. Furthermore, we introduce an open source Python library that can be freely downloaded from GitHub and we use it in a simple case study to evaluate the evolution of FI for the global-mean temperature from 1880 to 2015. Results indicate significant declines in FI starting in 1978, suggesting a possible regime shift. PMID:28018650

  20. Distribution of quantum Fisher information in asymmetric cloning machines

    PubMed Central

    Xiao, Xing; Yao, Yao; Zhou, Lei-Ming; Wang, Xiaoguang

    2014-01-01

    An unknown quantum state cannot be copied and broadcast freely due to the no-cloning theorem. Approximate cloning schemes have been proposed to achieve the optimal cloning characterized by the maximal fidelity between the original and its copies. Here, from the perspective of quantum Fisher information (QFI), we investigate the distribution of QFI in asymmetric cloning machines which produce two nonidentical copies. As one might expect, improving the QFI of one copy results in decreasing the QFI of the other copy. It is perhaps also unsurprising that asymmetric phase-covariant cloning outperforms universal cloning in distributing QFI since a priori information of the input state has been utilized. However, interesting results appear when we compare the distributabilities of fidelity (which quantifies the full information of quantum states), and QFI (which only captures the information of relevant parameters) in asymmetric cloning machines. Unlike the results of fidelity, where the distributability of symmetric cloning is always optimal for any d-dimensional cloning, we find that any asymmetric cloning outperforms symmetric cloning on the distribution of QFI for d ≤ 18, whereas some but not all asymmetric cloning strategies could be worse than symmetric ones when d > 18. PMID:25484234

  1. Generalized Fisher information matrix in nonextensive systems with spatial correlation

    NASA Astrophysics Data System (ADS)

    Hasegawa, Hideo

    2009-11-01

    By using the q -Gaussian distribution derived by the maximum entropy method for spatially correlated N -unit nonextensive systems, we have calculated the generalized Fisher information matrix of gθnθm for (θ1,θ2,θ3)=(μq,σq2,s) , where μq , σq2 , and s denote the mean, variance, and degree of spatial correlation, respectively, for a given entropic index q . It has been shown from the Cramér-Rao theorem that (1) an accuracy of an unbiased estimate of μq is improved (degraded) by a negative (positive) correlation s , (2) that of σq2 is worsen with increasing s , and (3) that of s is much improved for s≃-1/(N-1) or s≃1.0 though it is worst at s=(N-2)/2(N-1) . Our calculation provides a clear insight to the long-standing controversy whether the spatial correlation is beneficial or detrimental to decoding in neuronal ensembles. We discuss also a calculation of the q -Gaussian distribution applying the superstatistics to the Langevin model subjected to spatially correlated inputs.

  2. Hellmann–Feynman connection for the relative Fisher information

    SciTech Connect

    Venkatesan, R.C.; Plastino, A.

    2015-08-15

    The (i) reciprocity relations for the relative Fisher information (RFI, hereafter) and (ii) a generalized RFI–Euler theorem are self-consistently derived from the Hellmann–Feynman theorem. These new reciprocity relations generalize the RFI–Euler theorem and constitute the basis for building up a mathematical Legendre transform structure (LTS, hereafter), akin to that of thermodynamics, that underlies the RFI scenario. This demonstrates the possibility of translating the entire mathematical structure of thermodynamics into a RFI-based theoretical framework. Virial theorems play a prominent role in this endeavor, as a Schrödinger-like equation can be associated to the RFI. Lagrange multipliers are determined invoking the RFI–LTS link and the quantum mechanical virial theorem. An appropriate ansatz allows for the inference of probability density functions (pdf’s, hereafter) and energy-eigenvalues of the above mentioned Schrödinger-like equation. The energy-eigenvalues obtained here via inference are benchmarked against established theoretical and numerical results. A principled theoretical basis to reconstruct the RFI-framework from the FIM framework is established. Numerical examples for exemplary cases are provided. - Highlights: • Legendre transform structure for the RFI is obtained with the Hellmann–Feynman theorem. • Inference of the energy-eigenvalues of the SWE-like equation for the RFI is accomplished. • Basis for reconstruction of the RFI framework from the FIM-case is established. • Substantial qualitative and quantitative distinctions with prior studies are discussed.

  3. Using Fisher information to track stability in multivariate systems.

    PubMed

    Ahmad, Nasir; Derrible, Sybil; Eason, Tarsha; Cabezas, Heriberto

    2016-11-01

    With the current proliferation of data, the proficient use of statistical and mining techniques offer substantial benefits to capture useful information from any dataset. As numerous approaches make use of information theory concepts, here, we discuss how Fisher information (FI) can be applied to sustainability science problems and used in data mining applications by analysing patterns in data. FI was developed as a measure of information content in data, and it has been adapted to assess order in complex system behaviour. The main advantage of the approach is the ability to collapse multiple variables into an index that can be used to assess stability and track overall trends in a system, including its regimes and regime shifts. Here, we provide a brief overview of FI theory, followed by a simple step-by-step numerical example on how to compute FI. Furthermore, we introduce an open source Python library that can be freely downloaded from GitHub and we use it in a simple case study to evaluate the evolution of FI for the global-mean temperature from 1880 to 2015. Results indicate significant declines in FI starting in 1978, suggesting a possible regime shift.

  4. A generalized Fisher equation and its utility in chemical kinetics

    PubMed Central

    Ross, John; Villaverde, Alejandro Fernández; Banga, Julio R.; Vázquez, Sara; Morán, Federico

    2010-01-01

    A generalized Fisher equation (GFE) relates the time derivative of the average of the intrinsic rate of growth to its variance. The GFE is an exact mathematical result that has been widely used in population dynamics and genetics, where it originated. Here we demonstrate that the GFE can also be useful in other fields, specifically in chemistry, with models of two chemical reaction systems for which the mechanisms and rate coefficients correspond reasonably well to experiments. A bad fit of the GFE can be a sign of high levels of measurement noise; for low or moderate levels of noise, fulfillment of the GFE is not degraded. Hence, the GFE presents a noise threshold that may be used to test the validity of experimental measurements without requiring any additional information. In a different approach information about the system (model) is included in the calculations. In that case, the discrepancy with the GFE can be used as an optimization criterion for the determination of rate coefficients in a given reaction mechanism. PMID:20615992

  5. A generalized Fisher equation and its utility in chemical kinetics.

    PubMed

    Ross, John; Fernández Villaverde, Alejandro; Banga, Julio R; Vázquez, Sara; Morán, Federico

    2010-07-20

    A generalized Fisher equation (GFE) relates the time derivative of the average of the intrinsic rate of growth to its variance. The GFE is an exact mathematical result that has been widely used in population dynamics and genetics, where it originated. Here we demonstrate that the GFE can also be useful in other fields, specifically in chemistry, with models of two chemical reaction systems for which the mechanisms and rate coefficients correspond reasonably well to experiments. A bad fit of the GFE can be a sign of high levels of measurement noise; for low or moderate levels of noise, fulfillment of the GFE is not degraded. Hence, the GFE presents a noise threshold that may be used to test the validity of experimental measurements without requiring any additional information. In a different approach information about the system (model) is included in the calculations. In that case, the discrepancy with the GFE can be used as an optimization criterion for the determination of rate coefficients in a given reaction mechanism.

  6. Canine distemper in an isolated population of fishers (Martes pennanti) from California.

    PubMed

    Keller, Stefan M; Gabriel, Mourad; Terio, Karen A; Dubovi, Edward J; VanWormer, Elizabeth; Sweitzer, Rick; Barret, Reginald; Thompson, Craig; Purcell, Kathryn; Munson, Linda

    2012-10-01

    Four fishers (Martes pennanti) from an insular population in the southern Sierra Nevada Mountains, California, USA died as a consequence of an infection with canine distemper virus (CDV) in 2009. Three fishers were found in close temporal and spatial relationship; the fourth fisher died 4 mo later at a 70 km distance from the initial group. Gross lesions were restricted to hyperkeratosis of periocular skin and ulceration of footpads. All animals had necrotizing bronchitis and bronchiolitis with syncytia and intracytoplasmic inclusion bodies. Inclusion bodies were abundant in the epithelia of urinary bladder and epididymis but were infrequent in the renal pelvis and the female genital epithelia. No histopathologic or immunohistochemical evidence for virus spread to the central nervous system was found. One fisher had encephalitis caused by Sarcocystis neurona and another had severe head trauma as a consequence of predation. The H gene nucleotide sequence of the virus isolates from the first three fishers was identical and was 99.6% identical to the isolate from the fourth fisher. Phylogenetically, the isolates clustered with other North American isolates separate from classical European wildlife lineage strains. These data suggest that the European wildlife lineage might consist of two separate subgroups that are genetically distinct and endemic in different geographic regions. The source of infection as well as pertinent transmission routes remained unclear. This is the first report of CDV in fishers and underscores the significance of CDV as a pathogen of management concern.

  7. A Case of Miller Fisher Syndrome, Thromboembolic Disease, and Angioedema: Association or Coincidence?

    PubMed

    Salehi, Nooshin; Choi, Eric D; Garrison, Roger C

    2017-01-16

    BACKGROUND Miller Fisher Syndrome is characterized by the clinical triad of ophthalmoplegia, ataxia, and areflexia, and is considered to be a variant of Guillain-Barre Syndrome. Miller Fisher Syndrome is observed in approximately 1-5% of all Guillain-Barre cases in Western countries. Patients with Miller Fisher Syndrome usually have good recovery without residual deficits. Venous thromboembolism is a common complication of Guillain-Barre Syndrome and has also been reported in Miller Fisher Syndrome, but it has generally been reported in the presence of at least one prothrombotic risk factor such as immobility. A direct correlation between venous thromboembolism and Miller Fisher Syndrome or Guillain-Barre Syndrome has not been previously described. CASE REPORT We report the case of a 32-year-old Hispanic male who presented with acute, severe thromboembolic disease and concurrently demonstrated characteristic clinical features of Miller Fisher Syndrome including ophthalmoplegia, ataxia, and areflexia. Past medical and family history were negative for thromboembolic disease, and subsequent hypercoagulability workup was unremarkable. During the course of hospitalization, the patient also developed angioedema. CONCLUSIONS We describe a possible association between Miller Fisher Syndrome, thromboembolic disease, and angioedema.

  8. Color measurement and discrimination

    NASA Technical Reports Server (NTRS)

    Wandell, B. A.

    1985-01-01

    The present investigation is concerned with new results which show that for test lights with slow temporal modulations, and thus little effect on the luminance system, the vector-difference hypothesis represents an adequate characterization of discrimination data. It is pointed out that for certain experimental conditions color measurements can be successfully extended to include a difference measure which predicts the discriminability of pairs of lights. When discrimination depends principally on opponent-channel responses, discrimination thresholds can be predicted from the detection contour alone. Attention is given to discriminations with a 6-Hz Gabor function, the categorization of stimulus regions, and the nature of the visual mechanisms.

  9. Difference image analysis: automatic kernel design using information criteria

    NASA Astrophysics Data System (ADS)

    Bramich, D. M.; Horne, Keith; Alsubai, K. A.; Bachelet, E.; Mislis, D.; Parley, N.

    2016-03-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularization. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unregularized delta basis functions, combined with either the Akaike or Takeuchi information criterion, is the best kernel solution method in terms of photometric accuracy. Our results are validated by tests performed on two independent sets of real data. Finally, we provide some important recommendations for software implementations of difference image analysis.

  10. Post Tsunami Job Satisfaction among the Fishers of Na Pru Village, on the Andaman Sea Coast of Thailand

    ERIC Educational Resources Information Center

    Pollnac, Richard B.; Kotowicz, Dawn

    2012-01-01

    The paper examines job satisfaction among fishers in a tsunami-impacted area on the Andaman coast of Thailand. Following the tsunami, many predicted that fishers would be reluctant to resume their fishing activities. Observations in the fishing communities, however, indicated that as soon as fishers obtained replacements for equipment damaged by…

  11. Fast discrimination of hydroxypropyl methyl cellulose using portable Raman spectrometer and multivariate methods

    NASA Astrophysics Data System (ADS)

    Song, Biao; Lu, Dan; Peng, Ming; Li, Xia; Zou, Ye; Huang, Meizhen; Lu, Feng

    2017-02-01

    Raman spectroscopy is developed as a fast and non-destructive method for the discrimination and classification of hydroxypropyl methyl cellulose (HPMC) samples. 44 E series and 41 K series of HPMC samples are measured by a self-developed portable Raman spectrometer (Hx-Raman) which is excited by a 785 nm diode laser and the spectrum range is 200-2700 cm-1 with a resolution (FWHM) of 6 cm-1. Multivariate analysis is applied for discrimination of E series from K series. By methods of principal components analysis (PCA) and Fisher discriminant analysis (FDA), a discrimination result with sensitivity of 90.91% and specificity of 95.12% is achieved. The corresponding receiver operating characteristic (ROC) is 0.99, indicting the accuracy of the predictive model. This result demonstrates the prospect of portable Raman spectrometer for rapid, non-destructive classification and discrimination of E series and K series samples of HPMC.

  12. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images

    PubMed Central

    Khansari, Maziyar M; O’Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-01-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method’s discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  13. Efficient $\\chi ^{2}$ Kernel Linearization via Random Feature Maps.

    PubMed

    Yuan, Xiao-Tong; Wang, Zhenzhen; Deng, Jiankang; Liu, Qingshan

    2016-11-01

    Explicit feature mapping is an appealing way to linearize additive kernels, such as χ(2) kernel for training large-scale support vector machines (SVMs). Although accurate in approximation, feature mapping could pose computational challenges in high-dimensional settings as it expands the original features to a higher dimensional space. To handle this issue in the context of χ(2) kernel SVMs learning, we introduce a simple yet efficient method to approximately linearize χ(2) kernel through random feature maps. The main idea is to use sparse random projection to reduce the dimensionality of feature maps while preserving their approximation capability to the original kernel. We provide approximation error bound for the proposed method. Furthermore, we extend our method to χ(2) multiple kernel SVMs learning. Extensive experiments on large-scale image classification tasks confirm that the proposed approach is able to significantly speed up the training process of the χ(2) kernel SVMs at almost no cost of testing accuracy.

  14. Enhancing quantum coherence and quantum Fisher information by quantum partially collapsing measurements

    NASA Astrophysics Data System (ADS)

    Liu, Zhi; Qiu, Liang; Pan, Fei

    2017-04-01

    We consider the enhancement effect of quantum partially collapsing measurements, i.e., weak measurement and quantum measurement reversal, on quantum coherence and quantum Fisher information, both of which are transmitted through a spin-chain channel. For the state parameter lying in the region (π /2, π ), weak measurement can enhance quantum coherence and quantum Fisher information. For the state parameter lying in the region (0, π /2), quantum coherence and quantum Fisher information can be enhanced by quantum measurement reversal combined with weak measurement. We assume the probabilistic nature of the method should be responsible for the enhancement.

  15. Dangers, delights, and destiny on the sea: fishers along the East coast of north sumatra, indonesia.

    PubMed

    Markkanen, Pia

    2005-01-01

    This article describes a collaborative project between the International Labour Organization's International Programme on the Elimination of Child Labour (IPEC) and the Lowell Center for Sustainable Production, in identifying work hazards of fishers along the east coast of North Sumatra, Indonesia, in July 2004. The study employed qualitative investigation techniques: participant observations at fishing villages and harbors; and interviews with local fishers and skippers. Fishers work long hours in life-threatening conditions, often with low pay. It would be synergistic to incorporate fishing safety and health policies and advocacy efforts into reconstruction undertakings of fisheries devastated by the 2004 tsunami.

  16. A Novel Framework for Learning Geometry-Aware Kernels.

    PubMed

    Pan, Binbin; Chen, Wen-Sheng; Xu, Chen; Chen, Bo

    2016-05-01

    The data from real world usually have nonlinear geometric structure, which are often assumed to lie on or close to a low-dimensional manifold in a high-dimensional space. How to detect this nonlinear geometric structure of the data is important for the learning algorithms. Recently, there has been a surge of interest in utilizing kernels to exploit the manifold structure of the data. Such kernels are called geometry-aware kernels and are widely used in the machine learning algorithms. The performance of these algorithms critically relies on the choice of the geometry-aware kernels. Intuitively, a good geometry-aware kernel should utilize additional information other than the geometric information. In many applications, it is required to compute the out-of-sample data directly. However, most of the geometry-aware kernel methods are restricted to the available data given beforehand, with no straightforward extension for out-of-sample data. In this paper, we propose a framework for more general geometry-aware kernel learning. The proposed framework integrates multiple sources of information and enables us to develop flexible and effective kernel matrices. Then, we theoretically show how the learned kernel matrices are extended to the corresponding kernel functions, in which the out-of-sample data can be computed directly. Under our framework, a novel family of geometry-aware kernels is developed. Especially, some existing geometry-aware kernels can be viewed as instances of our framework. The performance of the kernels is evaluated on dimensionality reduction, classification, and clustering tasks. The empirical results show that our kernels significantly improve the performance.

  17. Kernel Density Estimation, Kernel Methods, and Fast Learning in Large Data Sets.

    PubMed

    Wang, Shitong; Wang, Jun; Chung, Fu-lai

    2014-01-01

    Kernel methods such as the standard support vector machine and support vector regression trainings take O(N(3)) time and O(N(2)) space complexities in their naïve implementations, where N is the training set size. It is thus computationally infeasible in applying them to large data sets, and a replacement of the naive method for finding the quadratic programming (QP) solutions is highly desirable. By observing that many kernel methods can be linked up with kernel density estimate (KDE) which can be efficiently implemented by some approximation techniques, a new learning method called fast KDE (FastKDE) is proposed to scale up kernel methods. It is based on establishing a connection between KDE and the QP problems formulated for kernel methods using an entropy-based integrated-squared-error criterion. As a result, FastKDE approximation methods can be applied to solve these QP problems. In this paper, the latest advance in fast data reduction via KDE is exploited. With just a simple sampling strategy, the resulted FastKDE method can be used to scale up various kernel methods with a theoretical guarantee that their performance does not degrade a lot. It has a time complexity of O(m(3)) where m is the number of the data points sampled from the training set. Experiments on different benchmarking data sets demonstrate that the proposed method has comparable performance with the state-of-art method and it is effective for a wide range of kernel methods to achieve fast learning in large data sets.

  18. Interpretation and Visualization of Non-Linear Data Fusion in Kernel Space: Study on Metabolomic Characterization of Progression of Multiple Sclerosis

    PubMed Central

    Smolinska, Agnieszka; Blanchet, Lionel; Coulier, Leon; Ampt, Kirsten A. M.; Luider, Theo; Hintzen, Rogier Q.; Wijmenga, Sybren S.; Buydens, Lutgarde M. C.

    2012-01-01

    Background In the last decade data fusion has become widespread in the field of metabolomics. Linear data fusion is performed most commonly. However, many data display non-linear parameter dependences. The linear methods are bound to fail in such situations. We used proton Nuclear Magnetic Resonance and Gas Chromatography-Mass Spectrometry, two well established techniques, to generate metabolic profiles of Cerebrospinal fluid of Multiple Sclerosis (MScl) individuals. These datasets represent non-linearly separable groups. Thus, to extract relevant information and to combine them a special framework for data fusion is required. Methodology The main aim is to demonstrate a novel approach for data fusion for classification; the approach is applied to metabolomics datasets coming from patients suffering from MScl at a different stage of the disease. The approach involves data fusion in kernel space and consists of four main steps. The first one is to extract the significant information per data source using Support Vector Machine Recursive Feature Elimination. This method allows one to select a set of relevant variables. In the next step the optimized kernel matrices are merged by linear combination. In step 3 the merged datasets are analyzed with a classification technique, namely Kernel Partial Least Square Discriminant Analysis. In the final step, the variables in kernel space are visualized and their significance established. Conclusions We find that fusion in kernel space allows for efficient and reliable discrimination of classes (MScl and early stage). This data fusion approach achieves better class prediction accuracy than analysis of individual datasets and the commonly used mid-level fusion. The prediction accuracy on an independent test set (8 samples) reaches 100%. Additionally, the classification model obtained on fused kernels is simpler in terms of complexity, i.e. just one latent variable was sufficient. Finally, visualization of variables importance in

  19. Wilson Dslash Kernel From Lattice QCD Optimization

    SciTech Connect

    Joo, Balint; Smelyanskiy, Mikhail; Kalamkar, Dhiraj D.; Vaidyanathan, Karthikeyan

    2015-07-01

    Lattice Quantum Chromodynamics (LQCD) is a numerical technique used for calculations in Theoretical Nuclear and High Energy Physics. LQCD is traditionally one of the first applications ported to many new high performance computing architectures and indeed LQCD practitioners have been known to design and build custom LQCD computers. Lattice QCD kernels are frequently used as benchmarks (e.g. 168.wupwise in the SPEC suite) and are generally well understood, and as such are ideal to illustrate several optimization techniques. In this chapter we will detail our work in optimizing the Wilson-Dslash kernels for Intel Xeon Phi, however, as we will show the technique gives excellent performance on regular Xeon Architecture as well.

  20. [Study on the method of feature extraction for brain-computer interface using discriminative common vector].

    PubMed

    Wang, Jinjia; Hu, Bei

    2013-02-01

    Discriminative common vector (DCV) is an effective method that was proposed for the small sample size problems of face recognition. There is the same problem in brain-computer interface (BCI). Using directly the linear discriminative analysis (LDA) could result in errors because of the singularity of the within-class matrix of data. In our studies, we used the DCV method from the common vector theory in the within-class scatter matrix of data of all classes, and then applied eigenvalue decomposition to the common vectors to obtain the final projected vectors. Then we used kernel discriminative common vector (KDCV) with different kernel. Three data sets that include BCI Competition I data set, Competition II data set IV, and a data set collected by ourselves were used in the experiments. The experiment results of 93%, 77% and 97% showed that this feature extraction method could be used well in the classification of imagine data in BCI.

  1. Transit light curves with finite integration time: Fisher information analysis

    SciTech Connect

    Price, Ellen M.; Rogers, Leslie A.

    2014-10-10

    Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal to noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curve photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal to noise (constant total integration time in the absence of read noise). Uncertainties on the transit ingress/egress time increase by a factor of 34 for Earth-size planets and 3.4 for Jupiter-size planets around Sun-like stars for integration times of 30 minutes compared to instantaneously sampled light curves. Similarly, uncertainties on the mid-transit time for Earth and Jupiter-size planets increase by factors of 3.9 and 1.4. Uncertainties on the transit depth are largely unaffected by finite integration times. While correlations among the transit depth, ingress duration, and transit duration all increase in magnitude with longer integration times, the mid-transit time remains uncorrelated with the other parameters. We provide code in Python and Mathematica for predicting the variances and covariances at www.its.caltech.edu/∼eprice.

  2. The different baryonic Tully-Fisher relations at low masses.

    PubMed

    Brook, Chris B; Santos-Santos, Isabel; Stinson, Greg

    2016-06-11

    We compare the Baryonic Tully-Fisher relation (BTFR) of simulations and observations of galaxies ranging from dwarfs to spirals, using various measures of rotational velocity Vrot. We explore the BTFR when measuring Vrot at the flat part of the rotation curve, Vflat, at the extent of H i gas, Vlast, and using 20 per cent (W20) and 50 per cent (W50) of the width of H i line profiles. We also compare with the maximum circular velocity of the parent halo, [Formula: see text], within dark matter only simulations. The different BTFRs increasingly diverge as galaxy mass decreases. Using Vlast one obtains a power law over four orders of magnitude in baryonic mass, with slope similar to the observed BTFR. Measuring Vflat gives similar results as Vlast when galaxies with rising rotation curves are excluded. However, higher rotation velocities would be found for low-mass galaxies if the cold gas extended far enough for Vrot to reach a maximum. W20 gives a similar slope as Vlast but with slightly lower values of Vrot for low-mass galaxies, although this may depend on the extent of the gas in your galaxy sample. W50 bends away from these other relations towards low velocities at low masses. By contrast, [Formula: see text] bends towards high velocities for low-mass galaxies, as cold gas does not extend out to the radius at which haloes reach [Formula: see text]. Our study highlights the need for careful comparisons between observations and models: one needs to be consistent about the particular method of measuring Vrot, and precise about the radius at which velocities are measured.

  3. The Tully-Fisher relation of COLD GASS Galaxies

    NASA Astrophysics Data System (ADS)

    Tiley, Alfred L.; Bureau, Martin; Saintonge, Amélie; Topal, Selcuk; Davis, Timothy A.; Torii, Kazufumi

    2016-10-01

    We present the stellar mass (M*) and Wide-Field Infrared Survey Explorer absolute Band 1 magnitude (MW1) Tully-Fisher relations (TFRs) of subsets of galaxies from the CO Legacy Database for the GALEX Arecibo SDSS Survey (COLD GASS). We examine the benefits and drawbacks of several commonly used fitting functions in the context of measuring CO(1-0) linewidths (and thus rotation velocities), favouring the Gaussian Double Peak function. We find the MW1 and M* TFR, for a carefully selected sub-sample, to be M_{W1} = (-7.1± 0.6) [log {(W_{50}/sin {i}/km s^{-1})}-2.58] - 23.83 ± 0.09 and log {(M_{{ast }}/M_{{⊙}})} = (3.3± 0.3) [log {(W_{50//sin {i}}{km s^{-1}})}-2.58] + 10.51± 0.04, respectively, where W50 is the width of a galaxy's CO(1-0) integrated profile at 50 per cent of its maximum and the inclination i is derived from the galaxy axial ratio measured on the Sloan Digitized Sky Survey r-band image. We find no evidence for any significant offset between the TFRs of COLD GASS galaxies and those of comparison samples of similar redshifts and morphologies. The slope of the COLD GASS M* TFR agrees with the relation of Pizagno et al. However, we measure a comparatively shallower slope for the COLD GASS MW1 TFR as compared to the relation of Tully & Pierce. We attribute this to the fact that the COLD GASS sample comprises galaxies of various (late-type) morphologies. Nevertheless, our work provides a robust reference point with which to compare future CO TFR studies.

  4. Bergman kernel and complex singularity exponent

    NASA Astrophysics Data System (ADS)

    Chen, Boyong; Lee, Hanjin

    2009-12-01

    We give a precise estimate of the Bergman kernel for the model domain defined by $\\Omega_F=\\{(z,w)\\in \\mathbb{C}^{n+1}:{\\rm Im}w-|F(z)|^2>0\\},$ where $F=(f_1,...,f_m)$ is a holomorphic map from $\\mathbb{C}^n$ to $\\mathbb{C}^m$, in terms of the complex singularity exponent of $F$.

  5. Advanced Development of Certified OS Kernels

    DTIC Science & Technology

    2015-06-01

    and Coq Ltac libraries. 15. SUBJECT TERMS Certified Software; Certified OS Kernels; Certified Compilers; Abstraction Layers; Modularity; Deep ...module should only need to be done once (to show that it implements its deep functional specification [14]). Global properties should be derived from the...building certified abstraction layers with deep specifications. A certified layer is a new language-based module construct that consists of a triple pL1,M

  6. The Palomar kernel-phase experiment: testing kernel phase interferometry for ground-based astronomical observations

    NASA Astrophysics Data System (ADS)

    Pope, Benjamin; Tuthill, Peter; Hinkley, Sasha; Ireland, Michael J.; Greenbaum, Alexandra; Latyshev, Alexey; Monnier, John D.; Martinache, Frantz

    2016-01-01

    At present, the principal limitation on the resolution and contrast of astronomical imaging instruments comes from aberrations in the optical path, which may be imposed by the Earth's turbulent atmosphere or by variations in the alignment and shape of the telescope optics. These errors can be corrected physically, with active and adaptive optics, and in post-processing of the resulting image. A recently developed adaptive optics post-processing technique, called kernel-phase interferometry, uses linear combinations of phases that are self-calibrating with respect to small errors, with the goal of constructing observables that are robust against the residual optical aberrations in otherwise well-corrected imaging systems. Here, we present a direct comparison between kernel phase and the more established competing techniques, aperture masking interferometry, point spread function (PSF) fitting and bispectral analysis. We resolve the α Ophiuchi binary system near periastron, using the Palomar 200-Inch Telescope. This is the first case in which kernel phase has been used with a full aperture to resolve a system close to the diffraction limit with ground-based extreme adaptive optics observations. Excellent agreement in astrometric quantities is found between kernel phase and masking, and kernel phase significantly outperforms PSF fitting and bispectral analysis, demonstrating its viability as an alternative to conventional non-redundant masking under appropriate conditions.

  7. A Fast Reduced Kernel Extreme Learning Machine.

    PubMed

    Deng, Wan-Yu; Ong, Yew-Soon; Zheng, Qing-Hua

    2016-04-01

    In this paper, we present a fast and accurate kernel-based supervised algorithm referred to as the Reduced Kernel Extreme Learning Machine (RKELM). In contrast to the work on Support Vector Machine (SVM) or Least Square SVM (LS-SVM), which identifies the support vectors or weight vectors iteratively, the proposed RKELM randomly selects a subset of the available data samples as support vectors (or mapping samples). By avoiding the iterative steps of SVM, significant cost savings in the training process can be readily attained, especially on Big datasets. RKELM is established based on the rigorous proof of universal learning involving reduced kernel-based SLFN. In particular, we prove that RKELM can approximate any nonlinear functions accurately under the condition of support vectors sufficiency. Experimental results on a wide variety of real world small instance size and large instance size applications in the context of binary classification, multi-class problem and regression are then reported to show that RKELM can perform at competitive level of generalized performance as the SVM/LS-SVM at only a fraction of the computational effort incurred.

  8. Kernel Non-Rigid Structure from Motion

    PubMed Central

    Gotardo, Paulo F. U.; Martinez, Aleix M.

    2013-01-01

    Non-rigid structure from motion (NRSFM) is a difficult, underconstrained problem in computer vision. The standard approach in NRSFM constrains 3D shape deformation using a linear combination of K basis shapes; the solution is then obtained as the low-rank factorization of an input observation matrix. An important but overlooked problem with this approach is that non-linear deformations are often observed; these deformations lead to a weakened low-rank constraint due to the need to use additional basis shapes to linearly model points that move along curves. Here, we demonstrate how the kernel trick can be applied in standard NRSFM. As a result, we model complex, deformable 3D shapes as the outputs of a non-linear mapping whose inputs are points within a low-dimensional shape space. This approach is flexible and can use different kernels to build different non-linear models. Using the kernel trick, our model complements the low-rank constraint by capturing non-linear relationships in the shape coefficients of the linear model. The net effect can be seen as using non-linear dimensionality reduction to further compress the (shape) space of possible solutions. PMID:24002226

  9. Balancing continuous covariates based on Kernel densities.

    PubMed

    Ma, Zhenjun; Hu, Feifang

    2013-03-01

    The balance of important baseline covariates is essential for convincing treatment comparisons. Stratified permuted block design and minimization are the two most commonly used balancing strategies, both of which require the covariates to be discrete. Continuous covariates are typically discretized in order to be included in the randomization scheme. But breakdown of continuous covariates into subcategories often changes the nature of the covariates and makes distributional balance unattainable. In this article, we propose to balance continuous covariates based on Kernel density estimations, which keeps the continuity of the covariates. Simulation studies show that the proposed Kernel-Minimization can achieve distributional balance of both continuous and categorical covariates, while also keeping the group size well balanced. It is also shown that the Kernel-Minimization is less predictable than stratified permuted block design and minimization. Finally, we apply the proposed method to redesign the NINDS trial, which has been a source of controversy due to imbalance of continuous baseline covariates. Simulation shows that imbalances such as those observed in the NINDS trial can be generally avoided through the implementation of the new method.

  10. Entanglement detection in a coupled atom-field system via quantum Fisher information

    NASA Astrophysics Data System (ADS)

    Mirkhalaf, Safoura Sadat; Smerzi, Augusto

    2017-02-01

    We consider a system of finite number of particles collectively interacting with a single-mode coherent field inside a cavity. Depending on the strength of the initial field compared to the number of atoms, we consider three regimes of weak-, intermediate-, and strong-field interaction. The dynamics of multiparticle entanglement detected by quantum Fisher information and spin squeezing are studied in each regime. It is seen that in the weak-field regime, spin squeezing and quantum Fisher information coincide. However, by increasing the initial field population toward the strong-field regime, quantum Fisher information is more effective in detecting entanglement compared to spin squeezing. In addition, in the two-atom system, we also study concurrence. In this case, the quantum Fisher information as a function of time is in good agreement with concurrence in predicting entanglement peaks.

  11. On the fractional Fisher information with applications to a hyperbolic-parabolic system of chemotaxis

    NASA Astrophysics Data System (ADS)

    Granero-Belinchón, Rafael

    2017-02-01

    We introduce new lower bounds for the fractional Fisher information. Equipped with these bounds we study a hyperbolic-parabolic model of chemotaxis and prove the global existence of solutions in certain dissipation regimes.

  12. Fisher Sand & Gravel New Mexico, Inc. General Air Quality Permit: Related Documents

    EPA Pesticide Factsheets

    Documents related to the Fisher Sand & Gravel – New Mexico, Inc., Grey Mesa Gravel Pit General Air Quality Permit for New or Modified Minor Source Stone Quarrying, Crushing, and Screening Facilities in Indian Country.

  13. Determining the continuous family of quantum Fisher information from linear-response theory

    NASA Astrophysics Data System (ADS)

    Shitara, Tomohiro; Ueda, Masahito

    2016-12-01

    The quantum Fisher information represents a continuous family of metrics on the space of quantum states and places the fundamental limit on the accuracy of quantum state estimation. We show that the entire family of quantum Fisher information can be determined from linear-response theory through generalized covariances. We derive the generalized fluctuation-dissipation theorem that relates linear-response functions to generalized covariances and hence allows us to determine the quantum Fisher information from linear-response functions, which are experimentally measurable quantities. As an application, we examine the skew information, which is a quantum Fisher information, of a harmonic oscillator in thermal equilibrium, and show that the equality of the skew-information-based uncertainty relation holds.

  14. 77 FR 60143 - Importer of Controlled Substances; Notice of Application; Fisher Clinical Services, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF JUSTICE Drug Enforcement Administration Importer of Controlled Substances; Notice of Application; Fisher Clinical Services... renewal to the Drug Enforcement Administration (DEA) for registration as an importer of...

  15. Fisher information for the position-dependent mass Schrödinger system

    NASA Astrophysics Data System (ADS)

    Falaye, B. J.; Serrano, F. A.; Dong, Shi-Hai

    2016-01-01

    This study presents the Fisher information for the position-dependent mass Schrödinger equation with hyperbolic potential V (x) = -V0csch2 (ax). The analysis of the quantum-mechanical probability for the ground and exited states (n = 0 , 1 , 2) has been obtained via the Fisher information. This controls both chemical and physical properties of some molecular systems. The Fisher information is considered only for x > 0 due to the singular point at x = 0. We found that Fisher-information-based uncertainty relation and the Cramer-Rao inequality holds. Some relevant numerical results are presented. The results presented show that the Cramer-Rao and the Heisenberg products in both spaces provide a natural measure for anharmonicity of -V0csch2 (ax).

  16. The Milky Way, the Local Group & the IR Tully-Fisher Diagram

    NASA Technical Reports Server (NTRS)

    Malhotra, S.; Spergel, D.; Rhoads, J.; Li, J.

    1996-01-01

    Using the near infrared fluxes of local group galaxies derived from Cosmic Background Explorer/Diffuse Infrared Background Experiment band maps and published Cepheid distances, we construct Tully-Fisher diagrams for the Local Group.

  17. 77 FR 47818 - Proposed Information Collection; Comment Request; Socioeconomics of Commercial Fishers and for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-10

    ...; Socioeconomics of Commercial Fishers and for Hire Diving and Fishing Operations in the Flower Garden Banks...). In 1996, the Flower Gardens Bank National Marine Sanctuary (FGBNMS) was added to the system of...

  18. Who's Qualified? Seeing Race in Color-Blind Times: Lessons from Fisher v. University of Texas

    ERIC Educational Resources Information Center

    Donnor, Jamel K.

    2015-01-01

    Using Howard Winant's racial dualism theory, this chapter explains how race was discursively operationalized in the recent U.S. Supreme Court higher education antiracial diversity case Fisher v. University of Texas at Austin.

  19. [Gender determination based on osteometric characteristics of the upper and lower extremities by discriminant analysis].

    PubMed

    Zviagin, V N; Sineva, I M

    2007-01-01

    The authors studied the osteological collection of the Chair of Antropology of the Moscow State University. The results of measurement of length of long tubular bones and articular parts of scapula and pelvis were statistically treated. The complex of discriminant models calculated by the Fisher's method is recommended for the sex identification. The diagnostic accuracy is 74 - 83.5% (separated bones) and 85.7 - 95.2% (complex of bones of upper and lower extremities).

  20. A Perfect Price Discrimination Market Model with Production, and a (Rational) Convex Program for It

    NASA Astrophysics Data System (ADS)

    Goel, Gagan; Vazirani, Vijay

    Recent results showed PPAD-completeness of the problem of computing an equilibrium for Fisher's market model under additively separable, piecewise-linear, concave utilities. We show that introducing perfect price discrimination in this model renders its equilibrium polynomial time computable. Moreover, its set of equilibria are captured by a convex program that generalizes the classical Eisenberg-Gale program, and always admits a rational solution.

  1. Association of Fisher scale and changes of language in patients with aneurysmal subarachnoid hemorrhage.

    PubMed

    Souza, Moysés Loiola Ponte de

    2014-11-01

    Cognitive deficits caused by subarachnoid hemorrhage (SAH) after rupture of cerebral aneurysms are common, as approximately half of patients have severe, or at least striking, declines in one or more functions of the cognitive domain. The Fisher Scale is associated with the development of vasospasm and thus with the final performance of the patient after SAH. The association of this scale with language disorders in the period preceding the treatment has not been reported yet in the literature. Associate the presence of language deficits with varying degrees of the Fisher Scale in patients with SAH in the period preceding the treatment of aneurysm, as well as compare the various degrees of this scale, identifying the Fisher Scale degrees more associated with the decline of language. The database of 185 preoperative evaluations of language was studied, through the Montreal Toulouse Protocol Alpha version and verbal fluency through CERAD battery, of patients of Hospital da Restauração with aneurysmal SAH. The data relating to the Fisher Scale, the location of the aneurysm, the age and the gender of patients were obtained through review of medical records. Patients were divided according to the Fisher Scale (Fisher I, II, III or IV) and compared with a control group of individuals considered normal. Disorders in language and verbal fluency in patients with SAH in the preoperative period were evidenced. The classification of the patients according to the Fisher Scale allowed to identify differences between the sub-groups and to conclude that patients with bulkier bleeding (Fisher III and IV) have larger declines in the analyzed functions.

  2. Resources and estuarine health: Perceptions of elected officials and recreational fishers

    SciTech Connect

    Burger, J.; Sanchez, J.; McMahon, M.; Leonard, J.; Lord, C.G.; Ramos, R.; Gochfeld, M.

    1999-10-29

    It is important to understand the perceptions of user groups regarding both the health of their estuaries and environmental problems requiring management. Recreational fishers were interviewed to determine the perceptions of one of the traditional user groups of Barnegat Bay (New Jersey), and elected officials were interviewed to determine if the people charged with making decisions about environmental issues in the bay held similar perceptions. Although relative ratings were similar, there were significant differences in perceptions of the severity of environmental problems, and for the most part, public officials thought the problems were more severe than did the fishers. Personal watercraft (often called Jet Skis) were rated as the most severe problem, followed by chemical pollution, junk, over fishing, street runoff, and boat oil. Small boats, sailboats, wind surfers, and foraging birds were not considered environmental problems by either elected officials or fishermen. The disconnect between the perceptions of the recreational fishers and those of the locally elected public officials suggests that officials may be hearing from some of the more vocal people about problems, rather than from the typical fishers. Both groups felt there were decreases in some of the resources in the bay; over 50% felt the number of fish and crabs had declined, the size of fish and crabs had declined, and the number of turtles had declined. Among recreational fishers, there were almost no differences in perceptions of the severity of environmental problems or in changes in the bay. The problems that were rated the most severe were personal watercraft and over fishing by commercial fishers. Recreational fishers ranked sailboats, wind surfers, and fishing by birds as posing no problem for the bay. Most fishers felt there had been recent major changes in Barnegat Bay, with there now being fewer and smaller fish, fewer and smaller crabs, and fewer turtles. The results suggest that the

  3. Comparing Alternative Kernels for the Kernel Method of Test Equating: Gaussian, Logistic, and Uniform Kernels. Research Report. ETS RR-08-12

    ERIC Educational Resources Information Center

    Lee, Yi-Hsuan; von Davier, Alina A.

    2008-01-01

    The kernel equating method (von Davier, Holland, & Thayer, 2004) is based on a flexible family of equipercentile-like equating functions that use a Gaussian kernel to continuize the discrete score distributions. While the classical equipercentile, or percentile-rank, equating method carries out the continuization step by linear interpolation,…

  4. Can reliable values of Young's modulus be deduced from Fisher's (1971) spinning lens measurements?

    PubMed

    Burd, H J; Wilde, G S; Judge, S J

    2006-04-01

    The current textbook view of the causes of presbyopia rests very largely on a series of experiments reported by R.F. Fisher some three decades ago, and in particular on the values of lens Young's modulus inferred from the deformation caused by spinning excised lenses about their optical axis (Fisher 1971) We studied the extent to which inferred values of Young's modulus are influenced by assumptions inherent in the mathematical procedures used by Fisher to interpret the test and we investigated several alternative interpretation methods. The results suggest that modelling assumptions inherent in Fisher's original method may have led to systematic errors in the determination of the Young's modulus of the cortex and nucleus. Fisher's conclusion that the cortex is stiffer than the nucleus, particularly in middle age, may be an artefact associated with these systematic errors. Moreover, none of the models we explored are able to account for Fisher's claim that the removal of the capsule has only a modest effect on the deformations induced in the spinning lens.

  5. Rapidly shifting environmental baselines among fishers of the Gulf of California

    PubMed Central

    Sáenz-Arroyo, Andrea; Roberts, Callum M; Torre, Jorge; Cariño-Olvera, Micheline; Enríquez-Andrade, Roberto R

    2005-01-01

    Shifting environmental baselines are inter-generational changes in perception of the state of the environment. As one generation replaces another, people's perceptions of what is natural change even to the extent that they no longer believe historical anecdotes of past abundance or size of species. Although widely accepted, this phenomenon has yet to be quantitatively tested. Here we survey three generations of fishers from Mexico's Gulf of California (N=108), where fish populations have declined steeply over the last 60 years, to investigate how far and fast their environmental baselines are shifting. Compared to young fishers, old fishers named five times as many species and four times as many fishing sites as once being abundant/productive but now depleted (Kruskal–Wallis tests, both p<0.001) with no evidence of a slowdown in rates of loss experienced by younger compared to older generations (Kruskal–Wallis test, n.s. in both cases). Old fishers caught up to 25 times as many Gulf grouper Mycteroperca jordani as young fishers on their best ever fishing day (regression r2=0.62, p<0.001). Despite times of plentiful large fish still being within living memory, few young fishers appreciated that large species had ever been common or nearshore sites productive. Such rapid shifts in perception of what is natural help explain why society is tolerant of the creeping loss of biodiversity. They imply a large educational hurdle in efforts to reset expectations and targets for conservation. PMID:16191603

  6. Small convolution kernels for high-fidelity image restoration

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Park, Stephen K.

    1991-01-01

    An algorithm is developed for computing the mean-square-optimal values for small, image-restoration kernels. The algorithm is based on a comprehensive, end-to-end imaging system model that accounts for the important components of the imaging process: the statistics of the scene, the point-spread function of the image-gathering device, sampling effects, noise, and display reconstruction. Subject to constraints on the spatial support of the kernel, the algorithm generates the kernel values that restore the image with maximum fidelity, that is, the kernel minimizes the expected mean-square restoration error. The algorithm is consistent with the derivation of the spatially unconstrained Wiener filter, but leads to a small, spatially constrained kernel that, unlike the unconstrained filter, can be efficiently implemented by convolution. Simulation experiments demonstrate that for a wide range of imaging systems these small kernels can restore images with fidelity comparable to images restored with the unconstrained Wiener filter.

  7. Influence of wheat kernel physical properties on the pulverizing process.

    PubMed

    Dziki, Dariusz; Cacak-Pietrzak, Grażyna; Miś, Antoni; Jończyk, Krzysztof; Gawlik-Dziki, Urszula

    2014-10-01

    The physical properties of wheat kernel were determined and related to pulverizing performance by correlation analysis. Nineteen samples of wheat cultivars about similar level of protein content (11.2-12.8 % w.b.) and obtained from organic farming system were used for analysis. The kernel (moisture content 10 % w.b.) was pulverized by using the laboratory hammer mill equipped with round holes 1.0 mm screen. The specific grinding energy ranged from 120 kJkg(-1) to 159 kJkg(-1). On the basis of data obtained many of significant correlations (p < 0.05) were found between wheat kernel physical properties and pulverizing process of wheat kernel, especially wheat kernel hardness index (obtained on the basis of Single Kernel Characterization System) and vitreousness significantly and positively correlated with the grinding energy indices and the mass fraction of coarse particles (> 0.5 mm). Among the kernel mechanical properties determined on the basis of uniaxial compression test only the rapture force was correlated with the impact grinding results. The results showed also positive and significant relationships between kernel ash content and grinding energy requirements. On the basis of wheat physical properties the multiple linear regression was proposed for predicting the average particle size of pulverized kernel.

  8. Geometric tree kernels: classification of COPD from airway tree geometry.

    PubMed

    Feragen, Aasa; Petersen, Jens; Grimm, Dominik; Dirksen, Asger; Pedersen, Jesper Holst; Borgwardt, Karsten; de Bruijne, Marleen

    2013-01-01

    Methodological contributions: This paper introduces a family of kernels for analyzing (anatomical) trees endowed with vector valued measurements made along the tree. While state-of-the-art graph and tree kernels use combinatorial tree/graph structure with discrete node and edge labels, the kernels presented in this paper can include geometric information such as branch shape, branch radius or other vector valued properties. In addition to being flexible in their ability to model different types of attributes, the presented kernels are computationally efficient and some of them can easily be computed for large datasets (N - 10.000) of trees with 30 - 600 branches. Combining the kernels with standard machine learning tools enables us to analyze the relation between disease and anatomical tree structure and geometry. Experimental results: The kernels are used to compare airway trees segmented from low-dose CT, endowed with branch shape descriptors and airway wall area percentage measurements made along the tree. Using kernelized hypothesis testing we show that the geometric airway trees are significantly differently distributed in patients with Chronic Obstructive Pulmonary Disease (COPD) than in healthy individuals. The geometric tree kernels also give a significant increase in the classification accuracy of COPD from geometric tree structure endowed with airway wall thickness measurements in comparison with state-of-the-art methods, giving further insight into the relationship between airway wall thickness and COPD. Software: Software for computing kernels and statistical tests is available at http://image.diku.dk/aasa/software.php.

  9. A Kernel-based Account of Bibliometric Measures

    NASA Astrophysics Data System (ADS)

    Ito, Takahiko; Shimbo, Masashi; Kudo, Taku; Matsumoto, Yuji

    The application of kernel methods to citation analysis is explored. We show that a family of kernels on graphs provides a unified perspective on the three bibliometric measures that have been discussed independently: relatedness between documents, global importance of individual documents, and importance of documents relative to one or more (root) documents (relative importance). The framework provided by the kernels establishes relative importance as an intermediate between relatedness and global importance, in which the degree of `relativity,' or the bias between relatedness and importance, is naturally controlled by a parameter characterizing individual kernels in the family.

  10. TGDA: Nonparametric Discriminant Analysis

    ERIC Educational Resources Information Center

    Pohl, Norval F.; Bruno, Albert V.

    1976-01-01

    A computer program for two-group nonparametric discriminant analysis is presented. Based on Bayes' Theorem for probability revision, the statistical rationale for this program uses the calculation of maximum likelihood estimates of group membership. The program compares the Bayesian procedure to the standard Linear Discriminant Function.…

  11. Flash-Type Discrimination

    NASA Technical Reports Server (NTRS)

    Koshak, William J.

    2010-01-01

    This viewgraph presentation describes the significant progress made in the flash-type discrimination algorithm development. The contents include: 1) Highlights of Progress for GLM-R3 Flash-Type discrimination Algorithm Development; 2) Maximum Group Area (MGA) Data; 3) Retrieval Errors from Simulations; and 4) Preliminary Global-scale Retrieval.

  12. The "Taste" for Discrimination.

    ERIC Educational Resources Information Center

    Chiswick, Barry R.

    1985-01-01

    Discusses, in terms of consumers, employers, and employees, how a "taste for discrimination," that is, someone's preference for or against association with some group in the labor market, can influence behavior and hence who gets hired. Argues that people with the strongest tastes for discrimination pay the heaviest cost. (RDN)

  13. Discrimination against Black Students

    ERIC Educational Resources Information Center

    Aloud, Ashwaq; Alsulayyim, Maryam

    2016-01-01

    Discrimination is a structured way of abusing people based on racial differences, hence barring them from accessing wealth, political participation and engagement in many spheres of human life. Racism and discrimination are inherently rooted in institutions in the society, the problem has spread across many social segments of the society including…

  14. Microscale acceleration history discriminators

    DOEpatents

    Polosky, Marc A.; Plummer, David W.

    2002-01-01

    A new class of micromechanical acceleration history discriminators is claimed. These discriminators allow the precise differentiation of a wide range of acceleration-time histories, thereby allowing adaptive events to be triggered in response to the severity (or lack thereof) of an external environment. Such devices have applications in airbag activation, and other safety and surety applications.

  15. Rapid discrimination of main red meat species based on near-infrared hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Qiao, Lu; Peng, Yankun; Chao, Kuanglin; Qin, Jianwei

    2016-05-01

    Meat is the necessary source of essential nutrients for people including protein, fat, and so on. The discrimination of meat species and the determination of meat authenticity have been an important issue in the meat industry. The objective of this study is to realize the fast and accurate identification of three main red meats containing beef, lamb and pork by using near-infrared hyperspectral imaging (HSI) technology. After acquiring the hyperspectral images of meat samples, the calibration of acquired images and selection of the region of interest (ROI) were carried out. Then spectral preprocessing method of standard normal variate correction (SNV) was used to reduce the light scattering and random noise before the spectral analysis. Finally, characteristic wavelengths were extracted by principal component analysis (PCA), and the Fisher linear discriminant method was applied to establish Fisher discriminant functions to identify the meat species. All the samples were collected from different batches in order to improve the coverage of the models. In addition to the validation of sample itself in train set and cross validation, three different meat samples were sliced at the size of 2cm×2cm×2 cm approximately and were spliced together in one interface to be scanned by HSI system. The acquired hyperspectral data was applied to further validate the discriminant model. The results demonstrated that the near-infrared hyperspectral imaging technology could be applied as an effective, rapid and non-destructive discrimination method for main red meats.

  16. Model-based online learning with kernels.

    PubMed

    Li, Guoqi; Wen, Changyun; Li, Zheng Guo; Zhang, Aimin; Yang, Feng; Mao, Kezhi

    2013-03-01

    New optimization models and algorithms for online learning with Kernels (OLK) in classification, regression, and novelty detection are proposed in a reproducing Kernel Hilbert space. Unlike the stochastic gradient descent algorithm, called the naive online Reg minimization algorithm (NORMA), OLK algorithms are obtained by solving a constrained optimization problem based on the proposed models. By exploiting the techniques of the Lagrange dual problem like Vapnik's support vector machine (SVM), the solution of the optimization problem can be obtained iteratively and the iteration process is similar to that of the NORMA. This further strengthens the foundation of OLK and enriches the research area of SVM. We also apply the obtained OLK algorithms to problems in classification, regression, and novelty detection, including real time background substraction, to show their effectiveness. It is illustrated that, based on the experimental results of both classification and regression, the accuracy of OLK algorithms is comparable with traditional SVM-based algorithms, such as SVM and least square SVM (LS-SVM), and with the state-of-the-art algorithms, such as Kernel recursive least square (KRLS) method and projectron method, while it is slightly higher than that of NORMA. On the other hand, the computational cost of the OLK algorithm is comparable with or slightly lower than existing online methods, such as above mentioned NORMA, KRLS, and projectron methods, but much lower than that of SVM-based algorithms. In addition, different from SVM and LS-SVM, it is possible for OLK algorithms to be applied to non-stationary problems. Also, the applicability of OLK in novelty detection is illustrated by simulation results.

  17. Robust kernel collaborative representation for face recognition

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Wang, Xiaohui; Ma, Yanbo; Jiang, Yuzheng; Zhu, Yinghui; Jin, Zhong

    2015-05-01

    One of the greatest challenges of representation-based face recognition is that the training samples are usually insufficient. In other words, the training set usually does not include enough samples to show varieties of high-dimensional face images caused by illuminations, facial expressions, and postures. When the test sample is significantly different from the training samples of the same subject, the recognition performance will be sharply reduced. We propose a robust kernel collaborative representation based on virtual samples for face recognition. We think that the virtual training set conveys some reasonable and possible variations of the original training samples. Hence, we design a new object function to more closely match the representation coefficients generated from the original and virtual training sets. In order to further improve the robustness, we implement the corresponding representation-based face recognition in kernel space. It is noteworthy that any kind of virtual training samples can be used in our method. We use noised face images to obtain virtual face samples. The noise can be approximately viewed as a reflection of the varieties of illuminations, facial expressions, and postures. Our work is a simple and feasible way to obtain virtual face samples to impose Gaussian noise (and other types of noise) specifically to the original training samples to obtain possible variations of the original samples. Experimental results on the FERET, Georgia Tech, and ORL face databases show that the proposed method is more robust than two state-of-the-art face recognition methods, such as CRC and Kernel CRC.

  18. Discrimination of raw and processed Dipsacus asperoides by near infrared spectroscopy combined with least squares-support vector machine and random forests

    NASA Astrophysics Data System (ADS)

    Xin, Ni; Gu, Xiao-Feng; Wu, Hao; Hu, Yu-Zhu; Yang, Zhong-Lin

    2012-04-01

    Most herbal medicines could be processed to fulfill the different requirements of therapy. The purpose of this study was to discriminate between raw and processed Dipsacus asperoides, a common traditional Chinese medicine, based on their near infrared (NIR) spectra. Least squares-support vector machine (LS-SVM) and random forests (RF) were employed for full-spectrum classification. Three types of kernels, including linear kernel, polynomial kernel and radial basis function kernel (RBF), were checked for optimization of LS-SVM model. For comparison, a linear discriminant analysis (LDA) model was performed for classification, and the successive projections algorithm (SPA) was executed prior to building an LDA model to choose an appropriate subset of wavelengths. The three methods were applied to a dataset containing 40 raw herbs and 40 corresponding processed herbs. We ran 50 runs of 10-fold cross validation to evaluate the model's efficiency. The performance of the LS-SVM with RBF kernel (RBF LS-SVM) was better than the other two kernels. The RF, RBF LS-SVM and SPA-LDA successfully classified all test samples. The mean error rates for the 50 runs of 10-fold cross validation were 1.35% for RBF LS-SVM, 2.87% for RF, and 2.50% for SPA-LDA. The best classification results were obtained by using LS-SVM with RBF kernel, while RF was fast in the training and making predictions.

  19. Prediction of kernel density of corn using single-kernel near infrared spectroscopy

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Corn hardness as is an important property for dry and wet-millers, food processors and corn breeders developing hybrids for specific markets. Of the several methods used to measure hardness, kernel density measurements are one of the more repeatable methods to quantify hardness. Near infrared spec...

  20. Neutron scattering kernel for solid deuterium

    NASA Astrophysics Data System (ADS)

    Granada, J. R.

    2009-06-01

    A new scattering kernel to describe the interaction of slow neutrons with solid deuterium was developed. The main characteristics of that system are contained in the formalism, including the lattice's density of states, the Young-Koppel quantum treatment of the rotations, and the internal molecular vibrations. The elastic processes involving coherent and incoherent contributions are fully described, as well as the spin-correlation effects. The results from the new model are compared with the best available experimental data, showing very good agreement.

  1. Oil point pressure of Indian almond kernels

    NASA Astrophysics Data System (ADS)

    Aregbesola, O.; Olatunde, G.; Esuola, S.; Owolarafe, O.

    2012-07-01

    The effect of preprocessing conditions such as moisture content, heating temperature, heating time and particle size on oil point pressure of Indian almond kernel was investigated. Results showed that oil point pressure was significantly (P < 0.05) affected by above mentioned parameters. It was also observed that oil point pressure reduced with increase in heating temperature and heating time for both coarse and fine particles. Furthermore, an increase in moisture content resulted in increased oil point pressure for coarse particles while there was a reduction in oil point pressure with increase in moisture content for fine particles.

  2. Verification of Chare-kernel programs

    SciTech Connect

    Bhansali, S.; Kale, L.V. )

    1989-01-01

    Experience with concurrent programming has shown that concurrent programs can conceal bugs even after extensive testing. Thus, there is a need for practical techniques which can establish the correctness of parallel programs. This paper proposes a method for showing how to prove the partial correctness of programs written in the Chare-kernel language, which is a language designed to support the parallel execution of computation with irregular structures. The proof is based on the lattice proof technique and is divided into two parts. The first part is concerned with the program behavior within a single chare instance, whereas the second part captures the inter-chare interaction.

  3. Fish Consumption Patterns and Mercury Advisory Knowledge Among Fishers in the Haw River Basin

    PubMed Central

    Johnston, Jill E.; Hoffman, Kate; Wing, Steve; Lowman, Amy

    2016-01-01

    BACKGROUND Fish consumption has numerous health benefits, with fish providing a source of protein as well as omega-3 fatty acids. However, some fish also contain contaminants that can impair human health. In North Carolina, the Department of Health and Human Services has issued fish consumption advisories due to methylmercury contamination in fish. Little is known about local fishers’ consumption patterns and advisory adherence in North Carolina. METHODS We surveyed a consecutive sample of 50 fishers (74.6% positive response rate) who reported eating fish caught from the Haw River Basin or Jordan Lake. They provided information on demographic characteristics, species caught, and the frequency of local fish consumption. Additionally, fishers provided information on their knowledge of fish consumption advisories and the impact of those advisories on their fishing and fish consumption patterns. RESULTS The majority of participants were male (n = 44) and reported living in central North Carolina. Catfish, crappie, sunfish, and large-mouth bass were consumed more frequently than other species of fish. Of the fishers surveyed, 8 reported eating more than 1 fish meal high in mercury per week, which exceeds the North Carolina advisory recommendation. Most participants (n = 32) had no knowledge of local fish advisories, and only 4 fishers reported that advisories impacted their fishing practices. LIMITATIONS We sampled 50 fishers at 11 locations. There is no enumeration of the dynamic population of fishers and no way to assess the representativeness of this sample. CONCLUSIONS Additional outreach is needed to make local fishers aware of fish consumption advisories and the potential health impacts of eating high-mercury fish, which may also contain other persistent and bioaccumulative toxins. PMID:26763238

  4. Kernel learning at the first level of inference.

    PubMed

    Cawley, Gavin C; Talbot, Nicola L C

    2014-05-01

    Kernel learning methods, whether Bayesian or frequentist, typically involve multiple levels of inference, with the coefficients of the kernel expansion being determined at the first level and the kernel and regularisation parameters carefully tuned at the second level, a process known as model selection. Model selection for kernel machines is commonly performed via optimisation of a suitable model selection criterion, often based on cross-validation or theoretical performance bounds. However, if there are a large number of kernel parameters, as for instance in the case of automatic relevance determination (ARD), there is a substantial risk of over-fitting the model selection criterion, resulting in poor generalisation performance. In this paper we investigate the possibility of learning the kernel, for the Least-Squares Support Vector Machine (LS-SVM) classifier, at the first level of inference, i.e. parameter optimisation. The kernel parameters and the coefficients of the kernel expansion are jointly optimised at the first level of inference, minimising a training criterion with an additional regularisation term acting on the kernel parameters. The key advantage of this approach is that the values of only two regularisation parameters need be determined in model selection, substantially alleviating the problem of over-fitting the model selection criterion. The benefits of this approach are demonstrated using a suite of synthetic and real-world binary classification benchmark problems, where kernel learning at the first level of inference is shown to be statistically superior to the conventional approach, improves on our previous work (Cawley and Talbot, 2007) and is competitive with Multiple Kernel Learning approaches, but with reduced computational expense.

  5. Analysis of maize (Zea mays) kernel density and volume using micro-computed tomography and single-kernel near infrared spectroscopy

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Maize kernel density impacts milling quality of the grain due to kernel hardness. Harder kernels are correlated with higher test weight and are more resistant to breakage during harvest and transport. Softer kernels, in addition to being susceptible to mechanical damage, are also prone to pathogen ...

  6. Landscape-scale habitat selection by fishers translocated to the Olympic Peninsula of Washington

    USGS Publications Warehouse

    Lewis, Jeffrey C.; Jenkins, Kurt J.; Happe, Patricia J.; Manson, David J.; McCalmon, Marc

    2016-01-01

    The fisher was extirpated from much of the Pacific Northwestern United States during the mid- to late-1900s and is now proposed for federal listing as a threatened species in all or part of its west coast range. Following the translocation of 90 fishers from central British Columbia, Canada, to the Olympic Peninsula of Washington State from 2008 to 2010, we investigated the landscape-scale habitat selection of reintroduced fishers across a broad range of forest ages and disturbance histories, providing the first information on habitat relationships of newly reintroduced fishers in coastal coniferous forests in the Pacific Northwest. We developed 17 a priori models to evaluate several habitat-selection hypotheses based on premises of habitat models used to forecast habitat suitability for the reintroduced population. Further, we hypothesized that female fishers, because of their smaller body size than males, greater vulnerability to predation, and specific reproductive requirements, would be more selective than males for mid- to late-seral forest communities, where complex forest structural elements provide secure foraging, resting, and denning sites. We assessed 11 forest structure and landscape characteristics within the home range core-areas used by 19 females and 12 males and within randomly placed pseudo core areas that represented available habitats. We used case-controlled logistic regression to compare the characteristics of used and pseudo core areas and to assess selection by male and female fishers. Females were more selective of core area placement than males. Fifteen of 19 females (79%) and 5 of 12 males (42%) selected core areas within federal lands that encompassed primarily forests with an overstory of mid-sized or large trees. Male fishers exhibited only weak selection for core areas dominated by forests with an overstory of small trees, primarily on land managed for timber production or at high elevations. The amount of natural open area best

  7. Delimiting Areas of Endemism through Kernel Interpolation

    PubMed Central

    Oliveira, Ubirajara; Brescovit, Antonio D.; Santos, Adalberto J.

    2015-01-01

    We propose a new approach for identification of areas of endemism, the Geographical Interpolation of Endemism (GIE), based on kernel spatial interpolation. This method differs from others in being independent of grid cells. This new approach is based on estimating the overlap between the distribution of species through a kernel interpolation of centroids of species distribution and areas of influence defined from the distance between the centroid and the farthest point of occurrence of each species. We used this method to delimit areas of endemism of spiders from Brazil. To assess the effectiveness of GIE, we analyzed the same data using Parsimony Analysis of Endemism and NDM and compared the areas identified through each method. The analyses using GIE identified 101 areas of endemism of spiders in Brazil GIE demonstrated to be effective in identifying areas of endemism in multiple scales, with fuzzy edges and supported by more synendemic species than in the other methods. The areas of endemism identified with GIE were generally congruent with those identified for other taxonomic groups, suggesting that common processes can be responsible for the origin and maintenance of these biogeographic units. PMID:25611971

  8. Bergman kernel, balanced metrics and black holes

    NASA Astrophysics Data System (ADS)

    Klevtsov, Semyon

    In this thesis we explore the connections between the Kahler geometry and Landau levels on compact manifolds. We rederive the expansion of the Bergman kernel on Kahler manifolds developed by Tian, Yau, Zelditch, Lu and Catlin, using path integral and perturbation theory. The physics interpretation of this result is as an expansion of the projector of wavefunctions on the lowest Landau level, in the special case that the magnetic field is proportional to the Kahler form. This is a geometric expansion, somewhat similar to the DeWitt-Seeley-Gilkey short time expansion for the heat kernel, but in this case describing the long time limit, without depending on supersymmetry. We also generalize this expansion to supersymmetric quantum mechanics and more general magnetic fields, and explore its applications. These include the quantum Hall effect in curved space, the balanced metrics and Kahler gravity. In particular, we conjecture that for a probe in a BPS black hole in type II strings compactified on Calabi-Yau manifolds, the moduli space metric is the balanced metric.

  9. Scientific Computing Kernels on the Cell Processor

    SciTech Connect

    Williams, Samuel W.; Shalf, John; Oliker, Leonid; Kamil, Shoaib; Husbands, Parry; Yelick, Katherine

    2007-04-04

    The slowing pace of commodity microprocessor performance improvements combined with ever-increasing chip power demands has become of utmost concern to computational scientists. As a result, the high performance computing community is examining alternative architectures that address the limitations of modern cache-based designs. In this work, we examine the potential of using the recently-released STI Cell processor as a building block for future high-end computing systems. Our work contains several novel contributions. First, we introduce a performance model for Cell and apply it to several key scientific computing kernels: dense matrix multiply, sparse matrix vector multiply, stencil computations, and 1D/2D FFTs. The difficulty of programming Cell, which requires assembly level intrinsics for the best performance, makes this model useful as an initial step in algorithm design and evaluation. Next, we validate the accuracy of our model by comparing results against published hardware results, as well as our own implementations on a 3.2GHz Cell blade. Additionally, we compare Cell performance to benchmarks run on leading superscalar (AMD Opteron), VLIW (Intel Itanium2), and vector (Cray X1E) architectures. Our work also explores several different mappings of the kernels and demonstrates a simple and effective programming model for Cell's unique architecture. Finally, we propose modest microarchitectural modifications that could significantly increase the efficiency of double-precision calculations. Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency.

  10. Generalized Langevin equation with tempered memory kernel

    NASA Astrophysics Data System (ADS)

    Liemert, André; Sandev, Trifce; Kantz, Holger

    2017-01-01

    We study a generalized Langevin equation for a free particle in presence of a truncated power-law and Mittag-Leffler memory kernel. It is shown that in presence of truncation, the particle from subdiffusive behavior in the short time limit, turns to normal diffusion in the long time limit. The case of harmonic oscillator is considered as well, and the relaxation functions and the normalized displacement correlation function are represented in an exact form. By considering external time-dependent periodic force we obtain resonant behavior even in case of a free particle due to the influence of the environment on the particle movement. Additionally, the double-peak phenomenon in the imaginary part of the complex susceptibility is observed. It is obtained that the truncation parameter has a huge influence on the behavior of these quantities, and it is shown how the truncation parameter changes the critical frequencies. The normalized displacement correlation function for a fractional generalized Langevin equation is investigated as well. All the results are exact and given in terms of the three parameter Mittag-Leffler function and the Prabhakar generalized integral operator, which in the kernel contains a three parameter Mittag-Leffler function. Such kind of truncated Langevin equation motion can be of high relevance for the description of lateral diffusion of lipids and proteins in cell membranes.

  11. Transcriptome analysis of Ginkgo biloba kernels

    PubMed Central

    He, Bing; Gu, Yincong; Xu, Meng; Wang, Jianwen; Cao, Fuliang; Xu, Li-an

    2015-01-01

    Ginkgo biloba is a dioecious species native to China with medicinally and phylogenetically important characteristics; however, genomic resources for this species are limited. In this study, we performed the first transcriptome sequencing for Ginkgo kernels at five time points using Illumina paired-end sequencing. Approximately 25.08-Gb clean reads were obtained, and 68,547 unigenes with an average length of 870 bp were generated by de novo assembly. Of these unigenes, 29,987 (43.74%) were annotated in publicly available plant protein database. A total of 3,869 genes were identified as significantly differentially expressed, and enrichment analysis was conducted at different time points. Furthermore, metabolic pathway analysis revealed that 66 unigenes were responsible for terpenoid backbone biosynthesis, with up to 12 up-regulated unigenes involved in the biosynthesis of ginkgolide and bilobalide. Differential gene expression analysis together with real-time PCR experiments indicated that the synthesis of bilobalide may have interfered with the ginkgolide synthesis process in the kernel. These data can remarkably expand the existing transcriptome resources of Ginkgo, and provide a valuable platform to reveal more on developmental and metabolic mechanisms of this species. PMID:26500663

  12. Aligning Biomolecular Networks Using Modular Graph Kernels

    NASA Astrophysics Data System (ADS)

    Towfic, Fadi; Greenlee, M. Heather West; Honavar, Vasant

    Comparative analysis of biomolecular networks constructed using measurements from different conditions, tissues, and organisms offer a powerful approach to understanding the structure, function, dynamics, and evolution of complex biological systems. We explore a class of algorithms for aligning large biomolecular networks by breaking down such networks into subgraphs and computing the alignment of the networks based on the alignment of their subgraphs. The resulting subnetworks are compared using graph kernels as scoring functions. We provide implementations of the resulting algorithms as part of BiNA, an open source biomolecular network alignment toolkit. Our experiments using Drosophila melanogaster, Saccharomyces cerevisiae, Mus musculus and Homo sapiens protein-protein interaction networks extracted from the DIP repository of protein-protein interaction data demonstrate that the performance of the proposed algorithms (as measured by % GO term enrichment of subnetworks identified by the alignment) is competitive with some of the state-of-the-art algorithms for pair-wise alignment of large protein-protein interaction networks. Our results also show that the inter-species similarity scores computed based on graph kernels can be used to cluster the species into a species tree that is consistent with the known phylogenetic relationships among the species.

  13. Conservation of the Eastern Taiwan Strait Chinese White Dolphin (Sousa chinensis): Fishers' Perspectives and Management Implications.

    PubMed

    Liu, Ta-Kang; Wang, Yu-Cheng; Chuang, Laurence Zsu-Hsin; Chen, Chih-How

    2016-01-01

    The abundance of the eastern Taiwan Strait (ETS) population of the Chinese white dolphin (Sousa chinensis) has been estimated to be less than 100 individuals. It is categorized as critically endangered in the IUCN Red List of Threatened Species. Thus, immediate measures of conservation should be taken to protect it from extinction. Currently, the Taiwanese government plans to designate its habitat as a Major Wildlife Habitat (MWH), a type of marine protected area (MPA) for conservation of wildlife species. Although the designation allows continuing the current exploitation, however, it may cause conflicts among multiple stakeholders with competing interests. The study is to explore the attitude and opinions among the stakeholders in order to better manage the MPA. This study employs a semi-structured interview and a questionnaire survey of local fishers. Results from interviews indicated that the subsistence of fishers remains a major problem. It was found that stakeholders have different perceptions of the fishers' attitude towards conservation and also thought that the fishery-related law enforcement could be difficult. Quantitative survey showed that fishers are generally positive towards the conservation of the Chinese white dolphin but are less willing to participate in the planning process. Most fishers considered temporary fishing closure as feasible for conservation. The results of this study provide recommendations for future efforts towards the goal of better conservation for this endangered species.

  14. Statistical Inference in the Wright-Fisher Model Using Allele Frequency Data.

    PubMed

    Tataru, Paula; Simonsen, Maria; Bataillon, Thomas; Hobolth, Asger

    2016-08-02

    The Wright-Fisher model provides an elegant mathematical framework for understanding allele frequency data. In particular, the model can be used to infer the demographic history of species and identify loci under selection. A crucial quantity for inference under the Wright-Fisher model is the distribution of allele frequencies (DAF). Despite the apparent simplicity of the model, the calculation of the DAF is challenging. We review and discuss strategies for approximating the DAF, and how these are used in methods that perform inference from allele frequency data. Various evolutionary forces can be incorporated in the Wright-Fisher model, and we consider these in turn. We begin our review with the basic bi-allelic Wright-Fisher model where random genetic drift is the only evolutionary force. We then consider mutation, migration, and selection. In particular, we compare diffusion-based and moment-based methods in terms of accuracy, computational efficiency, and analytical tractability. We conclude with a brief overview of the multi-allelic process with a general mutation model. [Allele frequency, diffusion, inference, moments, selection, Wright-Fisher.].

  15. A beacon configuration optimization method based on Fisher information for Mars atmospheric entry

    NASA Astrophysics Data System (ADS)

    Zhao, Zeduan; Yu, Zhengshi; Cui, Pingyuan

    2017-04-01

    The navigation capability of the proposed Mars network based entry navigation system is directly related to the beacon number and the relative configuration between the beacons and the entry vehicle. In this paper, a new beacon configuration optimization method is developed based on the Fisher information theory and this method is suitable for any number of visible beacons. The proposed method can be used for the navigation schemes based on range measurements provided by radio transceivers or other sensors for Mars entry. The observability of specific state is defined as its Fisher information based on the observation model. The overall navigation capability is improved by maximizing the minimum average Fisher information, even though the navigation system is not fully observed. In addition, when there is only one beacon capable of entry navigation and the observation information is relatively limited, the optimization method can be modulated to maximize the Fisher information of the specific state which may be preferred for the guidance and control system to improve its estimation accuracy. Finally, navigation scenarios consisted of 1-3 beacons are tested to validate the effectiveness of the developed optimization method. The extended Kalman filter (EKF) is employed to derive the state estimation error covariance. The results also show that the zero-Fisher information situation should be avoided, especially when the dynamic system is highly nonlinear and the states change dramatically.

  16. Fast and robust discrimination of almonds (Prunus amygdalus) with respect to their bitterness by using near infrared and partial least squares-discriminant analysis.

    PubMed

    Borràs, Eva; Amigo, José Manuel; van den Berg, Frans; Boqué, Ricard; Busto, Olga

    2014-06-15

    In this study, near-infrared spectroscopy (NIR) coupled to chemometrics is used to develop a fast, simple, non-destructive and robust method for discriminating sweet and bitter almonds (Prunus amygdalus) by the in situ measurement of the kernel surface without any sample pre-treatment. Principal component analysis (PCA) and partial least-squares discriminant analysis (PLS-DA) models were built to discriminate both types of almonds, obtaining high levels of sensitivity and specificity for both classes, with more than 95% of the samples correctly classified and discriminated. Moreover, the almonds were also analysed by Raman spectroscopy, the reference technique for this type of analysis, to validate and confirm the results obtained by NIR.

  17. Sugar uptake into kernels of tunicate tassel-seed maize

    SciTech Connect

    Thomas, P.A.; Felker, F.C.; Crawford, C.G. )

    1990-05-01

    A maize (Zea mays L.) strain expressing both the tassel-seed (Ts-5) and tunicate (Tu) characters was developed which produces glume-covered kernels on the tassel, often born on 7-10 mm pedicels. Vigorous plants produce up to 100 such kernels interspersed with additional sessile kernels. This floral unit provides a potentially valuable experimental system for studying sugar uptake into developing maize seeds. When detached kernels (with glumes and pedicel intact) are placed in incubation solution, fluid flows up the pedicel and into the glumes, entering the pedicel apoplast near the kernel base. The unusual anatomical features of this maize strain permit experimental access to the pedicel apoplast with much less possibility of kernel base tissue damage than with kernels excised from the cob. ({sup 14}C)Fructose incorporation into soluble and insoluble fractions of endosperm increased for 8 days. Endosperm uptake of sucrose, fructose, and D-glucose was significantly greater than that of L-glucose. Fructose uptake was significantly inhibited by CCCP, DNP, and PCMBS. These results suggest the presence of an active, non-diffusion component of sugar transport in maize kernels.

  18. Integral Transform Methods: A Critical Review of Various Kernels

    NASA Astrophysics Data System (ADS)

    Orlandini, Giuseppina; Turro, Francesco

    2017-03-01

    Some general remarks about integral transform approaches to response functions are made. Their advantage for calculating cross sections at energies in the continuum is stressed. In particular we discuss the class of kernels that allow calculations of the transform by matrix diagonalization. A particular set of such kernels, namely the wavelets, is tested in a model study.

  19. Evidence-Based Kernels: Fundamental Units of Behavioral Influence

    ERIC Educational Resources Information Center

    Embry, Dennis D.; Biglan, Anthony

    2008-01-01

    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behavior-influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of…

  20. Comparison of Kernel Equating and Item Response Theory Equating Methods

    ERIC Educational Resources Information Center

    Meng, Yu

    2012-01-01

    The kernel method of test equating is a unified approach to test equating with some advantages over traditional equating methods. Therefore, it is important to evaluate in a comprehensive way the usefulness and appropriateness of the Kernel equating (KE) method, as well as its advantages and disadvantages compared with several popular item…

  1. Integrating the Gradient of the Thin Wire Kernel

    NASA Technical Reports Server (NTRS)

    Champagne, Nathan J.; Wilton, Donald R.

    2008-01-01

    A formulation for integrating the gradient of the thin wire kernel is presented. This approach employs a new expression for the gradient of the thin wire kernel derived from a recent technique for numerically evaluating the exact thin wire kernel. This approach should provide essentially arbitrary accuracy and may be used with higher-order elements and basis functions using the procedure described in [4].When the source and observation points are close, the potential integrals over wire segments involving the wire kernel are split into parts to handle the singular behavior of the integrand [1]. The singularity characteristics of the gradient of the wire kernel are different than those of the wire kernel, and the axial and radial components have different singularities. The characteristics of the gradient of the wire kernel are discussed in [2]. To evaluate the near electric and magnetic fields of a wire, the integration of the gradient of the wire kernel needs to be calculated over the source wire. Since the vector bases for current have constant direction on linear wire segments, these integrals reduce to integrals of the form

  2. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AGREEMENTS AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  3. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  4. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AGREEMENTS AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  5. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  6. High speed sorting of Fusarium-damaged wheat kernels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Recent studies have found that resistance to Fusarium fungal infection can be inherited in wheat from one generation to another. However, there is not yet available a cost effective method to separate Fusarium-damaged wheat kernels from undamaged kernels so that wheat breeders can take advantage of...

  7. End-use quality of soft kernel durum wheat

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Kernel texture is a major determinant of end-use quality of wheat. Durum wheat is known for its very hard texture, which influences how it is milled and for what products it is well suited. We developed soft kernel durum wheat lines via Ph1b-mediated homoeologous recombination with Dr. Leonard Joppa...

  8. Optimal Bandwidth Selection in Observed-Score Kernel Equating

    ERIC Educational Resources Information Center

    Häggström, Jenny; Wiberg, Marie

    2014-01-01

    The selection of bandwidth in kernel equating is important because it has a direct impact on the equated test scores. The aim of this article is to examine the use of double smoothing when selecting bandwidths in kernel equating and to compare double smoothing with the commonly used penalty method. This comparison was made using both an equivalent…

  9. Parametric kernel-driven active contours for image segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Qiongzhi; Fang, Jiangxiong

    2012-10-01

    We investigated a parametric kernel-driven active contour (PKAC) model, which implicitly transfers kernel mapping and piecewise constant to modeling the image data via kernel function. The proposed model consists of curve evolution functional with three terms: global kernel-driven and local kernel-driven terms, which evaluate the deviation of the mapped image data within each region from the piecewise constant model, and a regularization term expressed as the length of the evolution curves. In the local kernel-driven term, the proposed model can effectively segment images with intensity inhomogeneity by incorporating the local image information. By balancing the weight between the global kernel-driven term and the local kernel-driven term, the proposed model can segment the images with either intensity homogeneity or intensity inhomogeneity. To ensure the smoothness of the level set function and reduce the computational cost, the distance regularizing term is applied to penalize the deviation of the level set function and eliminate the requirement of re-initialization. Compared with the local image fitting model and local binary fitting model, experimental results show the advantages of the proposed method in terms of computational efficiency and accuracy.

  10. Evidence-based Kernels: Fundamental Units of Behavioral Influence

    PubMed Central

    Biglan, Anthony

    2008-01-01

    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behavior–influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of its components would render it inert. Existing evidence shows that a variety of kernels can influence behavior in context, and some evidence suggests that frequent use or sufficient use of some kernels may produce longer lasting behavioral shifts. The analysis of kernels could contribute to an empirically based theory of behavioral influence, augment existing prevention or treatment efforts, facilitate the dissemination of effective prevention and treatment practices, clarify the active ingredients in existing interventions, and contribute to efficiently developing interventions that are more effective. Kernels involve one or more of the following mechanisms of behavior influence: reinforcement, altering antecedents, changing verbal relational responding, or changing physiological states directly. The paper describes 52 of these kernels, and details practical, theoretical, and research implications, including calling for a national database of kernels that influence human behavior. PMID:18712600

  11. Computing the roots of complex orthogonal and kernel polynomials

    SciTech Connect

    Saylor, P.E.; Smolarski, D.C.

    1988-01-01

    A method is presented to compute the roots of complex orthogonal and kernel polynomials. An important application of complex kernel polynomials is the acceleration of iterative methods for the solution of nonsymmetric linear equations. In the real case, the roots of orthogonal polynomials coincide with the eigenvalues of the Jacobi matrix, a symmetric tridiagonal matrix obtained from the defining three-term recurrence relationship for the orthogonal polynomials. In the real case kernel polynomials are orthogonal. The Stieltjes procedure is an algorithm to compute the roots of orthogonal and kernel polynomials bases on these facts. In the complex case, the Jacobi matrix generalizes to a Hessenberg matrix, the eigenvalues of which are roots of either orthogonal or kernel polynomials. The resulting algorithm generalizes the Stieljes procedure. It may not be defined in the case of kernel polynomials, a consequence of the fact that they are orthogonal with respect to a nonpositive bilinear form. (Another consequence is that kernel polynomials need not be of exact degree.) A second algorithm that is always defined is presented for kernel polynomials. Numerical examples are described.

  12. OSKI: A Library of Automatically Tuned Sparse Matrix Kernels

    SciTech Connect

    Vuduc, R; Demmel, J W; Yelick, K A

    2005-07-19

    The Optimized Sparse Kernel Interface (OSKI) is a collection of low-level primitives that provide automatically tuned computational kernels on sparse matrices, for use by solver libraries and applications. These kernels include sparse matrix-vector multiply and sparse triangular solve, among others. The primary aim of this interface is to hide the complex decision-making process needed to tune the performance of a kernel implementation for a particular user's sparse matrix and machine, while also exposing the steps and potentially non-trivial costs of tuning at run-time. This paper provides an overview of OSKI, which is based on our research on automatically tuned sparse kernels for modern cache-based superscalar machines.

  13. Direct Measurement of Wave Kernels in Time-Distance Helioseismology

    NASA Technical Reports Server (NTRS)

    Duvall, T. L., Jr.

    2006-01-01

    Solar f-mode waves are surface-gravity waves which propagate horizontally in a thin layer near the photosphere with a dispersion relation approximately that of deep water waves. At the power maximum near 3 mHz, the wavelength of 5 Mm is large enough for various wave scattering properties to be observable. Gizon and Birch (2002,ApJ,571,966)h ave calculated kernels, in the Born approximation, for the sensitivity of wave travel times to local changes in damping rate and source strength. In this work, using isolated small magnetic features as approximate point-sourc'e scatterers, such a kernel has been measured. The observed kernel contains similar features to a theoretical damping kernel but not for a source kernel. A full understanding of the effect of small magnetic features on the waves will require more detailed modeling.

  14. Anatomically-aided PET reconstruction using the kernel method

    NASA Astrophysics Data System (ADS)

    Hutchcroft, Will; Wang, Guobao; Chen, Kevin T.; Catana, Ciprian; Qi, Jinyi

    2016-09-01

    This paper extends the kernel method that was proposed previously for dynamic PET reconstruction, to incorporate anatomical side information into the PET reconstruction model. In contrast to existing methods that incorporate anatomical information using a penalized likelihood framework, the proposed method incorporates this information in the simpler maximum likelihood (ML) formulation and is amenable to ordered subsets. The new method also does not require any segmentation of the anatomical image to obtain edge information. We compare the kernel method with the Bowsher method for anatomically-aided PET image reconstruction through a simulated data set. Computer simulations demonstrate that the kernel method offers advantages over the Bowsher method in region of interest quantification. Additionally the kernel method is applied to a 3D patient data set. The kernel method results in reduced noise at a matched contrast level compared with the conventional ML expectation maximization algorithm.

  15. A novel extended kernel recursive least squares algorithm.

    PubMed

    Zhu, Pingping; Chen, Badong; Príncipe, José C

    2012-08-01

    In this paper, a novel extended kernel recursive least squares algorithm is proposed combining the kernel recursive least squares algorithm and the Kalman filter or its extensions to estimate or predict signals. Unlike the extended kernel recursive least squares (Ex-KRLS) algorithm proposed by Liu, the state model of our algorithm is still constructed in the original state space and the hidden state is estimated using the Kalman filter. The measurement model used in hidden state estimation is learned by the kernel recursive least squares algorithm (KRLS) in reproducing kernel Hilbert space (RKHS). The novel algorithm has more flexible state and noise models. We apply this algorithm to vehicle tracking and the nonlinear Rayleigh fading channel tracking, and compare the tracking performances with other existing algorithms.

  16. Quantity discrimination in salamanders.

    PubMed

    Krusche, Paul; Uller, Claudia; Dicke, Ursula

    2010-06-01

    We investigated discrimination of large quantities in salamanders of the genus Plethodon. Animals were challenged with two different quantities (8 vs 12 or 8 vs 16) in a two-alternative choice task. Stimuli were live crickets, videos of live crickets or images animated by a computer program. Salamanders reliably chose the larger of two quantities when the ratio between the sets was 1:2 and stimuli were live crickets or videos thereof. Magnitude discrimination was not successful when the ratio was 2:3, or when the ratio was 1:2 when stimuli were computer animated. Analysis of the salamanders' success and failure as well as analysis of stimulus features points towards movement as a dominant feature for quantity discrimination. The results are generally consistent with large quantity discrimination investigated in many other animals (e.g. primates, fish), current models of quantity representation (analogue magnitudes) and data on sensory aspects of amphibian prey-catching behaviour (neuronal motion processing).

  17. Mass discrimination during weightlessness

    NASA Technical Reports Server (NTRS)

    Ross, H.

    1981-01-01

    An experiment concerned with the ability of astronauts to discriminate between the mass of objects when both the objects and the astronauts are in weightless states is described. The main object of the experiment is to compare the threshold for weight-discrimination on Earth with that for mass-discrimination in orbit. Tests will be conducted premission and postmission and early and late during the mission while the crew is experiencing weightlessness. A comparison of early and late tests inflight and postflight will reveal the rate of adaptation to zero-gravity and 1-g. The mass discrimination box holds 24 balls which the astronaut will compare to one another in a random routine.

  18. Fishers' knowledge as a source of information about the estuarine dolphin (Sotalia guianensis, van Bénéden, 1864).

    PubMed

    Manzan, Maíra Fontes; Lopes, Priscila F M

    2015-01-01

    Fishers' local ecological knowledge (LEK) is an additional tool to obtain information about cetaceans, regarding their local particularities, fishing interactions, and behavior. However, this knowledge could vary in depth of detail according to the level of interaction that fishers have with a specific species. This study investigated differences in small-scale fishers' LEK regarding the estuarine dolphin (Sotalia guianensis) in three Brazilian northeast coastal communities where fishing is practiced in estuarine lagoons and/or coastal waters and where dolphin-watching tourism varies from incipient to important. The fishers (N = 116) were asked about general characteristics of S. guianensis and their interactions with this dolphin during fishing activities. Compared to lagoon fishers, coastal fishers showed greater knowledge about the species but had more negative interactions with the dolphin during fishing activities. Coastal fishing not only offered the opportunity for fishers to observe a wider variety of the dolphin's behavior, but also implied direct contact with the dolphins, as they are bycaught in coastal gillnets. Besides complementing information that could be used for the management of cetaceans, this study shows that the type of environment most used by fishers also affects the accuracy of the information they provide. When designing studies to gather information on species and/or populations with the support of fishers, special consideration should be given to local particularities such as gear and habitats used within the fishing community.

  19. Mean-Field Dynamics and Fisher Information in Matter Wave Interferometry

    NASA Astrophysics Data System (ADS)

    Haine, Simon A.

    2016-06-01

    There has been considerable recent interest in the mean-field dynamics of various atom-interferometry schemes designed for precision sensing. In the field of quantum metrology, the standard tools for evaluating metrological sensitivity are the classical and quantum Fisher information. In this Letter, we show how these tools can be adapted to evaluate the sensitivity when the behavior is dominated by mean-field dynamics. As an example, we compare the behavior of four recent theoretical proposals for gyroscopes based on matter-wave interference in toroidally trapped geometries. We show that while the quantum Fisher information increases at different rates for the various schemes considered, in all cases it is consistent with the well-known Sagnac phase shift after the matter waves have traversed a closed path. However, we argue that the relevant metric for quantifying interferometric sensitivity is the classical Fisher information, which can vary considerably between the schemes.

  20. Fisher information of a squeezed-state interferometer with a finite photon-number resolution

    NASA Astrophysics Data System (ADS)

    Liu, P.; Wang, P.; Yang, W.; Jin, G. R.; Sun, C. P.

    2017-02-01

    Squeezed-state interferometry plays an important role in quantum-enhanced optical phase estimation, as it allows the estimation precision to be improved up to the Heisenberg limit by using ideal photon-number-resolving detectors at the output ports. Here we show that for each individual N -photon component of the phase-matched coherent ⊗ squeezed vacuum input state, the classical Fisher information always saturates the quantum Fisher information. Moreover, the total Fisher information is the sum of the contributions from each individual N -photon component, where the largest N is limited by the finite number resolution of available photon counters. Based on this observation, we provide an approximate analytical formula that quantifies the amount of lost information due to the finite photon number resolution; e.g., given the mean photon number n ¯ in the input state, over 96% of the Heisenberg limit can be achieved with the number resolution larger than 5 n ¯ .

  1. Low Titers of Canine Distemper Virus Antibody in Wild Fishers (Martes pennanti) in the Eastern USA.

    PubMed

    Peper, Steven T; Peper, Randall L; Mitcheltree, Denise H; Kollias, George V; Brooks, Robert P; Stevens, Sadie S; Serfass, Thomas L

    2016-01-01

    Canine distemper virus (CDV) infects species in the order Carnivora. Members of the family Mustelidae are among the species most susceptible to CDV and have a high mortality rate after infection. Assessing an animal's pathogen or disease load prior to any reintroduction project is important to help protect the animal being reintroduced, as well as the wildlife and livestock in the area of relocation. We screened 58 fishers for CDV antibody prior to their release into Pennsylvania, US, as part of a reintroduction program. Five of the 58 (9%) fishers had a weak-positive reaction for CDV antibody at a dilution of 1:16. None of the fishers exhibited any clinical sign of canine distemper while being held prior to release.

  2. Angular velocity discrimination

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.

    1990-01-01

    Three experiments designed to investigate the ability of naive observers to discriminate rotational velocities of two simultaneously viewed objects are described. Rotations are constrained to occur about the x and y axes, resulting in linear two-dimensional image trajectories. The results indicate that observers can discriminate angular velocities with a competence near that for linear velocities. However, perceived angular rate is influenced by structural aspects of the stimuli.

  3. Image quality of mixed convolution kernel in thoracic computed tomography.

    PubMed

    Neubauer, Jakob; Spira, Eva Maria; Strube, Juliane; Langer, Mathias; Voss, Christian; Kotter, Elmar

    2016-11-01

    The mixed convolution kernel alters his properties geographically according to the depicted organ structure, especially for the lung. Therefore, we compared the image quality of the mixed convolution kernel to standard soft and hard kernel reconstructions for different organ structures in thoracic computed tomography (CT) images.Our Ethics Committee approved this prospective study. In total, 31 patients who underwent contrast-enhanced thoracic CT studies were included after informed consent. Axial reconstructions were performed with hard, soft, and mixed convolution kernel. Three independent and blinded observers rated the image quality according to the European Guidelines for Quality Criteria of Thoracic CT for 13 organ structures. The observers rated the depiction of the structures in all reconstructions on a 5-point Likert scale. Statistical analysis was performed with the Friedman Test and post hoc analysis with the Wilcoxon rank-sum test.Compared to the soft convolution kernel, the mixed convolution kernel was rated with a higher image quality for lung parenchyma, segmental bronchi, and the border between the pleura and the thoracic wall (P < 0.03). Compared to the hard convolution kernel, the mixed convolution kernel was rated with a higher image quality for aorta, anterior mediastinal structures, paratracheal soft tissue, hilar lymph nodes, esophagus, pleuromediastinal border, large and medium sized pulmonary vessels and abdomen (P < 0.004) but a lower image quality for trachea, segmental bronchi, lung parenchyma, and skeleton (P < 0.001).The mixed convolution kernel cannot fully substitute the standard CT reconstructions. Hard and soft convolution kernel reconstructions still seem to be mandatory for thoracic CT.

  4. Genomic Prediction of Genotype × Environment Interaction Kernel Regression Models.

    PubMed

    Cuevas, Jaime; Crossa, José; Soberanis, Víctor; Pérez-Elizalde, Sergio; Pérez-Rodríguez, Paulino; Campos, Gustavo de Los; Montesinos-López, O A; Burgueño, Juan

    2016-11-01

    In genomic selection (GS), genotype × environment interaction (G × E) can be modeled by a marker × environment interaction (M × E). The G × E may be modeled through a linear kernel or a nonlinear (Gaussian) kernel. In this study, we propose using two nonlinear Gaussian kernels: the reproducing kernel Hilbert space with kernel averaging (RKHS KA) and the Gaussian kernel with the bandwidth estimated through an empirical Bayesian method (RKHS EB). We performed single-environment analyses and extended to account for G × E interaction (GBLUP-G × E, RKHS KA-G × E and RKHS EB-G × E) in wheat ( L.) and maize ( L.) data sets. For single-environment analyses of wheat and maize data sets, RKHS EB and RKHS KA had higher prediction accuracy than GBLUP for all environments. For the wheat data, the RKHS KA-G × E and RKHS EB-G × E models did show up to 60 to 68% superiority over the corresponding single environment for pairs of environments with positive correlations. For the wheat data set, the models with Gaussian kernels had accuracies up to 17% higher than that of GBLUP-G × E. For the maize data set, the prediction accuracy of RKHS EB-G × E and RKHS KA-G × E was, on average, 5 to 6% higher than that of GBLUP-G × E. The superiority of the Gaussian kernel models over the linear kernel is due to more flexible kernels that accounts for small, more complex marker main effects and marker-specific interaction effects.

  5. On the Karlin-Kimura approaches to the Wright-Fisher diffusion with fluctuating selection

    NASA Astrophysics Data System (ADS)

    Huillet, Thierry

    2011-02-01

    The goal of this work is a comparative study of two Wright-Fisher-like diffusion processes on the interval, one due to Karlin and the other one due to Kimura. Each model accounts for the evolution of one two-locus colony undergoing random mating, under the additional action of selection in a random environment. In other words, we study the effect of disorder on the usual Wright-Fisher model with fixed (nonrandom) selection. There is a drastic qualitative difference between the two models and between the random and nonrandom selection hypotheses. We first present a series of elementary stochastic models and tools that are needed to conduct this study in the context of diffusion process theory, including Kolmogorov backward and forward equations, scale and speed functions, classification of boundaries, and Doob transformation of sample paths using additive functionals. In this spirit, we briefly revisit the neutral Wright-Fisher diffusion and the Wright-Fisher diffusion with nonrandom selection. With these tools at hand, we first deal with the Karlin approach to the Wright-Fisher diffusion model with randomized selection differentials. The specificity of this model is that in the large population case, the boundaries of the state space are natural and hence inaccessible, and so quasi-absorbing only. We supply some limiting properties pertaining to times of hitting of points close to the boundaries. Next, we study the Kimura approach to the Wright-Fisher model with randomized selection, which may be viewed as a modification of the Karlin model, using an appropriate Doob transform which we describe. This model also has natural boundaries, but they turn out to be much more attracting and sticky than in Karlin's version. This leads to a faster approach to the quasi-absorbing states, to a larger time needed to move from the vicinity of one boundary to the other and to a local critical behavior of the branching diffusion obtained after the relevant Doob transformation.

  6. Behaviours and attitudes of recreational fishers toward safety at a ?blackspot.

    PubMed

    Jasper, Randall; Stewart, Barbara A; Knight, Andrew

    2017-01-27

    Issue addressed: Recreational fishing, particularly rock fishing, can be dangerous; 30 fatalities were recorded in Western Australia from 2002-2014. This study investigates differences in behaviours and attitudes towards safety among fishers at a fishing fatality 'black spot' in Australia.Methods: A total of 236 fishers were surveyed at Salmon Holes, Western Australia in 2015. Fishers were grouped by country of origin and significant differences among groups for behaviours and attitudes towards personal safety were identified.Results: Of fishers surveyed, 53% were born in Asia. These fishers self-assessed as poorer swimmers (F=23.27, P<0.001), yet were more likely to have fished from rocks (χ2=20.94, P<0.001). They were less likely to go close to the water to get a snagged line (χ2=15.44, P<0.001) or to drink alcohol while fishing ( χ2 = 8.63, P<0.001), and were more likely to agree that they would drown if swept into the sea (χ2=9.49, P<0.001). Although most respondents agreed that wearing a life jacket made fishing safer, 78% 'never' wore a life jacket while fishing.Conclusions: Some fishers who were poor swimmers and were aware of the dangers of rock fishing still choose to fish from rocks.So what?: Our results support the proposal that the wearing of life jackets should be promoted, if not made mandatory, while water safety education campaigns should be continued and target vulnerable communities.

  7. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    PubMed

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  8. A visualization tool for the kernel-driven model with improved ability in data analysis and kernel assessment

    NASA Astrophysics Data System (ADS)

    Dong, Yadong; Jiao, Ziti; Zhang, Hu; Bai, Dongni; Zhang, Xiaoning; Li, Yang; He, Dandan

    2016-10-01

    The semi-empirical, kernel-driven Bidirectional Reflectance Distribution Function (BRDF) model has been widely used for many aspects of remote sensing. With the development of the kernel-driven model, there is a need to further assess the performance of newly developed kernels. The use of visualization tools can facilitate the analysis of model results and the assessment of newly developed kernels. However, the current version of the kernel-driven model does not contain a visualization function. In this study, a user-friendly visualization tool, named MaKeMAT, was developed specifically for the kernel-driven model. The POLDER-3 and CAR BRDF datasets were used to demonstrate the applicability of MaKeMAT. The visualization of inputted multi-angle measurements enhances understanding of multi-angle measurements and allows the choice of measurements with good representativeness. The visualization of modeling results facilitates the assessment of newly developed kernels. The study shows that the visualization tool MaKeMAT can promote the widespread application of the kernel-driven model.

  9. Discriminating dysplasia: Optical tomographic texture analysis of colorectal polyps.

    PubMed

    Li, Wenqi; Coats, Maria; Zhang, Jianguo; McKenna, Stephen J

    2015-12-01

    Optical projection tomography enables 3-D imaging of colorectal polyps at resolutions of 5-10 µm. This paper investigates the ability of image analysis based on 3-D texture features to discriminate diagnostic levels of dysplastic change from such images, specifically, low-grade dysplasia, high-grade dysplasia and invasive cancer. We build a patch-based recognition system and evaluate both multi-class classification and ordinal regression formulations on a 90 polyp dataset. 3-D texture representations computed with a hand-crafted feature extractor, random projection, and unsupervised image filter learning are compared using a bag-of-words framework. We measure performance in terms of error rates, F-measures, and ROC surfaces. Results demonstrate that randomly projected features are effective. Discrimination was improved by carefully manipulating various important aspects of the system, including class balancing, output calibration and approximation of non-linear kernels.

  10. Discriminant power analyses of non-linear dimension expansion methods

    NASA Astrophysics Data System (ADS)

    Woo, Seongyoun; Lee, Chulhee

    2016-05-01

    Most non-linear classification methods can be viewed as non-linear dimension expansion methods followed by a linear classifier. For example, the support vector machine (SVM) expands the dimensions of the original data using various kernels and classifies the data in the expanded data space using a linear SVM. In case of extreme learning machines or neural networks, the dimensions are expanded by hidden neurons and the final layer represents the linear classification. In this paper, we analyze the discriminant powers of various non-linear classifiers. Some analyses of the discriminating powers of non-linear dimension expansion methods are presented along with a suggestion of how to improve separability in non-linear classifiers.

  11. A Fisher-gradient complexity in systems with spatio-temporal dynamics

    NASA Astrophysics Data System (ADS)

    Arbona, A.; Bona, C.; Massó, J.; Miñano, B.; Plastino, A.

    2016-04-01

    We define a benchmark for definitions of complexity in systems with spatio-temporal dynamics and employ it in the study of Collective Motion. We show that LMC's complexity displays interesting properties in such systems, while a statistical complexity model (SCM) based on autocorrelation reasonably meets our perception of complexity. However this SCM is not as general as desirable, as it does not merely depend on the system's Probability Distribution Function. Inspired by the notion of Fisher information, we develop a SCM candidate, which we call the Fisher-gradient complexity, which exhibits nice properties from the viewpoint of our benchmark.

  12. Numerical method based on the lattice Boltzmann model for the Fisher equation.

    PubMed

    Yan, Guangwu; Zhang, Jianying; Dong, Yinfeng

    2008-06-01

    In this paper, a lattice Boltzmann model for the Fisher equation is proposed. First, the Chapman-Enskog expansion and the multiscale time expansion are used to describe higher-order moment of equilibrium distribution functions and a series of partial differential equations in different time scales. Second, the modified partial differential equation of the Fisher equation with the higher-order truncation error is obtained. Third, comparison between numerical results of the lattice Boltzmann models and exact solution is given. The numerical results agree well with the classical ones.

  13. The 2MASS Tully-Fisher Survey: Mapping the mass in the Universe

    NASA Astrophysics Data System (ADS)

    Hong, T.; Staveley-Smith, L.; Masters, K.; Springob, C.; Macri, L.; Koribalski, B.; Jones, H.; Jarrett, T.

    2013-02-01

    The 2MASS Tully-Fisher Survey (2MTF) aims to measure Tully-Fisher (TF) distances for all bright, inclined spirals in the 2MASS Redshift Survey (2MRS) using high-quality HI widths and 2MASS photometry. Compared with previous peculiar-velocity surveys, the 2MTF survey provides more accurate width measurements and more uniform sky coverage, combining observations with the Green Bank, Arecibo, and Parkes telescopes. With this new redshift-independent distance database, we will significantly improve our understanding of the mass distribution in the local Universe.

  14. Cosmology with the largest galaxy cluster surveys: going beyond Fisher matrix forecasts

    SciTech Connect

    Khedekar, Satej; Majumdar, Subhabrata E-mail: subha@tifr.res.in

    2013-02-01

    We make the first detailed MCMC likelihood study of cosmological constraints that are expected from some of the largest, ongoing and proposed, cluster surveys in different wave-bands and compare the estimates to the prevalent Fisher matrix forecasts. Mock catalogs of cluster counts expected from the surveys — eROSITA, WFXT, RCS2, DES and Planck, along with a mock dataset of follow-up mass calibrations are analyzed for this purpose. A fair agreement between MCMC and Fisher results is found only in the case of minimal models. However, for many cases, the marginalized constraints obtained from Fisher and MCMC methods can differ by factors of 30-100%. The discrepancy can be alarmingly large for a time dependent dark energy equation of state, w(a); the Fisher methods are seen to under-estimate the constraints by as much as a factor of 4-5. Typically, Fisher estimates become more and more inappropriate as we move away from ΛCDM, to a constant-w dark energy to varying-w dark energy cosmologies. Fisher analysis, also, predicts incorrect parameter degeneracies. There are noticeable offsets in the likelihood contours obtained from Fisher methods that is caused due to an asymmetry in the posterior likelihood distribution as seen through a MCMC analysis. From the point of mass-calibration uncertainties, a high value of unknown scatter about the mean mass-observable relation, and its redshift dependence, is seen to have large degeneracies with the cosmological parameters σ{sub 8} and w(a) and can degrade the cosmological constraints considerably. We find that the addition of mass-calibrated cluster datasets can improve dark energy and σ{sub 8} constraints by factors of 2-3 from what can be obtained from CMB+SNe+BAO only . Finally, we show that a joint analysis of datasets of two (or more) different cluster surveys would significantly tighten cosmological constraints from using clusters only. Since, details of future cluster surveys are still being planned, we emphasize

  15. Recurrent miller fisher syndrome with abnormal terminal axon dysfunction: a case report.

    PubMed

    Tomcík, Jan; Dufek, Michal; Hromada, Jan; Rektor, Ivan; Bares, Martin

    2007-12-01

    Miller Fisher syndrome (MFS) is a localized variant of Guillain-Barré syndrome (GBS), characterized by ophthalmoplegia, areflexia, and ataxia. Recent neurophysiological studies have suggested that abnormal terminal axon dysfunction occurs in some cases of Miller Fisher syndrome and Guillain-Barrd syndrome. We present a rare case report of recurrent MFS with abnormal terminal axon dysfunction. To the best of our knowledge, this is the first case report of recurrent MFS with terminal axon dysfunction that persisted up to nine months after the initial presentation of the second attack with positive antiganglioside antibodies and full clinical recovery.

  16. False-Positive Serum Botulism Bioassay in Miller-Fisher Syndrome.

    PubMed

    Zeylikman, Yuriy; Shah, Vishal; Shah, Umang; Mirsen, Thomas R; Campellone, Joseph V

    2015-09-01

    We describe a patient with acute progressive weakness and areflexia. Both botulism and Miller-Fisher variant of Guillain-Barré syndrome were initial diagnostic considerations, and she was treated with intravenous immunoglobulin and botulinum antitoxin. A mouse bioassay was positive for botulinum toxin A, although her clinical course, electrodiagnostic studies, and cerebrospinal fluid findings supported Miller-Fisher syndrome. This patient's atypical features offer points of discussion regarding the evaluation of patients with acute neuromuscular weakness and emphasize the limitations of the botulism bioassay.

  17. Combining features from ERP components in single-trial EEG for discriminating four-category visual objects

    NASA Astrophysics Data System (ADS)

    Wang, Changming; Xiong, Shi; Hu, Xiaoping; Yao, Li; Zhang, Jiacai

    2012-10-01

    Categorization of images containing visual objects can be successfully recognized using single-trial electroencephalograph (EEG) measured when subjects view images. Previous studies have shown that task-related information contained in event-related potential (ERP) components could discriminate two or three categories of object images. In this study, we investigated whether four categories of objects (human faces, buildings, cats and cars) could be mutually discriminated using single-trial EEG data. Here, the EEG waveforms acquired while subjects were viewing four categories of object images were segmented into several ERP components (P1, N1, P2a and P2b), and then Fisher linear discriminant analysis (Fisher-LDA) was used to classify EEG features extracted from ERP components. Firstly, we compared the classification results using features from single ERP components, and identified that the N1 component achieved the highest classification accuracies. Secondly, we discriminated four categories of objects using combining features from multiple ERP components, and showed that combination of ERP components improved four-category classification accuracies by utilizing the complementarity of discriminative information in ERP components. These findings confirmed that four categories of object images could be discriminated with single-trial EEG and could direct us to select effective EEG features for classifying visual objects.

  18. On the Kernelization Complexity of Colorful Motifs

    NASA Astrophysics Data System (ADS)

    Ambalath, Abhimanyu M.; Balasundaram, Radheshyam; Rao H., Chintan; Koppula, Venkata; Misra, Neeldhara; Philip, Geevarghese; Ramanujan, M. S.

    The Colorful Motif problem asks if, given a vertex-colored graph G, there exists a subset S of vertices of G such that the graph induced by G on S is connected and contains every color in the graph exactly once. The problem is motivated by applications in computational biology and is also well-studied from the theoretical point of view. In particular, it is known to be NP-complete even on trees of maximum degree three [Fellows et al, ICALP 2007]. In their pioneering paper that introduced the color-coding technique, Alon et al. [STOC 1995] show, inter alia, that the problem is FPT on general graphs. More recently, Cygan et al. [WG 2010] showed that Colorful Motif is NP-complete on comb graphs, a special subclass of the set of trees of maximum degree three. They also showed that the problem is not likely to admit polynomial kernels on forests.

  19. Kernel density estimation using graphical processing unit

    NASA Astrophysics Data System (ADS)

    Sunarko, Su'ud, Zaki

    2015-09-01

    Kernel density estimation for particles distributed over a 2-dimensional space is calculated using a single graphical processing unit (GTX 660Ti GPU) and CUDA-C language. Parallel calculations are done for particles having bivariate normal distribution and by assigning calculations for equally-spaced node points to each scalar processor in the GPU. The number of particles, blocks and threads are varied to identify favorable configuration. Comparisons are obtained by performing the same calculation using 1, 2 and 4 processors on a 3.0 GHz CPU using MPICH 2.0 routines. Speedups attained with the GPU are in the range of 88 to 349 times compared the multiprocessor CPU. Blocks of 128 threads are found to be the optimum configuration for this case.

  20. Privacy preserving RBF kernel support vector machine.

    PubMed

    Li, Haoran; Xiong, Li; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2014-01-01

    Data sharing is challenging but important for healthcare research. Methods for privacy-preserving data dissemination based on the rigorous differential privacy standard have been developed but they did not consider the characteristics of biomedical data and make full use of the available information. This often results in too much noise in the final outputs. We hypothesized that this situation can be alleviated by leveraging a small portion of open-consented data to improve utility without sacrificing privacy. We developed a hybrid privacy-preserving differentially private support vector machine (SVM) model that uses public data and private data together. Our model leverages the RBF kernel and can handle nonlinearly separable cases. Experiments showed that this approach outperforms two baselines: (1) SVMs that only use public data, and (2) differentially private SVMs that are built from private data. Our method demonstrated very close performance metrics compared to nonprivate SVMs trained on the private data.

  1. Learning molecular energies using localized graph kernels

    NASA Astrophysics Data System (ADS)

    Ferré, Grégoire; Haut, Terry; Barros, Kipton

    2017-03-01

    Recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although simple to state, these symmetries are complicated to encode into machine learning algorithms. In this paper, we present a machine learning approach based on graph theory that naturally incorporates translation, rotation, and permutation symmetries. Specifically, we use a random walk graph kernel to measure the similarity of two adjacency matrices, each of which represents a local atomic environment. This Graph Approximated Energy (GRAPE) approach is flexible and admits many possible extensions. We benchmark a simple version of GRAPE by predicting atomization energies on a standard dataset of organic molecules.

  2. On the characterization of vegetation recovery after fire disturbance using Fisher-Shannon analysis and SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series

    NASA Astrophysics Data System (ADS)

    Lasaponara, Rosa; Lanorte, Antonio; Lovallo, Michele; Telesca, Luciano

    2015-04-01

    Time series can fruitfully support fire monitoring and management from statistical analysis of fire occurrence (Tuia et al. 2008) to danger estimation (lasaponara 2005), damage evaluation (Lanorte et al 2014) and post fire recovery (Lanorte et al. 2014). In this paper, the time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers. Reference Lanorte A, R Lasaponara, M Lovallo, L Telesca 2014 Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to

  3. The flare kernel in the impulsive phase

    NASA Technical Reports Server (NTRS)

    Dejager, C.

    1986-01-01

    The impulsive phase of a flare is characterized by impulsive bursts of X-ray and microwave radiation, related to impulsive footpoint heating up to 50 or 60 MK, by upward gas velocities (150 to 400 km/sec) and by a gradual increase of the flare's thermal energy content. These phenomena, as well as non-thermal effects, are all related to the impulsive energy injection into the flare. The available observations are also quantitatively consistent with a model in which energy is injected into the flare by beams of energetic electrons, causing ablation of chromospheric gas, followed by convective rise of gas. Thus, a hole is burned into the chromosphere; at the end of impulsive phase of an average flare the lower part of that hole is situated about 1800 km above the photosphere. H alpha and other optical and UV line emission is radiated by a thin layer (approx. 20 km) at the bottom of the flare kernel. The upward rising and outward streaming gas cools down by conduction in about 45 s. The non-thermal effects in the initial phase are due to curtailing of the energy distribution function by escape of energetic electrons. The single flux tube model of a flare does not fit with these observations; instead we propose the spaghetti-bundle model. Microwave and gamma-ray observations suggest the occurrence of dense flare knots of approx. 800 km diameter, and of high temperature. Future observations should concentrate on locating the microwave/gamma-ray sources, and on determining the kernel's fine structure and the related multi-loop structure of the flaring area.

  4. Topology-based kernels with application to inference problems in Alzheimer's disease.

    PubMed

    Pachauri, Deepti; Hinrichs, Chris; Chung, Moo K; Johnson, Sterling C; Singh, Vikas

    2011-10-01

    Alzheimer's disease (AD) research has recently witnessed a great deal of activity focused on developing new statistical learning tools for automated inference using imaging data. The workhorse for many of these techniques is the support vector machine (SVM) framework (or more generally kernel-based methods). Most of these require, as a first step, specification of a kernel matrix K between input examples (i.e., images). The inner product between images I(i) and I(j) in a feature space can generally be written in closed form and so it is convenient to treat K as "given." However, in certain neuroimaging applications such an assumption becomes problematic. As an example, it is rather challenging to provide a scalar measure of similarity between two instances of highly attributed data such as cortical thickness measures on cortical surfaces. Note that cortical thickness is known to be discriminative for neurological disorders, so leveraging such information in an inference framework, especially within a multi-modal method, is potentially advantageous. But despite being clinically meaningful, relatively few works have successfully exploited this measure for classification or regression. Motivated by these applications, our paper presents novel techniques to compute similarity matrices for such topologically-based attributed data. Our ideas leverage recent developments to characterize signals (e.g., cortical thickness) motivated by the persistence of their topological features, leading to a scheme for simple constructions of kernel matrices. As a proof of principle, on a dataset of 356 subjects from the Alzheimer's Disease Neuroimaging Initiative study, we report good performance on several statistical inference tasks without any feature selection, dimensionality reduction, or parameter tuning.

  5. Training Lp norm multiple kernel learning in the primal.

    PubMed

    Liang, Zhizheng; Xia, Shixiong; Zhou, Yong; Zhang, Lei

    2013-10-01

    Some multiple kernel learning (MKL) models are usually solved by utilizing the alternating optimization method where one alternately solves SVMs in the dual and updates kernel weights. Since the dual and primal optimization can achieve the same aim, it is valuable in exploring how to perform Lp norm MKL in the primal. In this paper, we propose an Lp norm multiple kernel learning algorithm in the primal where we resort to the alternating optimization method: one cycle for solving SVMs in the primal by using the preconditioned conjugate gradient method and other cycle for learning the kernel weights. It is interesting to note that the kernel weights in our method can obtain analytical solutions. Most importantly, the proposed method is well suited for the manifold regularization framework in the primal since solving LapSVMs in the primal is much more effective than solving LapSVMs in the dual. In addition, we also carry out theoretical analysis for multiple kernel learning in the primal in terms of the empirical Rademacher complexity. It is found that optimizing the empirical Rademacher complexity may obtain a type of kernel weights. The experiments on some datasets are carried out to demonstrate the feasibility and effectiveness of the proposed method.

  6. Gaussian kernel width optimization for sparse Bayesian learning.

    PubMed

    Mohsenzadeh, Yalda; Sheikhzadeh, Hamid

    2015-04-01

    Sparse kernel methods have been widely used in regression and classification applications. The performance and the sparsity of these methods are dependent on the appropriate choice of the corresponding kernel functions and their parameters. Typically, the kernel parameters are selected using a cross-validation approach. In this paper, a learning method that is an extension of the relevance vector machine (RVM) is presented. The proposed method can find the optimal values of the kernel parameters during the training procedure. This algorithm uses an expectation-maximization approach for updating kernel parameters as well as other model parameters; therefore, the speed of convergence and computational complexity of the proposed method are the same as the standard RVM. To control the convergence of this fully parameterized model, the optimization with respect to the kernel parameters is performed using a constraint on these parameters. The proposed method is compared with the typical RVM and other competing methods to analyze the performance. The experimental results on the commonly used synthetic data, as well as benchmark data sets, demonstrate the effectiveness of the proposed method in reducing the performance dependency on the initial choice of the kernel parameters.

  7. Relaxation and diffusion models with non-singular kernels

    NASA Astrophysics Data System (ADS)

    Sun, HongGuang; Hao, Xiaoxiao; Zhang, Yong; Baleanu, Dumitru

    2017-02-01

    Anomalous relaxation and diffusion processes have been widely quantified by fractional derivative models, where the definition of the fractional-order derivative remains a historical debate due to its limitation in describing different kinds of non-exponential decays (e.g. stretched exponential decay). Meanwhile, many efforts by mathematicians and engineers have been made to overcome the singularity of power function kernel in its definition. This study first explores physical properties of relaxation and diffusion models where the temporal derivative was defined recently using an exponential kernel. Analytical analysis shows that the Caputo type derivative model with an exponential kernel cannot characterize non-exponential dynamics well-documented in anomalous relaxation and diffusion. A legitimate extension of the previous derivative is then proposed by replacing the exponential kernel with a stretched exponential kernel. Numerical tests show that the Caputo type derivative model with the stretched exponential kernel can describe a much wider range of anomalous diffusion than the exponential kernel, implying the potential applicability of the new derivative in quantifying real-world, anomalous relaxation and diffusion processes.

  8. Localized Multiple Kernel Learning Via Sample-Wise Alternating Optimization.

    PubMed

    Han, Yina; Yang, Kunde; Ma, Yuanliang; Liu, Guizhong

    2014-01-01

    Our objective is to train support vector machines (SVM)-based localized multiple kernel learning (LMKL), using the alternating optimization between the standard SVM solvers with the local combination of base kernels and the sample-specific kernel weights. The advantage of alternating optimization developed from the state-of-the-art MKL is the SVM-tied overall complexity and the simultaneous optimization on both the kernel weights and the classifier. Unfortunately, in LMKL, the sample-specific character makes the updating of kernel weights a difficult quadratic nonconvex problem. In this paper, starting from a new primal-dual equivalence, the canonical objective on which state-of-the-art methods are based is first decomposed into an ensemble of objectives corresponding to each sample, namely, sample-wise objectives. Then, the associated sample-wise alternating optimization method is conducted, in which the localized kernel weights can be independently obtained by solving their exclusive sample-wise objectives, either linear programming (for l1-norm) or with closed-form solutions (for lp-norm). At test time, the learnt kernel weights for the training data are deployed based on the nearest-neighbor rule. Hence, to guarantee their generality among the test part, we introduce the neighborhood information and incorporate it into the empirical loss when deriving the sample-wise objectives. Extensive experiments on four benchmark machine learning datasets and two real-world computer vision datasets demonstrate the effectiveness and efficiency of the proposed algorithm.

  9. Effects of sample size on KERNEL home range estimates

    USGS Publications Warehouse

    Seaman, D.E.; Millspaugh, J.J.; Kernohan, Brian J.; Brundige, Gary C.; Raedeke, Kenneth J.; Gitzen, Robert A.

    1999-01-01

    Kernel methods for estimating home range are being used increasingly in wildlife research, but the effect of sample size on their accuracy is not known. We used computer simulations of 10-200 points/home range and compared accuracy of home range estimates produced by fixed and adaptive kernels with the reference (REF) and least-squares cross-validation (LSCV) methods for determining the amount of smoothing. Simulated home ranges varied from simple to complex shapes created by mixing bivariate normal distributions. We used the size of the 95% home range area and the relative mean squared error of the surface fit to assess the accuracy of the kernel home range estimates. For both measures, the bias and variance approached an asymptote at about 50 observations/home range. The fixed kernel with smoothing selected by LSCV provided the least-biased estimates of the 95% home range area. All kernel methods produced similar surface fit for most simulations, but the fixed kernel with LSCV had the lowest frequency and magnitude of very poor estimates. We reviewed 101 papers published in The Journal of Wildlife Management (JWM) between 1980 and 1997 that estimated animal home ranges. A minority of these papers used nonparametric utilization distribution (UD) estimators, and most did not adequately report sample sizes. We recommend that home range studies using kernel estimates use LSCV to determine the amount of smoothing, obtain a minimum of 30 observations per animal (but preferably a?Y50), and report sample sizes in published results.

  10. Texture analysis in gel electrophoresis images using an integrative kernel-based approach

    PubMed Central

    Fernandez-Lozano, Carlos; Seoane, Jose A.; Gestal, Marcos; Gaunt, Tom R.; Dorado, Julian; Pazos, Alejandro; Campbell, Colin

    2016-01-01

    Texture information could be used in proteomics to improve the quality of the image analysis of proteins separated on a gel. In order to evaluate the best technique to identify relevant textures, we use several different kernel-based machine learning techniques to classify proteins in 2-DE images into spot and noise. We evaluate the classification accuracy of each of these techniques with proteins extracted from ten 2-DE images of different types of tissues and different experimental conditions. We found that the best classification model was FSMKL, a data integration method using multiple kernel learning, which achieved AUROC values above 95% while using a reduced number of features. This technique allows us to increment the interpretability of the complex combinations of textures and to weight the importance of each particular feature in the final model. In particular the Inverse Difference Moment exhibited the highest discriminating power. A higher value can be associated with an homogeneous structure as this feature describes the homogeneity; the larger the value, the more symmetric. The final model is performed by the combination of different groups of textural features. Here we demonstrated the feasibility of combining different groups of textures in 2-DE image analysis for spot detection. PMID:26758643

  11. Towards a Holistic Cortical Thickness Descriptor: Heat Kernel-Based Grey Matter Morphology Signatures.

    PubMed

    Wang, Gang; Wang, Yalin

    2017-02-15

    In this paper, we propose a heat kernel based regional shape descriptor that may be capable of better exploiting volumetric morphological information than other available methods, thereby improving statistical power on brain magnetic resonance imaging (MRI) analysis. The mechanism of our analysis is driven by the graph spectrum and the heat kernel theory, to capture the volumetric geometry information in the constructed tetrahedral meshes. In order to capture profound brain grey matter shape changes, we first use the volumetric Laplace-Beltrami operator to determine the point pair correspondence between white-grey matter and CSF-grey matter boundary surfaces by computing the streamlines in a tetrahedral mesh. Secondly, we propose multi-scale grey matter morphology signatures to describe the transition probability by random walk between the point pairs, which reflects the inherent geometric characteristics. Thirdly, a point distribution model is applied to reduce the dimensionality of the grey matter morphology signatures and generate the internal structure features. With the sparse linear discriminant analysis, we select a concise morphology feature set with improved classification accuracies. In our experiments, the proposed work outperformed the cortical thickness features computed by FreeSurfer software in the classification of Alzheimer's disease and its prodromal stage, i.e., mild cognitive impairment, on publicly available data from the Alzheimer's Disease Neuroimaging Initiative. The multi-scale and physics based volumetric structure feature may bring stronger statistical power than some traditional methods for MRI-based grey matter morphology analysis.

  12. Transfer String Kernel for Cross-Context DNA-Protein Binding Prediction.

    PubMed

    Singh, Ritambhara; Lanchantin, Jack; Robins, Gabriel; Qi, Yanjun

    2016-09-15

    Through sequence-based classification, this paper tries to accurately predict the DNA binding sites of transcription factors (TFs) in an unannotated cellular context. Related methods in the literature fail to perform such predictions accurately, since they do not consider sample distribution shift of sequence segments from an annotated (source) context to an unannotated (target) context. We, therefore, propose a method called "Transfer String Kernel" (TSK) that achieves improved prediction of transcription factor binding site (TFBS) using knowledge transfer via cross-context sample adaptation. TSK maps sequence segments to a high-dimensional feature space using a discriminative mismatch string kernel framework. In this high-dimensional space, labeled examples of the source context are re-weighted so that the revised sample distribution matches the target context more closely. We have experimentally verified TSK for TFBS identifications on fourteen different TFs under a cross-organism setting. We find that TSK consistently outperforms the state-of-the-art TFBS tools, especially when working with TFs whose binding sequences are not conserved across contexts. We also demonstrate the generalizability of TSK by showing its cutting-edge performance on a different set of cross-context tasks for the MHC peptide binding predictions.

  13. Improved Prediction of Malaria Degradomes by Supervised Learning with SVM and Profile Kernel

    PubMed Central

    Kuang, Rui; Gu, Jianying; Cai, Hong; Wang, Yufeng

    2009-01-01

    The spread of drug resistance through malaria parasite populations calls for the development of new therapeutic strategies. However, the seemingly promising genomics-driven target identification paradigm is hampered by the weak annotation coverage. To identify potentially important yet uncharacterized proteins, we apply support vector machines using profile kernels, a supervised discriminative machine learning technique for remote homology detection, as a complement to the traditional alignment based algorithms. In this study, we focus on the prediction of proteases, which have long been considered attractive drug targets because of their indispensable roles in parasite development and infection. Our analysis demonstrates that an abundant and complex repertoire is conserved in five Plasmodium parasite species. Several putative proteases may be important components in networks that mediate cellular processes, including hemoglobin digestion, invasion, trafficking, cell cycle fate, and signal transduction. This catalog of proteases provides a short list of targets for functional characterization and rational inhibitor design. PMID:19057851

  14. A multi-label image annotation scheme based on improved SVM multiple kernel learning

    NASA Astrophysics Data System (ADS)

    Jin, Cong; Jin, Shu-Wei

    2017-02-01

    Multi-label image annotation (MIA) has been widely studied during recent years and many MIA schemes have been proposed. However, the most existing schemes are not satisfactory. In this paper, an improved multiple kernel learning (IMKL) method of support vector machine (SVM) is proposed to improve the classification accuracy of SVM, then a novel MIA scheme based on IMKL is presented, which uses the discriminant loss to control the number of top semantic labels, and the feature selection approach is also used for improving the performance of MIA. The experiment results show that proposed MIA scheme achieves higher the performance than the existing other MIA schemes, its performance is satisfactory for large image dataset.

  15. The Use of Fisher's Z in Schmidt-Hunter-Type Meta-Analyses.

    ERIC Educational Resources Information Center

    Law, Kenneth S.

    1995-01-01

    Two new methods of estimating the mean population correlation (M) and the standard deviation of population correlations (SD) were suggested and tested by Monte Carlo simulations. Results show no consistent advantage to using the Pearson correlation or Fisher's Z in estimating M or SD; estimates from all methods are similar. (SLD)

  16. 76 FR 18151 - Kootenai National Forest, Lincoln County, MT; Miller West Fisher Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... areas outside the recovery zone that are occupied by grizzly bears; 5. Provide further explanation for..., increase big game security through reduction in open road density (ORD), and to create grizzly bear core... stabilization in West Fisher Creek; improvement of trails and trailheads, including 5.9 miles of trail...

  17. Evaluating the sustainability of a regional system using Fisher information in the San Luis Basin, Colorado.

    PubMed

    Eason, Tarsha; Cabezas, Heriberto

    2012-02-01

    This paper describes the theory, data, and methodology necessary for using Fisher information to assess the sustainability of the San Luis Basin (SLB) regional system over time. Fisher information was originally developed as a measure of the information content in data and is an important method in information theory. Our adaptation of Fisher information provides a means of monitoring the variables of a system to characterize dynamic order, and, therefore, its regimes and regime shifts. This work is part of the SLB Sustainability Metrics Project, which aimed to evaluate movement over time towards or away from regional sustainability. One of the key goals of this project was to use readily available data to assess the sustainability of the system including its environmental, social and economic aspects. For this study, Fisher information was calculated for fifty-three variables which characterize the consumption of food and energy, agricultural production, environmental characteristics, demographic properties and changes in land use for the SLB system from 1980 to 2005. Our analysis revealed that while the system displayed small changes in dynamic order over time with a slight decreasing trend near the end of the period, there is no indication of a regime shift. Therefore, the SLB system is stable with very slight movement away from sustainability in more recent years.

  18. 75 FR 11939 - Fisher & Paykel Appliances, Inc., Huntington Beach, CA; Notice of Termination of Investigation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration Fisher & Paykel Appliances, Inc., Huntington Beach, CA; Notice of Termination of Investigation Pursuant to Section 221 of the Trade Act of 1974, as amended, an...

  19. The Role of Fisher Information Theory in the Development of Fundamental Laws in Physical Chemistry

    ERIC Educational Resources Information Center

    Honig, J. M.

    2009-01-01

    The unifying principle that involves rendering the Fisher information measure an extremum is reviewed. It is shown that with this principle, in conjunction with appropriate constraints, a large number of fundamental laws can be derived from a common source in a unified manner. The resulting economy of thought pertaining to fundamental principles…

  20. Balancing Liberty and Equality: Justice Kennedy's Decisive Vote in "Fisher v. University of Texas," Part II

    ERIC Educational Resources Information Center

    Garces, Liliana M.

    2015-01-01

    For the second time in three years, the Supreme Court is reviewing the constitutionality of a race-conscious admissions policy at the University of Texas, Austin. While the case, "Fisher v. University of Texas," raises questions specific to UT Austin, the Court's second review could change the ways higher education institutions across…

  1. 78 FR 59064 - Importer of Controlled Substances; Notice of Application; Fisher Clinical Services, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-25

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF JUSTICE Drug Enforcement Administration Importer of Controlled Substances; Notice of Application; Fisher Clinical Services... application by renewal to the Drug Enforcement Administration (DEA) for registration as an importer of...

  2. A COMAPRISON OF MERCURY IN MINK AND FISHER IN RHODE ISLAND

    EPA Science Inventory

    Comparison of total mercury concentrations and nitrogen and carbon stable isotope values in muscle tissue and stomach contents of mink (Mustela vison) and fisher (Martes pennanti) from Rhode Island in 2000- 2003 showed results which appeared to reflect dietary differences betwee...

  3. 38 CFR 60.10 - Eligibility criteria for Fisher House or other temporary lodging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... from the VA health care facility without an overnight stay. (e) Special authority for organ transplant... status as an organ donor for a veteran. VA may also provide Fisher House or other temporary lodging for the donor's accompanying individuals at all phases of the transplant process. (Authority: 38...

  4. 38 CFR 60.10 - Eligibility criteria for Fisher House or other temporary lodging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... from the VA health care facility without an overnight stay. (e) Special authority for organ transplant... status as an organ donor for a veteran. VA may also provide Fisher House or other temporary lodging for the donor's accompanying individuals at all phases of the transplant process. (Authority: 38...

  5. Disentangling Similarity Judgments from Pragmatic Judgments: Response to Sloutsky and Fisher (2012)

    ERIC Educational Resources Information Center

    Noles, Nicholaus S.; Gelman, Susan A.

    2012-01-01

    Sloutsky and Fisher (2012) attempt to reframe the results presented in Noles and Gelman (2012) as a pure replication of their original work validating the similarity, induction, naming, and categorization (SINC) model. However, their critique fails to engage with the central findings reported in Noles and Gelman, and their reanalysis fails to…

  6. The effects of isoflurane on airway smooth muscle crossbridge kinetics in Fisher and Lewis rats.

    PubMed

    Duracher, Caroline; Blanc, François-Xavier; Gueugniaud, Pierre-Yves; David, Jean Stéphane; Riou, Bruno; Lecarpentier, Yves; Coirault, Catherine

    2005-07-01

    Our aim was to determine how isoflurane modified crossbridge (CB) number and kinetics in airway smooth muscle (ASM) and to compare its effects in Fisher and Lewis rats, two strains with differences in airway responsiveness. The effects of isoflurane (2 MAC) on isotonic and isometric contractility in tracheal ASM strips were investigated after methacholine (10(-6) M)-induced contraction. CB mechanics and kinetics were analyzed using the formalism of Huxley's equations adapted to ASM. After isoflurane, maximum velocity did not differ from baseline in Lewis rats, whereas it was significantly less than baseline in Fisher rats ( approximately 25%), the most reactive strain. Isoflurane totally reversed methacholine-induced increase in active CB number in Lewis rats (2.4 +/- 0.5 versus 1.8 +/- 0.4 10(9)/mm(2) after methacholine and isoflurane, respectively) whereas reversal was only partial in Fisher rats (2.7 +/- 0.4 versus 2.1 +/- 0.3 10(9)/mm(2) after methacholine and isoflurane, respectively). Isoflurane induced a 40% increase in attachment step duration in both strains and an almost twofold increase in the CB cycle duration compared with baseline in Lewis rats. The isoflurane-induced increase in detachment step duration was less in Lewis than in Fisher rats (P < 0.05). We concluded that isoflurane modulated CB number and CB cycling rates of isolated rat ASM differently depending on the level of airway responsiveness.

  7. Landscape genetics of fishers (Martes pennanti) in the Northeast: dispersal barriers and historical influences.

    PubMed

    Hapeman, Paul; Latch, Emily K; Fike, Jennifer A; Rhodes, Olin E; Kilpatrick, C William

    2011-01-01

    Habitat fragmentation and overtrapping are thought to have resulted in severe population declines for fisher (Martes pennanti) across the northeastern United States, and by the end of the 1930s only 3 remnant populations remained. Subsequent trapping cessation, extensive reintroduction programs, and natural recolonization have helped fishers to reclaim much of their historical range. The degree to which these processes have impacted genetic structure in this species, however, remains unknown. We used 11 microsatellites from tissue samples (n = 432) of fishers to characterize contemporary population structure in light of historical population structure and thus to determine the relative influence of anthropogenic disturbances and natural landscape features in shaping genetic structure of the contemporary population. Our results indicated that 3 well-differentiated contemporary populations are present that correspond well with what would be expected based on their reported history. A course barrier to dispersal appears in the western portion of the study area associated with several lakes including Lake George and Great Sacandaga Lake. Large-scale reintroduction efforts and natural recolonizations have largely had predictable impacts on population structure. An important exception is the substantial impact of the reintroduction of fishers to Vermont.

  8. Cerebral infarction complicating intravenous immunoglobulin therapy in a patient with Miller Fisher syndrome

    PubMed Central

    Turner, B.; Wills, A.

    2000-01-01

    Intravenous immunoglobulin (IVIg) therapy is being increasingly used in a wide range of neurological conditions. However, treatment is expensive and side effects may be severe. A patient with Miller Fisher syndrome who developed cortical blindness as a consequence of occipital infarction precipitated by IVIg is reported on.

 PMID:10811710

  9. Conservation of the Eastern Taiwan Strait Chinese White Dolphin (Sousa chinensis): Fishers' Perspectives and Management Implications

    PubMed Central

    Liu, Ta-Kang; Wang, Yu-Cheng; Chuang, Laurence Zsu-Hsin; Chen, Chih-How

    2016-01-01

    The abundance of the eastern Taiwan Strait (ETS) population of the Chinese white dolphin (Sousa chinensis) has been estimated to be less than 100 individuals. It is categorized as critically endangered in the IUCN Red List of Threatened Species. Thus, immediate measures of conservation should be taken to protect it from extinction. Currently, the Taiwanese government plans to designate its habitat as a Major Wildlife Habitat (MWH), a type of marine protected area (MPA) for conservation of wildlife species. Although the designation allows continuing the current exploitation, however, it may cause conflicts among multiple stakeholders with competing interests. The study is to explore the attitude and opinions among the stakeholders in order to better manage the MPA. This study employs a semi-structured interview and a questionnaire survey of local fishers. Results from interviews indicated that the subsistence of fishers remains a major problem. It was found that stakeholders have different perceptions of the fishers’ attitude towards conservation and also thought that the fishery-related law enforcement could be difficult. Quantitative survey showed that fishers are generally positive towards the conservation of the Chinese white dolphin but are less willing to participate in the planning process. Most fishers considered temporary fishing closure as feasible for conservation. The results of this study provide recommendations for future efforts towards the goal of better conservation for this endangered species. PMID:27526102

  10. Book review: Biology and conservation of martens, sables, and fishers: A new synthesis

    USGS Publications Warehouse

    Jenkins, Kurt J.

    2013-01-01

    Review info: Biology and conservation of martens, sables, and fishers: A new synthesis. Edited by K.B. Aubry, W.J. Zielinski, M.G. Raphael, G. Proulx, and S.W. Buskirk, 2012. ISBN: 978-08014, 580pp.

  11. Existence of travelling wave solutions for a Fisher-Kolmogorov system with biomedical applications

    NASA Astrophysics Data System (ADS)

    Belmonte-Beitia, Juan

    2016-07-01

    We consider a Fisher-Kolmogorov system with applications in oncology Pérez-García et al. (2015). Of interest is the question of the existence of travelling front solutions of the system. When the speed of the travelling wave is sufficiently large, existence of such fronts is shown using singular geometric perturbation theory.

  12. Fisher Ames and Political Judgment: Reason, Passion, and Vehement Style in the Jay Treaty Speech.

    ERIC Educational Resources Information Center

    Farrell, James M.

    1990-01-01

    Analyzes Fisher Ames' fiery speech of 1796 on the Jay Treaty. Demonstrates the influence of Scottish enlightenment thinkers (particularly in moral sense philosophy and faculty psychology) on Ames and his rhetoric. Demonstrates how Ames made a compelling case to shift the standard of political judgment from reason to passion. (SR)

  13. Derivation of the equations of nonrelativistic quantum mechanics using the principle of minimum Fisher information

    SciTech Connect

    Reginatto, M.

    1998-09-01

    The many-particle time-dependent Schr{umlt o}dinger equation is derived using the principle of minimum Fisher information. This application of information theory leads to a physically well motivated derivation of the Schr{umlt o}dinger equation, which distinguishes between subjective and objective elements of the theory.

  14. Fisher-Level Decision Making to Participate in Fisheries Improvement Projects (FIPs) for Yellowfin Tuna in the Philippines

    PubMed Central

    Berentsen, Paul; Bush, Simon R.; Digal, Larry; Oude Lansink, Alfons

    2016-01-01

    This study identifies the capabilities needed by small-scale fishers to participate in Fishery Improvement Projects (FIPs) for yellowfin tuna in the Philippines. The current literature provides little empirical evidence on how different models, or types of FIPs, influence the participation of fishers in their programs and the degree which FIPs are able to foster improvements in fishing practices. To address this literature gap, two different FIPs are empirically analysed, each with different approaches for fostering improvement. The first is the non-governmental organisation-led Partnership Programme Towards Sustainable Tuna, which adopts a bottom-up or development oriented FIP model. The second is the private-led Artesmar FIP, which adopts a top-down or market-oriented FIP approach. The data were obtained from 350 fishers surveyed and were analysed using two separate models run in succession, taking into consideration full, partial, and non-participation in the two FIPs. The results demonstrate that different types of capabilities are required in order to participate in different FIP models. Individual firm capabilities are more important for fishers participation in market-oriented FIPs, which use direct economic incentives to encourage improvements in fisher practices. Collective capabilities are more important for fishers to participate in development-oriented FIPs, which drive improvement by supporting fishers, fisher associations, and governments to move towards market requirements. PMID:27732607

  15. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  16. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  17. 33 CFR 207.260 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 207.260 Section 207.260... REGULATIONS § 207.260 Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher... canal at any stage from the mouth of the Yazoo Diversion Canal where it enters into the...

  18. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  19. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  20. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  1. 33 CFR 207.260 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 207.260 Section 207.260... REGULATIONS § 207.260 Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher... canal at any stage from the mouth of the Yazoo Diversion Canal where it enters into the...

  2. Lessons from integrating fishers of arapaima in small-scale fisheries management at the Mamirauá Reserve, Amazon.

    PubMed

    Castello, Leandro; Viana, João P; Watkins, Graham; Pinedo-Vasquez, Miguel; Luzadis, Valerie A

    2009-02-01

    Fishers and small-scale fisheries worldwide have been marginalized historically. Now it is clear that integrating fishers in management processes is key to resource conservation, but it is less clear how to do it. Here, based on a literature review and new information, we present and analyze a case in which the participation of fishers in the management process was crucial in recovering an overexploited small-scale fishery for the pirarucu (Arapaima spp.) in the Amazon Basin, Brazil. In 8 years of experimental management, from 1999 to 2006, the population of pirarucu increased 9-fold (from about 2200 to 20,650 individuals), harvest quotas increased 10-fold (from 120 to 1249 individuals), and fishers' participation in the management process increased and they benefited from increased monetary returns. Additionally, the number of communities conducting the management scheme increased from 4 in 1999 to 108 in 2006, following the demands of fishers and regional government agencies. Based on our analysis, we suggest that the participation of fishers in the management of other small-scale fisheries in the world can be improved by focusing on (1) applying the knowledge and skills of fishers in resource monitoring and management, (2) bridging knowledge systems among all involved stakeholders, (3) collaborating with fishers that are interested in, and capable of conducting, resource conservation schemes, and (4) conducting management under conditions of uncertainty.

  3. Burden's on U! the Impact of the "Fisher v. University of Texas at Austin" Decision on K-16 Admissions Policies

    ERIC Educational Resources Information Center

    Nguyen, David H. K.

    2014-01-01

    Using race as a factor in admissions policies was contested in "Fisher v. University of Texas at Austin." Although the U.S. Supreme Court firmly held in "Grutter v. Bollinger" that race can be considered among many factors in admitting students, the recent decision in "Fisher" has posed many questions and challenges…

  4. Discriminant learning analysis.

    PubMed

    Peng, Jing; Zhang, Peng; Riedel, Norbert

    2008-12-01

    Linear discriminant analysis (LDA) as a dimension reduction method is widely used in classification such as face recognition. However, it suffers from the small sample size (SSS) problem when data dimensionality is greater than the sample size, as in images where features are high dimensional and correlated. In this paper, we propose to address the SSS problem in the framework of statistical learning theory. We compute linear discriminants by regularized least squares regression, where the singularity problem is resolved. The resulting discriminants are complete in that they include both regular and irregular information. We show that our proposal and its nonlinear extension belong to the same framework where powerful classifiers such as support vector machines are formulated. In addition, our approach allows us to establish an error bound for LDA. Finally, our experiments validate our theoretical analysis results.

  5. Bias, discrimination, and obesity.

    PubMed

    Puhl, R; Brownell, K D

    2001-12-01

    This article reviews information on discriminatory attitudes and behaviors against obese individuals, integrates this to show whether systematic discrimination occurs and why, and discusses needed work in the field. Clear and consistent stigmatization, and in some cases discrimination, can be documented in three important areas of living: employment, education, and health care. Among the findings are that 28% of teachers in one study said that becoming obese is the worst thing that can happen to a person; 24% of nurses said that they are "repulsed" by obese persons; and, controlling for income and grades, parents provide less college support for their overweight than for their thin children. There are also suggestions but not yet documentation of discrimination occurring in adoption proceedings, jury selection, housing, and other areas. Given the vast numbers of people potentially affected, it is important to consider the research-related, educational, and social policy implications of these findings.

  6. Rare variant testing across methods and thresholds using the multi-kernel sequence kernel association test (MK-SKAT).

    PubMed

    Urrutia, Eugene; Lee, Seunggeun; Maity, Arnab; Zhao, Ni; Shen, Judong; Li, Yun; Wu, Michael C

    Analysis of rare genetic variants has focused on region-based analysis wherein a subset of the variants within a genomic region is tested for association with a complex trait. Two important practical challenges have emerged. First, it is difficult to choose which test to use. Second, it is unclear which group of variants within a region should be tested. Both depend on the unknown true state of nature. Therefore, we develop the Multi-Kernel SKAT (MK-SKAT) which tests across a range of rare variant tests and groupings. Specifically, we demonstrate that several popular rare variant tests are special cases of the sequence kernel association test which compares pair-wise similarity in trait value to similarity in the rare variant genotypes between subjects as measured through a kernel function. Choosing a particular test is equivalent to choosing a kernel. Similarly, choosing which group of variants to test also reduces to choosing a kernel. Thus, MK-SKAT uses perturbation to test across a range of kernels. Simulations and real data analyses show that our framework controls type I error while maintaining high power across settings: MK-SKAT loses power when compared to the kernel for a particular scenario but has much greater power than poor choices.

  7. An information measure for class discrimination. [in remote sensing of crop observation

    NASA Technical Reports Server (NTRS)

    Shen, S. S.; Badhwar, G. D.

    1986-01-01

    This article describes a separability measure for class discrimination. This measure is based on the Fisher information measure for estimating the mixing proportion of two classes. The Fisher information measure not only provides a means to assess quantitatively the information content in the features for separating classes, but also gives the lower bound for the variance of any unbiased estimate of the mixing proportion based on observations of the features. Unlike most commonly used separability measures, this measure is not dependent on the form of the probability distribution of the features and does not imply a specific estimation procedure. This is important because the probability distribution function that describes the data for a given class does not have simple analytic forms, such as a Gaussian. Results of applying this measure to compare the information content provided by three Landsat-derived feature vectors for the purpose of separating small grains from other crops are presented.

  8. Optical fiber phase discriminator.

    PubMed

    Danielson, B L

    1978-11-15

    Phase discriminators are devices widely used at rf and microwave frequencies to convert phase, or frequency, changes to amplitude changes. They find widespread use in generating audio feedback signals for frequency stabilization of oscillators and in angle demodulation applications. This paper demonstrates that similar devices, with similar functions, can be constructed in the visible region using optical fibers as delay-line elements. The operating principles of an optical-fiber delay-line phase discriminator are discussed. The sensitivity is shown to be proportional to the fiber propagation-delay time. A device working at 0.6328 microm is described and compared with predictions.

  9. Fishing for space: fine-scale multi-sector maritime activities influence fisher location choice.

    PubMed

    Tidd, Alex N; Vermard, Youen; Marchal, Paul; Pinnegar, John; Blanchard, Julia L; Milner-Gulland, E J

    2015-01-01

    The European Union and other states are moving towards Ecosystem Based Fisheries Management to balance food production and security with wider ecosystem concerns. Fishing is only one of several sectors operating within the ocean environment, competing for renewable and non-renewable resources that overlap in a limited space. Other sectors include marine mining, energy generation, recreation, transport and conservation. Trade-offs of these competing sectors are already part of the process but attempts to detail how the seas are being utilised have been primarily based on compilations of data on human activity at large spatial scales. Advances including satellite and shipping automatic tracking enable investigation of factors influencing fishers' choice of fishing grounds at spatial scales relevant to decision-making, including the presence or avoidance of activities by other sectors. We analyse the determinants of English and Welsh scallop-dredging fleet behaviour, including competing sectors, operating in the eastern English Channel. Results indicate aggregate mining activity, maritime traffic, increased fishing costs, and the English inshore 6 and French 12 nautical mile limits negatively impact fishers' likelihood of fishing in otherwise suitable areas. Past success, net-benefits and fishing within the 12 NM predispose fishers to use areas. Systematic conservation planning has yet to be widely applied in marine systems, and the dynamics of spatial overlap of fishing with other activities have not been studied at scales relevant to fisher decision-making. This study demonstrates fisher decision-making is indeed affected by the real-time presence of other sectors in an area, and therefore trade-offs which need to be accounted for in marine planning. As marine resource extraction demands intensify, governments will need to take a more proactive approach to resolving these trade-offs, and studies such as this will be required as the evidential foundation for future

  10. The Influence of Fisher Knowledge on the Susceptibility of Reef Fish Aggregations to Fishing

    PubMed Central

    Robinson, Jan; Cinner, Joshua E.; Graham, Nicholas A. J.

    2014-01-01

    Reef fishes that exhibit predictable aggregating behaviour are often considered vulnerable to overexploitation. However, fisher knowledge of this behaviour is often heterogeneous and, coupled with socioeconomic factors that constrain demand for or access to aggregated fish, will influence susceptibility to fishing. At two case study locations in Papua New Guinea, Ahus and Karkar islands, we conducted interview-based surveys to examine how local context influenced heterogeneity in knowledge of fish aggregations. We then explored the role of fisher knowledge in conferring susceptibility to fishing relative to socioeconomic drivers of fishing effort. Local heterogeneity in knowledge of aggregating behaviour differed between our case studies. At Ahus, variable access rights among fishers and genders to the main habitats were sources of heterogeneity in knowledge. By contrast, knowledge was more homogenous at Karkar and the sole source of variation was gear type. Differences between locations in the susceptibility of aggregations to fishing depended primarily on socioeconomic drivers of fishing effort rather than catchability. While Ahus fishers were knowledgeable of fish aggregations and used more selective gears, Karkar fishers were less constrained by tenure in their access to aggregation habitat. However, fishing effort was greater at Ahus and likely related to high dependency on fishing, greater access to provincial capital markets than Karkar and a weakening of customary management. Moreover, highly efficient fishing techniques have emerged at Ahus to exploit the non-reproductive aggregating behaviour of target species. Understanding how knowledge is structured within fishing communities and its relation to socioeconomic drivers of fishing effort is important if customary practices for conservation, such as tambu areas, are to be supported. The findings of this study call for a holistic approach to assessing the risks posed to reef fish aggregations by fishing

  11. The influence of fisher knowledge on the susceptibility of reef fish aggregations to fishing.

    PubMed

    Robinson, Jan; Cinner, Joshua E; Graham, Nicholas A J

    2014-01-01

    Reef fishes that exhibit predictable aggregating behaviour are often considered vulnerable to overexploitation. However, fisher knowledge of this behaviour is often heterogeneous and, coupled with socioeconomic factors that constrain demand for or access to aggregated fish, will influence susceptibility to fishing. At two case study locations in Papua New Guinea, Ahus and Karkar islands, we conducted interview-based surveys to examine how local context influenced heterogeneity in knowledge of fish aggregations. We then explored the role of fisher knowledge in conferring susceptibility to fishing relative to socioeconomic drivers of fishing effort. Local heterogeneity in knowledge of aggregating behaviour differed between our case studies. At Ahus, variable access rights among fishers and genders to the main habitats were sources of heterogeneity in knowledge. By contrast, knowledge was more homogenous at Karkar and the sole source of variation was gear type. Differences between locations in the susceptibility of aggregations to fishing depended primarily on socioeconomic drivers of fishing effort rather than catchability. While Ahus fishers were knowledgeable of fish aggregations and used more selective gears, Karkar fishers were less constrained by tenure in their access to aggregation habitat. However, fishing effort was greater at Ahus and likely related to high dependency on fishing, greater access to provincial capital markets than Karkar and a weakening of customary management. Moreover, highly efficient fishing techniques have emerged at Ahus to exploit the non-reproductive aggregating behaviour of target species. Understanding how knowledge is structured within fishing communities and its relation to socioeconomic drivers of fishing effort is important if customary practices for conservation, such as tambu areas, are to be supported. The findings of this study call for a holistic approach to assessing the risks posed to reef fish aggregations by fishing

  12. ODVBA: Optimally-Discriminative Voxel-Based Analysis

    PubMed Central

    Davatzikos, Christos

    2012-01-01

    Gaussian smoothing of images prior to applying voxel-based statistics is an important step in Voxel-Based Analysis and Statistical Parametric Mapping (VBA-SPM), and is used to account for registration errors, to Gaussianize the data, and to integrate imaging signals from a region around each voxel. However, it has also become a limitation of VBA-SPM based methods, since it is often chosen empirically and lacks spatial adaptivity to the shape and spatial extent of the region of interest, such as a region of atrophy or functional activity. In this paper, we propose a new framework, named Optimally-Discriminative Voxel-Based Analysis (ODVBA), for determining the optimal spatially adaptive smoothing of images, followed by applying voxel-based group analysis. In ODVBA, Nonnegative Discriminative Projection is applied regionally to get the direction that best discriminates between two groups, e.g., patients and controls; this direction is equivalent to local filtering by an optimal kernel whose coefficients define the optimally discriminative direction. By considering all the neighborhoods that contain a given voxel, we then compose this information to produce the statistic for each voxel. Finally, permutation tests are used to obtain a statistical parametric map of group differences. ODVBA has been evaluated using simulated data in which the ground truth is known and with data from an Alzheimer’s disease (AD) study. The experimental results have shown that the proposed ODVBA can precisely describe the shape and location of structural abnormality. PMID:21324774

  13. Impact of Species and Variety on Concentrations of Minor Lipophilic Bioactive Compounds in Oils Recovered from Plum Kernels.

    PubMed

    Górnaś, Paweł; Rudzińska, Magdalena; Raczyk, Marianna; Mišina, Inga; Soliven, Arianne; Lācis, Gunārs; Segliņa, Dalija

    2016-02-03

    The profile of bioactive compounds (carotenoids, tocopherols, tocotrienols, phytosterols, and squalene) in oils recovered from the kernels of 28 plum varieties of hexaploid species Prunus domestica L. and diploid plums Prunus cerasifera Ehrh. and their crossbreeds were studied. Oil yields in plum kernels of both P. cerasifera and P. domestica was in wide ranges of 22.6-53.1 and 24.2-46.9% (w/w) dw, respectively. The contents of total tocochromanols, carotenoids, phytosterols, and squalene was significantly affected by the variety and ranged between 70.7 and 208.7 mg/100 g of oil, between 0.41 and 3.07 mg/100 g of oil, between 297.2 and 1569.6 mg/100 g of oil, and between 25.7 and 80.4 mg/100 g of oil, respectively. Regardless of the cultivar, β-sitosterol and γ-tocopherol were the main minor lipophilic compounds in plum kernel oils and constituted between 208.5 and 1258.7 mg/100 g of oil and between 60.5 and 182.0 mg/100 g of oil, respectively. Between the studied plum species, significant differences were recorded for δ-tocopherol (p = 0.007), 24-methylenecycloartanol (p = 0.038), and citrostadienol (p = 0.003), but they were insufficient for discrimination by PCA.

  14. Two dimensional discriminant neighborhood preserving embedding in face recognition

    NASA Astrophysics Data System (ADS)

    Pang, Meng; Jiang, Jifeng; Lin, Chuang; Wang, Binghui

    2015-03-01

    One of the key issues of face recognition is to extract the features of face images. In this paper, we propose a novel method, named two-dimensional discriminant neighborhood preserving embedding (2DDNPE), for image feature extraction and face recognition. 2DDNPE benefits from four techniques, i.e., neighborhood preserving embedding (NPE), locality preserving projection (LPP), image based projection and Fisher criterion. Firstly, NPE and LPP are two popular manifold learning techniques which can optimally preserve the local geometry structures of the original samples from different angles. Secondly, image based projection enables us to directly extract the optimal projection vectors from twodimensional image matrices rather than vectors, which avoids the small sample size problem as well as reserves useful structural information embedded in the original images. Finally, the Fisher criterion applied in 2DDNPE can boost face recognition rates by minimizing the within-class distance, while maximizing the between-class distance. To evaluate the performance of 2DDNPE, several experiments are conducted on the ORL and Yale face datasets. The results corroborate that 2DDNPE outperforms the existing 1D feature extraction methods, such as NPE, LPP, LDA and PCA across all experiments with respect to recognition rate and training time. 2DDNPE also delivers consistently promising results compared with other competing 2D methods such as 2DNPP, 2DLPP, 2DLDA and 2DPCA.

  15. Inheritance of Kernel Color in Corn: Explanations and Investigations.

    ERIC Educational Resources Information Center

    Ford, Rosemary H.

    2000-01-01

    Offers a new perspective on traditional problems in genetics on kernel color in corn, including information about genetic regulation, metabolic pathways, and evolution of genes. (Contains 15 references.) (ASK)

  16. Nonlinear hyperspectral unmixing based on constrained multiple kernel NMF

    NASA Astrophysics Data System (ADS)

    Cui, Jiantao; Li, Xiaorun; Zhao, Liaoying

    2014-05-01

    Nonlinear spectral unmixing constitutes an important field of research for hyperspectral imagery. An unsupervised nonlinear spectral unmixing algorithm, namely multiple kernel constrained nonnegative matrix factorization (MKCNMF) is proposed by coupling multiple-kernel selection with kernel NMF. Additionally, a minimum endmemberwise distance constraint and an abundance smoothness constraint are introduced to alleviate the uniqueness problem of NMF in the algorithm. In the MKCNMF, two problems of optimizing matrices and selecting the proper kernel are jointly solved. The performance of the proposed unmixing algorithm is evaluated via experiments based on synthetic and real hyperspectral data sets. The experimental results demonstrate that the proposed method outperforms some existing unmixing algorithms in terms of spectral angle distance (SAD) and abundance fractions.

  17. Hash subgraph pairwise kernel for protein-protein interaction extraction.

    PubMed

    Zhang, Yijia; Lin, Hongfei; Yang, Zhihao; Wang, Jian; Li, Yanpeng

    2012-01-01

    Extracting protein-protein interaction (PPI) from biomedical literature is an important task in biomedical text mining (BioTM). In this paper, we propose a hash subgraph pairwise (HSP) kernel-based approach for this task. The key to the novel kernel is to use the hierarchical hash labels to express the structural information of subgraphs in a linear time. We apply the graph kernel to compute dependency graphs representing the sentence structure for protein-protein interaction extraction task, which can efficiently make use of full graph structural information, and particularly capture the contiguous topological and label information ignored before. We evaluate the proposed approach on five publicly available PPI corpora. The experimental results show that our approach significantly outperforms all-path kernel approach on all five corpora and achieves state-of-the-art performance.

  18. On the asymptotic expansion of the Bergman kernel

    NASA Astrophysics Data System (ADS)

    Seto, Shoo

    Let (L, h) → (M, o) be a polarized Kahler manifold. We define the Bergman kernel for H0(M, Lk), holomorphic sections of the high tensor powers of the line bundle L. In this thesis, we will study the asymptotic expansion of the Bergman kernel. We will consider the on-diagonal, near-diagonal and far off-diagonal, using L2 estimates to show the existence of the asymptotic expansion and computation of the coefficients for the on and near-diagonal case, and a heat kernel approach to show the exponential decay of the off-diagonal of the Bergman kernel for noncompact manifolds assuming only a lower bound on Ricci curvature and C2 regularity of the metric.

  19. Kernel-based Linux emulation for Plan 9.

    SciTech Connect

    Minnich, Ronald G.

    2010-09-01

    CNKemu is a kernel-based system for the 9k variant of the Plan 9 kernel. It is designed to provide transparent binary support for programs compiled for IBM's Compute Node Kernel (CNK) on the Blue Gene series of supercomputers. This support allows users to build applications with the standard Blue Gene toolchain, including C++ and Fortran compilers. While the CNK is not Linux, IBM designed the CNK so that the user interface has much in common with the Linux 2.0 system call interface. The Plan 9 CNK emulator hence provides the foundation of kernel-based Linux system call support on Plan 9. In this paper we discuss cnkemu's implementation and some of its more interesting features, such as the ability to easily intermix Plan 9 and Linux system calls.

  20. Resummed memory kernels in generalized system-bath master equations.

    PubMed

    Mavros, Michael G; Van Voorhis, Troy

    2014-08-07

    Generalized master equations provide a concise formalism for studying reduced population dynamics. Usually, these master equations require a perturbative expansion of the memory kernels governing the dynamics; in order to prevent divergences, these expansions must be resummed. Resummation techniques of perturbation series are ubiquitous in physics, but they have not been readily studied for the time-dependent memory kernels used in generalized master equations. In this paper, we present a comparison of different resummation techniques for such memory kernels up to fourth order. We study specifically the spin-boson Hamiltonian as a model system bath Hamiltonian, treating the diabatic coupling between the two states as a perturbation. A novel derivation of the fourth-order memory kernel for the spin-boson problem is presented; then, the second- and fourth-order kernels are evaluated numerically for a variety of spin-boson parameter regimes. We find that resumming the kernels through fourth order using a Padé approximant results in divergent populations in the strong electronic coupling regime due to a singularity introduced by the nature of the resummation, and thus recommend a non-divergent exponential resummation (the "Landau-Zener resummation" of previous work). The inclusion of fourth-order effects in a Landau-Zener-resummed kernel is shown to improve both the dephasing rate and the obedience of detailed balance over simpler prescriptions like the non-interacting blip approximation, showing a relatively quick convergence on the exact answer. The results suggest that including higher-order contributions to the memory kernel of a generalized master equation and performing an appropriate resummation can provide a numerically-exact solution to system-bath dynamics for a general spectral density, opening the way to a new class of methods for treating system-bath dynamics.