Science.gov

Sample records for kernel fisher discriminant

  1. The use of kernel local Fisher discriminant analysis for the channelization of the Hotelling model observer

    NASA Astrophysics Data System (ADS)

    Wen, Gezheng; Markey, Mia K.

    2015-03-01

    It is resource-intensive to conduct human studies for task-based assessment of medical image quality and system optimization. Thus, numerical model observers have been developed as a surrogate for human observers. The Hotelling observer (HO) is the optimal linear observer for signal-detection tasks, but the high dimensionality of imaging data results in a heavy computational burden. Channelization is often used to approximate the HO through a dimensionality reduction step, but how to produce channelized images without losing significant image information remains a key challenge. Kernel local Fisher discriminant analysis (KLFDA) uses kernel techniques to perform supervised dimensionality reduction, which finds an embedding transformation that maximizes betweenclass separability and preserves within-class local structure in the low-dimensional manifold. It is powerful for classification tasks, especially when the distribution of a class is multimodal. Such multimodality could be observed in many practical clinical tasks. For example, primary and metastatic lesions may both appear in medical imaging studies, but the distributions of their typical characteristics (e.g., size) may be very different. In this study, we propose to use KLFDA as a novel channelization method. The dimension of the embedded manifold (i.e., the result of KLFDA) is a counterpart to the number of channels in the state-of-art linear channelization. We present a simulation study to demonstrate the potential usefulness of KLFDA for building the channelized HOs (CHOs) and generating reliable decision statistics for clinical tasks. We show that the performance of the CHO with KLFDA channels is comparable to that of the benchmark CHOs.

  2. Fault diagnosis of nonlinear and large-scale processes using novel modified kernel Fisher discriminant analysis approach

    NASA Astrophysics Data System (ADS)

    Shi, Huaitao; Liu, Jianchang; Wu, Yuhou; Zhang, Ke; Zhang, Lixiu; Xue, Peng

    2016-04-01

    It is pretty significant for fault diagnosis timely and accurately to improve the dependability of industrial processes. In this study, fault diagnosis of nonlinear and large-scale processes by variable-weighted kernel Fisher discriminant analysis (KFDA) based on improved biogeography-based optimisation (IBBO) is proposed, referred to as IBBO-KFDA, where IBBO is used to determine the parameters of variable-weighted KFDA, and variable-weighted KFDA is used to solve the multi-classification overlapping problem. The main contributions of this work are four-fold to further improve the performance of KFDA for fault diagnosis. First, a nonlinear fault diagnosis approach with variable-weighted KFDA is developed for maximising separation between the overlapping fault samples. Second, kernel parameters and features selection of variable-weighted KFDA are simultaneously optimised using IBBO. Finally, a single fitness function that combines erroneous diagnosis rate with feature cost is created, a novel mixed kernel function is introduced to improve the classification capability in the feature space and diagnosis accuracy of the IBBO-KFDA, and serves as the target function in the optimisation problem. Moreover, an IBBO approach is developed to obtain the better quality of solution and faster convergence speed. On the one hand, the proposed IBBO-KFDA method is first used on Tennessee Eastman process benchmark data sets to validate the feasibility and efficiency. On the other hand, IBBO-KFDA is applied to diagnose faults of automation gauge control system. Simulation results demonstrate that IBBO-KFDA can obtain better kernel parameters and feature vectors with a lower computing cost, higher diagnosis accuracy and a better real-time capacity.

  3. Volcano clustering determination: Bivariate Gauss vs. Fisher kernels

    NASA Astrophysics Data System (ADS)

    Cañón-Tapia, Edgardo

    2013-05-01

    Underlying many studies of volcano clustering is the implicit assumption that vent distribution can be studied by using kernels originally devised for distribution in plane surfaces. Nevertheless, an important change in topology in the volcanic context is related to the distortion that is introduced when attempting to represent features found on the surface of a sphere that are being projected into a plane. This work explores the extent to which different topologies of the kernel used to study the spatial distribution of vents can introduce significant changes in the obtained density functions. To this end, a planar (Gauss) and a spherical (Fisher) kernels are mutually compared. The role of the smoothing factor in these two kernels is also explored with some detail. The results indicate that the topology of the kernel is not extremely influential, and that either type of kernel can be used to characterize a plane or a spherical distribution with exactly the same detail (provided that a suitable smoothing factor is selected in each case). It is also shown that there is a limitation on the resolution of the Fisher kernel relative to the typical separation between data that can be accurately described, because data sets with separations lower than 500 km are considered as a single cluster using this method. In contrast, the Gauss kernel can provide adequate resolutions for vent distributions at a wider range of separations. In addition, this study also shows that the numerical value of the smoothing factor (or bandwidth) of both the Gauss and Fisher kernels has no unique nor direct relationship with the relevant separation among data. In order to establish the relevant distance, it is necessary to take into consideration the value of the respective smoothing factor together with a level of statistical significance at which the contributions to the probability density function will be analyzed. Based on such reference level, it is possible to create a hierarchy of

  4. Kernel PLS-SVC for Linear and Nonlinear Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Trejo, Leonard J.; Matthews, Bryan

    2003-01-01

    A new methodology for discrimination is proposed. This is based on kernel orthonormalized partial least squares (PLS) dimensionality reduction of the original data space followed by support vector machines for classification. Close connection of orthonormalized PLS and Fisher's approach to linear discrimination or equivalently with canonical correlation analysis is described. This gives preference to use orthonormalized PLS over principal component analysis. Good behavior of the proposed method is demonstrated on 13 different benchmark data sets and on the real world problem of the classification finger movement periods versus non-movement periods based on electroencephalogram.

  5. Emotion Recognition from Single-Trial EEG Based on Kernel Fisher's Emotion Pattern and Imbalanced Quasiconformal Kernel Support Vector Machine

    PubMed Central

    Liu, Yi-Hung; Wu, Chien-Te; Cheng, Wei-Teng; Hsiao, Yu-Tsung; Chen, Po-Ming; Teng, Jyh-Tong

    2014-01-01

    Electroencephalogram-based emotion recognition (EEG-ER) has received increasing attention in the fields of health care, affective computing, and brain-computer interface (BCI). However, satisfactory ER performance within a bi-dimensional and non-discrete emotional space using single-trial EEG data remains a challenging task. To address this issue, we propose a three-layer scheme for single-trial EEG-ER. In the first layer, a set of spectral powers of different EEG frequency bands are extracted from multi-channel single-trial EEG signals. In the second layer, the kernel Fisher's discriminant analysis method is applied to further extract features with better discrimination ability from the EEG spectral powers. The feature vector produced by layer 2 is called a kernel Fisher's emotion pattern (KFEP), and is sent into layer 3 for further classification where the proposed imbalanced quasiconformal kernel support vector machine (IQK-SVM) serves as the emotion classifier. The outputs of the three layer EEG-ER system include labels of emotional valence and arousal. Furthermore, to collect effective training and testing datasets for the current EEG-ER system, we also use an emotion-induction paradigm in which a set of pictures selected from the International Affective Picture System (IAPS) are employed as emotion induction stimuli. The performance of the proposed three-layer solution is compared with that of other EEG spectral power-based features and emotion classifiers. Results on 10 healthy participants indicate that the proposed KFEP feature performs better than other spectral power features, and IQK-SVM outperforms traditional SVM in terms of the EEG-ER accuracy. Our findings also show that the proposed EEG-ER scheme achieves the highest classification accuracies of valence (82.68%) and arousal (84.79%) among all testing methods. PMID:25061837

  6. Optimal Fisher Discriminant Ratio for an Arbitrary Spatial Light Modulator

    NASA Technical Reports Server (NTRS)

    Juday, Richard D.

    1999-01-01

    Optimizing the Fisher ratio is well established in statistical pattern recognition as a means of discriminating between classes. I show how to optimize that ratio for optical correlation intensity by choice of filter on an arbitrary spatial light modulator (SLM). I include the case of additive noise of known power spectral density.

  7. Kernel Partial Least Squares for Nonlinear Regression and Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper summarizes recent results on applying the method of partial least squares (PLS) in a reproducing kernel Hilbert space (RKHS). A previously proposed kernel PLS regression model was proven to be competitive with other regularized regression methods in RKHS. The family of nonlinear kernel-based PLS models is extended by considering the kernel PLS method for discrimination. Theoretical and experimental results on a two-class discrimination problem indicate usefulness of the method.

  8. Microcalcifications detection using Fisher's linear discriminant and breast density.

    PubMed

    Rodriguez, G A; Gonzalez, J A; Altamirano, L; Guichard, J S; Diaz, R

    2011-01-01

    Breast cancer is one of the main causes of death in women. However, its early detection through microcalcifications identification is a powerful tool to save many lives. In this study, we present a supervised microcalcifications detection method based on Fisher's Linear Discriminant. Our method considers knowledge about breast density allowing it to identify microcalcifications even in difficult cases (when there is not high contrast between the microcalcification and the surrounding breast tissue). We evaluated our method with two mammograms databases for each of its phases: breast density classification, microcalcifications segmentation, and false-positive reduction, obtaining cumulative accuracy results around 90% for the microcalcifications detection task.

  9. Selection of principal components based on Fisher discriminant ratio

    NASA Astrophysics Data System (ADS)

    Zeng, Xiangyan; Naghedolfeizi, Masoud; Arora, Sanjeev; Yousif, Nabil; Aberra, Dawit

    2016-05-01

    Principal component analysis transforms a set of possibly correlated variables into uncorrelated variables, and is widely used as a technique of dimensionality reduction and feature extraction. In some applications of dimensionality reduction, the objective is to use a small number of principal components to represent most variation in the data. On the other hand, the main purpose of feature extraction is to facilitate subsequent pattern recognition and machine learning tasks, such as classification. Selecting principal components for classification tasks aims for more than dimensionality reduction. The capability of distinguishing different classes is another major concern. Components that have larger eigenvalues do not necessarily have better distinguishing capabilities. In this paper, we investigate a strategy of selecting principal components based on the Fisher discriminant ratio. The ratio of between class variance to within class variance is calculated for each component, based on which the principal components are selected. The number of relevant components is determined by the classification accuracy. To alleviate overfitting which is common when there are few training data available, we use a cross-validation procedure to determine the number of principal components. The main objective is to select the components that have large Fisher discriminant ratios so that adequate class separability is obtained. The number of selected components is determined by the classification accuracy of the validation data. The selection method is evaluated by face recognition experiments.

  10. An intelligent fault diagnosis method of rolling bearings based on regularized kernel Marginal Fisher analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Li; Shi, Tielin; Xuan, Jianping

    2012-05-01

    Generally, the vibration signals of fault bearings are non-stationary and highly nonlinear under complicated operating conditions. Thus, it's a big challenge to extract optimal features for improving classification and simultaneously decreasing feature dimension. Kernel Marginal Fisher analysis (KMFA) is a novel supervised manifold learning algorithm for feature extraction and dimensionality reduction. In order to avoid the small sample size problem in KMFA, we propose regularized KMFA (RKMFA). A simple and efficient intelligent fault diagnosis method based on RKMFA is put forward and applied to fault recognition of rolling bearings. So as to directly excavate nonlinear features from the original high-dimensional vibration signals, RKMFA constructs two graphs describing the intra-class compactness and the inter-class separability, by combining traditional manifold learning algorithm with fisher criteria. Therefore, the optimal low-dimensional features are obtained for better classification and finally fed into the simplest K-nearest neighbor (KNN) classifier to recognize different fault categories of bearings. The experimental results demonstrate that the proposed approach improves the fault classification performance and outperforms the other conventional approaches.

  11. Fisher kernel based task boundary retrieval in laparoscopic database with single video query.

    PubMed

    Twinanda, Andru Putra; De Mathelin, Michel; Padoy, Nicolas

    2014-01-01

    As minimally invasive surgery becomes increasingly popular, the volume of recorded laparoscopic videos will increase rapidly. Invaluable information for teaching, assistance during difficult cases, and quality evaluation can be accessed from these videos through a video search engine. Typically, video search engines give a list of the most relevant videos pertaining to a keyword. However, instead of a whole video, one is often only interested in a fraction of the video (e.g. intestine stitching in bypass surgeries). In addition, video search requires semantic tags, yet the large amount of data typically generated hinders the feasibility of manual annotation. To tackle these problems, we propose a coarse-to-fine video indexing approach that looks for the time boundaries of a task in a laparoscopic video based on a video snippet query. We combine our search approach with the Fisher kernel (FK) encoding and show that similarity measures on this encoding are better suited for this problem than traditional similarities, such as dynamic time warping (DTW). Despite visual challenges, such as the presence of smoke, motion blur, and lens impurity, our approach performs very well in finding 3 tasks in 49 bypass videos, 1 task in 23 hernia videos, and also 1 cross-surgery task between 49 bypass and 7 sleeve gastrectomy videos. PMID:25320826

  12. Fisher kernel based task boundary retrieval in laparoscopic database with single video query.

    PubMed

    Twinanda, Andru Putra; De Mathelin, Michel; Padoy, Nicolas

    2014-01-01

    As minimally invasive surgery becomes increasingly popular, the volume of recorded laparoscopic videos will increase rapidly. Invaluable information for teaching, assistance during difficult cases, and quality evaluation can be accessed from these videos through a video search engine. Typically, video search engines give a list of the most relevant videos pertaining to a keyword. However, instead of a whole video, one is often only interested in a fraction of the video (e.g. intestine stitching in bypass surgeries). In addition, video search requires semantic tags, yet the large amount of data typically generated hinders the feasibility of manual annotation. To tackle these problems, we propose a coarse-to-fine video indexing approach that looks for the time boundaries of a task in a laparoscopic video based on a video snippet query. We combine our search approach with the Fisher kernel (FK) encoding and show that similarity measures on this encoding are better suited for this problem than traditional similarities, such as dynamic time warping (DTW). Despite visual challenges, such as the presence of smoke, motion blur, and lens impurity, our approach performs very well in finding 3 tasks in 49 bypass videos, 1 task in 23 hernia videos, and also 1 cross-surgery task between 49 bypass and 7 sleeve gastrectomy videos.

  13. Discrimination of healthy and osteoarthritic articular cartilage by Fourier transform infrared imaging and Fisher's discriminant analysis.

    PubMed

    Mao, Zhi-Hua; Yin, Jian-Hua; Zhang, Xue-Xi; Wang, Xiao; Xia, Yang

    2016-02-01

    Fourier transform infrared spectroscopic imaging (FTIRI) technique can be used to obtain the quantitative information of content and spatial distribution of principal components in cartilage by combining with chemometrics methods. In this study, FTIRI combining with principal component analysis (PCA) and Fisher's discriminant analysis (FDA) was applied to identify the healthy and osteoarthritic (OA) articular cartilage samples. Ten 10-μm thick sections of canine cartilages were imaged at 6.25μm/pixel in FTIRI. The infrared spectra extracted from the FTIR images were imported into SPSS software for PCA and FDA. Based on the PCA result of 2 principal components, the healthy and OA cartilage samples were effectively discriminated by the FDA with high accuracy of 94% for the initial samples (training set) and cross validation, as well as 86.67% for the prediction group. The study showed that cartilage degeneration became gradually weak with the increase of the depth. FTIRI combined with chemometrics may become an effective method for distinguishing healthy and OA cartilages in future. PMID:26977354

  14. Multilevel image recognition using discriminative patches and kernel covariance descriptor

    NASA Astrophysics Data System (ADS)

    Lu, Le; Yao, Jianhua; Turkbey, Evrim; Summers, Ronald M.

    2014-03-01

    Computer-aided diagnosis of medical images has emerged as an important tool to objectively improve the performance, accuracy and consistency for clinical workflow. To computerize the medical image diagnostic recognition problem, there are three fundamental problems: where to look (i.e., where is the region of interest from the whole image/volume), image feature description/encoding, and similarity metrics for classification or matching. In this paper, we exploit the motivation, implementation and performance evaluation of task-driven iterative, discriminative image patch mining; covariance matrix based descriptor via intensity, gradient and spatial layout; and log-Euclidean distance kernel for support vector machine, to address these three aspects respectively. To cope with often visually ambiguous image patterns for the region of interest in medical diagnosis, discovery of multilabel selective discriminative patches is desired. Covariance of several image statistics summarizes their second order interactions within an image patch and is proved as an effective image descriptor, with low dimensionality compared with joint statistics and fast computation regardless of the patch size. We extensively evaluate two extended Gaussian kernels using affine-invariant Riemannian metric or log-Euclidean metric with support vector machines (SVM), on two medical image classification problems of degenerative disc disease (DDD) detection on cortical shell unwrapped CT maps and colitis detection on CT key images. The proposed approach is validated with promising quantitative results on these challenging tasks. Our experimental findings and discussion also unveil some interesting insights on the covariance feature composition with or without spatial layout for classification and retrieval, and different kernel constructions for SVM. This will also shed some light on future work using covariance feature and kernel classification for medical image analysis.

  15. A Gabor-Block-Based Kernel Discriminative Common Vector Approach Using Cosine Kernels for Human Face Recognition

    PubMed Central

    Kar, Arindam; Bhattacharjee, Debotosh; Basu, Dipak Kumar; Nasipuri, Mita; Kundu, Mahantapas

    2012-01-01

    In this paper a nonlinear Gabor Wavelet Transform (GWT) discriminant feature extraction approach for enhanced face recognition is proposed. Firstly, the low-energized blocks from Gabor wavelet transformed images are extracted. Secondly, the nonlinear discriminating features are analyzed and extracted from the selected low-energized blocks by the generalized Kernel Discriminative Common Vector (KDCV) method. The KDCV method is extended to include cosine kernel function in the discriminating method. The KDCV with the cosine kernels is then applied on the extracted low-energized discriminating feature vectors to obtain the real component of a complex quantity for face recognition. In order to derive positive kernel discriminative vectors, we apply only those kernel discriminative eigenvectors that are associated with nonzero eigenvalues. The feasibility of the low-energized Gabor-block-based generalized KDCV method with cosine kernel function models has been successfully tested for classification using the L1, L2 distance measures; and the cosine similarity measure on both frontal and pose-angled face recognition. Experimental results on the FRAV2D and the FERET database demonstrate the effectiveness of this new approach. PMID:23365559

  16. Feature selection of fMRI data based on normalized mutual information and fisher discriminant ratio.

    PubMed

    Wang, Yanbin; Ji, Junzhong; Liang, Peipeng

    2016-03-17

    Pattern classification has been increasingly used in functional magnetic resonance imaging (fMRI) data analysis. However, the classification performance is restricted by the high dimensional property and noises of the fMRI data. In this paper, a new feature selection method (named as "NMI-F") was proposed by sequentially combining the normalized mutual information (NMI) and fisher discriminant ratio. In NMI-F, the normalized mutual information was firstly used to evaluate the relationships between features, and fisher discriminant ratio was then applied to calculate the importance of each feature involved. Two fMRI datasets (task-related and resting state) were used to test the proposed method. It was found that classification base on the NMI-F method could differentiate the brain cognitive and disease states effectively, and the proposed NMI-F method was prior to the other related methods. The current results also have implications to the future studies. PMID:27257882

  17. Facial expression recognition using local binary patterns and discriminant kernel locally linear embedding

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaoming; Zhang, Shiqing

    2012-12-01

    Given the nonlinear manifold structure of facial images, a new kernel-based supervised manifold learning algorithm based on locally linear embedding (LLE), called discriminant kernel locally linear embedding (DKLLE), is proposed for facial expression recognition. The proposed DKLLE aims to nonlinearly extract the discriminant information by maximizing the interclass scatter while minimizing the intraclass scatter in a reproducing kernel Hilbert space. DKLLE is compared with LLE, supervised locally linear embedding (SLLE), principal component analysis (PCA), linear discriminant analysis (LDA), kernel principal component analysis (KPCA), and kernel linear discriminant analysis (KLDA). Experimental results on two benchmarking facial expression databases, i.e., the JAFFE database and the Cohn-Kanade database, demonstrate the effectiveness and promising performance of DKLLE.

  18. Gabor feature based classification using the enhanced fisher linear discriminant model for face recognition.

    PubMed

    Liu, Chengjun; Wechsler, Harry

    2002-01-01

    This paper introduces a novel Gabor-Fisher (1936) classifier (GFC) for face recognition. The GFC method, which is robust to changes in illumination and facial expression, applies the enhanced Fisher linear discriminant model (EFM) to an augmented Gabor feature vector derived from the Gabor wavelet representation of face images. The novelty of this paper comes from 1) the derivation of an augmented Gabor feature vector, whose dimensionality is further reduced using the EFM by considering both data compression and recognition (generalization) performance; 2) the development of a Gabor-Fisher classifier for multi-class problems; and 3) extensive performance evaluation studies. In particular, we performed comparative studies of different similarity measures applied to various classifiers. We also performed comparative experimental studies of various face recognition schemes, including our novel GFC method, the Gabor wavelet method, the eigenfaces method, the Fisherfaces method, the EFM method, the combination of Gabor and the eigenfaces method, and the combination of Gabor and the Fisherfaces method. The feasibility of the new GFC method has been successfully tested on face recognition using 600 FERET frontal face images corresponding to 200 subjects, which were acquired under variable illumination and facial expressions. The novel GFC method achieves 100% accuracy on face recognition using only 62 features. PMID:18244647

  19. Sparse dimensionality reduction of hyperspectral image based on semi-supervised local Fisher discriminant analysis

    NASA Astrophysics Data System (ADS)

    Shao, Zhenfeng; Zhang, Lei

    2014-09-01

    This paper presents a novel sparse dimensionality reduction method of hyperspectral image based on semi-supervised local Fisher discriminant analysis (SELF). The proposed method is designed to be especially effective for dealing with the out-of-sample extrapolation to realize advantageous complementarities between SELF and sparsity preserving projections (SPP). Compared to SELF and SPP, the method proposed herein offers highly discriminative ability and produces an explicit nonlinear feature mapping for the out-of-sample extrapolation. This is due to the fact that the proposed method can get an explicit feature mapping for dimensionality reduction and improve the classification performance of classifiers by performing dimensionality reduction. Experimental analysis on the sparsity and efficacy of low dimensional outputs shows that, sparse dimensionality reduction based on SELF can yield good classification results and interpretability in the field of hyperspectral remote sensing.

  20. Discriminating between different streamflow regimes by using the Fisher-Shannon method: An application to the Colombia rivers

    NASA Astrophysics Data System (ADS)

    Pierini, Jorge O.; Restrepo, Juan C.; Lovallo, Michele; Telesca, Luciano

    2014-12-01

    The Fisher-Shannon (FS) information plane, defined by the Fisher information measure (FIM) and the Shannon entropy power (N X ), was robustly used to investigate the complex dynamics of eight monthly streamflow time series in Colombia. In the FS plane the streamflow series seem to aggregate into two different clusters corresponding to two different climatological regimes in Colombia. Our findings suggest the use of the statistical quantity defined by the FS information plane as a tool to discriminate among different hydrological regimes.

  1. Discriminating Between Different Streamflow Regimes by Using the Fisher-Shannon Method: An Application to the Colombia Rivers

    NASA Astrophysics Data System (ADS)

    Pierini, Jorge O.; Restrepo, Juan C.; Lovallo, Michele; Telesca, Luciano

    2015-04-01

    The Fisher-Shannon (FS) information plane, defined by the Fisher information measure (FIM) and the Shannon entropy power (NX), was robustly used to investigate the complex dynamics of eight monthly streamflow time series in Colombia. In the FS plane the streamflow series seem to aggregate into two different clusters corresponding to two different climatological regimes in Colombia. Our findings suggest the use of the statistical quantity defined by the FS information plane as a tool to discriminate among different hydrological regimes.

  2. Fisher's linear discriminant ratio based threshold for moving human detection in thermal video

    NASA Astrophysics Data System (ADS)

    Sharma, Lavanya; Yadav, Dileep Kumar; Singh, Annapurna

    2016-09-01

    In video surveillance, the moving human detection in thermal video is a critical phase that filters out redundant information to extract relevant information. The moving object detection is applied on thermal video because it penetrate challenging problems such as dynamic issues of background and illumination variation. In this work, we have proposed a new background subtraction method using Fisher's linear discriminant ratio based threshold. This threshold is investigated automatically during run-time for each pixel of every sequential frame. Automatically means to avoid the involvement of external source such as programmer or user for threshold selection. This threshold provides better pixel classification at run-time. This method handles problems generated due to multiple behavior of background more accurately using Fisher's ratio. It maximizes the separation between object pixel and the background pixel. To check the efficacy, the performance of this work is observed in terms of various parameters depicted in analysis. The experimental results and their analysis demonstrated better performance of proposed method against considered peer methods.

  3. Tropical cyclone center location based on Fisher discriminant and Chan-Vese model

    NASA Astrophysics Data System (ADS)

    Qiao, Wenfeng; Li, Yuanxiang; Wei, Xian; Shen, Ji

    2011-12-01

    TC center location is important for weather forecast and TC analysis. However the appearance of TC centers has different shapes and sizes at different time. At different stages of TC lifetime, the difficulty of locating TC center is different. In order to improve the automatism and precision, we present a TC center location scheme for eye TCs and non-eye TCs. Fisher discriminant is used to segment TC so that we can get the binary image automatically and effectively. Since the cloud wall near the non-eye TC center is homocentric circle, Chan-Vese model is used to get TC contour. Experimental results on TCs show that our scheme can achieve an average error within 0.3 degrees in longitude/latitude in comparison with the best tracks by CMA and RSMC.

  4. Tropical cyclone center location based on Fisher discriminant and Chan-Vese model

    NASA Astrophysics Data System (ADS)

    Qiao, Wenfeng; Li, Yuanxiang; Wei, Xian; Shen, Ji

    2012-01-01

    TC center location is important for weather forecast and TC analysis. However the appearance of TC centers has different shapes and sizes at different time. At different stages of TC lifetime, the difficulty of locating TC center is different. In order to improve the automatism and precision, we present a TC center location scheme for eye TCs and non-eye TCs. Fisher discriminant is used to segment TC so that we can get the binary image automatically and effectively. Since the cloud wall near the non-eye TC center is homocentric circle, Chan-Vese model is used to get TC contour. Experimental results on TCs show that our scheme can achieve an average error within 0.3 degrees in longitude/latitude in comparison with the best tracks by CMA and RSMC.

  5. Color model and method for video fire flame and smoke detection using Fisher linear discriminant

    NASA Astrophysics Data System (ADS)

    Wei, Yuan; Jie, Li; Jun, Fang; Yongming, Zhang

    2013-02-01

    Video fire detection is playing an increasingly important role in our life. But recent research is often based on a traditional RGB color model used to analyze the flame, which may be not the optimal color space for fire recognition. It is worse when we research smoke simply using gray images instead of color ones. We clarify the importance of color information for fire detection. We present a fire discriminant color (FDC) model for flame or smoke recognition based on color images. The FDC models aim to unify fire color image representation and fire recognition task into one framework. With the definition of between-class scatter matrices and within-class scatter matrices of Fisher linear discriminant, the proposed models seek to obtain one color-space-transform matrix and a discriminate projection basis vector by maximizing the ratio of these two scatter matrices. First, an iterative basic algorithm is designed to get one-component color space transformed from RGB. Then, a general algorithm is extended to generate three-component color space for further improvement. Moreover, we propose a method for video fire detection based on the models using the kNN classifier. To evaluate the recognition performance, we create a database including flame, smoke, and nonfire images for training and testing. The test experiments show that the proposed model achieves a flame verification rate receiver operating characteristic (ROC I) of 97.5% at a false alarm rate (FAR) of 1.06% and a smoke verification rate (ROC II) of 91.5% at a FAR of 1.2%, and lots of fire video experiments demonstrate that our method reaches a high accuracy for fire recognition.

  6. Early discriminant method of infected kernel based on the erosion effects of laser ultrasonics

    NASA Astrophysics Data System (ADS)

    Fan, Chao

    2015-07-01

    To discriminate the infected kernel of the wheat as early as possible, a new kind of detection method of hidden insects, especially in their egg and larvae stage, was put forward based on the erosion effect of the laser ultrasonic in this paper. The surface of the grain is exposured by the pulsed laser, the energy of which is absorbed and the ultrasonic is excited, and the infected kernel can be recognized by appropriate signal analyzing. Firstly, the detection principle was given based on the classical wave equation and the platform was established. Then, the detected ultrasonic signal was processed both in the time domain and the frequency domain by using FFT and DCT , and six significant features were selected as the characteristic parameters of the signal by the method of stepwise discriminant analysis. Finally, a BP neural network was designed by using these six parameters as the input to classify the infected kernels from the normal ones. Numerous experiments were performed by using twenty wheat varieties, the results shown that the the infected kernels can be recognized effectively, and the false negative error and the false positive error was 12% and 9% respectively, the discriminant method of the infected kernels based on the erosion effect of laser ultrasonics is feasible.

  7. Discrimination of Mine Seismic Events and Blasts Using the Fisher Classifier, Naive Bayesian Classifier and Logistic Regression

    NASA Astrophysics Data System (ADS)

    Dong, Longjun; Wesseloo, Johan; Potvin, Yves; Li, Xibing

    2016-01-01

    Seismic events and blasts generate seismic waveforms that have different characteristics. The challenge to confidently differentiate these two signatures is complex and requires the integration of physical and statistical techniques. In this paper, the different characteristics of blasts and seismic events were investigated by comparing probability density distributions of different parameters. Five typical parameters of blasts and events and the probability density functions of blast time, as well as probability density functions of origin time difference for neighbouring blasts were extracted as discriminant indicators. The Fisher classifier, naive Bayesian classifier and logistic regression were used to establish discriminators. Databases from three Australian and Canadian mines were established for training, calibrating and testing the discriminant models. The classification performances and discriminant precision of the three statistical techniques were discussed and compared. The proposed discriminators have explicit and simple functions which can be easily used by workers in mines or researchers. Back-test, applied results, cross-validated results and analysis of receiver operating characteristic curves in different mines have shown that the discriminator for one of the mines has a reasonably good discriminating performance.

  8. Identification of wheat varieties with a parallel-plate capacitance sensor using fisher linear discriminant analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Fisher’s linear discriminant (FLD) models for wheat variety classification were developed and validated. The inputs to the FLD models were the capacitance (C), impedance (Z), and phase angle ('), measured at two frequencies. Classification of wheat varieties was obtained as output of the FLD mod...

  9. Discriminative Learning for Automatic Staging of Placental Maturity via Multi-layer Fisher Vector

    NASA Astrophysics Data System (ADS)

    Lei, Baiying; Yao, Yuan; Chen, Siping; Li, Shengli; Li, Wanjun; Ni, Dong; Wang, Tianfu

    2015-07-01

    Currently, placental maturity is performed using subjective evaluation, which can be unreliable as it is highly dependent on the observations and experiences of clinicians. To address this problem, this paper proposes a method to automatically stage placenta maturity from B-mode ultrasound (US) images based on dense sampling and novel feature descriptors. Specifically, our proposed method first densely extracts features with a regular grid based on dense sampling instead of a few unreliable interest points. Followed by, these features are clustered using generative Gaussian mixture model (GMM) to obtain high order statistics of the features. The clustering representatives (i.e., cluster means) are encoded by Fisher vector (FV) for staging accuracy enhancement. Differing from the previous studies, a multi-layer FV is investigated to exploit the spatial information rather than the single layer FV. Experimental results show that the proposed method with the dense FV has achieved an area under the receiver of characteristics (AUC) of 96.77%, sensitivity and specificity of 98.04% and 93.75% for the placental maturity staging, respectively. Our experimental results also demonstrate that the dense feature outperforms the traditional sparse feature for placental maturity staging.

  10. Discriminative Learning for Automatic Staging of Placental Maturity via Multi-layer Fisher Vector

    PubMed Central

    Lei, Baiying; Yao, Yuan; Chen, Siping; Li, Shengli; Li, Wanjun; Ni, Dong; Wang, Tianfu

    2015-01-01

    Currently, placental maturity is performed using subjective evaluation, which can be unreliable as it is highly dependent on the observations and experiences of clinicians. To address this problem, this paper proposes a method to automatically stage placenta maturity from B-mode ultrasound (US) images based on dense sampling and novel feature descriptors. Specifically, our proposed method first densely extracts features with a regular grid based on dense sampling instead of a few unreliable interest points. Followed by, these features are clustered using generative Gaussian mixture model (GMM) to obtain high order statistics of the features. The clustering representatives (i.e., cluster means) are encoded by Fisher vector (FV) for staging accuracy enhancement. Differing from the previous studies, a multi-layer FV is investigated to exploit the spatial information rather than the single layer FV. Experimental results show that the proposed method with the dense FV has achieved an area under the receiver of characteristics (AUC) of 96.77%, sensitivity and specificity of 98.04% and 93.75% for the placental maturity staging, respectively. Our experimental results also demonstrate that the dense feature outperforms the traditional sparse feature for placental maturity staging. PMID:26228175

  11. Can malt whisky be discriminated from blended whisky? The proof. A modification of Sir Ronald Fisher's hypothetical tea tasting experiment.

    PubMed

    Chadwick, S J; Dudley, H A

    A modified version of Fisher's tea tasting experiment was performed to test the confident assertions of some members of an academic surgical unit that they could easily distinguish malt from blend whisky. Eight male volunteers from the unit, divided into regular and inexperienced whisky drinkers, were blindfolded and given a glass of each of six whiskies. The whiskies included three malts and three blends, and each subject tasted each whisky six times. They were asked whether the whisky was malt or blended, whether they could identify the distillery, and whether they liked it (ranked on a nine-point scale). The four regular whisky drinkers identified the whiskies as malts 57 times, as blends 84 times, and did not know three times; the inexperienced drinkers identified them as malts 64 times, as blends 79 times, and did not know on one occasion. The regular drinkers correctly identified the malts 36 times and the blends 48 times, and the inexperienced drinkers correctly identified malts 30 times and blends 40 times. Statistical analysis of the data suggested that within the unit malt whisky could not be distinguished from blended whisky and that experience did not alter powers of discrimination. A surgeon with no discriminatory prowess at all could be expected to achieve complete discrimination of malt from blend whisky once in 2 occasions. These results suggest that, although "uisgebeatha" has unique properties, the inexpert drinker should choose his whisky to suit his taste and pocket and not his self image.

  12. Unsupervised Wishart Classfication of Wetlands in Newfoundland, Canada Using Polsar Data Based on Fisher Linear Discriminant Analysis

    NASA Astrophysics Data System (ADS)

    Mohammadimanesh, F.; Salehi, B.; Mahdianpari, M.; Homayouni, S.

    2016-06-01

    Polarimetric Synthetic Aperture Radar (PolSAR) imagery is a complex multi-dimensional dataset, which is an important source of information for various natural resources and environmental classification and monitoring applications. PolSAR imagery produces valuable information by observing scattering mechanisms from different natural and man-made objects. Land cover mapping using PolSAR data classification is one of the most important applications of SAR remote sensing earth observations, which have gained increasing attention in the recent years. However, one of the most challenging aspects of classification is selecting features with maximum discrimination capability. To address this challenge, a statistical approach based on the Fisher Linear Discriminant Analysis (FLDA) and the incorporation of physical interpretation of PolSAR data into classification is proposed in this paper. After pre-processing of PolSAR data, including the speckle reduction, the H/α classification is used in order to classify the basic scattering mechanisms. Then, a new method for feature weighting, based on the fusion of FLDA and physical interpretation, is implemented. This method proves to increase the classification accuracy as well as increasing between-class discrimination in the final Wishart classification. The proposed method was applied to a full polarimetric C-band RADARSAT-2 data set from Avalon area, Newfoundland and Labrador, Canada. This imagery has been acquired in June 2015, and covers various types of wetlands including bogs, fens, marshes and shallow water. The results were compared with the standard Wishart classification, and an improvement of about 20% was achieved in the overall accuracy. This method provides an opportunity for operational wetland classification in northern latitude with high accuracy using only SAR polarimetric data.

  13. Accurate palm vein recognition based on wavelet scattering and spectral regression kernel discriminant analysis

    NASA Astrophysics Data System (ADS)

    Elnasir, Selma; Shamsuddin, Siti Mariyam; Farokhi, Sajad

    2015-01-01

    Palm vein recognition (PVR) is a promising new biometric that has been applied successfully as a method of access control by many organizations, which has even further potential in the field of forensics. The palm vein pattern has highly discriminative features that are difficult to forge because of its subcutaneous position in the palm. Despite considerable progress and a few practical issues, providing accurate palm vein readings has remained an unsolved issue in biometrics. We propose a robust and more accurate PVR method based on the combination of wavelet scattering (WS) with spectral regression kernel discriminant analysis (SRKDA). As the dimension of WS generated features is quite large, SRKDA is required to reduce the extracted features to enhance the discrimination. The results based on two public databases-PolyU Hyper Spectral Palmprint public database and PolyU Multi Spectral Palmprint-show the high performance of the proposed scheme in comparison with state-of-the-art methods. The proposed approach scored a 99.44% identification rate and a 99.90% verification rate [equal error rate (EER)=0.1%] for the hyperspectral database and a 99.97% identification rate and a 99.98% verification rate (EER=0.019%) for the multispectral database.

  14. Deep learning regularized Fisher mappings.

    PubMed

    Wong, W K; Sun, Mingming

    2011-10-01

    For classification tasks, it is always desirable to extract features that are most effective for preserving class separability. In this brief, we propose a new feature extraction method called regularized deep Fisher mapping (RDFM), which learns an explicit mapping from the sample space to the feature space using a deep neural network to enhance the separability of features according to the Fisher criterion. Compared to kernel methods, the deep neural network is a deep and nonlocal learning architecture, and therefore exhibits more powerful ability to learn the nature of highly variable datasets from fewer samples. To eliminate the side effects of overfitting brought about by the large capacity of powerful learners, regularizers are applied in the learning procedure of RDFM. RDFM is evaluated in various types of datasets, and the results reveal that it is necessary to apply unsupervised regularization in the fine-tuning phase of deep learning. Thus, for very flexible models, the optimal Fisher feature extractor may be a balance between discriminative ability and descriptive ability.

  15. Visualization of nonlinear kernel models in neuroimaging by sensitivity maps.

    PubMed

    Rasmussen, Peter Mondrup; Madsen, Kristoffer Hougaard; Lund, Torben Ellegaard; Hansen, Lars Kai

    2011-04-01

    There is significant current interest in decoding mental states from neuroimages. In this context kernel methods, e.g., support vector machines (SVM) are frequently adopted to learn statistical relations between patterns of brain activation and experimental conditions. In this paper we focus on visualization of such nonlinear kernel models. Specifically, we investigate the sensitivity map as a technique for generation of global summary maps of kernel classification models. We illustrate the performance of the sensitivity map on functional magnetic resonance (fMRI) data based on visual stimuli. We show that the performance of linear models is reduced for certain scan labelings/categorizations in this data set, while the nonlinear models provide more flexibility. We show that the sensitivity map can be used to visualize nonlinear versions of kernel logistic regression, the kernel Fisher discriminant, and the SVM, and conclude that the sensitivity map is a versatile and computationally efficient tool for visualization of nonlinear kernel models in neuroimaging.

  16. Choosing parameters of kernel subspace LDA for recognition of face images under pose and illumination variations.

    PubMed

    Huang, Jian; Yuen, Pong C; Chen, Wen-Sheng; Lai, Jian Huang

    2007-08-01

    This paper addresses the problem of automatically tuning multiple kernel parameters for the kernel-based linear discriminant analysis (LDA) method. The kernel approach has been proposed to solve face recognition problems under complex distribution by mapping the input space to a high-dimensional feature space. Some recognition algorithms such as the kernel principal components analysis, kernel Fisher discriminant, generalized discriminant analysis, and kernel direct LDA have been developed in the last five years. The experimental results show that the kernel-based method is a good and feasible approach to tackle the pose and illumination variations. One of the crucial factors in the kernel approach is the selection of kernel parameters, which highly affects the generalization capability and stability of the kernel-based learning methods. In view of this, we propose an eigenvalue-stability-bounded margin maximization (ESBMM) algorithm to automatically tune the multiple parameters of the Gaussian radial basis function kernel for the kernel subspace LDA (KSLDA) method, which is developed based on our previously developed subspace LDA method. The ESBMM algorithm improves the generalization capability of the kernel-based LDA method by maximizing the margin maximization criterion while maintaining the eigenvalue stability of the kernel-based LDA method. An in-depth investigation on the generalization performance on pose and illumination dimensions is performed using the YaleB and CMU PIE databases. The FERET database is also used for benchmark evaluation. Compared with the existing PCA-based and LDA-based methods, our proposed KSLDA method, with the ESBMM kernel parameter estimation algorithm, gives superior performance.

  17. Discriminative clustering via extreme learning machine.

    PubMed

    Huang, Gao; Liu, Tianchi; Yang, Yan; Lin, Zhiping; Song, Shiji; Wu, Cheng

    2015-10-01

    Discriminative clustering is an unsupervised learning framework which introduces the discriminative learning rule of supervised classification into clustering. The underlying assumption is that a good partition (clustering) of the data should yield high discrimination, namely, the partitioned data can be easily classified by some classification algorithms. In this paper, we propose three discriminative clustering approaches based on Extreme Learning Machine (ELM). The first algorithm iteratively trains weighted ELM (W-ELM) classifier to gradually maximize the data discrimination. The second and third methods are both built on Fisher's Linear Discriminant Analysis (LDA); but one approach adopts alternative optimization, while the other leverages kernel k-means. We show that the proposed algorithms can be easily implemented, and yield competitive clustering accuracy on real world data sets compared to state-of-the-art clustering methods. PMID:26143036

  18. Kernels for longitudinal data with variable sequence length and sampling intervals.

    PubMed

    Lu, Zhengdong; Leen, Todd K; Kaye, Jeffrey

    2011-09-01

    We develop several kernel methods for classification of longitudinal data and apply them to detect cognitive decline in the elderly. We first develop mixed-effects models, a type of hierarchical empirical Bayes generative models, for the time series. After demonstrating their utility in likelihood ratio classifiers (and the improvement over standard regression models for such classifiers), we develop novel Fisher kernels based on mixture of mixed-effects models and use them in support vector machine classifiers. The hierarchical generative model allows us to handle variations in sequence length and sampling interval gracefully. We also give nonparametric kernels not based on generative models, but rather on the reproducing kernel Hilbert space. We apply the methods to detecting cognitive decline from longitudinal clinical data on motor and neuropsychological tests. The likelihood ratio classifiers based on the neuropsychological tests perform better than than classifiers based on the motor behavior. Discriminant classifiers performed better than likelihood ratio classifiers for the motor behavior tests.

  19. [Selection of Characteristic Wavelengths Using SPA and Qualitative Discrimination of Mildew Degree of Corn Kernels Based on SVM].

    PubMed

    Yuan, Ying; Wang, Wei; Chu, Xuan; Xi, Ming-jie

    2016-01-01

    The feasibility of Fourier transform near infrared (FT-NIR) spectroscopy with spectral range between 833 and 2 500 nm to detect the moldy corn kernels with different levels of mildew was verified in this paper. Firstly, to avoid the influence of noise, moving average smoothing was used for spectral data preprocessing after four common pretreatment methods were compared. Then to improve the prediction performance of the model, SPXY (sample set partitioning based on joint x-y distance) was selected and used for sample set partition. Furthermore, in order to reduce the dimensions of the original spectral data, successive projection algorithm (SPA) was adopted and ultimately 7 characteristic wavelengths were extracted, the characteristic wave-lengths were 833, 927, 1 208, 1 337, 1 454, 1 861, 2 280 nm. The experimental results showed when the spectrum data of the 7 characteristic wavelengths were taken as the input of SVM, the radial basic function (RBF) used as the kernel function, and kernel parameter C = 7 760 469, γ = 0.017 003, the classification accuracies of the established SVM model were 97.78% and 93.33% for the training and testing sets respectively. In addition, the independent validation set was selected in the same standard, and used to verify the model. At last, the classification accuracy of 91.11% for the independent validation set was achieved. The result indicated that it is feasible to identify and classify different degree of moldy corn grain kernels using SPA and SVM, and characteristic wavelengths selected by SPA in this paper also lay a foundation for the online NIR detection of mildew corn kernels. PMID:27228772

  20. [Selection of Characteristic Wavelengths Using SPA and Qualitative Discrimination of Mildew Degree of Corn Kernels Based on SVM].

    PubMed

    Yuan, Ying; Wang, Wei; Chu, Xuan; Xi, Ming-jie

    2016-01-01

    The feasibility of Fourier transform near infrared (FT-NIR) spectroscopy with spectral range between 833 and 2 500 nm to detect the moldy corn kernels with different levels of mildew was verified in this paper. Firstly, to avoid the influence of noise, moving average smoothing was used for spectral data preprocessing after four common pretreatment methods were compared. Then to improve the prediction performance of the model, SPXY (sample set partitioning based on joint x-y distance) was selected and used for sample set partition. Furthermore, in order to reduce the dimensions of the original spectral data, successive projection algorithm (SPA) was adopted and ultimately 7 characteristic wavelengths were extracted, the characteristic wave-lengths were 833, 927, 1 208, 1 337, 1 454, 1 861, 2 280 nm. The experimental results showed when the spectrum data of the 7 characteristic wavelengths were taken as the input of SVM, the radial basic function (RBF) used as the kernel function, and kernel parameter C = 7 760 469, γ = 0.017 003, the classification accuracies of the established SVM model were 97.78% and 93.33% for the training and testing sets respectively. In addition, the independent validation set was selected in the same standard, and used to verify the model. At last, the classification accuracy of 91.11% for the independent validation set was achieved. The result indicated that it is feasible to identify and classify different degree of moldy corn grain kernels using SPA and SVM, and characteristic wavelengths selected by SPA in this paper also lay a foundation for the online NIR detection of mildew corn kernels.

  1. Fisher Matrix Preloaded — FISHER4CAST

    NASA Astrophysics Data System (ADS)

    Bassett, Bruce A.; Fantaye, Yabebal; Hlozek, Renée; Kotze, Jacques

    The Fisher Matrix is the backbone of modern cosmological forecasting. We describe the Fisher4Cast software: A general-purpose, easy-to-use, Fisher Matrix framework. It is open source, rigorously designed and tested and includes a Graphical User Interface (GUI) with automated LATEX file creation capability and point-and-click Fisher ellipse generation. Fisher4Cast was designed for ease of extension and, although written in Matlab, is easily portable to open-source alternatives such as Octave and Scilab. Here we use Fisher4Cast to present new 3D and 4D visualizations of the forecasting landscape and to investigate the effects of growth and curvature on future cosmological surveys. Early releases have been available at since mid-2008. The current release of the code is Version 2.2 which is described here. For ease of reference a Quick Start guide and the code used to produce the figures in this paper are included, in the hope that it will be useful to the cosmology and wider scientific communities.

  2. After "Fisher": Academic Review and Judicial Scrutiny

    ERIC Educational Resources Information Center

    La Noue, George R.

    2013-01-01

    This article describes the outcomes of the case "Fisher v. University of Texas at Austin," in which the plaintiff had accused the University of Texas (UT) of racial discrimination in the admission process. The author believes that the ruling of the court in this case makes it harder to hide race-based measures used in college admissions.…

  3. Miller Fisher Syndrome

    MedlinePlus

    ... sensory information to the spinal cord and brain. Magnetic resonance (MRI) or other imaging of the brain and/or spinal cord are usually normal. Spinal fluid protein is often elevated. Pure Fisher syndrome is ...

  4. Recurrent Miller Fisher syndrome.

    PubMed

    Madhavan, S; Geetha; Bhargavan, P V

    2004-07-01

    Miller Fisher syndrome (MFS) is a variant of Guillan Barre syndrome characterized by the triad of ophthalmoplegia, ataxia and areflexia. Recurrences are exceptional with Miller Fisher syndrome. We are reporting a case with two episodes of MFS within two years. Initially he presented with partial ophthalmoplegia, ataxia. Second episode was characterized by full-blown presentation characterized by ataxia, areflexia and ophthalmoplegia. CSF analysis was typical during both episodes. Nerve conduction velocity study was fairly within normal limits. MRI of brain was within normal limits. He responded to symptomatic measures initially, then to steroids in the second episode. We are reporting the case due to its rarity.

  5. Stem kernels for RNA sequence analyses.

    PubMed

    Sakakibara, Yasubumi; Popendorf, Kris; Ogawa, Nana; Asai, Kiyoshi; Sato, Kengo

    2007-10-01

    Several computational methods based on stochastic context-free grammars have been developed for modeling and analyzing functional RNA sequences. These grammatical methods have succeeded in modeling typical secondary structures of RNA, and are used for structural alignment of RNA sequences. However, such stochastic models cannot sufficiently discriminate member sequences of an RNA family from nonmembers and hence detect noncoding RNA regions from genome sequences. A novel kernel function, stem kernel, for the discrimination and detection of functional RNA sequences using support vector machines (SVMs) is proposed. The stem kernel is a natural extension of the string kernel, specifically the all-subsequences kernel, and is tailored to measure the similarity of two RNA sequences from the viewpoint of secondary structures. The stem kernel examines all possible common base pairs and stem structures of arbitrary lengths, including pseudoknots between two RNA sequences, and calculates the inner product of common stem structure counts. An efficient algorithm is developed to calculate the stem kernels based on dynamic programming. The stem kernels are then applied to discriminate members of an RNA family from nonmembers using SVMs. The study indicates that the discrimination ability of the stem kernel is strong compared with conventional methods. Furthermore, the potential application of the stem kernel is demonstrated by the detection of remotely homologous RNA families in terms of secondary structures. This is because the string kernel is proven to work for the remote homology detection of protein sequences. These experimental results have convinced us to apply the stem kernel in order to find novel RNA families from genome sequences. PMID:17933013

  6. Analysis of the Fisher solution

    SciTech Connect

    Abdolrahimi, Shohreh; Shoom, Andrey A.

    2010-01-15

    We study the d-dimensional Fisher solution which represents a static, spherically symmetric, asymptotically flat spacetime with a massless scalar field. The solution has two parameters, the mass M and the 'scalar charge' {Sigma}. The Fisher solution has a naked curvature singularity which divides the spacetime manifold into two disconnected parts. The part which is asymptotically flat we call the Fisher spacetime, and another part we call the Fisher universe. The d-dimensional Schwarzschild-Tangherlini solution and the Fisher solution belong to the same theory and are dual to each other. The duality transformation acting in the parameter space (M,{Sigma}) maps the exterior region of the Schwarzschild-Tangherlini black hole into the Fisher spacetime which has a naked timelike singularity, and interior region of the black hole into the Fisher universe, which is an anisotropic expanding-contracting universe and which has two spacelike singularities representing its 'big bang' and 'big crunch'. The big bang singularity and the singularity of the Fisher spacetime are radially weak in the sense that a 1-dimensional object moving along a timelike radial geodesic can arrive to the singularities intact. At the vicinity of the singularity the Fisher spacetime of nonzero mass has a region where its Misner-Sharp energy is negative. The Fisher universe has a marginally trapped surface corresponding to the state of its maximal expansion in the angular directions. These results and derived relations between geometric quantities of the Fisher spacetime, the Fisher universe, and the Schwarzschild-Tangherlini black hole may suggest that the massless scalar field transforms the black hole event horizon into the naked radially weak disjoint singularities of the Fisher spacetime and the Fisher universe which are 'dual to the horizon'.

  7. Kernel Methods on Riemannian Manifolds with Gaussian RBF Kernels.

    PubMed

    Jayasumana, Sadeep; Hartley, Richard; Salzmann, Mathieu; Li, Hongdong; Harandi, Mehrtash

    2015-12-01

    In this paper, we develop an approach to exploiting kernel methods with manifold-valued data. In many computer vision problems, the data can be naturally represented as points on a Riemannian manifold. Due to the non-Euclidean geometry of Riemannian manifolds, usual Euclidean computer vision and machine learning algorithms yield inferior results on such data. In this paper, we define Gaussian radial basis function (RBF)-based positive definite kernels on manifolds that permit us to embed a given manifold with a corresponding metric in a high dimensional reproducing kernel Hilbert space. These kernels make it possible to utilize algorithms developed for linear spaces on nonlinear manifold-valued data. Since the Gaussian RBF defined with any given metric is not always positive definite, we present a unified framework for analyzing the positive definiteness of the Gaussian RBF on a generic metric space. We then use the proposed framework to identify positive definite kernels on two specific manifolds commonly encountered in computer vision: the Riemannian manifold of symmetric positive definite matrices and the Grassmann manifold, i.e., the Riemannian manifold of linear subspaces of a Euclidean space. We show that many popular algorithms designed for Euclidean spaces, such as support vector machines, discriminant analysis and principal component analysis can be generalized to Riemannian manifolds with the help of such positive definite Gaussian kernels. PMID:26539851

  8. Band-Reweighed Gabor Kernel Embedding for Face Image Representation and Recognition.

    PubMed

    Ren, Chuan-Xian; Dai, Dao-Qing; Li, Xiao-Xin; Lai, Zhao-Rong

    2014-02-01

    Face recognition with illumination or pose variation is a challenging problem in image processing and pattern recognition. A novel algorithm using band-reweighed Gabor kernel embedding to deal with the problem is proposed in this paper. For a given image, it is first transformed by a group of Gabor filters, which output Gabor features using different orientation and scale parameters. Fisher scoring function is used to measure the importance of features in each band, and then, the features with the largest scores are preserved for saving memory requirements. The reduced bands are combined by a vector, which is determined by a weighted kernel discriminant criterion and solved by a constrained quadratic programming method, and then, the weighted sum of these nonlinear bands is defined as the similarity between two images. Compared with existing concatenation-based Gabor feature representation and the uniformly weighted similarity calculation approaches, our method provides a new way to use Gabor features for face recognition and presents a reasonable interpretation for highlighting discriminant orientations and scales. The minimum Mahalanobis distance considering the spatial correlations within the data is exploited for feature matching, and the graphical lasso is used therein for directly estimating the sparse inverse covariance matrix. Experiments using benchmark databases show that our new algorithm improves the recognition results and obtains competitive performance.

  9. Optical resolution from Fisher information

    NASA Astrophysics Data System (ADS)

    Motka, L.; Stoklasa, B.; D'Angelo, M.; Facchi, P.; Garuccio, A.; Hradil, Z.; Pascazio, S.; Pepe, F. V.; Teo, Y. S.; Řeháček, J.; Sánchez-Soto, L. L.

    2016-05-01

    The information gained by performing a measurement on a physical system is most appropriately assessed by the Fisher information, which in fact establishes lower bounds on estimation errors for an arbitrary unbiased estimator. We revisit the basic properties of the Fisher information and demonstrate its potential to quantify the resolution of optical systems. We illustrate this with some conceptually important examples, such as single-slit diffraction, spectroscopy and superresolution techniques.

  10. On Fisher Information and Thermodynamics

    EPA Science Inventory

    Fisher information is a measure of the information obtainable by an observer from the observation of reality. However, information is obtainable only when there are patterns or features to observe, and these only exist when there is order. For example, a system in perfect disor...

  11. "Fisher v. Texas": Strictly Disappointing

    ERIC Educational Resources Information Center

    Nieli, Russell K.

    2013-01-01

    Russell K. Nieli writes in this opinion paper that as far as the ability of state colleges and universities to use race as a criteria for admission goes, "Fisher v. Texas" was a big disappointment, and failed in the most basic way. Nieli states that although some affirmative action opponents have tried to put a more positive spin on the…

  12. Feasibility of near infrared spectroscopy for analyzing corn kernel damage and viability of soybean and corn kernels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The current US corn grading system accounts for the portion of damaged kernels, which is measured by time-consuming and inaccurate visual inspection. Near infrared spectroscopy (NIRS), a non-destructive and fast analytical method, was tested as a tool for discriminating corn kernels with heat and f...

  13. Approximate kernel competitive learning.

    PubMed

    Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang

    2015-03-01

    Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches.

  14. Approximate kernel competitive learning.

    PubMed

    Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang

    2015-03-01

    Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches. PMID:25528318

  15. The arms race between fishers

    NASA Astrophysics Data System (ADS)

    Rijnsdorp, Adriaan D.; Poos, Jan Jaap; Quirijns, Floor J.; HilleRisLambers, Reinier; De Wilde, Jan W.; Den Heijer, Willem M.

    An analysis of the changes in the Dutch demersal fishing fleet since the 1950s revealed that competitive interactions among vessels and gear types within the constraints imposed by biological, economic and fisheries management factors are the dominant processes governing the dynamics of fishing fleets. Double beam trawling, introduced in the early 1960s, proved a successful fishing method to catch deep burying flatfish, in particular sole. In less than 10 years, the otter trawl fleet was replaced by a highly specialised beam trawling fleet, despite an initial doubling of the loss rate of vessels due to stability problems. Engine power, size of the beam trawl, number of tickler chains and fishing speed rapidly increased and fishing activities expanded into previously lightly fished grounds and seasons. Following the ban on flatfish trawling within the 12 nautical mile zone for vessels of more than 300 hp in 1975 and with the restriction of engine power to 2000 hp in 1987, the beam trawl fleet bifurcated. Changes in the fleet capacity were related to the economic results and showed a cyclic pattern with a period of 6-7 years. The arms race between fishers was fuelled by competitive interactions among fishers: while the catchability of the fleet more than doubled in the ten years following the introduction of the beam trawl, a decline in catchability was observed in reference beam trawlers that remained the same. Vessel performance was not only affected by the technological characteristics but also by the number and characteristics of competing vessels.

  16. Optimization of Polarimetric Contrast Enhancement Based on Fisher Criterion

    NASA Astrophysics Data System (ADS)

    Deng, Qiming; Chen, Jiong; Yang, Jian

    The optimization of polarimetric contrast enhancement (OPCE) is a widely used method for maximizing the received power ratio of a desired target versus an undesired target (clutter). In this letter, a new model of the OPCE is proposed based on the Fisher criterion. By introducing the well known two-class problem of linear discriminant analysis (LDA), the proposed model is to enlarge the normalized distance of mean value between the target and the clutter. In addition, a cross-iterative numerical method is proposed for solving the optimization with a quadratic constraint. Experimental results with the polarimetric SAR (POLSAR) data demonstrate the effectiveness of the proposed method.

  17. Iterative software kernels

    SciTech Connect

    Duff, I.

    1994-12-31

    This workshop focuses on kernels for iterative software packages. Specifically, the three speakers discuss various aspects of sparse BLAS kernels. Their topics are: `Current status of user lever sparse BLAS`; Current status of the sparse BLAS toolkit`; and `Adding matrix-matrix and matrix-matrix-matrix multiply to the sparse BLAS toolkit`.

  18. Intelligent classification methods of grain kernels using computer vision analysis

    NASA Astrophysics Data System (ADS)

    Lee, Choon Young; Yan, Lei; Wang, Tianfeng; Lee, Sang Ryong; Park, Cheol Woo

    2011-06-01

    In this paper, a digital image analysis method was developed to classify seven kinds of individual grain kernels (common rice, glutinous rice, rough rice, brown rice, buckwheat, common barley and glutinous barley) widely planted in Korea. A total of 2800 color images of individual grain kernels were acquired as a data set. Seven color and ten morphological features were extracted and processed by linear discriminant analysis to improve the efficiency of the identification process. The output features from linear discriminant analysis were used as input to the four-layer back-propagation network to classify different grain kernel varieties. The data set was divided into three groups: 70% for training, 20% for validation, and 10% for testing the network. The classification experimental results show that the proposed method is able to classify the grain kernel varieties efficiently.

  19. General perspective view of the Fisher School Covered Bridge, view ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    General perspective view of the Fisher School Covered Bridge, view looking southwest from Five Rivers Road. - Fisher School Covered Bridge, Crab Creek Road at Fiver Rivers Road, Fisher, Lincoln County, OR

  20. General topographic view of the Fisher School Covered Bridge, view ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    General topographic view of the Fisher School Covered Bridge, view looking northwest from Crab Creek Road. - Fisher School Covered Bridge, Crab Creek Road at Fiver Rivers Road, Fisher, Lincoln County, OR

  1. Interior of the Fisher School Covered Bridge, view to north ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior of the Fisher School Covered Bridge, view to north showing road deck, guardrail, and howe truss. - Fisher School Covered Bridge, Crab Creek Road at Fiver Rivers Road, Fisher, Lincoln County, OR

  2. General perspective view of the Fisher School Covered Bridge, view ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    General perspective view of the Fisher School Covered Bridge, view looking east along Five Rivers Road. - Fisher School Covered Bridge, Crab Creek Road at Fiver Rivers Road, Fisher, Lincoln County, OR

  3. A Spectrum Tree Kernel

    NASA Astrophysics Data System (ADS)

    Kuboyama, Tetsuji; Hirata, Kouichi; Kashima, Hisashi; F. Aoki-Kinoshita, Kiyoko; Yasuda, Hiroshi

    Learning from tree-structured data has received increasing interest with the rapid growth of tree-encodable data in the World Wide Web, in biology, and in other areas. Our kernel function measures the similarity between two trees by counting the number of shared sub-patterns called tree q-grams, and runs, in effect, in linear time with respect to the number of tree nodes. We apply our kernel function with a support vector machine (SVM) to classify biological data, the glycans of several blood components. The experimental results show that our kernel function performs as well as one exclusively tailored to glycan properties.

  4. Fisher Information in Ecological Systems

    NASA Astrophysics Data System (ADS)

    Frieden, B. Roy; Gatenby, Robert A.

    Fisher information is being increasingly used as a tool of research into ecological systems. For example the information was shown in Chapter 7 to provide a useful diagnostic of the health of an ecology. In other applications to ecology, extreme physical information (EPI) has been used to derive the population-rate (or Lotka-Volterra) equations of ecological systems, both directly [1] and indirectly (Chapter 5) via the quantum Schrodinger wave equation (SWE). We next build on these results, to derive (i) an uncertainty principle (8.3) of biology, (ii) a simple decision rule (8.18) for predicting whether a given ecology is susceptible to a sudden drop in population (Section 8.1), (iii) the probability law (8.57) or (8.59) on the worldwide occurrence of the masses of living creatures from mice to elephants and beyond (Section 8.2), and (iv) the famous quarter-power laws for the attributes of biological and other systems. The latter approach uses EPI to derive the simultaneous quarter-power behavior of all attributes obeyed by the law, such as metabolism rate, brain size, grazing range, etc. (Section 8.3). This maximal breadth of scope is allowed by its basis in information, which of course applies to all types of quantitative data (Section 1.4.3, Chapter 1).

  5. Robotic Intelligence Kernel: Communications

    SciTech Connect

    Walton, Mike C.

    2009-09-16

    The INL Robotic Intelligence Kernel-Comms is the communication server that transmits information between one or more robots using the RIK and one or more user interfaces. It supports event handling and multiple hardware communication protocols.

  6. Robotic Intelligence Kernel: Driver

    SciTech Connect

    2009-09-16

    The INL Robotic Intelligence Kernel-Driver is built on top of the RIK-A and implements a dynamic autonomy structure. The RIK-D is used to orchestrate hardware for sensing and action as well as software components for perception, communication, behavior and world modeling into a single cognitive behavior kernel that provides intrinsic intelligence for a wide variety of unmanned ground vehicle systems.

  7. Linearized Kernel Dictionary Learning

    NASA Astrophysics Data System (ADS)

    Golts, Alona; Elad, Michael

    2016-06-01

    In this paper we present a new approach of incorporating kernels into dictionary learning. The kernel K-SVD algorithm (KKSVD), which has been introduced recently, shows an improvement in classification performance, with relation to its linear counterpart K-SVD. However, this algorithm requires the storage and handling of a very large kernel matrix, which leads to high computational cost, while also limiting its use to setups with small number of training examples. We address these problems by combining two ideas: first we approximate the kernel matrix using a cleverly sampled subset of its columns using the Nystr\\"{o}m method; secondly, as we wish to avoid using this matrix altogether, we decompose it by SVD to form new "virtual samples," on which any linear dictionary learning can be employed. Our method, termed "Linearized Kernel Dictionary Learning" (LKDL) can be seamlessly applied as a pre-processing stage on top of any efficient off-the-shelf dictionary learning scheme, effectively "kernelizing" it. We demonstrate the effectiveness of our method on several tasks of both supervised and unsupervised classification and show the efficiency of the proposed scheme, its easy integration and performance boosting properties.

  8. Fishers' knowledge and seahorse conservation in Brazil

    PubMed Central

    Rosa, Ierecê ML; Alves, Rômulo RN; Bonifácio, Kallyne M; Mourão, José S; Osório, Frederico M; Oliveira, Tacyana PR; Nottingham, Mara C

    2005-01-01

    From a conservationist perspective, seahorses are threatened fishes. Concomitantly, from a socioeconomic perspective, they represent a source of income to many fishing communities in developing countries. An integration between these two views requires, among other things, the recognition that seahorse fishers have knowledge and abilities that can assist the implementation of conservation strategies and of management plans for seahorses and their habitats. This paper documents the knowledge held by Brazilian fishers on the biology and ecology of the longsnout seahorse Hippocampus reidi. Its aims were to explore collaborative approaches to seahorse conservation and management in Brazil; to assess fishers' perception of seahorse biology and ecology, in the context evaluating potential management options; to increase fishers' involvement with seahorse conservation in Brazil. Data were obtained through questionnaires and interviews made during field surveys conducted in fishing villages located in the States of Piauí, Ceará, Paraíba, Maranhão, Pernambuco and Pará. We consider the following aspects as positive for the conservation of seahorses and their habitats in Brazil: fishers were willing to dialogue with researchers; although captures and/or trade of brooding seahorses occurred, most interviewees recognized the importance of reproduction to the maintenance of seahorses in the wild (and therefore of their source of income), and expressed concern over population declines; fishers associated the presence of a ventral pouch with reproduction in seahorses (regardless of them knowing which sex bears the pouch), and this may facilitate the construction of collaborative management options designed to eliminate captures of brooding specimens; fishers recognized microhabitats of importance to the maintenance of seahorse wild populations; fishers who kept seahorses in captivity tended to recognize the condtions as poor, and as being a cause of seahorse mortality. PMID

  9. An evaluation of parturition indices in fishers

    USGS Publications Warehouse

    Frost, H.C.; York, E.C.; Krohn, W.B.; Elowe, K.D.; Decker, T.A.; Powell, S.M.; Fuller, T.K.

    1999-01-01

    Fishers (Martes pennanti) are important forest carnivores and furbearers that are susceptible to overharvest. Traditional indices used to monitor fisher populations typically overestimate litter size and proportion of females that give birth. We evaluated the usefulness of 2 indices of reproduction to determine proportion of female fishers that gave birth in a particular year. We used female fishers of known age and reproductive histories to compare appearance of placental scars with incidence of pregnancy and litter size. Microscopic observation of freshly removed reproductive tracts correctly identified pregnant fishers and correctly estimated litter size in 3 of 4 instances, but gross observation of placental scars failed to correctly identify pregnant fishers and litter size. Microscopic observations of reproductive tracts in carcasses that were not fresh also failed to identify pregnant animals and litter size. We evaluated mean sizes of anterior nipples to see if different reproductive classes could be distinguished. Mean anterior nipple size of captive and wild fishers correctly identified current-year breeders from nonbreeders. Former breeders were misclassified in 4 of 13 instances. Presence of placental scars accurately predicted parturition in a small sample size of fishers, but absence of placental scars did not signify that a female did not give birth. In addition to enabling the estimation of parturition rates in live animals more accurately than traditional indices, mean anterior nipple size also provided an estimate of the percentage of adult females that successfully raised young. Though using mean anterior nipple size to index reproductive success looks promising, additional data are needed to evaluate effects of using dried, stretched pelts on nipple size for management purposes.

  10. Fishers' knowledge and seahorse conservation in Brazil.

    PubMed

    Rosa, Ierecê Ml; Alves, Rômulo Rn; Bonifácio, Kallyne M; Mourão, José S; Osório, Frederico M; Oliveira, Tacyana Pr; Nottingham, Mara C

    2005-12-08

    From a conservationist perspective, seahorses are threatened fishes. Concomitantly, from a socioeconomic perspective, they represent a source of income to many fishing communities in developing countries. An integration between these two views requires, among other things, the recognition that seahorse fishers have knowledge and abilities that can assist the implementation of conservation strategies and of management plans for seahorses and their habitats. This paper documents the knowledge held by Brazilian fishers on the biology and ecology of the longsnout seahorse Hippocampus reidi. Its aims were to explore collaborative approaches to seahorse conservation and management in Brazil; to assess fishers' perception of seahorse biology and ecology, in the context evaluating potential management options; to increase fishers' involvement with seahorse conservation in Brazil. Data were obtained through questionnaires and interviews made during field surveys conducted in fishing villages located in the States of Piauí, Ceará, Paraíba, Maranhão, Pernambuco and Pará. We consider the following aspects as positive for the conservation of seahorses and their habitats in Brazil: fishers were willing to dialogue with researchers; although captures and/or trade of brooding seahorses occurred, most interviewees recognized the importance of reproduction to the maintenance of seahorses in the wild (and therefore of their source of income), and expressed concern over population declines; fishers associated the presence of a ventral pouch with reproduction in seahorses (regardless of them knowing which sex bears the pouch), and this may facilitate the construction of collaborative management options designed to eliminate captures of brooding specimens; fishers recognized microhabitats of importance to the maintenance of seahorse wild populations; fishers who kept seahorses in captivity tended to recognize the condtions as poor, and as being a cause of seahorse mortality.

  11. A fisher vector representation of GPR data for detecting buried objects

    NASA Astrophysics Data System (ADS)

    Karem, Andrew; Khalifa, Amine B.; Frigui, Hichem

    2016-05-01

    We present a new method, based on the Fisher Vector (FV), for detecting buried explosive objects using ground- penetrating radar (GPR) data. First, low-level dense SIFT features are extracted from a grid covering a region of interest (ROIs). ROIs are identified as regions with high-energy along the (down-track, depth) dimensions of the 3-D GPR cube, or with high-energy along the (cross-track, depth) dimensions. Next, we model the training data (in the SIFT feature space) by a mixture of Gaussian components. Then, we construct FV descriptors based on the Fisher Kernel. The Fisher Kernel characterizes low-level features from an ROI by their deviation from a generative model. The deviation is the gradient of the ROI log-likelihood with respect to the generative model parameters. The vectorial representation of all the deviations is called the Fisher Vector. FV is a generalization of the standard Bag of Words (BoW) method, which provides a framework to map a set of local descriptors to a global feature vector. It is more efficient to compute than the BoW since it relies on a significantly smaller codebook. In addition, mapping a GPR signature into one global feature vector using this technique makes it more efficient to classify using simple and fast linear classifiers such as Support Vector Machines. The proposed approach is applied to detect buried explosive objects using GPR data. The selected data were accumulated across multiple dates and multiple test sites by a vehicle mounted mine detector (VMMD) using GPR sensor. This data consist of a diverse set of conventional landmines and other buried explosive objects consisting of varying shapes, metal content, and burial depths. The performance of the proposed approach is analyzed using receiver operating characteristics (ROC) and is compared to other state-of-the-art feature representation methods.

  12. Application of Fisher Information to Complex Dynamic Systems (Tucson)

    EPA Science Inventory

    Fisher information was developed by the statistician Ronald Fisher as a measure of the information obtainable from data being used to fit a related parameter. Starting from the work of Ronald Fisher1 and B. Roy Frieden2, we have developed Fisher information as a measure of order ...

  13. Application of Fisher Information to Complex Dynamic Systems

    EPA Science Inventory

    Fisher information was developed by the statistician Ronald Fisher as a measure of the information obtainable from data being used to fit a related parameter. Starting from the work of Ronald Fisher1 and B. Roy Frieden2, we have developed Fisher information as a measure of order ...

  14. Nonparametric estimation of Fisher information from real data

    NASA Astrophysics Data System (ADS)

    Har-Shemesh, Omri; Quax, Rick; Miñano, Borja; Hoekstra, Alfons G.; Sloot, Peter M. A.

    2016-02-01

    The Fisher information matrix (FIM) is a widely used measure for applications including statistical inference, information geometry, experiment design, and the study of criticality in biological systems. The FIM is defined for a parametric family of probability distributions and its estimation from data follows one of two paths: either the distribution is assumed to be known and the parameters are estimated from the data or the parameters are known and the distribution is estimated from the data. We consider the latter case which is applicable, for example, to experiments where the parameters are controlled by the experimenter and a complicated relation exists between the input parameters and the resulting distribution of the data. Since we assume that the distribution is unknown, we use a nonparametric density estimation on the data and then compute the FIM directly from that estimate using a finite-difference approximation to estimate the derivatives in its definition. The accuracy of the estimate depends on both the method of nonparametric estimation and the difference Δ θ between the densities used in the finite-difference formula. We develop an approach for choosing the optimal parameter difference Δ θ based on large deviations theory and compare two nonparametric density estimation methods, the Gaussian kernel density estimator and a novel density estimation using field theory method. We also compare these two methods to a recently published approach that circumvents the need for density estimation by estimating a nonparametric f divergence and using it to approximate the FIM. We use the Fisher information of the normal distribution to validate our method and as a more involved example we compute the temperature component of the FIM in the two-dimensional Ising model and show that it obeys the expected relation to the heat capacity and therefore peaks at the phase transition at the correct critical temperature.

  15. Nonparametric estimation of Fisher information from real data.

    PubMed

    Har-Shemesh, Omri; Quax, Rick; Miñano, Borja; Hoekstra, Alfons G; Sloot, Peter M A

    2016-02-01

    The Fisher information matrix (FIM) is a widely used measure for applications including statistical inference, information geometry, experiment design, and the study of criticality in biological systems. The FIM is defined for a parametric family of probability distributions and its estimation from data follows one of two paths: either the distribution is assumed to be known and the parameters are estimated from the data or the parameters are known and the distribution is estimated from the data. We consider the latter case which is applicable, for example, to experiments where the parameters are controlled by the experimenter and a complicated relation exists between the input parameters and the resulting distribution of the data. Since we assume that the distribution is unknown, we use a nonparametric density estimation on the data and then compute the FIM directly from that estimate using a finite-difference approximation to estimate the derivatives in its definition. The accuracy of the estimate depends on both the method of nonparametric estimation and the difference Δθ between the densities used in the finite-difference formula. We develop an approach for choosing the optimal parameter difference Δθ based on large deviations theory and compare two nonparametric density estimation methods, the Gaussian kernel density estimator and a novel density estimation using field theory method. We also compare these two methods to a recently published approach that circumvents the need for density estimation by estimating a nonparametric f divergence and using it to approximate the FIM. We use the Fisher information of the normal distribution to validate our method and as a more involved example we compute the temperature component of the FIM in the two-dimensional Ising model and show that it obeys the expected relation to the heat capacity and therefore peaks at the phase transition at the correct critical temperature. PMID:26986433

  16. Kernel mucking in top

    SciTech Connect

    LeFebvre, W.

    1994-08-01

    For many years, the popular program top has aided system administrations in examination of process resource usage on their machines. Yet few are familiar with the techniques involved in obtaining this information. Most of what is displayed by top is available only in the dark recesses of kernel memory. Extracting this information requires familiarity not only with how bytes are read from the kernel, but also what data needs to be read. The wide variety of systems and variants of the Unix operating system in today`s marketplace makes writing such a program very challenging. This paper explores the tremendous diversity in kernel information across the many platforms and the solutions employed by top to achieve and maintain ease of portability in the presence of such divergent systems.

  17. Calculates Thermal Neutron Scattering Kernel.

    1989-11-10

    Version 00 THRUSH computes the thermal neutron scattering kernel by the phonon expansion method for both coherent and incoherent scattering processes. The calculation of the coherent part is suitable only for calculating the scattering kernel for heavy water.

  18. Mutual Information, Fisher Information, and Efficient Coding.

    PubMed

    Wei, Xue-Xin; Stocker, Alan A

    2016-02-01

    Fisher information is generally believed to represent a lower bound on mutual information (Brunel & Nadal, 1998), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought. For example, we find that in the small noise regime, Fisher information actually provides an upper bound on mutual information. Generally our results show that it is more appropriate to consider Fisher information as an approximation rather than a bound on mutual information. We analytically derive the correspondence between the two quantities and the conditions under which the approximation is good. Our results have implications for neural coding theories and the link between neural population coding and psychophysically measurable behavior. Specifically, they allow us to formulate the efficient coding problem of maximizing mutual information between a stimulus variable and the response of a neural population in terms of Fisher information. We derive a signature of efficient coding expressed as the correspondence between the population Fisher information and the distribution of the stimulus variable. The signature is more general than previously proposed solutions that rely on specific assumptions about the neural tuning characteristics. We demonstrate that it can explain measured tuning characteristics of cortical neural populations that do not agree with previous models of efficient coding.

  19. Anytime query-tuned kernel machine classifiers via Cholesky factorization

    NASA Technical Reports Server (NTRS)

    DeCoste, D.

    2002-01-01

    We recently demonstrated 2 to 64-fold query-time speedups of Support Vector Machine and Kernel Fisher classifiers via a new computational geometry method for anytime output bounds (DeCoste,2002). This new paper refines our approach in two key ways. First, we introduce a simple linear algebra formulation based on Cholesky factorization, yielding simpler equations and lower computational overhead. Second, this new formulation suggests new methods for achieving additional speedups, including tuning on query samples. We demonstrate effectiveness on benchmark datasets.

  20. Fisher's ratio-based criterion for finding endmembers in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Gao, Cheng; Chen, Shih-Yu; Chang, Chein-I.

    2014-05-01

    Endmember extraction has recently received considerable interest in hyperspectral imagery. However, several issues in endmember extraction may have been overlooked. The first and foremost is the term of using endmember extraction. Many algorithms claimed to be endmember extraction algorithms actually do not extract true endmembers but rather find potential endmember candidates, referred to as virtual endmembers (VEs). Secondly, how difficult for an algorithm to find VEs is primarily determined by two key factors, endmember variability and endmember discriminability. While the former issue has been addressed recently in the literature, the latter issue is yet explored and has not been investigated before. This paper re-invents a wheel by developing a Fisher's ratio approach to finding VEs using Fisher's ratio criterion which is defined by ratio of endmember variability to endmember discriminability.

  1. Robotic Intelligence Kernel: Visualization

    SciTech Connect

    2009-09-16

    The INL Robotic Intelligence Kernel-Visualization is the software that supports the user interface. It uses the RIK-C software to communicate information to and from the robot. The RIK-V illustrates the data in a 3D display and provides an operating picture wherein the user can task the robot.

  2. Robotic Intelligence Kernel: Architecture

    SciTech Connect

    2009-09-16

    The INL Robotic Intelligence Kernel Architecture (RIK-A) is a multi-level architecture that supports a dynamic autonomy structure. The RIK-A is used to coalesce hardware for sensing and action as well as software components for perception, communication, behavior and world modeling into a framework that can be used to create behaviors for humans to interact with the robot.

  3. The Baryonic Tully-Fisher Relation.

    PubMed

    McGaugh; Schombert; Bothun; de Blok WJ

    2000-04-20

    We explore the Tully-Fisher relation over five decades in stellar mass in galaxies with circular velocities ranging over 30 less, similarVc less, similar300 km s-1. We find a clear break in the optical Tully-Fisher relation: field galaxies with Vc less, similar90 km s-1 fall below the relation defined by brighter galaxies. These faint galaxies, however, are very rich in gas; adding in the gas mass and plotting the baryonic disk mass Md=M*+Mgas in place of luminosity restores the single linear relation. The Tully-Fisher relation thus appears fundamentally to be a relation between rotation velocity and total baryonic mass of the form Md~V4c.

  4. [Fisher Syndrome and Bickerstaff Brainstem Encephalitis].

    PubMed

    Kuwabara, Satoshi

    2015-11-01

    Fisher syndrome has been regarded as a peculiar inflammatory neuropathy with ophthalmoplegia, ataxia, and areflexia, whereas Bickerstaff brainstem encephalitis has been considered a pure central nervous system disease characterized by ophthalmoplegia, ataxia, and consciousness disturbance. Both disorders share common features including preceding infection, albumin-cytological dissociation, and association with Guillain-Barré syndrome. The discovery of anti-GQ1b IgG antibodies further supports the view that the two disorders represent a single disease spectrum. The lesions in Fisher syndrome and Bickerstaff brainstem encephalitis are presumably determined by the expression of ganglioside GQ1b in the human peripheral and central nervous systems. Bickerstaff brainstem encephalitis is likely to represent a variant of Fisher syndrome with central nervous system involvement. PMID:26560952

  5. Fisher information and Rényi dimensions: A thermodynamical formalism

    NASA Astrophysics Data System (ADS)

    Godó, B.; Nagy, Á.

    2016-08-01

    The relation between the Fisher information and Rényi dimensions is established: the Fisher information can be expressed as a linear combination of the first and second derivatives of the Rényi dimensions with respect to the Rényi parameter β. The Rényi parameter β is the parameter of the Fisher information. A thermodynamical description based on the Fisher information with β being the inverse temperature is introduced for chaotic systems. The link between the Fisher information and the heat capacity is emphasized, and the Fisher heat capacity is introduced.

  6. Fisher information and Rényi dimensions: A thermodynamical formalism.

    PubMed

    Godó, B; Nagy, Á

    2016-08-01

    The relation between the Fisher information and Rényi dimensions is established: the Fisher information can be expressed as a linear combination of the first and second derivatives of the Rényi dimensions with respect to the Rényi parameter β. The Rényi parameter β is the parameter of the Fisher information. A thermodynamical description based on the Fisher information with β being the inverse temperature is introduced for chaotic systems. The link between the Fisher information and the heat capacity is emphasized, and the Fisher heat capacity is introduced. PMID:27586598

  7. MC Kernel: Broadband Waveform Sensitivity Kernels for Seismic Tomography

    NASA Astrophysics Data System (ADS)

    Stähler, Simon C.; van Driel, Martin; Auer, Ludwig; Hosseini, Kasra; Sigloch, Karin; Nissen-Meyer, Tarje

    2016-04-01

    We present MC Kernel, a software implementation to calculate seismic sensitivity kernels on arbitrary tetrahedral or hexahedral grids across the whole observable seismic frequency band. Seismic sensitivity kernels are the basis for seismic tomography, since they map measurements to model perturbations. Their calculation over the whole frequency range was so far only possible with approximative methods (Dahlen et al. 2000). Fully numerical methods were restricted to the lower frequency range (usually below 0.05 Hz, Tromp et al. 2005). With our implementation, it's possible to compute accurate sensitivity kernels for global tomography across the observable seismic frequency band. These kernels rely on wavefield databases computed via AxiSEM (www.axisem.info), and thus on spherically symmetric models. The advantage is that frequencies up to 0.2 Hz and higher can be accessed. Since the usage of irregular, adapted grids is an integral part of regularisation in seismic tomography, MC Kernel works in a inversion-grid-centred fashion: A Monte-Carlo integration method is used to project the kernel onto each basis function, which allows to control the desired precision of the kernel estimation. Also, it means that the code concentrates calculation effort on regions of interest without prior assumptions on the kernel shape. The code makes extensive use of redundancies in calculating kernels for different receivers or frequency-pass-bands for one earthquake, to facilitate its usage in large-scale global seismic tomography.

  8. FISHER INFORMATION AND ECOSYSTEM REGIME CHANGES

    EPA Science Inventory

    Following Fisher’s work, we propose two different expressions for the Fisher Information along with Shannon Information as a means of detecting and assessing shifts between alternative ecosystem regimes. Regime shifts are a consequence of bifurcations in the dynamics of an ecosys...

  9. EXERGY AND FISHER INFORMATION AS ECOLOGICAL INDEXES

    EPA Science Inventory

    Ecological indices are used to provide summary information about a particular aspect of ecosystem behavior. Many such indices have been proposed and here we investigate two: exergy and Fisher Information. Exergy, a thermodynamically based index, is a measure of maximum amount o...

  10. Kernel-aligned multi-view canonical correlation analysis for image recognition

    NASA Astrophysics Data System (ADS)

    Su, Shuzhi; Ge, Hongwei; Yuan, Yun-Hao

    2016-09-01

    Existing kernel-based correlation analysis methods mainly adopt a single kernel in each view. However, only a single kernel is usually insufficient to characterize nonlinear distribution information of a view. To solve the problem, we transform each original feature vector into a 2-dimensional feature matrix by means of kernel alignment, and then propose a novel kernel-aligned multi-view canonical correlation analysis (KAMCCA) method on the basis of the feature matrices. Our proposed method can simultaneously employ multiple kernels to better capture the nonlinear distribution information of each view, so that correlation features learned by KAMCCA can have well discriminating power in real-world image recognition. Extensive experiments are designed on five real-world image datasets, including NIR face images, thermal face images, visible face images, handwritten digit images, and object images. Promising experimental results on the datasets have manifested the effectiveness of our proposed method.

  11. Differential evolution algorithm-based kernel parameter selection for Fukunaga-Koontz Transform subspaces construction

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin

    2015-10-01

    The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.

  12. Kernel methods for phenotyping complex plant architecture.

    PubMed

    Kawamura, Koji; Hibrand-Saint Oyant, Laurence; Foucher, Fabrice; Thouroude, Tatiana; Loustau, Sébastien

    2014-02-01

    The Quantitative Trait Loci (QTL) mapping of plant architecture is a critical step for understanding the genetic determinism of plant architecture. Previous studies adopted simple measurements, such as plant-height, stem-diameter and branching-intensity for QTL mapping of plant architecture. Many of these quantitative traits were generally correlated to each other, which give rise to statistical problem in the detection of QTL. We aim to test the applicability of kernel methods to phenotyping inflorescence architecture and its QTL mapping. We first test Kernel Principal Component Analysis (KPCA) and Support Vector Machines (SVM) over an artificial dataset of simulated inflorescences with different types of flower distribution, which is coded as a sequence of flower-number per node along a shoot. The ability of discriminating the different inflorescence types by SVM and KPCA is illustrated. We then apply the KPCA representation to the real dataset of rose inflorescence shoots (n=1460) obtained from a 98 F1 hybrid mapping population. We find kernel principal components with high heritability (>0.7), and the QTL analysis identifies a new QTL, which was not detected by a trait-by-trait analysis of simple architectural measurements. The main tools developed in this paper could be use to tackle the general problem of QTL mapping of complex (sequences, 3D structure, graphs) phenotypic traits.

  13. Twin kernel embedding.

    PubMed

    Guo, Yi; Gao, Junbin; Kwan, Paul W

    2008-08-01

    In most existing dimensionality reduction algorithms, the main objective is to preserve relational structure among objects of the input space in a low dimensional embedding space. This is achieved by minimizing the inconsistency between two similarity/dissimilarity measures, one for the input data and the other for the embedded data, via a separate matching objective function. Based on this idea, a new dimensionality reduction method called Twin Kernel Embedding (TKE) is proposed. TKE addresses the problem of visualizing non-vectorial data that is difficult for conventional methods in practice due to the lack of efficient vectorial representation. TKE solves this problem by minimizing the inconsistency between the similarity measures captured respectively by their kernel Gram matrices in the two spaces. In the implementation, by optimizing a nonlinear objective function using the gradient descent algorithm, a local minimum can be reached. The results obtained include both the optimal similarity preserving embedding and the appropriate values for the hyperparameters of the kernel. Experimental evaluation on real non-vectorial datasets confirmed the effectiveness of TKE. TKE can be applied to other types of data beyond those mentioned in this paper whenever suitable measures of similarity/dissimilarity can be defined on the input data. PMID:18566501

  14. An early "Atkins' Diet": RA Fisher analyses a medical "experiment".

    PubMed

    Senn, Stephen

    2006-04-01

    A study on vitamin absorption which RA Fisher analysed for WRG Atkins and co-authored with him is critically examined. The historical background as well as correspondence between Atkins and Fisher is presented.

  15. Iris Image Blur Detection with Multiple Kernel Learning

    NASA Astrophysics Data System (ADS)

    Pan, Lili; Xie, Mei; Mao, Ling

    In this letter, we analyze the influence of motion and out-of-focus blur on both frequency spectrum and cepstrum of an iris image. Based on their characteristics, we define two new discriminative blur features represented by Energy Spectral Density Distribution (ESDD) and Singular Cepstrum Histogram (SCH). To merge the two features for blur detection, a merging kernel which is a linear combination of two kernels is proposed when employing Support Vector Machine. Extensive experiments demonstrate the validity of our method by showing the improved blur detection performance on both synthetic and real datasets.

  16. HMM-Fisher: identifying differential methylation using a hidden Markov model and Fisher's exact test.

    PubMed

    Sun, Shuying; Yu, Xiaoqing

    2016-03-01

    DNA methylation is an epigenetic event that plays an important role in regulating gene expression. It is important to study DNA methylation, especially differential methylation patterns between two groups of samples (e.g. patients vs. normal individuals). With next generation sequencing technologies, it is now possible to identify differential methylation patterns by considering methylation at the single CG site level in an entire genome. However, it is challenging to analyze large and complex NGS data. In order to address this difficult question, we have developed a new statistical method using a hidden Markov model and Fisher's exact test (HMM-Fisher) to identify differentially methylated cytosines and regions. We first use a hidden Markov chain to model the methylation signals to infer the methylation state as Not methylated (N), Partly methylated (P), and Fully methylated (F) for each individual sample. We then use Fisher's exact test to identify differentially methylated CG sites. We show the HMM-Fisher method and compare it with commonly cited methods using both simulated data and real sequencing data. The results show that HMM-Fisher outperforms the current available methods to which we have compared. HMM-Fisher is efficient and robust in identifying heterogeneous DM regions. PMID:26854292

  17. Fisher classifier and its probability of error estimation

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  18. Sexual selection unhandicapped by the Fisher process.

    PubMed

    Grafen, A

    1990-06-21

    A population genetic model of sexual selection is constructed in which, at equilibrium, males signal their quality by developing costly ornaments, and females pay costs to use the ornaments in mate choice. It is shown that the form of the equilibrium is uninfluenced by the Fisher process, that is, by self-reinforcement of female preferences. This is a working model of the handicap principle applied to sexual selection, and places Zahavi's handicap principle on the same logical footing as the Fisher process, in that each can support sexual selection without the presence of the other. A way of measuring the relative importance of the two processes is suggested that can be applied to both theories and facts. A style of modelling that allows simple genetics and complicated biology to be combined is recommended.

  19. Olympic Fisher Reintroduction Project- 2009 Progress Report

    USGS Publications Warehouse

    Lewis, Jeffrey C.; Happe, Patti J.; Jenkins, Kurt J.; Manson, David J.

    2009-01-01

    The 2009 progress report is a summary of the reintroduction, monitoring, and research efforts undertaken during the first two years of the Olympic fisher reintroduction project. Jeffrey C. Lewis of Washington Department of Fish and Wildlife, Patti J. Happe of Olympic National Park, and Kurt J. Jenkins of U. S. Geological Survey are the principal investigators of the monitoring and research program associated with the reintroduction. David J. Manson of Olympic National Park is the lead biological

  20. Olympic Fisher Reintroduction Project: 2010 Progress Report

    USGS Publications Warehouse

    Lewis, Jeffrey C.; Happe, Patti J.; Jenkins, Kurt J.; Manson, David J.

    2010-01-01

    The 2010 progress report is a summary of the reintroduction, monitoring, and research efforts undertaken during the third year of the Olympic fisher reintroduction project. Jeffrey C. Lewis of Washington Department of Fish and Wildlife, Patti J. Happe of Olympic National Park, and Kurt J. Jenkins of U. S. Geological Survey are the principal investigators of the monitoring and research program associated with the reintroduction. David J. Manson of Olympic National Park is the lead biological technician.

  1. Fisher information, nonclassicality and quantum revivals

    NASA Astrophysics Data System (ADS)

    Romera, Elvira; de los Santos, Francisco

    2013-11-01

    Wave packet revivals and fractional revivals are studied by means of a measure of nonclassicality based on the Fisher information. In particular, we show that the spreading and the regeneration of initially Gaussian wave packets in a quantum bouncer and in the infinite square-well correspond, respectively, to high and low nonclassicality values. This result is in accordance with the physical expectations that at a quantum revival wave packets almost recover their initial shape and the classical motion revives temporarily afterward.

  2. Fisher equation for a decaying brane

    NASA Astrophysics Data System (ADS)

    Ghoshal, Debashis

    2011-12-01

    We consider the inhomogeneous decay of an unstable D-brane. The dynamical equation that describes this process (in light-cone time) is a variant of the non-linear reaction-diffusion equation that first made its appearance in the pioneering work of (Luther and) Fisher and appears in a variety of natural phenomena. We analyze its travelling front solution using singular perturbation theory.

  3. Miller Fisher syndrome presenting as palate paralysis.

    PubMed

    Noureldine, Mohammad Hassan A; Sweid, Ahmad; Ahdab, Rechdi

    2016-09-15

    We report a 63-year old patient who presented to our care initially with a hypernasal voice followed by ataxia, ptosis, dysphonia, and paresthesias. The patient's history, physical examination, and additional tests led to a Miller Fisher syndrome (MFS) diagnosis. Palatal paralysis as an inaugurating manifestation of MFS is quite rare and requires special attention from neurologists and otolaryngologists. Although it may present as benign as an acute change in voice, early diagnosis and prompt management may prevent further complications. PMID:27609285

  4. Cell Development obeys Maximum Fisher Information

    PubMed Central

    Frieden, B. Roy; Gatenby, Robert A.

    2014-01-01

    Eukaryotic cell development has been optimized by natural selection to obey maximal intracellular flux of messenger proteins. This, in turn, implies maximum Fisher information on angular position about a target nuclear pore complex (NPR). The cell is simply modeled as spherical, with cell membrane (CM) diameter 10μm and concentric nuclear membrane (NM) diameter 6μm. The NM contains ≈ 3000 nuclear pore complexes (NPCs). Development requires messenger ligands to travel from the CM-NPC-DNA target binding sites. Ligands acquire negative charge by phosphorylation, passing through the cytoplasm over Newtonian trajectories toward positively charged NPCs (utilizing positive nuclear localization sequences). The CM-NPC channel obeys maximized mean protein flux F and Fisher information I at the NPC, with first-order δI = 0 and approximate 2nd-order δ2I ≈ 0 stability to environmental perturbations. Many of its predictions are confirmed, including the dominance of protein pathways of from 1–4 proteins, a 4nm size for the EGFR protein and the flux value F ≈1016 proteins/m2-s. After entering the nucleus, each protein ultimately delivers its ligand information to a DNA target site with maximum probability, i.e. maximum Kullback-Liebler entropy HKL. In a smoothness limit HKL → IDNA/2, so that the total CM-NPC-DNA channel obeys maximum Fisher I. Thus maximum information → non-equilibrium, one condition for life. PMID:23747917

  5. A Global Estimate of the Number of Coral Reef Fishers.

    PubMed

    Teh, Louise S L; Teh, Lydia C L; Sumaila, U Rashid

    2013-01-01

    Overfishing threatens coral reefs worldwide, yet there is no reliable estimate on the number of reef fishers globally. We address this data gap by quantifying the number of reef fishers on a global scale, using two approaches - the first estimates reef fishers as a proportion of the total number of marine fishers in a country, based on the ratio of reef-related to total marine fish landed values. The second estimates reef fishers as a function of coral reef area, rural coastal population, and fishing pressure. In total, we find that there are 6 million reef fishers in 99 reef countries and territories worldwide, of which at least 25% are reef gleaners. Our estimates are an improvement over most existing fisher population statistics, which tend to omit accounting for gleaners and reef fishers. Our results suggest that slightly over a quarter of the world's small-scale fishers fish on coral reefs, and half of all coral reef fishers are in Southeast Asia. Coral reefs evidently support the socio-economic well-being of numerous coastal communities. By quantifying the number of people who are employed as reef fishers, we provide decision-makers with an important input into planning for sustainable coral reef fisheries at the appropriate scale. PMID:23840327

  6. A Global Estimate of the Number of Coral Reef Fishers

    PubMed Central

    Teh, Louise S. L.; Teh, Lydia C. L.; Sumaila, U. Rashid

    2013-01-01

    Overfishing threatens coral reefs worldwide, yet there is no reliable estimate on the number of reef fishers globally. We address this data gap by quantifying the number of reef fishers on a global scale, using two approaches - the first estimates reef fishers as a proportion of the total number of marine fishers in a country, based on the ratio of reef-related to total marine fish landed values. The second estimates reef fishers as a function of coral reef area, rural coastal population, and fishing pressure. In total, we find that there are 6 million reef fishers in 99 reef countries and territories worldwide, of which at least 25% are reef gleaners. Our estimates are an improvement over most existing fisher population statistics, which tend to omit accounting for gleaners and reef fishers. Our results suggest that slightly over a quarter of the world’s small-scale fishers fish on coral reefs, and half of all coral reef fishers are in Southeast Asia. Coral reefs evidently support the socio-economic well-being of numerous coastal communities. By quantifying the number of people who are employed as reef fishers, we provide decision-makers with an important input into planning for sustainable coral reef fisheries at the appropriate scale. PMID:23840327

  7. A Global Estimate of the Number of Coral Reef Fishers.

    PubMed

    Teh, Louise S L; Teh, Lydia C L; Sumaila, U Rashid

    2013-01-01

    Overfishing threatens coral reefs worldwide, yet there is no reliable estimate on the number of reef fishers globally. We address this data gap by quantifying the number of reef fishers on a global scale, using two approaches - the first estimates reef fishers as a proportion of the total number of marine fishers in a country, based on the ratio of reef-related to total marine fish landed values. The second estimates reef fishers as a function of coral reef area, rural coastal population, and fishing pressure. In total, we find that there are 6 million reef fishers in 99 reef countries and territories worldwide, of which at least 25% are reef gleaners. Our estimates are an improvement over most existing fisher population statistics, which tend to omit accounting for gleaners and reef fishers. Our results suggest that slightly over a quarter of the world's small-scale fishers fish on coral reefs, and half of all coral reef fishers are in Southeast Asia. Coral reefs evidently support the socio-economic well-being of numerous coastal communities. By quantifying the number of people who are employed as reef fishers, we provide decision-makers with an important input into planning for sustainable coral reef fisheries at the appropriate scale.

  8. Kernel Phase and Kernel Amplitude in Fizeau Imaging

    NASA Astrophysics Data System (ADS)

    Pope, Benjamin J. S.

    2016-09-01

    Kernel phase interferometry is an approach to high angular resolution imaging which enhances the performance of speckle imaging with adaptive optics. Kernel phases are self-calibrating observables that generalize the idea of closure phases from non-redundant arrays to telescopes with arbitrarily shaped pupils, by considering a matrix-based approximation to the diffraction problem. In this paper I discuss the recent fhistory of kernel phase, in particular in the matrix-based study of sparse arrays, and propose an analogous generalization of the closure amplitude to kernel amplitudes. This new approach can self-calibrate throughput and scintillation errors in optical imaging, which extends the power of kernel phase-like methods to symmetric targets where amplitude and not phase calibration can be a significant limitation, and will enable further developments in high angular resolution astronomy.

  9. Fisher-Mendel controversy in genetics: scientific argument, intellectual integrity, a fair society, Western falls and bioethical evaluation.

    PubMed

    Tang, Bing H

    2009-10-01

    This review article aims to discuss and analyze the background and findings regarding Fisher-Mendel Controversy in Genetics and to elucidate the scientific argument and intellectual integrity involved, as well as their importance in a fair society, and the lesson of Western falls as learned. At the onset of this review, the kernel of Mendel-Fisher Controversy is dissected and then identified. The fact of an organizational restructuring that had never gone towards a happy synchronization for the ensuing years since 1933 is demonstrated. It was at that time after Fisher succeeded Karl Pearson not only as the Francis Galton Professor of Eugenics but also as the chief of the Galton Laboratory at University College, London. The academic style of eugenics in the late 19th and early 20th centuries in the UK is then introduced. Fisher's ideology at that time, with its effects on the human value system and policy-making at that juncture are portrayed. Bioethical assessment is provided. Lessons in history, the emergence of the Eastern phenomenon and the decline of the Western power are outlined. PMID:19811718

  10. Robotic intelligence kernel

    DOEpatents

    Bruemmer, David J.

    2009-11-17

    A robot platform includes perceptors, locomotors, and a system controller. The system controller executes a robot intelligence kernel (RIK) that includes a multi-level architecture and a dynamic autonomy structure. The multi-level architecture includes a robot behavior level for defining robot behaviors, that incorporate robot attributes and a cognitive level for defining conduct modules that blend an adaptive interaction between predefined decision functions and the robot behaviors. The dynamic autonomy structure is configured for modifying a transaction capacity between an operator intervention and a robot initiative and may include multiple levels with at least a teleoperation mode configured to maximize the operator intervention and minimize the robot initiative and an autonomous mode configured to minimize the operator intervention and maximize the robot initiative. Within the RIK at least the cognitive level includes the dynamic autonomy structure.

  11. Flexible kernel memory.

    PubMed

    Nowicki, Dimitri; Siegelmann, Hava

    2010-06-11

    This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors can be added, deleted, and updated on-line simply, without harming existing memories, and the number of attractors is independent of input dimension. Input vectors do not have to adhere to a fixed or bounded dimensionality; they can increase and decrease it without relearning previous memories. A memory consolidation process enables the network to generalize concepts and form clusters of input data, which outperforms many unsupervised clustering techniques; this process is demonstrated on handwritten digits from MNIST. Another process, reminiscent of memory reconsolidation is introduced, in which existing memories are refreshed and tuned with new inputs; this process is demonstrated on series of morphed faces.

  12. Probability-confidence-kernel-based localized multiple kernel learning with lp norm.

    PubMed

    Han, Yina; Liu, Guizhong

    2012-06-01

    Localized multiple kernel learning (LMKL) is an attractive strategy for combining multiple heterogeneous features in terms of their discriminative power for each individual sample. However, models excessively fitting to a specific sample would obstacle the extension to unseen data, while a more general form is often insufficient for diverse locality characterization. Hence, both learning sample-specific local models for each training datum and extending the learned models to unseen test data should be equally addressed in designing LMKL algorithm. In this paper, for an integrative solution, we propose a probability confidence kernel (PCK), which measures per-sample similarity with respect to probabilistic-prediction-based class attribute: The class attribute similarity complements the spatial-similarity-based base kernels for more reasonable locality characterization, and the predefined form of involved class probability density function facilitates the extension to the whole input space and ensures its statistical meaning. Incorporating PCK into support-vectormachine-based LMKL framework, we propose a new PCK-LMKL with arbitrary l(p)-norm constraint implied in the definition of PCKs, where both the parameters in PCK and the final classifier can be efficiently optimized in a joint manner. Evaluations of PCK-LMKL on both benchmark machine learning data sets (ten University of California Irvine (UCI) data sets) and challenging computer vision data sets (15-scene data set and Caltech-101 data set) have shown to achieve state-of-the-art performances.

  13. GeneFisher-P: variations of GeneFisher as processes in Bio-jETI

    PubMed Central

    Lamprecht, Anna-Lena; Margaria, Tiziana; Steffen, Bernhard; Sczyrba, Alexander; Hartmeier, Sven; Giegerich, Robert

    2008-01-01

    Background PCR primer design is an everyday, but not trivial task requiring state-of-the-art software. We describe the popular tool GeneFisher and explain its recent restructuring using workflow techniques. We apply a service-oriented approach to model and implement GeneFisher-P, a process-based version of the GeneFisher web application, as a part of the Bio-jETI platform for service modeling and execution. We show how to introduce a flexible process layer to meet the growing demand for improved user-friendliness and flexibility. Results Within Bio-jETI, we model the process using the jABC framework, a mature model-driven, service-oriented process definition platform. We encapsulate remote legacy tools and integrate web services using jETI, an extension of the jABC for seamless integration of remote resources as basic services, ready to be used in the process. Some of the basic services used by GeneFisher are in fact already provided as individual web services at BiBiServ and can be directly accessed. Others are legacy programs, and are made available to Bio-jETI via the jETI technology. The full power of service-based process orientation is required when more bioinformatics tools, available as web services or via jETI, lead to easy extensions or variations of the basic process. This concerns for instance variations of data retrieval or alignment tools as provided by the European Bioinformatics Institute (EBI). Conclusions The resulting service- and process-oriented GeneFisher-P demonstrates how basic services from heterogeneous sources can be easily orchestrated in the Bio-jETI platform and lead to a flexible family of specialized processes tailored to specific tasks. PMID:18460174

  14. Labeled Graph Kernel for Behavior Analysis.

    PubMed

    Zhao, Ruiqi; Martinez, Aleix M

    2016-08-01

    Automatic behavior analysis from video is a major topic in many areas of research, including computer vision, multimedia, robotics, biology, cognitive science, social psychology, psychiatry, and linguistics. Two major problems are of interest when analyzing behavior. First, we wish to automatically categorize observed behaviors into a discrete set of classes (i.e., classification). For example, to determine word production from video sequences in sign language. Second, we wish to understand the relevance of each behavioral feature in achieving this classification (i.e., decoding). For instance, to know which behavior variables are used to discriminate between the words apple and onion in American Sign Language (ASL). The present paper proposes to model behavior using a labeled graph, where the nodes define behavioral features and the edges are labels specifying their order (e.g., before, overlaps, start). In this approach, classification reduces to a simple labeled graph matching. Unfortunately, the complexity of labeled graph matching grows exponentially with the number of categories we wish to represent. Here, we derive a graph kernel to quickly and accurately compute this graph similarity. This approach is very general and can be plugged into any kernel-based classifier. Specifically, we derive a Labeled Graph Support Vector Machine (LGSVM) and a Labeled Graph Logistic Regressor (LGLR) that can be readily employed to discriminate between many actions (e.g., sign language concepts). The derived approach can be readily used for decoding too, yielding invaluable information for the understanding of a problem (e.g., to know how to teach a sign language). The derived algorithms allow us to achieve higher accuracy results than those of state-of-the-art algorithms in a fraction of the time. We show experimental results on a variety of problems and datasets, including multimodal data.

  15. Prenatal development in fishers (Martes pennanti)

    USGS Publications Warehouse

    Frost, H.C.; Krohn, W.B.; Bezembluk, E.A.; Lott, R.; Wallace, C.R.

    2005-01-01

    We evaluated and quantified prenatal growth of fishers (Martes pennanti) using ultrasonography. Seven females gave birth to 21 kits. The first identifiable embryonic structures were seen 42 d prepartum; these appeared to be unimplanted blastocysts or gestational sacs, which subsequently implanted in the uterine horns. Maternal and fetal heart rates were monitored from first detection to birth. Maternal heart rates did not differ among sampling periods, while fetal hearts rates increased from first detection to birth. Head and body differentiation, visible limbs and skeletal ossification were visible by 30, 23 and 21 d prepartum, respectively. Mean diameter of gestational sacs and crown-rump lengths were linearly related to gestational age (P < 0.001). Biparietal and body diameters were also linearly related to gestational age (P < 0.001) and correctly predicted parturition dates within 1-2 d. ?? 2004 Elsevier Inc. All rights reserved.

  16. Kernel-based whole-genome prediction of complex traits: a review

    PubMed Central

    Morota, Gota; Gianola, Daniel

    2014-01-01

    Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics. PMID:25360145

  17. Job Satisfaction among Fishers in the Dominican Republic

    ERIC Educational Resources Information Center

    Ruiz, Victor

    2012-01-01

    This paper reflects on the results of a job satisfaction study of small-scale fishers in the Dominican Republic. The survey results suggest that, although fishers are generally satisfied with their occupations, they also have serious concerns. These concerns include anxieties about the level of earnings, the condition of marine resources and the…

  18. "The Streets of Harlem": The Short Stories of Rudolph Fisher.

    ERIC Educational Resources Information Center

    Deutsch, Leonard J.

    1979-01-01

    It is argued in this review of Fisher's stories that no other writer captured the manner and morals of Harlem in the 1920s and that these stories establish Fisher as the principal historian and social critic of the Harlem Renaissance period. (Author/EB)

  19. Fisher information in a quantum-critical environment

    SciTech Connect

    Sun Zhe; Ma Jian; Lu Xiaoming; Wang Xiaoguang

    2010-08-15

    We consider a process of parameter estimation in a spin-j system surrounded by a quantum-critical spin chain. Quantum Fisher information lies at the heart of the estimation task. We employ Ising spin chain in a transverse field as the environment which exhibits a quantum phase transition. Fisher information decays with time almost monotonously when the environment reaches the critical point. By choosing a fixed time or taking the time average, one can see the quantum Fisher information presents a sudden drop at the critical point. Different initial states of the environment are considered. The phenomenon that the quantum Fisher information, namely, the precision of estimation, changes dramatically can be used to detect the quantum criticality of the environment. We also introduce a general method to obtain the maximal Fisher information for a given state.

  20. R. A. Fisher and his advocacy of randomization.

    PubMed

    Hall, Nancy S

    2007-01-01

    The requirement of randomization in experimental design was first stated by R. A. Fisher, statistician and geneticist, in 1925 in his book Statistical Methods for Research Workers. Earlier designs were systematic and involved the judgment of the experimenter; this led to possible bias and inaccurate interpretation of the data. Fisher's dictum was that randomization eliminates bias and permits a valid test of significance. Randomization in experimenting had been used by Charles Sanders Peirce in 1885 but the practice was not continued. Fisher developed his concepts of randomizing as he considered the mathematics of small samples, in discussions with "Student," William Sealy Gosset. Fisher published extensively. His principles of experimental design were spread worldwide by the many "voluntary workers" who came from other institutions to Rothamsted Agricultural Station in England to learn Fisher's methods.

  1. R. A. Fisher and his advocacy of randomization.

    PubMed

    Hall, Nancy S

    2007-01-01

    The requirement of randomization in experimental design was first stated by R. A. Fisher, statistician and geneticist, in 1925 in his book Statistical Methods for Research Workers. Earlier designs were systematic and involved the judgment of the experimenter; this led to possible bias and inaccurate interpretation of the data. Fisher's dictum was that randomization eliminates bias and permits a valid test of significance. Randomization in experimenting had been used by Charles Sanders Peirce in 1885 but the practice was not continued. Fisher developed his concepts of randomizing as he considered the mathematics of small samples, in discussions with "Student," William Sealy Gosset. Fisher published extensively. His principles of experimental design were spread worldwide by the many "voluntary workers" who came from other institutions to Rothamsted Agricultural Station in England to learn Fisher's methods. PMID:18175604

  2. High-Speed Tracking with Kernelized Correlation Filters.

    PubMed

    Henriques, João F; Caseiro, Rui; Martins, Pedro; Batista, Jorge

    2015-03-01

    The core component of most modern trackers is a discriminative classifier, tasked with distinguishing between the target and the surrounding environment. To cope with natural image changes, this classifier is typically trained with translated and scaled sample patches. Such sets of samples are riddled with redundancies-any overlapping pixels are constrained to be the same. Based on this simple observation, we propose an analytic model for datasets of thousands of translated patches. By showing that the resulting data matrix is circulant, we can diagonalize it with the discrete Fourier transform, reducing both storage and computation by several orders of magnitude. Interestingly, for linear regression our formulation is equivalent to a correlation filter, used by some of the fastest competitive trackers. For kernel regression, however, we derive a new kernelized correlation filter (KCF), that unlike other kernel algorithms has the exact same complexity as its linear counterpart. Building on it, we also propose a fast multi-channel extension of linear correlation filters, via a linear kernel, which we call dual correlation filter (DCF). Both KCF and DCF outperform top-ranking trackers such as Struck or TLD on a 50 videos benchmark, despite running at hundreds of frames-per-second, and being implemented in a few lines of code (Algorithm 1). To encourage further developments, our tracking framework was made open-source. PMID:26353263

  3. High-Speed Tracking with Kernelized Correlation Filters.

    PubMed

    Henriques, João F; Caseiro, Rui; Martins, Pedro; Batista, Jorge

    2015-03-01

    The core component of most modern trackers is a discriminative classifier, tasked with distinguishing between the target and the surrounding environment. To cope with natural image changes, this classifier is typically trained with translated and scaled sample patches. Such sets of samples are riddled with redundancies-any overlapping pixels are constrained to be the same. Based on this simple observation, we propose an analytic model for datasets of thousands of translated patches. By showing that the resulting data matrix is circulant, we can diagonalize it with the discrete Fourier transform, reducing both storage and computation by several orders of magnitude. Interestingly, for linear regression our formulation is equivalent to a correlation filter, used by some of the fastest competitive trackers. For kernel regression, however, we derive a new kernelized correlation filter (KCF), that unlike other kernel algorithms has the exact same complexity as its linear counterpart. Building on it, we also propose a fast multi-channel extension of linear correlation filters, via a linear kernel, which we call dual correlation filter (DCF). Both KCF and DCF outperform top-ranking trackers such as Struck or TLD on a 50 videos benchmark, despite running at hundreds of frames-per-second, and being implemented in a few lines of code (Algorithm 1). To encourage further developments, our tracking framework was made open-source.

  4. Learning With Jensen-Tsallis Kernels.

    PubMed

    Ghoshdastidar, Debarghya; Adsul, Ajay P; Dukkipati, Ambedkar

    2016-10-01

    Jensen-type [Jensen-Shannon (JS) and Jensen-Tsallis] kernels were first proposed by Martins et al. (2009). These kernels are based on JS divergences that originated in the information theory. In this paper, we extend the Jensen-type kernels on probability measures to define positive-definite kernels on Euclidean space. We show that the special cases of these kernels include dot-product kernels. Since Jensen-type divergences are multidistribution divergences, we propose their multipoint variants, and study spectral clustering and kernel methods based on these. We also provide experimental studies on benchmark image database and gene expression database that show the benefits of the proposed kernels compared with the existing kernels. The experiments on clustering also demonstrate the use of constructing multipoint similarities.

  5. RTOS kernel in portable electrocardiograph

    NASA Astrophysics Data System (ADS)

    Centeno, C. A.; Voos, J. A.; Riva, G. G.; Zerbini, C.; Gonzalez, E. A.

    2011-12-01

    This paper presents the use of a Real Time Operating System (RTOS) on a portable electrocardiograph based on a microcontroller platform. All medical device digital functions are performed by the microcontroller. The electrocardiograph CPU is based on the 18F4550 microcontroller, in which an uCOS-II RTOS can be embedded. The decision associated with the kernel use is based on its benefits, the license for educational use and its intrinsic time control and peripherals management. The feasibility of its use on the electrocardiograph is evaluated based on the minimum memory requirements due to the kernel structure. The kernel's own tools were used for time estimation and evaluation of resources used by each process. After this feasibility analysis, the migration from cyclic code to a structure based on separate processes or tasks able to synchronize events is used; resulting in an electrocardiograph running on one Central Processing Unit (CPU) based on RTOS.

  6. Fishers, farms, and forests in eastern North America.

    PubMed

    Lancaster, Pamela A; Bowman, Jeff; Pond, Bruce A

    2008-07-01

    The fisher (Martes pennanti) has recently recovered from historic extirpations across much of its geographic range. There are at least five explanations for the recovery of the fisher, including changes in the amount of habitat, the suitability of habitat, trapping pressure, societal attitudes toward predators, and climate. We evaluated a recovering fisher population in Ontario to test two conditions we viewed as necessary to support the hypothesis that fisher populations have increased due to an increase in the amount of forested land. First, we tested whether the amount of forested land has increased. Second, we tested whether contemporary fisher abundance (and therefore habitat quality) was related to the amount of forest. Topographic maps showed that the proportion of forested land in the study area had increased by 1.9% per decade since 1934 and 3.3% per decade since 1959, likely as a result of land conversion from agricultural uses. Overall the proportion of the study area that was forested increased from 29% to 40% during 1934 to 1995. Census data from the region indicated that there had been a decline in the amount of land area being farmed during the last 50 years. Recent livetrapping data showed that fisher abundance was positively related to the proportion of landscapes that were forested. Based on our results, we could not reject the hypothesis that an increase in the amount of forested land has contributed to the recovery of fisher populations.

  7. Infrasound analysis using Fisher detector and Hough transform

    NASA Astrophysics Data System (ADS)

    Averbuch, Gil; Assink, Jelle D.; Smets, Pieter S. M.; Evers, Läslo G.

    2016-04-01

    Automatic detection of infrasound signals from the International Monitoring System (IMS) from the Comprehensive Nuclear-Test-Ban Treaty requires low rates of both false alarms and missed events. The Fisher detector is a statistical method used for detecting such infrasonic events. The detector aims to detect coherent signals after Beamforming is applied on the recordings. A detection is defined to be above a threshold value of Fisher ratio. The Fisher distribution for such a detection is affected by the SNR. While events with high Fisher ratio and SNR can easily be detected automatically, events with lower Fisher ratios and SNRs might be missed. The Hough transform is a post processing step. It is based on a slope-intercept transform applied to a discretely sampled data, with the goal of finding straight lines (in apparent velocity and back azimuth). Applying it on the results from the Fisher detector is advantageous in case of noisy data, which corresponds to low Fisher ratios and SNRs. Results of the Hough transform on synthetic data with SNR down to 0.7 provided a lower number of missed events. In this work, we will present the results of an automatic detector, based on both methods. Synthetic data with different lengths and SNRs are evaluated. Furthermore, continuous data from the IMS infrasound station I18DK will be analyzed. We will compare the performances of both methods and investigate their ability in reducing the number of missed events.

  8. Density Estimation with Mercer Kernels

    NASA Technical Reports Server (NTRS)

    Macready, William G.

    2003-01-01

    We present a new method for density estimation based on Mercer kernels. The density estimate can be understood as the density induced on a data manifold by a mixture of Gaussians fit in a feature space. As is usual, the feature space and data manifold are defined with any suitable positive-definite kernel function. We modify the standard EM algorithm for mixtures of Gaussians to infer the parameters of the density. One benefit of the approach is it's conceptual simplicity, and uniform applicability over many different types of data. Preliminary results are presented for a number of simple problems.

  9. [Charles Miller Fisher: a giant of neurology].

    PubMed

    Tapia, Jorge

    2013-08-01

    C. Miller Fisher MD, one of the great neurologists in the 20th century, died in April 2012. Born in Canada, he studied medicine at the University of Toronto. As a Canadian Navy medical doctor he participated in World War II and was a war prisoner from 1941 to 1944. He did a residency in neurology at the Montreal Neurological Institute between 1946 and 1948, and later on was a Fellow in Neurology and Neuropathology at the Boston City Hospital. In 1954 he entered the Massachusetts General Hospital as a neurologist and neuropathologist, where he remained until his retirement, in 2005. His academic career ended as Professor Emeritus at Harvard University. His area of special interest in neurology was cerebrovascular disease (CVD). In 1954 he created the first Vascular Neurology service in the world and trained many leading neurologists on this field. His scientific contributions are present in more than 250 publications, as journal articles and book chapters. Many of his articles, certainly not restricted to CVD, were seminal in neurology. Several concepts and terms that he coined are currently used in daily clinical practice. The chapters on CVD, in seven consecutive editions of Harrison's Internal Medicine textbook, are among his highlights. His death was deeply felt by the neurological community.

  10. Entropy, Fisher Information and Variance with Frost-Musulin Potenial

    NASA Astrophysics Data System (ADS)

    Idiodi, J. O. A.; Onate, C. A.

    2016-09-01

    This study presents the Shannon and Renyi information entropy for both position and momentum space and the Fisher information for the position-dependent mass Schrödinger equation with the Frost-Musulin potential. The analysis of the quantum mechanical probability has been obtained via the Fisher information. The variance information of this potential is equally computed. This controls both the chemical properties and physical properties of some of the molecular systems. We have observed the behaviour of the Shannon entropy. Renyi entropy, Fisher information and variance with the quantum number n respectively.

  11. The NAS kernel benchmark program

    NASA Technical Reports Server (NTRS)

    Bailey, D. H.; Barton, J. T.

    1985-01-01

    A collection of benchmark test kernels that measure supercomputer performance has been developed for the use of the NAS (Numerical Aerodynamic Simulation) program at the NASA Ames Research Center. This benchmark program is described in detail and the specific ground rules are given for running the program as a performance test.

  12. Adaptive wiener image restoration kernel

    DOEpatents

    Yuan, Ding

    2007-06-05

    A method and device for restoration of electro-optical image data using an adaptive Wiener filter begins with constructing imaging system Optical Transfer Function, and the Fourier Transformations of the noise and the image. A spatial representation of the imaged object is restored by spatial convolution of the image using a Wiener restoration kernel.

  13. Local Observed-Score Kernel Equating

    ERIC Educational Resources Information Center

    Wiberg, Marie; van der Linden, Wim J.; von Davier, Alina A.

    2014-01-01

    Three local observed-score kernel equating methods that integrate methods from the local equating and kernel equating frameworks are proposed. The new methods were compared with their earlier counterparts with respect to such measures as bias--as defined by Lord's criterion of equity--and percent relative error. The local kernel item response…

  14. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  15. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  16. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  17. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  18. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... Standards for Shelled Almonds, or which has embedded dirt or other foreign material not easily removed...

  19. Parameter-based Fisher's information of orthogonal polynomials

    NASA Astrophysics Data System (ADS)

    Dehesa, J. S.; Olmos, B.; Yanez, R. J.

    2008-04-01

    The Fisher information of the classical orthogonal polynomials with respect to a parameter is introduced, its interest justified and its explicit expression for the Jacobi, Laguerre, Gegenbauer and Grosjean polynomials found.

  20. Wigner functions defined with Laplace transform kernels.

    PubMed

    Oh, Se Baek; Petruccelli, Jonathan C; Tian, Lei; Barbastathis, George

    2011-10-24

    We propose a new Wigner-type phase-space function using Laplace transform kernels--Laplace kernel Wigner function. Whereas momentum variables are real in the traditional Wigner function, the Laplace kernel Wigner function may have complex momentum variables. Due to the property of the Laplace transform, a broader range of signals can be represented in complex phase-space. We show that the Laplace kernel Wigner function exhibits similar properties in the marginals as the traditional Wigner function. As an example, we use the Laplace kernel Wigner function to analyze evanescent waves supported by surface plasmon polariton.

  1. Genetic Discrimination

    MedlinePlus

    ... Medicine Working Group New Horizons and Research Patient Management Policy and Ethics Issues Quick Links for Patient Care ... genetic discrimination. April 25, 2007, Statement of Administration Policy, Office of Management and Budget Official Statement from the Office of ...

  2. 76 FR 78739 - Agency Information Collection (Regulation on Application for Fisher Houses and Other Temporary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-19

    ... AFFAIRS Agency Information Collection (Regulation on Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application) Activity Under OMB Review AGENCY: Veterans Health Administration... Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application, VA Forms...

  3. 38 CFR 60.20 - Duration of Fisher House or other temporary lodging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2014-07-01 2014-07-01 false Duration of Fisher House... VETERANS AFFAIRS (CONTINUED) FISHER HOUSES AND OTHER TEMPORARY LODGING § 60.20 Duration of Fisher House or other temporary lodging. Fisher House or other temporary lodging may be awarded for the...

  4. 38 CFR 60.20 - Duration of Fisher House or other temporary lodging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2013-07-01 2013-07-01 false Duration of Fisher House... VETERANS AFFAIRS (CONTINUED) FISHER HOUSES AND OTHER TEMPORARY LODGING § 60.20 Duration of Fisher House or other temporary lodging. Fisher House or other temporary lodging may be awarded for the...

  5. Local Kernel for Brains Classification in Schizophrenia

    NASA Astrophysics Data System (ADS)

    Castellani, U.; Rossato, E.; Murino, V.; Bellani, M.; Rambaldelli, G.; Tansella, M.; Brambilla, P.

    In this paper a novel framework for brain classification is proposed in the context of mental health research. A learning by example method is introduced by combining local measurements with non linear Support Vector Machine. Instead of considering a voxel-by-voxel comparison between patients and controls, we focus on landmark points which are characterized by local region descriptors, namely Scale Invariance Feature Transform (SIFT). Then, matching is obtained by introducing the local kernel for which the samples are represented by unordered set of features. Moreover, a new weighting approach is proposed to take into account the discriminative relevance of the detected groups of features. Experiments have been performed including a set of 54 patients with schizophrenia and 54 normal controls on which region of interest (ROI) have been manually traced by experts. Preliminary results on Dorso-lateral PreFrontal Cortex (DLPFC) region are promising since up to 75% of successful classification rate has been obtained with this technique and the performance has improved up to 85% when the subjects have been stratified by sex.

  6. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  7. Kernel Near Principal Component Analysis

    SciTech Connect

    MARTIN, SHAWN B.

    2002-07-01

    We propose a novel algorithm based on Principal Component Analysis (PCA). First, we present an interesting approximation of PCA using Gram-Schmidt orthonormalization. Next, we combine our approximation with the kernel functions from Support Vector Machines (SVMs) to provide a nonlinear generalization of PCA. After benchmarking our algorithm in the linear case, we explore its use in both the linear and nonlinear cases. We include applications to face data analysis, handwritten digit recognition, and fluid flow.

  8. On the validity of cosmological Fisher matrix forecasts

    SciTech Connect

    Wolz, Laura; Kilbinger, Martin; Weller, Jochen; Giannantonio, Tommaso E-mail: martin.kilbinger@cea.fr E-mail: tommaso@usm.lmu.de

    2012-09-01

    We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w{sub 0} and w{sub a}. For purely geometrical probes, and especially when marginalising over w{sub a}, we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function. More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts.

  9. R.A. Fisher's contributions to genetical statistics.

    PubMed

    Thompson, E A

    1990-12-01

    R. A. Fisher (1890-1962) was a professor of genetics, and many of his statistical innovations found expression in the development of methodology in statistical genetics. However, whereas his contributions in mathematical statistics are easily identified, in population genetics he shares his preeminence with Sewall Wright (1889-1988) and J. B. S. Haldane (1892-1965). This paper traces some of Fisher's major contributions to the foundations of statistical genetics, and his interactions with Wright and with Haldane which contributed to the development of the subject. With modern technology, both statistical methodology and genetic data are changing. Nonetheless much of Fisher's work remains relevant, and may even serve as a foundation for future research in the statistical analysis of DNA data. For Fisher's work reflects his view of the role of statistics in scientific inference, expressed in 1949: There is no wide or urgent demand for people who will define methods of proof in set theory in the name of improving mathematical statistics. There is a widespread and urgent demand for mathematicians who understand that branch of mathematics known as theoretical statistics, but who are capable also of recognising situations in the real world to which such mathematics is applicable. In recognising features of the real world to which his models and analyses should be applicable, Fisher laid a lasting foundation for statistical inference in genetic analyses. PMID:2085639

  10. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation.

    PubMed

    Sun, Rui; Zhang, Guanghai; Yan, Xiaoxing; Gao, Jun

    2016-08-16

    Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods.

  11. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation

    PubMed Central

    Sun, Rui; Zhang, Guanghai; Yan, Xiaoxing; Gao, Jun

    2016-01-01

    Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods. PMID:27537888

  12. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation.

    PubMed

    Sun, Rui; Zhang, Guanghai; Yan, Xiaoxing; Gao, Jun

    2016-01-01

    Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods. PMID:27537888

  13. Nonlinear projection trick in kernel methods: an alternative to the kernel trick.

    PubMed

    Kwak, Nojun

    2013-12-01

    In kernel methods such as kernel principal component analysis (PCA) and support vector machines, the so called kernel trick is used to avoid direct calculations in a high (virtually infinite) dimensional kernel space. In this brief, based on the fact that the effective dimensionality of a kernel space is less than the number of training samples, we propose an alternative to the kernel trick that explicitly maps the input data into a reduced dimensional kernel space. This is easily obtained by the eigenvalue decomposition of the kernel matrix. The proposed method is named as the nonlinear projection trick in contrast to the kernel trick. With this technique, the applicability of the kernel methods is widened to arbitrary algorithms that do not use the dot product. The equivalence between the kernel trick and the nonlinear projection trick is shown for several conventional kernel methods. In addition, we extend PCA-L1, which uses L1-norm instead of L2-norm (or dot product), into a kernel version and show the effectiveness of the proposed approach.

  14. Nonlinear projection trick in kernel methods: an alternative to the kernel trick.

    PubMed

    Kwak, Nojun

    2013-12-01

    In kernel methods such as kernel principal component analysis (PCA) and support vector machines, the so called kernel trick is used to avoid direct calculations in a high (virtually infinite) dimensional kernel space. In this brief, based on the fact that the effective dimensionality of a kernel space is less than the number of training samples, we propose an alternative to the kernel trick that explicitly maps the input data into a reduced dimensional kernel space. This is easily obtained by the eigenvalue decomposition of the kernel matrix. The proposed method is named as the nonlinear projection trick in contrast to the kernel trick. With this technique, the applicability of the kernel methods is widened to arbitrary algorithms that do not use the dot product. The equivalence between the kernel trick and the nonlinear projection trick is shown for several conventional kernel methods. In addition, we extend PCA-L1, which uses L1-norm instead of L2-norm (or dot product), into a kernel version and show the effectiveness of the proposed approach. PMID:24805227

  15. R. A. Fisher: a faith fit for eugenics.

    PubMed

    Moore, James

    2007-03-01

    In discussions of 'religion-and-science', faith is usually emphasized more than works, scientists' beliefs more than their deeds. By reversing the priority, a lingering puzzle in the life of Ronald Aylmer Fisher (1890-1962), statistician, eugenicist and founder of the neo-Darwinian synthesis, can be solved. Scholars have struggled to find coherence in Fisher's simultaneous commitment to Darwinism, Anglican Christianity and eugenics. The problem is addressed by asking what practical mode of faith or faithful mode of practice lent unity to his life? Families, it is argued, with their myriad practical, emotional and intellectual challenges, rendered a mathematically-based eugenic Darwinian Christianity not just possible for Fisher, but vital.

  16. Fisher information description of the classical-quantal transition

    NASA Astrophysics Data System (ADS)

    Kowalski, A. M.; Martín, M. T.; Plastino, A.; Rosso, O. A.

    2011-06-01

    We investigate the classical limit of the dynamics of a semiclassical system that represents the interaction between matter and a given field. The concept of Fisher Information measure ( F) on using as a quantifier of the process, we find that it adequately describes the transition, detecting the most salient details of the changeover. Used in conjunction with other possible information quantifiers, such as the Normalized Shannon Entropy ( H) and the Statistical Complexity ( C) by recourse to appropriate planar representations like the Fisher Entropy ( F×H) and Fisher Complexity ( F×C) planes, one obtains a better visualization of the transition than that provided by just one quantifier by itself. In the evaluation of these Information Theory quantifiers, we used the Bandt and Pompe methodology for the obtention of the corresponding probability distribution.

  17. Predicting Protein Function Using Multiple Kernels.

    PubMed

    Yu, Guoxian; Rangwala, Huzefa; Domeniconi, Carlotta; Zhang, Guoji; Zhang, Zili

    2015-01-01

    High-throughput experimental techniques provide a wide variety of heterogeneous proteomic data sources. To exploit the information spread across multiple sources for protein function prediction, these data sources are transformed into kernels and then integrated into a composite kernel. Several methods first optimize the weights on these kernels to produce a composite kernel, and then train a classifier on the composite kernel. As such, these approaches result in an optimal composite kernel, but not necessarily in an optimal classifier. On the other hand, some approaches optimize the loss of binary classifiers and learn weights for the different kernels iteratively. For multi-class or multi-label data, these methods have to solve the problem of optimizing weights on these kernels for each of the labels, which are computationally expensive and ignore the correlation among labels. In this paper, we propose a method called Predicting Protein Function using Multiple Kernels (ProMK). ProMK iteratively optimizes the phases of learning optimal weights and reduces the empirical loss of multi-label classifier for each of the labels simultaneously. ProMK can integrate kernels selectively and downgrade the weights on noisy kernels. We investigate the performance of ProMK on several publicly available protein function prediction benchmarks and synthetic datasets. We show that the proposed approach performs better than previously proposed protein function prediction approaches that integrate multiple data sources and multi-label multiple kernel learning methods. The codes of our proposed method are available at https://sites.google.com/site/guoxian85/promk.

  18. Kernel earth mover's distance for EEG classification.

    PubMed

    Daliri, Mohammad Reza

    2013-07-01

    Here, we propose a new kernel approach based on the earth mover's distance (EMD) for electroencephalography (EEG) signal classification. The EEG time series are first transformed into histograms in this approach. The distance between these histograms is then computed using the EMD in a pair-wise manner. We bring the distances into a kernel form called kernel EMD. The support vector classifier can then be used for the classification of EEG signals. The experimental results on the real EEG data show that the new kernel method is very effective, and can classify the data with higher accuracy than traditional methods.

  19. ANALYSIS AND REJECTION SAMPLING OF WRIGHT-FISHER DIFFUSION BRIDGES

    PubMed Central

    Schraiber, Joshua G.; Griffiths, Robert C.; Evans, Steven N.

    2013-01-01

    We investigate the properties of a Wright-Fisher diffusion process started from frequency x at time 0 and conditioned to be at frequency y at time T. Such a process is called a bridge. Bridges arise naturally in the analysis of selection acting on standing variation and in the inference of selection from allele frequency time series. We establish a number of results about the distribution of neutral Wright-Fisher bridges and develop a novel rejection sampling scheme for bridges under selection that we use to study their behavior. PMID:24001410

  20. Fisher Controls introduces Snug Meter to gas industry

    SciTech Connect

    Share, J.

    1996-04-01

    Spurred by an industry demanding a sleeker look that will appeal to consumers, Fisher Controls International inc., has introduced a compact natural gas meter that not only is considerably smaller than existing models, but also incorporates features that company officials feel may set new standards. Termed the Snug meter, the four-chamber device is particularly designed for multi-dwelling buildings and is also the initial foray of Fisher--a recognized leader in North America for pressure-control and regulation equipment--into the meter industry. This paper reviews the design features of this new meter.

  1. Traveling waves in the nonlocal KPP-Fisher equation: Different roles of the right and the left interactions

    NASA Astrophysics Data System (ADS)

    Hasik, Karel; Kopfová, Jana; Nábělková, Petra; Trofimchuk, Sergei

    2016-04-01

    We consider the nonlocal KPP-Fisher equation ut (t , x) =uxx (t , x) + u (t , x) (1 - (K * u) (t , x)) which describes the evolution of population density u (t , x) with respect to time t and location x. The non-locality is expressed in terms of the convolution of u (t , ṡ) with kernel K (ṡ) ≥ 0, ∫R K (s) ds = 1. The restrictions K (s), s ≥ 0, and K (s), s ≤ 0, are responsible for interactions of an individual with his left and right neighbors, respectively. We show that these two parts of K play quite different roles as for the existence and uniqueness of traveling fronts to the KPP-Fisher equation. In particular, if the left interaction is dominant, the uniqueness of fronts can be proved, while the dominance of the right interaction can induce the co-existence of monotone and oscillating fronts. We also present a short proof of the existence of traveling waves without assuming various technical restrictions usually imposed on K.

  2. Molecular Hydrodynamics from Memory Kernels.

    PubMed

    Lesnicki, Dominika; Vuilleumier, Rodolphe; Carof, Antoine; Rotenberg, Benjamin

    2016-04-01

    The memory kernel for a tagged particle in a fluid, computed from molecular dynamics simulations, decays algebraically as t^{-3/2}. We show how the hydrodynamic Basset-Boussinesq force naturally emerges from this long-time tail and generalize the concept of hydrodynamic added mass. This mass term is negative in the present case of a molecular solute, which is at odds with incompressible hydrodynamics predictions. Lastly, we discuss the various contributions to the friction, the associated time scales, and the crossover between the molecular and hydrodynamic regimes upon increasing the solute radius. PMID:27104730

  3. An Extended Method of SIRMs Connected Fuzzy Inference Method Using Kernel Method

    NASA Astrophysics Data System (ADS)

    Seki, Hirosato; Mizuguchi, Fuhito; Watanabe, Satoshi; Ishii, Hiroaki; Mizumoto, Masaharu

    The single input rule modules connected fuzzy inference method (SIRMs method) by Yubazaki et al. can decrease the number of fuzzy rules drastically in comparison with the conventional fuzzy inference methods. Moreover, Seki et al. have proposed a functional-type SIRMs method which generalizes the consequent part of the SIRMs method to function. However, these SIRMs methods can not be applied to XOR (Exclusive OR). In this paper, we propose a “kernel-type SIRMs method” which uses the kernel trick to the SIRMs method, and show that this method can treat XOR. Further, a learning algorithm of the proposed SIRMs method is derived by using the steepest descent method, and compared with the one of conventional SIRMs method and kernel perceptron by applying to identification of nonlinear functions, medical diagnostic system and discriminant analysis of Iris data.

  4. FISHER INFORMATION AS A METRIC FOR SUSTAINABLE REGIMES

    EPA Science Inventory

    The important question in sustainability is not whether the world is sustainable, but whether a humanly acceptable regime of the world is sustainable. We propose Fisher Information as a metric for the sustainability of dynamic regimes in complex systems. The quantity now known ...

  5. Fisher Transformations for Correlations Corrected for Selection and Missing Data.

    ERIC Educational Resources Information Center

    Mendoza, Jorge L.

    1993-01-01

    A Fisher's Z transformation is developed for the corrected correlation for conditions when the criterion data are missing because of selection on the predictor and when the criterion was missing at random, not because of selection. The two Z transformations were evaluated in a computer simulation and found accurate. (SLD)

  6. Quantum Fisher and skew information for Unruh accelerated Dirac qubit

    NASA Astrophysics Data System (ADS)

    Banerjee, Subhashish; Alok, Ashutosh Kumar; Omkar, S.

    2016-08-01

    We develop a Bloch vector representation of the Unruh channel for a Dirac field mode. This is used to provide a unified, analytical treatment of quantum Fisher and skew information for a qubit subjected to the Unruh channel, both in its pure form as well as in the presence of experimentally relevant external noise channels. The time evolution of Fisher and skew information is studied along with the impact of external environment parameters such as temperature and squeezing. The external noises are modelled by both purely dephasing phase damping and the squeezed generalised amplitude damping channels. An interesting interplay between the external reservoir temperature and squeezing on the Fisher and skew information is observed, in particular, for the action of the squeezed generalised amplitude damping channel. It is seen that for some regimes, squeezing can enhance the quantum information against the deteriorating influence of the ambient environment. Similar features are also observed for the analogous study of skew information, highlighting a similar origin of the Fisher and skew information.

  7. Fisher-Shannon product and quantum revivals in wavepacket dynamics

    NASA Astrophysics Data System (ADS)

    García, T.; de los Santos, F.; Romera, E.

    2014-01-01

    We show the usefulness of the Fisher-Shannon information product in the study of the sequence of collapses and revivals that take place along the time evolution of quantum wavepackets. This fact is illustrated in two models, a quantum bouncer and a graphene quantum ring.

  8. Testing the Difference between Dependent Correlations Using the Fisher Z.

    ERIC Educational Resources Information Center

    Ramseyer, Gary C.

    1979-01-01

    A procedure is discussed for testing the significance of the difference in two correlated correlation coefficients, using Fisher's Z-Transformation. The procedure is applicable to a wide range of problems involving tests between dependent correlations and has documented mathematical support when its power curves are examined. (MH)

  9. Postcolonial Appalachia: Bhabha, Bakhtin, and Diane Gilliam Fisher's "Kettle Bottom"

    ERIC Educational Resources Information Center

    Stevenson, Sheryl

    2006-01-01

    Diane Gilliam Fisher's 2004 award-winning book of poems, "Kettle Bottom," offers students a revealing vantage point for seeing Appalachian regional culture in a postcolonial context. An artful and accessible poetic sequence that was selected as the 2005 summer reading for entering students at Smith College, "Kettle Bottom"…

  10. The Fisher-Yates Exact Test and Unequal Sample Sizes

    ERIC Educational Resources Information Center

    Johnson, Edgar M.

    1972-01-01

    A computational short cut suggested by Feldman and Klinger for the one-sided Fisher-Yates exact test is clarified and is extended to the calculation of probability values for certain two-sided tests when sample sizes are unequal. (Author)

  11. FISHER INFORMATION AS A METRIC FOR SUSTAINABLE SYSTEM REGIMES

    EPA Science Inventory

    The important question in sustainability is not whether the world is sustainable, but whether a humanly acceptable regime of the world is sustainable. We propose Fisher Information as a metric for the sustainability of dynamic regimes in complex systems. The quantity now known ...

  12. FISHER INFORMATION AND DYNAMIC REGIME CHANGES IN ECOLOGICAL SYSTEMS

    EPA Science Inventory

    Fisher Information and Dynamic Regime Changes in Ecological Systems
    Abstract for the 3rd Conference of the International Society for Ecological Informatics
    Audrey L. Mayer, Christopher W. Pawlowski, and Heriberto Cabezas

    The sustainable nature of particular dynamic...

  13. 77 FR 59087 - Fisher House and Other Temporary Lodging

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-26

    ... examination. (2) Extended outpatient treatment, such as treatment associated with organ transplant... overnight stay. (e) Special authority for organ transplant cases. Notwithstanding any other provision of... Federal Register (77 FR 15650) a proposal to amend its regulations concerning Fisher Houses and...

  14. 77 FR 15650 - Fisher House and Other Temporary Lodging

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-16

    ... treatment associated with organ transplant, chemotherapy, or radiation.'' The use of Fisher House or other... organ transplant, chemotherapy, or radiation. (3) Hospitalization for a critical injury or illness... overnight stay. (e) Special authority for organ transplant cases. Notwithstanding any other provision...

  15. Reflections on a Collaboration: Communicating Educational Research in "Fisher"

    ERIC Educational Resources Information Center

    Garces, Liliana M.

    2013-01-01

    It was critical that the U.S. Supreme Court have the best empirical evidence available to help inform its decisions in "Fisher." The "amicus" brief filed by 444 researchers from 172 institutions in 42 states was the result of a collaborative effort among members of the social science, educational, and legal communities. In her role as counsel of…

  16. Detection and Assessment of Ecosystem Regime Shifts from Fisher Information

    EPA Science Inventory

    Ecosystem regime shifts, which are long-term system reorganizations, have profound implications for sustainability. There is a great need for indicators of regime shifts, particularly methods that are applicable to data from real systems. We have developed a form of Fisher info...

  17. Enhancement of Fusarium head blight detection in free-falling wheat kernels using a bichromatic pulsed LED design

    NASA Astrophysics Data System (ADS)

    Yang, I.-Chang; Delwiche, Stephen R.; Chen, Suming; Lo, Y. Martin

    2009-02-01

    Fusarium head blight is a worldwide fungal disease of small cereal grains such as wheat that affects the yield, quality, and safety of food and feed products. The current study was implemented to develop more efficient methods for optically detecting Fusarium-damaged (scabby) kernels from normal (sound) wheat kernels. Through development of a high-power pulsed LED (green and red) inspection system, it was found that Fusarium-damaged and normal wheat kernels have different reflected energy responses. Two parameters (slope and r2) from a regression analysis of the green and red responses were used as input parameters in linear discriminant analysis models. The examined factors affecting accuracy were the orientation of the optical probe, the color contrast between normal and Fusarium-damaged kernels, and the manner in which one LED's response is time-matched to the other LED. Whereas commercial high-speed optical sorters are, on average, 50% efficient at removing mold-damaged kernels, this efficiency can rise to 95% or better under more carefully controlled, kernel-at-rest conditions in the laboratory. The current research on free-falling kernels has demonstrated accuracies (>90% for wheat samples of high visual contrast) that approach those of controlled conditions, which will lead to improvements in high-speed optical sorters.

  18. Improving the Bandwidth Selection in Kernel Equating

    ERIC Educational Resources Information Center

    Andersson, Björn; von Davier, Alina A.

    2014-01-01

    We investigate the current bandwidth selection methods in kernel equating and propose a method based on Silverman's rule of thumb for selecting the bandwidth parameters. In kernel equating, the bandwidth parameters have previously been obtained by minimizing a penalty function. This minimization process has been criticized by practitioners…

  19. Assessing Fishers' Support of Striped Bass Management Strategies.

    PubMed

    Murphy, Robert D; Scyphers, Steven B; Grabowski, Jonathan H

    2015-01-01

    Incorporating the perspectives and insights of stakeholders is an essential component of ecosystem-based fisheries management, such that policy strategies should account for the diverse interests of various groups of anglers to enhance their efficacy. Here we assessed fishing stakeholders' perceptions on the management of Atlantic striped bass (Morone saxatilis) and receptiveness to potential future regulations using an online survey of recreational and commercial fishers in Massachusetts and Connecticut (USA). Our results indicate that most fishers harbored adequate to positive perceptions of current striped bass management policies when asked to grade their state's management regime. Yet, subtle differences in perceptions existed between recreational and commercial fishers, as well as across individuals with differing levels of fishing experience, resource dependency, and tournament participation. Recreational fishers in both states were generally supportive or neutral towards potential management actions including slot limits (71%) and mandated circle hooks to reduce mortality of released fish (74%), but less supportive of reduced recreational bag limits (51%). Although commercial anglers were typically less supportive of management changes than their recreational counterparts, the majority were still supportive of slot limits (54%) and mandated use of circle hooks (56%). Our study suggests that both recreational and commercial fishers are generally supportive of additional management strategies aimed at sustaining healthy striped bass populations and agree on a variety of strategies. However, both stakeholder groups were less supportive of harvest reductions, which is the most direct measure of reducing mortality available to fisheries managers. By revealing factors that influence stakeholders' support or willingness to comply with management strategies, studies such as ours can help managers identify potential stakeholder support for or conflicts that may

  20. Assessing Fishers' Support of Striped Bass Management Strategies.

    PubMed

    Murphy, Robert D; Scyphers, Steven B; Grabowski, Jonathan H

    2015-01-01

    Incorporating the perspectives and insights of stakeholders is an essential component of ecosystem-based fisheries management, such that policy strategies should account for the diverse interests of various groups of anglers to enhance their efficacy. Here we assessed fishing stakeholders' perceptions on the management of Atlantic striped bass (Morone saxatilis) and receptiveness to potential future regulations using an online survey of recreational and commercial fishers in Massachusetts and Connecticut (USA). Our results indicate that most fishers harbored adequate to positive perceptions of current striped bass management policies when asked to grade their state's management regime. Yet, subtle differences in perceptions existed between recreational and commercial fishers, as well as across individuals with differing levels of fishing experience, resource dependency, and tournament participation. Recreational fishers in both states were generally supportive or neutral towards potential management actions including slot limits (71%) and mandated circle hooks to reduce mortality of released fish (74%), but less supportive of reduced recreational bag limits (51%). Although commercial anglers were typically less supportive of management changes than their recreational counterparts, the majority were still supportive of slot limits (54%) and mandated use of circle hooks (56%). Our study suggests that both recreational and commercial fishers are generally supportive of additional management strategies aimed at sustaining healthy striped bass populations and agree on a variety of strategies. However, both stakeholder groups were less supportive of harvest reductions, which is the most direct measure of reducing mortality available to fisheries managers. By revealing factors that influence stakeholders' support or willingness to comply with management strategies, studies such as ours can help managers identify potential stakeholder support for or conflicts that may

  1. Assessing Fishers' Support of Striped Bass Management Strategies

    PubMed Central

    Murphy, Robert D.; Scyphers, Steven B.; Grabowski, Jonathan H.

    2015-01-01

    Incorporating the perspectives and insights of stakeholders is an essential component of ecosystem-based fisheries management, such that policy strategies should account for the diverse interests of various groups of anglers to enhance their efficacy. Here we assessed fishing stakeholders’ perceptions on the management of Atlantic striped bass (Morone saxatilis) and receptiveness to potential future regulations using an online survey of recreational and commercial fishers in Massachusetts and Connecticut (USA). Our results indicate that most fishers harbored adequate to positive perceptions of current striped bass management policies when asked to grade their state’s management regime. Yet, subtle differences in perceptions existed between recreational and commercial fishers, as well as across individuals with differing levels of fishing experience, resource dependency, and tournament participation. Recreational fishers in both states were generally supportive or neutral towards potential management actions including slot limits (71%) and mandated circle hooks to reduce mortality of released fish (74%), but less supportive of reduced recreational bag limits (51%). Although commercial anglers were typically less supportive of management changes than their recreational counterparts, the majority were still supportive of slot limits (54%) and mandated use of circle hooks (56%). Our study suggests that both recreational and commercial fishers are generally supportive of additional management strategies aimed at sustaining healthy striped bass populations and agree on a variety of strategies. However, both stakeholder groups were less supportive of harvest reductions, which is the most direct measure of reducing mortality available to fisheries managers. By revealing factors that influence stakeholders’ support or willingness to comply with management strategies, studies such as ours can help managers identify potential stakeholder support for or conflicts that

  2. The context-tree kernel for strings.

    PubMed

    Cuturi, Marco; Vert, Jean-Philippe

    2005-10-01

    We propose a new kernel for strings which borrows ideas and techniques from information theory and data compression. This kernel can be used in combination with any kernel method, in particular Support Vector Machines for string classification, with notable applications in proteomics. By using a Bayesian averaging framework with conjugate priors on a class of Markovian models known as probabilistic suffix trees or context-trees, we compute the value of this kernel in linear time and space while only using the information contained in the spectrum of the considered strings. This is ensured through an adaptation of a compression method known as the context-tree weighting algorithm. Encouraging classification results are reported on a standard protein homology detection experiment, showing that the context-tree kernel performs well with respect to other state-of-the-art methods while using no biological prior knowledge.

  3. Discrimination of Maize Haploid Seeds from Hybrid Seeds Using Vis Spectroscopy and Support Vector Machine Method.

    PubMed

    Liu, Jin; Guo, Ting-ting; Li, Hao-chuan; Jia, Shi-qiang; Yan, Yan-lu; An, Dong; Zhang, Yao; Chen, Shao-jiang

    2015-11-01

    Doubled haploid (DH) lines are routinely applied in the hybrid maize breeding programs of many institutes and companies for their advantages of complete homozygosity and short breeding cycle length. A key issue in this approach is an efficient screening system to identify haploid kernels from the hybrid kernels crossed with the inducer. At present, haploid kernel selection is carried out manually using the"red-crown" kernel trait (the haploid kernel has a non-pigmented embryo and pigmented endosperm) controlled by the R1-nj gene. Manual selection is time-consuming and unreliable. Furthermore, the color of the kernel embryo is concealed by the pericarp. Here, we establish a novel approach for identifying maize haploid kernels based on visible (Vis) spectroscopy and support vector machine (SVM) pattern recognition technology. The diffuse transmittance spectra of individual kernels (141 haploid kernels and 141 hybrid kernels from 9 genotypes) were collected using a portable UV-Vis spectrometer and integrating sphere. The raw spectral data were preprocessed using smoothing and vector normalization methods. The desired feature wavelengths were selected based on the results of the Kolmogorov-Smirnov test. The wavelengths with p values above 0. 05 were eliminated because the distributions of absorbance data in these wavelengths show no significant difference between haploid and hybrid kernels. Principal component analysis was then performed to reduce the number of variables. The SVM model was evaluated by 9-fold cross-validation. In each round, samples of one genotype were used as the testing set, while those of other genotypes were used as the training set. The mean rate of correct discrimination was 92.06%. This result demonstrates the feasibility of using Vis spectroscopy to identify haploid maize kernels. The method would help develop a rapid and accurate automated screening-system for haploid kernels. PMID:26978947

  4. Bayesian Kernel Mixtures for Counts

    PubMed Central

    Canale, Antonio; Dunson, David B.

    2011-01-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online. PMID:22523437

  5. Bayesian Kernel Mixtures for Counts.

    PubMed

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online. PMID:22523437

  6. MULTIVARIATE KERNEL PARTITION PROCESS MIXTURES

    PubMed Central

    Dunson, David B.

    2013-01-01

    Mixtures provide a useful approach for relaxing parametric assumptions. Discrete mixture models induce clusters, typically with the same cluster allocation for each parameter in multivariate cases. As a more flexible approach that facilitates sparse nonparametric modeling of multivariate random effects distributions, this article proposes a kernel partition process (KPP) in which the cluster allocation varies for different parameters. The KPP is shown to be the driving measure for a multivariate ordered Chinese restaurant process that induces a highly-flexible dependence structure in local clustering. This structure allows the relative locations of the random effects to inform the clustering process, with spatially-proximal random effects likely to be assigned the same cluster index. An exact block Gibbs sampler is developed for posterior computation, avoiding truncation of the infinite measure. The methods are applied to hormone curve data, and a dependent KPP is proposed for classification from functional predictors. PMID:24478563

  7. Fisheries productivity and its effects on the consumption of animal protein and food sharing of fishers' and non-fishers' families.

    PubMed

    da Costa, Mikaelle Kaline Bezerra; de Melo, Clarissy Dinyz; Lopes, Priscila Fabiana Macedo

    2014-01-01

    This study compared the consumption of animal protein and food sharing among fishers' and non-fishers' families of the northeastern Brazilian coast. The diet of these families was registered through the 24-hour-recall method during 10 consecutive days in January (good fishing season) and June (bad fishing season) 2012. Fish consumption was not different between the fishers' and non-fishers' families, but varied according to fisheries productivity to both groups. Likewise, food sharing was not different between the two groups, but food was shared more often when fisheries were productive. Local availability of fish, more than a direct dependency on fisheries, determines local patterns of animal protein consumption, but a direct dependency on fisheries exposes families to a lower-quality diet in less-productive seasons. As such, fisheries could shape and affect the livelihoods of coastal villages, including fishers' and non-fishers' families.

  8. Putting Priors in Mixture Density Mercer Kernels

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2004-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly infinite dimensional feature space. We describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using predefined kernels. These data adaptive kernels can en- code prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS). The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains template for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic- algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code. The results show that the Mixture Density Mercer-Kernel described here outperforms tree-based classification in distinguishing high-redshift galaxies from low- redshift galaxies by approximately 16% on test data, bagged trees by approximately 7%, and bagged trees built on a much larger sample of data by approximately 2%.

  9. Ideal regularization for learning kernels from labels.

    PubMed

    Pan, Binbin; Lai, Jianhuang; Shen, Lixin

    2014-08-01

    In this paper, we propose a new form of regularization that is able to utilize the label information of a data set for learning kernels. The proposed regularization, referred to as ideal regularization, is a linear function of the kernel matrix to be learned. The ideal regularization allows us to develop efficient algorithms to exploit labels. Three applications of the ideal regularization are considered. Firstly, we use the ideal regularization to incorporate the labels into a standard kernel, making the resulting kernel more appropriate for learning tasks. Next, we employ the ideal regularization to learn a data-dependent kernel matrix from an initial kernel matrix (which contains prior similarity information, geometric structures, and labels of the data). Finally, we incorporate the ideal regularization to some state-of-the-art kernel learning problems. With this regularization, these learning problems can be formulated as simpler ones which permit more efficient solvers. Empirical results show that the ideal regularization exploits the labels effectively and efficiently.

  10. GA-fisher: A new LDA-based face recognition algorithm with selection of principal components.

    PubMed

    Zheng, Wei-Shi; Lai, Jian-Huang; Yuen, Pong C

    2005-10-01

    This paper addresses the dimension reduction problem in Fisherface for face recognition. When the number of training samples is less than the image dimension (total number of pixels), the within-class scatter matrix (Sw) in Linear Discriminant Analysis (LDA) is singular, and Principal Component Analysis (PCA) is suggested to employ in Fisherface for dimension reduction of Sw so that it becomes nonsingular. The popular method is to select the largest nonzero eigenvalues and the corresponding eigenvectors for LDA. To attenuate the illumination effect, some researchers suggested removing the three eigenvectors with the largest eigenvalues and the performance is improved. However, as far as we know, there is no systematic way to determine which eigenvalues should be used. Along this line, this paper proposes a theorem to interpret why PCA can be used in LDA and an automatic and systematic method to select the eigenvectors to be used in LDA using a Genetic Algorithm (GA). A GA-PCA is then developed. It is found that some small eigenvectors should also be used as part of the basis for dimension reduction. Using the GA-PCA to reduce the dimension, a GA-Fisher method is designed and developed. Comparing with the traditional Fisherface method, the proposed GA-Fisher offers two additional advantages. First, optimal bases for dimensionality reduction are derived from GA-PCA. Second, the computational efficiency of LDA is improved by adding a whitening procedure after dimension reduction. The Face Recognition Technology (FERET) and Carnegie Mellon University Pose, Illumination, and Expression (CMU PIE) databases are used for evaluation. Experimental results show that almost 5 % improvement compared with Fisherface can be obtained, and the results are encouraging.

  11. Kernel score statistic for dependent data.

    PubMed

    Malzahn, Dörthe; Friedrichs, Stefanie; Rosenberger, Albert; Bickeböller, Heike

    2014-01-01

    The kernel score statistic is a global covariance component test over a set of genetic markers. It provides a flexible modeling framework and does not collapse marker information. We generalize the kernel score statistic to allow for familial dependencies and to adjust for random confounder effects. With this extension, we adjust our analysis of real and simulated baseline systolic blood pressure for polygenic familial background. We find that the kernel score test gains appreciably in power through the use of sequencing compared to tag-single-nucleotide polymorphisms for very rare single nucleotide polymorphisms with <1% minor allele frequency.

  12. Traditional botanical knowledge of artisanal fishers in southern Brazil

    PubMed Central

    2013-01-01

    Background This study characterized the botanical knowledge of artisanal fishers of the Lami community, Porto Alegre, southern Brazil based on answers to the following question: Is the local botanical knowledge of the artisanal fishers of the rural-urban district of Lami still active, even since the district’s insertion into the metropolitan region of Porto Alegre? Methods This region, which contains a mosaic of urban and rural areas, hosts the Lami Biological Reserve (LBR) and a community of 13 artisanal fisher families. Semi-structured interviews were conducted with 15 fishers, complemented by participatory observation techniques and free-lists; in these interviews, the species of plants used by the community and their indicated uses were identified. Results A total of 111 species belonging to 50 families were identified. No significant differences between the diversities of native and exotic species were found. Seven use categories were reported: medicinal (49%), human food (23.2%), fishing (12.3%), condiments (8%), firewood (5%), mystical purposes (1.45%), and animal food (0.72%). The medicinal species with the highest level of agreement regarding their main uses (AMUs) were Aloe arborescens Mill., Plectranthus barbatus Andrews, Dodonaea viscosa Jacq., Plectranthus ornatus Codd, Eugenia uniflora L., and Foeniculum vulgare Mill. For illness and diseases, most plants were used for problems with the digestive system (20 species), followed by the respiratory system (16 species). This community possesses a wide botanical knowledge, especially of medicinal plants, comparable to observations made in other studies with fishing communities in coastal areas of the Atlantic Forest of Brazil. Conclusions Ethnobotanical studies in rural-urban areas contribute to preserving local knowledge and provide information that aids in conserving the remaining ecosystems in the region. PMID:23898973

  13. Fishing, fish consumption and advisory awareness among Louisiana's recreational fishers.

    PubMed

    Katner, Adrienne; Ogunyinka, Ebenezer; Sun, Mei-Hung; Soileau, Shannon; Lavergne, David; Dugas, Dianne; Suffet, Mel

    2011-11-01

    This paper presents results from the first known population-based survey of recreational fishers in Louisiana (n=1774). The ultimate goal of this study was to obtain data in support of the development of regional advisories for a high exposure population with unique seafood consumption patterns. Between July and August of 2008, a survey was mailed to a random sample of licensed recreational fishers to characterize local fishing habits, sportfish consumption, and advisory awareness. Eighty-eight percent of respondents reported eating sportfish. Respondents ate an estimated mean of four fish meals per month, of which, approximately half were sportfish. Over half of all sportfish meals (54%) were caught in the Gulf of Mexico or bordering brackish areas. Sportfish consumption varied by license and gender; and was highest among Sportsman's Paradise license holders (2.8±0.2 meals per month), and males (2.2±0.1 meals per month). The most frequently consumed sportfish species were red drum, speckled trout, catfish, bass, crappie and bream. Advisory awareness rates varied by gender, ethnicity, geographic area, license type, age and education; and were lowest among women (53%), African-Americans (43%), fishers from the southeast of Louisiana (50%), holders of Senior Hunting and Fishing licenses (51%), individuals between 15 and 19 years of age (41%), and individuals with less than a high school education (43%). Results were used to identify ways to optimize monitoring, advisory development and outreach activities. PMID:21851935

  14. Fisher Pierce products for improving distribution system reliability

    SciTech Connect

    1994-12-31

    The challenges facing the electric power utility today in the 1990s has changed significantly from those of even 10 years ago. The proliferation of automation and the personnel computer have heightened the requirements and demands put on the electric distribution system. Today`s customers, fighting to compete in a world market, demand quality, uninterrupted power service. Privatization and the concept of unregulated competition require utilities to streamline to minimize system support costs and optimize power delivery efficiency. Fisher Pierce, serving the electric utility industry for over 50 years, offers a line of products to assist utilities in meeting these challenges. The Fisher Pierce Family of products provide tools for the electric utility to exceed customer service demands. A full line of fault indicating devices are offered to expedite system power restoration both locally and in conjunction with SCADA systems. Fisher Pierce is the largest supplier of roadway lighting controls, manufacturing on a 6 million dollar automated line assuring the highest quality in the world. The distribution system capacitor control line offers intelligent local or radio linked switching control to maintain system voltage and Var levels for quality and cost efficient power delivery under varying customer loads. Additional products, designed to authenticate revenue metering calibration and verify on sight metering service wiring, help optimize the profitability of the utility assuring continuous system service improvements for their customers.

  15. Sugar Profile of Kernels as a Marker of Origin and Ripening Time of Peach (Prunus persicae L.).

    PubMed

    Stanojević, Marija; Trifković, Jelena; Akšić, Milica Fotirić; Rakonjac, Vera; Nikolić, Dragan; Šegan, Sandra; Milojković-Opsenica, Dušanka

    2015-12-01

    Large amounts of fruit seeds, especially peach, are discarded annually in juice or conserve producing industries which is a potential waste of valuable resource and serious disposal problem. Regarding the fact that peach seeds can be obtained as a byproduct from processing companies their exploitation should be greater and, consequently more information of cultivars' kernels and their composition is required. A total of 25 samples of kernels from various peach germplasm (including commercial cultivars, perspective hybrids and vineyard peach accessions) differing in origin and ripening time were characterized by evaluation of their sugar composition. Twenty characteristic carbohydrates and sugar alcohols were determined and quantified using high-performance anion-exchange chromatography with pulsed amperometric detection (HPAEC/PAD). Sucrose, glucose and fructose are the most important sugars in peach kernels similar to other representatives of the Rosaceae family. Also, high amounts of sugars in seeds of promising hybrids implies that through conventional breeding programs peach kernels with high sugar content can be obtained. In addition, by the means of several pattern recognition methods the variables that discriminate peach kernels arising from diverse germplasm and different stage of maturity were identified and successful models for further prediction were developed. Sugars such as ribose, trehalose, arabinose, galactitol, fructose, maltose, sorbitol, sucrose, iso-maltotriose were marked as most important for such discrimination.

  16. A Further Evaluation of Picture Prompts during Auditory-Visual Conditional Discrimination Training

    ERIC Educational Resources Information Center

    Carp, Charlotte L.; Peterson, Sean P.; Arkel, Amber J.; Petursdottir, Anna I.; Ingvarsson, Einar T.

    2012-01-01

    This study was a systematic replication and extension of Fisher, Kodak, and Moore (2007), in which a picture prompt embedded into a least-to-most prompting sequence facilitated acquisition of auditory-visual conditional discriminations. Participants were 4 children who had been diagnosed with autism; 2 had limited prior receptive skills, and 2 had…

  17. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... weight of delivery 10,000 10,000 2. Percent of edible kernel weight 53.0 84.0 3. Less weight loss in... 7 Agriculture 8 2014-01-01 2014-01-01 false Adjusted kernel weight. 981.401 Section 981.401... Administrative Rules and Regulations § 981.401 Adjusted kernel weight. (a) Definition. Adjusted kernel...

  18. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... weight of delivery 10,000 10,000 2. Percent of edible kernel weight 53.0 84.0 3. Less weight loss in... 7 Agriculture 8 2012-01-01 2012-01-01 false Adjusted kernel weight. 981.401 Section 981.401... Administrative Rules and Regulations § 981.401 Adjusted kernel weight. (a) Definition. Adjusted kernel...

  19. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... weight of delivery 10,000 10,000 2. Percent of edible kernel weight 53.0 84.0 3. Less weight loss in... 7 Agriculture 8 2013-01-01 2013-01-01 false Adjusted kernel weight. 981.401 Section 981.401... Administrative Rules and Regulations § 981.401 Adjusted kernel weight. (a) Definition. Adjusted kernel...

  20. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... weight of delivery 10,000 10,000 2. Percent of edible kernel weight 53.0 84.0 3. Less weight loss in... 7 Agriculture 8 2011-01-01 2011-01-01 false Adjusted kernel weight. 981.401 Section 981.401... Administrative Rules and Regulations § 981.401 Adjusted kernel weight. (a) Definition. Adjusted kernel...

  1. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... weight of delivery 10,000 10,000 2. Percent of edible kernel weight 53.0 84.0 3. Less weight loss in... 7 Agriculture 8 2010-01-01 2010-01-01 false Adjusted kernel weight. 981.401 Section 981.401... Administrative Rules and Regulations § 981.401 Adjusted kernel weight. (a) Definition. Adjusted kernel...

  2. 7 CFR 51.2125 - Split or broken kernels.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Split or broken kernels. 51.2125 Section 51.2125... STANDARDS) United States Standards for Grades of Shelled Almonds Definitions § 51.2125 Split or broken kernels. Split or broken kernels means seven-eighths or less of complete whole kernels but which will...

  3. 7 CFR 51.2125 - Split or broken kernels.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Split or broken kernels. 51.2125 Section 51.2125... STANDARDS) United States Standards for Grades of Shelled Almonds Definitions § 51.2125 Split or broken kernels. Split or broken kernels means seven-eighths or less of complete whole kernels but which will...

  4. 7 CFR 51.2125 - Split or broken kernels.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Split or broken kernels. 51.2125 Section 51.2125... STANDARDS) United States Standards for Grades of Shelled Almonds Definitions § 51.2125 Split or broken kernels. Split or broken kernels means seven-eighths or less of complete whole kernels but which will...

  5. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Kernel color classification. 51.1403 Section 51.1403... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Kernel Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the...

  6. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Kernel color classification. 51.1403 Section 51.1403... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Kernel Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the...

  7. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Kernel color classification. 51.1403 Section 51.1403... Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the color classifications provided in this section. When the color of kernels in a...

  8. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Kernel color classification. 51.1403 Section 51.1403... Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the color classifications provided in this section. When the color of kernels in a...

  9. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Kernel color classification. 51.1403 Section 51.1403... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Kernel Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the...

  10. KITTEN Lightweight Kernel 0.1 Beta

    2007-12-12

    The Kitten Lightweight Kernel is a simplified OS (operating system) kernel that is intended to manage a compute node's hardware resources. It provides a set of mechanisms to user-level applications for utilizing hardware resources (e.g., allocating memory, creating processes, accessing the network). Kitten is much simpler than general-purpose OS kernels, such as Linux or Windows, but includes all of the esssential functionality needed to support HPC (high-performance computing) MPI, PGAS and OpenMP applications. Kitten providesmore » unique capabilities such as physically contiguous application memory, transparent large page support, and noise-free tick-less operation, which enable HPC applications to obtain greater efficiency and scalability than with general purpose OS kernels.« less

  11. Quantum kernel applications in medicinal chemistry.

    PubMed

    Huang, Lulu; Massa, Lou

    2012-07-01

    Progress in the quantum mechanics of biological molecules is being driven by computational advances. The notion of quantum kernels can be introduced to simplify the formalism of quantum mechanics, making it especially suitable for parallel computation of very large biological molecules. The essential idea is to mathematically break large biological molecules into smaller kernels that are calculationally tractable, and then to represent the full molecule by a summation over the kernels. The accuracy of the kernel energy method (KEM) is shown by systematic application to a great variety of molecular types found in biology. These include peptides, proteins, DNA and RNA. Examples are given that explore the KEM across a variety of chemical models, and to the outer limits of energy accuracy and molecular size. KEM represents an advance in quantum biology applicable to problems in medicine and drug design. PMID:22857535

  12. Variational Dirichlet Blur Kernel Estimation.

    PubMed

    Zhou, Xu; Mateos, Javier; Zhou, Fugen; Molina, Rafael; Katsaggelos, Aggelos K

    2015-12-01

    Blind image deconvolution involves two key objectives: 1) latent image and 2) blur estimation. For latent image estimation, we propose a fast deconvolution algorithm, which uses an image prior of nondimensional Gaussianity measure to enforce sparsity and an undetermined boundary condition methodology to reduce boundary artifacts. For blur estimation, a linear inverse problem with normalization and nonnegative constraints must be solved. However, the normalization constraint is ignored in many blind image deblurring methods, mainly because it makes the problem less tractable. In this paper, we show that the normalization constraint can be very naturally incorporated into the estimation process by using a Dirichlet distribution to approximate the posterior distribution of the blur. Making use of variational Dirichlet approximation, we provide a blur posterior approximation that considers the uncertainty of the estimate and removes noise in the estimated kernel. Experiments with synthetic and real data demonstrate that the proposed method is very competitive to the state-of-the-art blind image restoration methods. PMID:26390458

  13. Weighted Bergman Kernels and Quantization}

    NASA Astrophysics Data System (ADS)

    Engliš, Miroslav

    Let Ω be a bounded pseudoconvex domain in CN, φ, ψ two positive functions on Ω such that - log ψ, - log φ are plurisubharmonic, and z∈Ω a point at which - log φ is smooth and strictly plurisubharmonic. We show that as k-->∞, the Bergman kernels with respect to the weights φkψ have an asymptotic expansion for x,y near z, where φ(x,y) is an almost-analytic extension of &\\phi(x)=φ(x,x) and similarly for ψ. Further, . If in addition Ω is of finite type, φ,ψ behave reasonably at the boundary, and - log φ, - log ψ are strictly plurisubharmonic on Ω, we obtain also an analogous asymptotic expansion for the Berezin transform and give applications to the Berezin quantization. Finally, for Ω smoothly bounded and strictly pseudoconvex and φ a smooth strictly plurisubharmonic defining function for Ω, we also obtain results on the Berezin-Toeplitz quantization.

  14. TICK: Transparent Incremental Checkpointing at Kernel Level

    SciTech Connect

    Petrini, Fabrizio; Gioiosa, Roberto

    2004-10-25

    TICK is a software package implemented in Linux 2.6 that allows the save and restore of user processes, without any change to the user code or binary. With TICK a process can be suspended by the Linux kernel upon receiving an interrupt and saved in a file. This file can be later thawed in another computer running Linux (potentially the same computer). TICK is implemented as a Linux kernel module, in the Linux version 2.6.5

  15. A kernel autoassociator approach to pattern classification.

    PubMed

    Zhang, Haihong; Huang, Weimin; Huang, Zhiyong; Zhang, Bailing

    2005-06-01

    Autoassociators are a special type of neural networks which, by learning to reproduce a given set of patterns, grasp the underlying concept that is useful for pattern classification. In this paper, we present a novel nonlinear model referred to as kernel autoassociators based on kernel methods. While conventional non-linear autoassociation models emphasize searching for the non-linear representations of input patterns, a kernel autoassociator takes a kernel feature space as the nonlinear manifold, and places emphasis on the reconstruction of input patterns from the kernel feature space. Two methods are proposed to address the reconstruction problem, using linear and multivariate polynomial functions, respectively. We apply the proposed model to novelty detection with or without novelty examples and study it on the promoter detection and sonar target recognition problems. We also apply the model to mclass classification problems including wine recognition, glass recognition, handwritten digit recognition, and face recognition. The experimental results show that, compared with conventional autoassociators and other recognition systems, kernel autoassociators can provide better or comparable performance for concept learning and recognition in various domains. PMID:15971928

  16. A kernel autoassociator approach to pattern classification.

    PubMed

    Zhang, Haihong; Huang, Weimin; Huang, Zhiyong; Zhang, Bailing

    2005-06-01

    Autoassociators are a special type of neural networks which, by learning to reproduce a given set of patterns, grasp the underlying concept that is useful for pattern classification. In this paper, we present a novel nonlinear model referred to as kernel autoassociators based on kernel methods. While conventional non-linear autoassociation models emphasize searching for the non-linear representations of input patterns, a kernel autoassociator takes a kernel feature space as the nonlinear manifold, and places emphasis on the reconstruction of input patterns from the kernel feature space. Two methods are proposed to address the reconstruction problem, using linear and multivariate polynomial functions, respectively. We apply the proposed model to novelty detection with or without novelty examples and study it on the promoter detection and sonar target recognition problems. We also apply the model to mclass classification problems including wine recognition, glass recognition, handwritten digit recognition, and face recognition. The experimental results show that, compared with conventional autoassociators and other recognition systems, kernel autoassociators can provide better or comparable performance for concept learning and recognition in various domains.

  17. PET Image Reconstruction Using Kernel Method

    PubMed Central

    Wang, Guobao; Qi, Jinyi

    2014-01-01

    Image reconstruction from low-count PET projection data is challenging because the inverse problem is ill-posed. Prior information can be used to improve image quality. Inspired by the kernel methods in machine learning, this paper proposes a kernel based method that models PET image intensity in each pixel as a function of a set of features obtained from prior information. The kernel-based image model is incorporated into the forward model of PET projection data and the coefficients can be readily estimated by the maximum likelihood (ML) or penalized likelihood image reconstruction. A kernelized expectation-maximization (EM) algorithm is presented to obtain the ML estimate. Computer simulations show that the proposed approach can achieve better bias versus variance trade-off and higher contrast recovery for dynamic PET image reconstruction than the conventional maximum likelihood method with and without post-reconstruction denoising. Compared with other regularization-based methods, the kernel method is easier to implement and provides better image quality for low-count data. Application of the proposed kernel method to a 4D dynamic PET patient dataset showed promising results. PMID:25095249

  18. Gabor-based kernel PCA with doubly nonlinear mapping for face recognition with a single face image.

    PubMed

    Xie, Xudong; Lam, Kin-Man

    2006-09-01

    In this paper, a novel Gabor-based kernel principal component analysis (PCA) with doubly nonlinear mapping is proposed for human face recognition. In our approach, the Gabor wavelets are used to extract facial features, then a doubly nonlinear mapping kernel PCA (DKPCA) is proposed to perform feature transformation and face recognition. The conventional kernel PCA nonlinearly maps an input image into a high-dimensional feature space in order to make the mapped features linearly separable. However, this method does not consider the structural characteristics of the face images, and it is difficult to determine which nonlinear mapping is more effective for face recognition. In this paper, a new method of nonlinear mapping, which is performed in the original feature space, is defined. The proposed nonlinear mapping not only considers the statistical property of the input features, but also adopts an eigenmask to emphasize those important facial feature points. Therefore, after this mapping, the transformed features have a higher discriminating power, and the relative importance of the features adapts to the spatial importance of the face images. This new nonlinear mapping is combined with the conventional kernel PCA to be called "doubly" nonlinear mapping kernel PCA. The proposed algorithm is evaluated based on the Yale database, the AR database, the ORL database and the YaleB database by using different face recognition methods such as PCA, Gabor wavelets plus PCA, and Gabor wavelets plus kernel PCA with fractional power polynomial models. Experiments show that consistent and promising results are obtained.

  19. Retrieval of Brain Tumors by Adaptive Spatial Pooling and Fisher Vector Representation

    PubMed Central

    Huang, Meiyan; Huang, Wei; Jiang, Jun; Zhou, Yujia; Yang, Ru; Zhao, Jie; Feng, Yanqiu; Feng, Qianjin; Chen, Wufan

    2016-01-01

    Content-based image retrieval (CBIR) techniques have currently gained increasing popularity in the medical field because they can use numerous and valuable archived images to support clinical decisions. In this paper, we concentrate on developing a CBIR system for retrieving brain tumors in T1-weighted contrast-enhanced MRI images. Specifically, when the user roughly outlines the tumor region of a query image, brain tumor images in the database of the same pathological type are expected to be returned. We propose a novel feature extraction framework to improve the retrieval performance. The proposed framework consists of three steps. First, we augment the tumor region and use the augmented tumor region as the region of interest to incorporate informative contextual information. Second, the augmented tumor region is split into subregions by an adaptive spatial division method based on intensity orders; within each subregion, we extract raw image patches as local features. Third, we apply the Fisher kernel framework to aggregate the local features of each subregion into a respective single vector representation and concatenate these per-subregion vector representations to obtain an image-level signature. After feature extraction, a closed-form metric learning algorithm is applied to measure the similarity between the query image and database images. Extensive experiments are conducted on a large dataset of 3604 images with three types of brain tumors, namely, meningiomas, gliomas, and pituitary tumors. The mean average precision can reach 94.68%. Experimental results demonstrate the power of the proposed algorithm against some related state-of-the-art methods on the same dataset. PMID:27273091

  20. Kernel Manifold Alignment for Domain Adaptation.

    PubMed

    Tuia, Devis; Camps-Valls, Gustau

    2016-01-01

    The wealth of sensory data coming from different modalities has opened numerous opportunities for data analysis. The data are of increasing volume, complexity and dimensionality, thus calling for new methodological innovations towards multimodal data processing. However, multimodal architectures must rely on models able to adapt to changes in the data distribution. Differences in the density functions can be due to changes in acquisition conditions (pose, illumination), sensors characteristics (number of channels, resolution) or different views (e.g. street level vs. aerial views of a same building). We call these different acquisition modes domains, and refer to the adaptation problem as domain adaptation. In this paper, instead of adapting the trained models themselves, we alternatively focus on finding mappings of the data sources into a common, semantically meaningful, representation domain. This field of manifold alignment extends traditional techniques in statistics such as canonical correlation analysis (CCA) to deal with nonlinear adaptation and possibly non-corresponding data pairs between the domains. We introduce a kernel method for manifold alignment (KEMA) that can match an arbitrary number of data sources without needing corresponding pairs, just few labeled examples in all domains. KEMA has interesting properties: 1) it generalizes other manifold alignment methods, 2) it can align manifolds of very different complexities, performing a discriminative alignment preserving each manifold inner structure, 3) it can define a domain-specific metric to cope with multimodal specificities, 4) it can align data spaces of different dimensionality, 5) it is robust to strong nonlinear feature deformations, and 6) it is closed-form invertible, which allows transfer across-domains and data synthesis. To authors' knowledge this is the first method addressing all these important issues at once. We also present a reduced-rank version of KEMA for computational

  1. Kernel Manifold Alignment for Domain Adaptation

    PubMed Central

    Tuia, Devis; Camps-Valls, Gustau

    2016-01-01

    The wealth of sensory data coming from different modalities has opened numerous opportunities for data analysis. The data are of increasing volume, complexity and dimensionality, thus calling for new methodological innovations towards multimodal data processing. However, multimodal architectures must rely on models able to adapt to changes in the data distribution. Differences in the density functions can be due to changes in acquisition conditions (pose, illumination), sensors characteristics (number of channels, resolution) or different views (e.g. street level vs. aerial views of a same building). We call these different acquisition modes domains, and refer to the adaptation problem as domain adaptation. In this paper, instead of adapting the trained models themselves, we alternatively focus on finding mappings of the data sources into a common, semantically meaningful, representation domain. This field of manifold alignment extends traditional techniques in statistics such as canonical correlation analysis (CCA) to deal with nonlinear adaptation and possibly non-corresponding data pairs between the domains. We introduce a kernel method for manifold alignment (KEMA) that can match an arbitrary number of data sources without needing corresponding pairs, just few labeled examples in all domains. KEMA has interesting properties: 1) it generalizes other manifold alignment methods, 2) it can align manifolds of very different complexities, performing a discriminative alignment preserving each manifold inner structure, 3) it can define a domain-specific metric to cope with multimodal specificities, 4) it can align data spaces of different dimensionality, 5) it is robust to strong nonlinear feature deformations, and 6) it is closed-form invertible, which allows transfer across-domains and data synthesis. To authors’ knowledge this is the first method addressing all these important issues at once. We also present a reduced-rank version of KEMA for computational

  2. Fighting discrimination.

    PubMed

    Wientjens, Wim; Cairns, Douglas

    2012-10-01

    In the fight against discrimination, the IDF launched the first ever International Charter of Rights and Responsibilities of People with Diabetes in 2011: a balance between rights and duties to optimize health and quality of life, to enable as normal a life as possible and to reduce/eliminate the barriers which deny realization of full potential as members of society. It is extremely frustrating to suffer blanket bans and many examples exist, including insurance, driving licenses, getting a job, keeping a job and family affairs. In this article, an example is given of how pilots with insulin treated diabetes are allowed to fly by taking the responsibility of using special blood glucose monitoring protocols. At this time the systems in the countries allowing flying for pilots with insulin treated diabetes are applauded, particularly the USA for private flying, and Canada for commercial flying. Encouraging developments may be underway in the UK for commercial flying and, if this materializes, could be used as an example for other aviation authorities to help adopt similar protocols. However, new restrictions implemented by the new European Aviation Authority take existing privileges away for National Private Pilot Licence holders with insulin treated diabetes in the UK. PMID:22784927

  3. Fighting discrimination.

    PubMed

    Wientjens, Wim; Cairns, Douglas

    2012-10-01

    In the fight against discrimination, the IDF launched the first ever International Charter of Rights and Responsibilities of People with Diabetes in 2011: a balance between rights and duties to optimize health and quality of life, to enable as normal a life as possible and to reduce/eliminate the barriers which deny realization of full potential as members of society. It is extremely frustrating to suffer blanket bans and many examples exist, including insurance, driving licenses, getting a job, keeping a job and family affairs. In this article, an example is given of how pilots with insulin treated diabetes are allowed to fly by taking the responsibility of using special blood glucose monitoring protocols. At this time the systems in the countries allowing flying for pilots with insulin treated diabetes are applauded, particularly the USA for private flying, and Canada for commercial flying. Encouraging developments may be underway in the UK for commercial flying and, if this materializes, could be used as an example for other aviation authorities to help adopt similar protocols. However, new restrictions implemented by the new European Aviation Authority take existing privileges away for National Private Pilot Licence holders with insulin treated diabetes in the UK.

  4. Feature expectation heightens visual sensitivity during fine orientation discrimination

    PubMed Central

    Cheadle, Sam; Egner, Tobias; Wyart, Valentin; Wu, Claire; Summerfield, Christopher

    2015-01-01

    Attending to a stimulus enhances the sensitivity of perceptual decisions. However, it remains unclear how perceptual sensitivity varies according to whether a feature is expected or unexpected. Here, observers made fine discrimination judgments about the orientation of visual gratings embedded in low spatial-frequency noise, and psychophysical reverse correlation was used to estimate decision ‘kernels' that revealed how visual features influenced choices. Orthogonal cues alerted subjects to which of two spatial locations was likely to be probed (spatial attention cue) and which of two oriented gratings was likely to occur (feature expectation cue). When an expected (relative to unexpected) feature occurred, decision kernels shifted away from the category boundary, allowing observers to capitalize on more informative, “off-channel” stimulus features. By contrast, the spatial attention cue had a multiplicative influence on decision kernels, consistent with an increase in response gain. Feature expectation thus heightens sensitivity to the most informative visual features, independent of selective attention. PMID:26505967

  5. Feature expectation heightens visual sensitivity during fine orientation discrimination.

    PubMed

    Cheadle, Sam; Egner, Tobias; Wyart, Valentin; Wu, Claire; Summerfield, Christopher

    2015-01-01

    Attending to a stimulus enhances the sensitivity of perceptual decisions. However, it remains unclear how perceptual sensitivity varies according to whether a feature is expected or unexpected. Here, observers made fine discrimination judgments about the orientation of visual gratings embedded in low spatial-frequency noise, and psychophysical reverse correlation was used to estimate decision 'kernels' that revealed how visual features influenced choices. Orthogonal cues alerted subjects to which of two spatial locations was likely to be probed (spatial attention cue) and which of two oriented gratings was likely to occur (feature expectation cue). When an expected (relative to unexpected) feature occurred, decision kernels shifted away from the category boundary, allowing observers to capitalize on more informative, "off-channel" stimulus features. By contrast, the spatial attention cue had a multiplicative influence on decision kernels, consistent with an increase in response gain. Feature expectation thus heightens sensitivity to the most informative visual features, independent of selective attention. PMID:26505967

  6. Direct discriminant locality preserving projection with Hammerstein polynomial expansion.

    PubMed

    Chen, Xi; Zhang, Jiashu; Li, Defang

    2012-12-01

    Discriminant locality preserving projection (DLPP) is a linear approach that encodes discriminant information into the objective of locality preserving projection and improves its classification ability. To enhance the nonlinear description ability of DLPP, we can optimize the objective function of DLPP in reproducing kernel Hilbert space to form a kernel-based discriminant locality preserving projection (KDLPP). However, KDLPP suffers the following problems: 1) larger computational burden; 2) no explicit mapping functions in KDLPP, which results in more computational burden when projecting a new sample into the low-dimensional subspace; and 3) KDLPP cannot obtain optimal discriminant vectors, which exceedingly optimize the objective of DLPP. To overcome the weaknesses of KDLPP, in this paper, a direct discriminant locality preserving projection with Hammerstein polynomial expansion (HPDDLPP) is proposed. The proposed HPDDLPP directly implements the objective of DLPP in high-dimensional second-order Hammerstein polynomial space without matrix inverse, which extracts the optimal discriminant vectors for DLPP without larger computational burden. Compared with some other related classical methods, experimental results for face and palmprint recognition problems indicate the effectiveness of the proposed HPDDLPP.

  7. Analysis of heat kernel highlights the strongly modular and heat-preserving structure of proteins

    NASA Astrophysics Data System (ADS)

    Livi, Lorenzo; Maiorino, Enrico; Pinna, Andrea; Sadeghian, Alireza; Rizzi, Antonello; Giuliani, Alessandro

    2016-01-01

    In this paper, we study the structure and dynamical properties of protein contact networks with respect to other biological networks, together with simulated archetypal models acting as probes. We consider both classical topological descriptors, such as modularity and statistics of the shortest paths, and different interpretations in terms of diffusion provided by the discrete heat kernel, which is elaborated from the normalized graph Laplacians. A principal component analysis shows high discrimination among the network types, by considering both the topological and heat kernel based vector characterizations. Furthermore, a canonical correlation analysis demonstrates the strong agreement among those two characterizations, providing thus an important justification in terms of interpretability for the heat kernel. Finally, and most importantly, the focused analysis of the heat kernel provides a way to yield insights on the fact that proteins have to satisfy specific structural design constraints that the other considered networks do not need to obey. Notably, the heat trace decay of an ensemble of varying-size proteins denotes subdiffusion, a peculiar property of proteins.

  8. Capacity of very noisy communication channels based on Fisher information

    PubMed Central

    Duan, Fabing; Chapeau-Blondeau, François; Abbott, Derek

    2016-01-01

    We generalize the asymptotic capacity expression for very noisy communication channels to now include coloured noise. For the practical scenario of a non-optimal receiver, we consider the common case of a correlation receiver. Due to the central limit theorem and the cumulative characteristic of a correlation receiver, we model this channel noise as additive Gaussian noise. Then, the channel capacity proves to be directly related to the Fisher information of the noise distribution and the weak signal energy. The conditions for occurrence of a noise-enhanced capacity effect are discussed, and the capacity difference between this noisy communication channel and other nonlinear channels is clarified. PMID:27306041

  9. Capacity of very noisy communication channels based on Fisher information

    NASA Astrophysics Data System (ADS)

    Duan, Fabing; Chapeau-Blondeau, François; Abbott, Derek

    2016-06-01

    We generalize the asymptotic capacity expression for very noisy communication channels to now include coloured noise. For the practical scenario of a non-optimal receiver, we consider the common case of a correlation receiver. Due to the central limit theorem and the cumulative characteristic of a correlation receiver, we model this channel noise as additive Gaussian noise. Then, the channel capacity proves to be directly related to the Fisher information of the noise distribution and the weak signal energy. The conditions for occurrence of a noise-enhanced capacity effect are discussed, and the capacity difference between this noisy communication channel and other nonlinear channels is clarified.

  10. A Simplex-Like Algorithm for Fisher Markets

    NASA Astrophysics Data System (ADS)

    Adsul, Bharat; Babu, Ch. Sobhan; Garg, Jugal; Mehta, Ruta; Sohoni, Milind

    We propose a new convex optimization formulation for the Fisher market problem with linear utilities. Like the Eisenberg-Gale formulation, the set of feasible points is a polyhedral convex set while the cost function is non-linear; however, unlike that, the optimum is always attained at a vertex of this polytope. The convex cost function depends only on the initial endowments of the buyers. This formulation yields an easy simplex-like pivoting algorithm which is provably strongly polynomial for many special cases.

  11. Determining optimally orthogonal discriminant vectors in DCT domain for multiscale-based face recognition

    NASA Astrophysics Data System (ADS)

    Niu, Yanmin; Wang, Xuchu

    2011-02-01

    This paper presents a new face recognition method that extracts multiple discriminant features based on multiscale image enhancement technique and kernel-based orthogonal feature extraction improvements with several interesting characteristics. First, it can extract more discriminative multiscale face feature than traditional pixel-based or Gabor-based feature. Second, it can effectively deal with the small sample size problem as well as feature correlation problem by using eigenvalue decomposition on scatter matrices. Finally, the extractor handles nonlinearity efficiently by using kernel trick. Multiple recognition experiments on open face data set with comparison to several related methods show the effectiveness and superiority of the proposed method.

  12. Score-moment combined linear discrimination analysis (SMC-LDA) as an improved discrimination method.

    PubMed

    Han, Jintae; Chung, Hoeil; Han, Sung-Hwan; Yoon, Moon-Young

    2007-01-01

    A new discrimination method called the score-moment combined linear discrimination analysis (SMC-LDA) has been developed and its performance has been evaluated using three practical spectroscopic datasets. The key concept of SMC-LDA was to use not only the score from principal component analysis (PCA), but also the moment of the spectrum, as inputs for LDA to improve discrimination. Along with conventional score, moment is used in spectroscopic fields as an effective alternative for spectral feature representation. Three different approaches were considered. Initially, the score generated from PCA was projected onto a two-dimensional feature space by maximizing Fisher's criterion function (conventional PCA-LDA). Next, the same procedure was performed using only moment. Finally, both score and moment were utilized simultaneously for LDA. To evaluate discrimination performances, three different spectroscopic datasets were employed: (1) infrared (IR) spectra of normal and malignant stomach tissue, (2) near-infrared (NIR) spectra of diesel and light gas oil (LGO) and (3) Raman spectra of Chinese and Korean ginseng. For each case, the best discrimination results were achieved when both score and moment were used for LDA (SMC-LDA). Since the spectral representation character of moment was different from that of score, inclusion of both score and moment for LDA provided more diversified and descriptive information.

  13. Adaptive kernels for multi-fiber reconstruction.

    PubMed

    Barmpoutis, Angelos; Jian, Bing; Vemuri, Baba C

    2009-01-01

    In this paper we present a novel method for multi-fiber reconstruction given a diffusion-weighted MRI dataset. There are several existing methods that employ various spherical deconvolution kernels for achieving this task. However the kernels in all of the existing methods rely on certain assumptions regarding the properties of the underlying fibers, which introduce inaccuracies and unnatural limitations in them. Our model is a non trivial generalization of the spherical deconvolution model, which unlike the existing methods does not make use of a fix-shaped kernel. Instead, the shape of the kernel is estimated simultaneously with the rest of the unknown parameters by employing a general adaptive model that can theoretically approximate any spherical deconvolution kernel. The performance of our model is demonstrated using simulated and real diffusion-weighed MR datasets and compared quantitatively with several existing techniques in literature. The results obtained indicate that our model has superior performance that is close to the theoretic limit of the best possible achievable result.

  14. Analog forecasting with dynamics-adapted kernels

    NASA Astrophysics Data System (ADS)

    Zhao, Zhizhen; Giannakis, Dimitrios

    2016-09-01

    Analog forecasting is a nonparametric technique introduced by Lorenz in 1969 which predicts the evolution of states of a dynamical system (or observables defined on the states) by following the evolution of the sample in a historical record of observations which most closely resembles the current initial data. Here, we introduce a suite of forecasting methods which improve traditional analog forecasting by combining ideas from kernel methods developed in harmonic analysis and machine learning and state-space reconstruction for dynamical systems. A key ingredient of our approach is to replace single-analog forecasting with weighted ensembles of analogs constructed using local similarity kernels. The kernels used here employ a number of dynamics-dependent features designed to improve forecast skill, including Takens’ delay-coordinate maps (to recover information in the initial data lost through partial observations) and a directional dependence on the dynamical vector field generating the data. Mathematically, our approach is closely related to kernel methods for out-of-sample extension of functions, and we discuss alternative strategies based on the Nyström method and the multiscale Laplacian pyramids technique. We illustrate these techniques in applications to forecasting in a low-order deterministic model for atmospheric dynamics with chaotic metastability, and interannual-scale forecasting in the North Pacific sector of a comprehensive climate model. We find that forecasts based on kernel-weighted ensembles have significantly higher skill than the conventional approach following a single analog.

  15. Investigating the inner time properties of seismograms by using the Fisher Information Measure

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Lovallo, Michele; Alcaz, Vasile; Ilies, Ion

    2014-09-01

    The time dynamics of seismograms of nine tectonic earthquakes which occurred in Vrancea (Romania) registered at three seismic stations located in Moldova are analyzed by means of the informational approach of the Fisher Information Measure (FIM). The three seismic stations in Moldova are located, two (MILM and LEOM) within an area with high seismic hazard, while the third (SORM) in a less hazardous region. Our findings point out to a clear discrimination of the two stations MILM and LEOM from SORM on the basis of the informational properties of the recorded seismograms corresponding to the same earthquakes. In particular it is found that larger distance and lower azimuth characterize seismograms with lower FIM, which implies lower organization and higher disorder in seismograms recorded by SORM with respect to those recorded by MILM and LEOM. The lower FIM revealed by seismograms recorded by SORM could be put in relationship with the lower degree of seismic hazard in the area where the seismic station is installed.

  16. Nonconvexity of the relative entropy for Markov dynamics: A Fisher information approach

    NASA Astrophysics Data System (ADS)

    Polettini, Matteo; Esposito, Massimiliano

    2013-07-01

    We show via counterexamples that relative entropy between the solution of a Markovian master equation and the steady state is not a convex function of time. We thus disprove the hypotheses that a general evolution principle of thermodynamics based on the decrease of the nonadiabatic entropy production could hold. However, we argue that a large separation of typical decay times is necessary for nonconvex solutions to occur, making concave transients extremely short lived with respect to the main relaxation modes. We describe a general method based on the Fisher information matrix to discriminate between generators that admit nonconvex solutions and those that do not. While initial conditions leading to concave transients are shown to be extremely fine-tuned, by our method we are able to select nonconvex initial conditions that are arbitrarily close to the steady state. Convexity does occur when the system is close to satisfying detailed balance or, more generally, when certain normality conditions of the decay modes are satisfied. Our results circumscribe the range of validity of a conjecture by Maes [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.107.010601 107, 010601 (2011)] regarding monotonicity of the large deviation rate functional for the occupation probability, showing that while the conjecture might hold in the long-time limit, the conditions for Lyapunov's second criterion for stability are not met.

  17. Which Fishers are Satisfied in the Caribbean? A Comparative Analysis of Job Satisfaction Among Caribbean Lobster Fishers.

    PubMed

    Monnereau, Iris; Pollnac, Richard

    2012-10-01

    Lobster fishing (targeting the spiny lobster Panulirus argus) is an important economic activity throughout the Wider Caribbean Region both as a source of income and employment for the local population as well as foreign exchange for national governments. Due to the high unit prices of the product, international lobster trade provides a way to improve the livelihoods of fisheries-dependent populations. The specie harvested is identical throughout the region and end market prices are roughly similar. In this paper we wish to investigate to which extent lobster fishers' job satisfaction differs in three countries in the Caribbean and how these differences can be explained by looking at the national governance arrangements.

  18. Describing variations of the Fisher-matrix across parameter space

    NASA Astrophysics Data System (ADS)

    Schäfer, Björn Malte; Reischke, Robert

    2016-08-01

    Forecasts in cosmology, both with Monte Carlo Markov-chain methods and with the Fisher-matrix formalism, depend on the choice of the fiducial model because both the signal strength of any observable and the model non-linearities linking observables to cosmological parameters vary in the general case. In this paper we propose a method for extrapolating Fisher-forecasts across the space of cosmological parameters by constructing a suitable basis. We demonstrate the validity of our method with constraints on a standard dark energy model extrapolated from a ΛCDM-model, as can be expected from two-bin weak lensing tomography with an Euclid-like survey, in the parameter pairs (Ωm, σ8), (Ωm, w0) and (w0, wa). Our numerical results include very accurate extrapolations across a wide range of cosmological parameters in terms of shape, size and orientation of the parameter likelihood, and a decomposition of the change of the likelihood contours into modes, which are straightforward to interpret in a geometrical way. We find that in particular the variation of the dark energy figure of merit is well captured by our formalism.

  19. Comparing quantum cloning: A Fisher-information perspective

    NASA Astrophysics Data System (ADS)

    Song, Hongting; Luo, Shunlong; Li, Nan; Chang, Lina

    2013-10-01

    Perfect cloning of an unknown quantum state is impossible. Approximate cloning, which is optimal in various senses, has been found in many cases. Paradigmatic examples are Wootters-Zurek cloning and universal cloning. These cloning machines aim at optimal cloning of the full quantum states. However, in practice, what is important and relevant may only involve partial information in quantum states, rather than quantum states themselves. For example, signals are often encoded as parameters in quantum states, whose information content is well synthesized by quantum Fisher information. This raises the basic issue of evaluating the information transferring capability (e.g., distributing quantum Fisher information) of quantum cloning. We assess and compare Wootters-Zurek cloning and universal cloning from this perspective and show that, on average, Wootters-Zurek cloning performs better than universal cloning for the phase (as well as amplitude) parameter, although they are incomparable individually, and universal cloning has many advantages over Wootters-Zurek cloning in other contexts. Physical insights and related issues are further discussed.

  20. Diffusion on a hypersphere: application to the Wright-Fisher model

    NASA Astrophysics Data System (ADS)

    Maruyama, Kishiko; Itoh, Yoshiaki

    2016-04-01

    The eigenfunction expansion by Gegenbauer polynomials for the diffusion on a hypersphere is transformed into the diffusion for the Wright-Fisher model with a particular mutation rate. We use the Ito calculus considering stochastic differential equations. The expansion gives a simple interpretation of the Griffiths eigenfunction expansion for the Wright-Fisher model. Our representation is useful to simulate the Wright-Fisher model as well as Brownian motion on a hypersphere.

  1. Fast generation of sparse random kernel graphs

    SciTech Connect

    Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo

    2015-09-10

    The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in time at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.

  2. Fast generation of sparse random kernel graphs

    DOE PAGES

    Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo

    2015-09-10

    The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less

  3. Kernel bandwidth estimation for nonparametric modeling.

    PubMed

    Bors, Adrian G; Nasios, Nikolaos

    2009-12-01

    Kernel density estimation is a nonparametric procedure for probability density modeling, which has found several applications in various fields. The smoothness and modeling ability of the functional approximation are controlled by the kernel bandwidth. In this paper, we describe a Bayesian estimation method for finding the bandwidth from a given data set. The proposed bandwidth estimation method is applied in three different computational-intelligence methods that rely on kernel density estimation: 1) scale space; 2) mean shift; and 3) quantum clustering. The third method is a novel approach that relies on the principles of quantum mechanics. This method is based on the analogy between data samples and quantum particles and uses the SchrOdinger potential as a cost function. The proposed methodology is used for blind-source separation of modulated signals and for terrain segmentation based on topography information.

  4. Experimental study of turbulent flame kernel propagation

    SciTech Connect

    Mansour, Mohy; Peters, Norbert; Schrader, Lars-Uve

    2008-07-15

    Flame kernels in spark ignited combustion systems dominate the flame propagation and combustion stability and performance. They are likely controlled by the spark energy, flow field and mixing field. The aim of the present work is to experimentally investigate the structure and propagation of the flame kernel in turbulent premixed methane flow using advanced laser-based techniques. The spark is generated using pulsed Nd:YAG laser with 20 mJ pulse energy in order to avoid the effect of the electrodes on the flame kernel structure and the variation of spark energy from shot-to-shot. Four flames have been investigated at equivalence ratios, {phi}{sub j}, of 0.8 and 1.0 and jet velocities, U{sub j}, of 6 and 12 m/s. A combined two-dimensional Rayleigh and LIPF-OH technique has been applied. The flame kernel structure has been collected at several time intervals from the laser ignition between 10 {mu}s and 2 ms. The data show that the flame kernel structure starts with spherical shape and changes gradually to peanut-like, then to mushroom-like and finally disturbed by the turbulence. The mushroom-like structure lasts longer in the stoichiometric and slower jet velocity. The growth rate of the average flame kernel radius is divided into two linear relations; the first one during the first 100 {mu}s is almost three times faster than that at the later stage between 100 and 2000 {mu}s. The flame propagation is slightly faster in leaner flames. The trends of the flame propagation, flame radius, flame cross-sectional area and mean flame temperature are related to the jet velocity and equivalence ratio. The relations obtained in the present work allow the prediction of any of these parameters at different conditions. (author)

  5. Quantum Fisher information of the Greenberg-Horne-Zeilinger state in decoherence channels

    SciTech Connect

    Ma Jian; Huang Yixiao; Wang Xiaoguang; Sun, C. P.

    2011-08-15

    Quantum Fisher information of a parameter characterizes the sensitivity of the state with respect to changes of the parameter. In this article, we study the quantum Fisher information of a state with respect to SU(2) rotations under three decoherence channels: the amplitude-damping, phase-damping, and depolarizing channels. The initial state is chosen to be a Greenberg-Horne-Zeilinger state of which the phase sensitivity can achieve the Heisenberg limit. By using the Kraus operator representation, the quantum Fisher information is obtained analytically. We observe the decay and sudden change of the quantum Fisher information in all three channels.

  6. Volatile compound formation during argan kernel roasting.

    PubMed

    El Monfalouti, Hanae; Charrouf, Zoubida; Giordano, Manuela; Guillaume, Dominique; Kartah, Badreddine; Harhar, Hicham; Gharby, Saïd; Denhez, Clément; Zeppa, Giuseppe

    2013-01-01

    Virgin edible argan oil is prepared by cold-pressing argan kernels previously roasted at 110 degrees C for up to 25 minutes. The concentration of 40 volatile compounds in virgin edible argan oil was determined as a function of argan kernel roasting time. Most of the volatile compounds begin to be formed after 15 to 25 minutes of roasting. This suggests that a strictly controlled roasting time should allow the modulation of argan oil taste and thus satisfy different types of consumers. This could be of major importance considering the present booming use of edible argan oil.

  7. Reduced multiple empirical kernel learning machine.

    PubMed

    Wang, Zhe; Lu, MingZhe; Gao, Daqi

    2015-02-01

    Multiple kernel learning (MKL) is demonstrated to be flexible and effective in depicting heterogeneous data sources since MKL can introduce multiple kernels rather than a single fixed kernel into applications. However, MKL would get a high time and space complexity in contrast to single kernel learning, which is not expected in real-world applications. Meanwhile, it is known that the kernel mapping ways of MKL generally have two forms including implicit kernel mapping and empirical kernel mapping (EKM), where the latter is less attracted. In this paper, we focus on the MKL with the EKM, and propose a reduced multiple empirical kernel learning machine named RMEKLM for short. To the best of our knowledge, it is the first to reduce both time and space complexity of the MKL with EKM. Different from the existing MKL, the proposed RMEKLM adopts the Gauss Elimination technique to extract a set of feature vectors, which is validated that doing so does not lose much information of the original feature space. Then RMEKLM adopts the extracted feature vectors to span a reduced orthonormal subspace of the feature space, which is visualized in terms of the geometry structure. It can be demonstrated that the spanned subspace is isomorphic to the original feature space, which means that the dot product of two vectors in the original feature space is equal to that of the two corresponding vectors in the generated orthonormal subspace. More importantly, the proposed RMEKLM brings a simpler computation and meanwhile needs a less storage space, especially in the processing of testing. Finally, the experimental results show that RMEKLM owns a much efficient and effective performance in terms of both complexity and classification. The contributions of this paper can be given as follows: (1) by mapping the input space into an orthonormal subspace, the geometry of the generated subspace is visualized; (2) this paper first reduces both the time and space complexity of the EKM-based MKL; (3

  8. Utilizing Kernelized Advection Schemes in Ocean Models

    NASA Astrophysics Data System (ADS)

    Zadeh, N.; Balaji, V.

    2008-12-01

    There has been a recent effort in the ocean model community to use a set of generic FORTRAN library routines for advection of scalar tracers in the ocean. In a collaborative project called Hybrid Ocean Model Environement (HOME), vastly different advection schemes (space-differencing schemes for advection equation) become available to modelers in the form of subroutine calls (kernels). In this talk we explore the possibility of utilizing ESMF data structures in wrapping these kernels so that they can be readily used in ESMF gridded components.

  9. Kernel abortion in maize. II. Distribution of /sup 14/C among kernel carboydrates

    SciTech Connect

    Hanft, J.M.; Jones, R.J.

    1986-06-01

    This study was designed to compare the uptake and distribution of /sup 14/C among fructose, glucose, sucrose, and starch in the cob, pedicel, and endosperm tissues of maize (Zea mays L.) kernels induced to abort by high temperature with those that develop normally. Kernels cultured in vitro at 309 and 35/sup 0/C were transferred to (/sup 14/C)sucrose media 10 days after pollination. Kernels cultured at 35/sup 0/C aborted prior to the onset of linear dry matter accumulation. Significant uptake into the cob, pedicel, and endosperm of radioactivity associated with the soluble and starch fractions of the tissues was detected after 24 hours in culture on atlageled media. After 8 days in culture on (/sup 14/C)sucrose media, 48 and 40% of the radioactivity associated with the cob carbohydrates was found in the reducing sugars at 30 and 35/sup 0/C, respectively. Of the total carbohydrates, a higher percentage of label was associated with sucrose and lower percentage with fructose and glucose in pedicel tissue of kernels cultured at 35/sup 0/C compared to kernels cultured at 30/sup 0/C. These results indicate that sucrose was not cleaved to fructose and glucose as rapidly during the unloading process in the pedicel of kernels induced to abort by high temperature. Kernels cultured at 35/sup 0/C had a much lower proportion of label associated with endosperm starch (29%) than did kernels cultured at 30/sup 0/C (89%). Kernels cultured at 35/sup 0/C had a correspondingly higher proportion of /sup 14/C in endosperm fructose, glucose, and sucrose.

  10. Classification of Hazelnut Kernels by Using Impact Acoustic Time-Frequency Patterns

    NASA Astrophysics Data System (ADS)

    Kalkan, Habil; Ince, Nuri Firat; Tewfik, Ahmed H.; Yardimci, Yasemin; Pearson, Tom

    2007-12-01

    Hazelnuts with damaged or cracked shells are more prone to infection with aflatoxin producing molds ( Aspergillus flavus). These molds can cause cancer. In this study, we introduce a new approach that separates damaged/cracked hazelnut kernels from good ones by using time-frequency features obtained from impact acoustic signals. The proposed technique requires no prior knowledge of the relevant time and frequency locations. In an offline step, the algorithm adaptively segments impact signals from a training data set in time using local cosine packet analysis and a Kullback-Leibler criterion to assess the discrimination power of different segmentations. In each resulting time segment, the signal is further decomposed into subbands using an undecimated wavelet transform. The most discriminative subbands are selected according to the Euclidean distance between the cumulative probability distributions of the corresponding subband coefficients. The most discriminative subbands are fed into a linear discriminant analysis classifier. In the online classification step, the algorithm simply computes the learned features from the observed signal and feeds them to the linear discriminant analysis (LDA) classifier. The algorithm achieved a throughput rate of 45 nuts/s and a classification accuracy of 96% with the 30 most discriminative features, a higher rate than those provided with prior methods.

  11. Linear discriminant analysis with misallocation in training samples

    NASA Technical Reports Server (NTRS)

    Chhikara, R. (Principal Investigator); Mckeon, J.

    1982-01-01

    Linear discriminant analysis for a two-class case is studied in the presence of misallocation in training samples. A general appraoch to modeling of mislocation is formulated, and the mean vectors and covariance matrices of the mixture distributions are derived. The asymptotic distribution of the discriminant boundary is obtained and the asymptotic first two moments of the two types of error rate given. Certain numerical results for the error rates are presented by considering the random and two non-random misallocation models. It is shown that when the allocation procedure for training samples is objectively formulated, the effect of misallocation on the error rates of the Bayes linear discriminant rule can almost be eliminated. If, however, this is not possible, the use of Fisher rule may be preferred over the Bayes rule.

  12. Accuracy of Reduced and Extended Thin-Wire Kernels

    SciTech Connect

    Burke, G J

    2008-11-24

    Some results are presented comparing the accuracy of the reduced thin-wire kernel and an extended kernel with exact integration of the 1/R term of the Green's function and results are shown for simple wire structures.

  13. The Cosmological Origin of the Tully-Fisher Relation

    NASA Astrophysics Data System (ADS)

    Steinmetz, Matthias; Navarro, Julio F.

    1999-03-01

    We use high-resolution cosmological simulations that include the effects of gasdynamics and star formation to investigate the origin of the Tully-Fisher relation in the standard cold dark matter cosmogony. Stars are assumed to form in collapsing, Jeans-unstable gas clumps at a rate set by the local gas density and the dynamical/cooling timescale. The energetic feedback from stellar evolution is assumed to heat the gas-surrounding regions of ongoing star formation, where it is radiated away very rapidly. The star formation algorithm thus has little effect on the rate at which gas cools and collapses, and, as a result, most galaxies form their stars very early. Luminosities are computed for each model galaxy using their full star formation histories and the latest spectrophotometric models. We find that the stellar mass of model galaxies is proportional to the total baryonic mass within the virial radius of their surrounding halos. Circular velocity then correlates tightly with the total luminosity of the galaxy, which reflects the equivalence between mass and circular velocity of systems identified in a cosmological context. The slope of the relation steepens slightly from the blue to the red bandpasses and is in fairly good agreement with observations. Its scatter is small, decreasing from ~0.38 mag in the U band to ~0.24 mag in the K band. The particular cosmological model we explore here seems unable to account for the zero point of the correlation. Model galaxies are too faint at z=0 (by about 2 mag) if the circular velocity at the edge of the luminous galaxy is used as an estimator of the rotation speed. The model Tully-Fisher relation is brighter in the past by ~0.7 mag in the B band at z=1, which is at odds with recent observations of z~1 galaxies. We conclude that the slope and tightness of the Tully-Fisher relation can be naturally explained in hierarchical models, but that its normalization and evolution depend strongly on the star formation algorithm

  14. Centered Kernel Alignment Enhancing Neural Network Pretraining for MRI-Based Dementia Diagnosis

    PubMed Central

    Cárdenas-Peña, David; Collazos-Huertas, Diego; Castellanos-Dominguez, German

    2016-01-01

    Dementia is a growing problem that affects elderly people worldwide. More accurate evaluation of dementia diagnosis can help during the medical examination. Several methods for computer-aided dementia diagnosis have been proposed using resonance imaging scans to discriminate between patients with Alzheimer's disease (AD) or mild cognitive impairment (MCI) and healthy controls (NC). Nonetheless, the computer-aided diagnosis is especially challenging because of the heterogeneous and intermediate nature of MCI. We address the automated dementia diagnosis by introducing a novel supervised pretraining approach that takes advantage of the artificial neural network (ANN) for complex classification tasks. The proposal initializes an ANN based on linear projections to achieve more discriminating spaces. Such projections are estimated by maximizing the centered kernel alignment criterion that assesses the affinity between the resonance imaging data kernel matrix and the label target matrix. As a result, the performed linear embedding allows accounting for features that contribute the most to the MCI class discrimination. We compare the supervised pretraining approach to two unsupervised initialization methods (autoencoders and Principal Component Analysis) and against the best four performing classification methods of the 2014 CADDementia challenge. As a result, our proposal outperforms all the baselines (7% of classification accuracy and area under the receiver-operating-characteristic curve) at the time it reduces the class biasing. PMID:27148392

  15. Fabrication of Uranium Oxycarbide Kernels for HTR Fuel

    SciTech Connect

    Charles Barnes; CLay Richardson; Scott Nagley; John Hunn; Eric Shaber

    2010-10-01

    Babcock and Wilcox (B&W) has been producing high quality uranium oxycarbide (UCO) kernels for Advanced Gas Reactor (AGR) fuel tests at the Idaho National Laboratory. In 2005, 350-µm, 19.7% 235U-enriched UCO kernels were produced for the AGR-1 test fuel. Following coating of these kernels and forming the coated-particles into compacts, this fuel was irradiated in the Advanced Test Reactor (ATR) from December 2006 until November 2009. B&W produced 425-µm, 14% enriched UCO kernels in 2008, and these kernels were used to produce fuel for the AGR-2 experiment that was inserted in ATR in 2010. B&W also produced 500-µm, 9.6% enriched UO2 kernels for the AGR-2 experiments. Kernels of the same size and enrichment as AGR-1 were also produced for the AGR-3/4 experiment. In addition to fabricating enriched UCO and UO2 kernels, B&W has produced more than 100 kg of natural uranium UCO kernels which are being used in coating development tests. Successive lots of kernels have demonstrated consistent high quality and also allowed for fabrication process improvements. Improvements in kernel forming were made subsequent to AGR-1 kernel production. Following fabrication of AGR-2 kernels, incremental increases in sintering furnace charge size have been demonstrated. Recently small scale sintering tests using a small development furnace equipped with a residual gas analyzer (RGA) has increased understanding of how kernel sintering parameters affect sintered kernel properties. The steps taken to increase throughput and process knowledge have reduced kernel production costs. Studies have been performed of additional modifications toward the goal of increasing capacity of the current fabrication line to use for production of first core fuel for the Next Generation Nuclear Plant (NGNP) and providing a basis for the design of a full scale fuel fabrication facility.

  16. GMM-based intermediate matching kernel for classification of varying length patterns of long duration speech using support vector machines.

    PubMed

    Dileep, Aroor Dinesh; Sekhar, Chellu Chandra

    2014-08-01

    Dynamic kernel (DK)-based support vector machines are used for the classification of varying length patterns. This paper explores the use of intermediate matching kernel (IMK) as a DK for classification of varying length patterns of long duration speech represented as sets of feature vectors. The main issue in construction of IMK is the choice for the set of virtual feature vectors used to select the local feature vectors for matching. This paper proposes to use components of class-independent Gaussian mixture model (CIGMM) as a representation for the set of virtual feature vectors. For every component of CIGMM, a local feature vector each from the two sets of local feature vectors that has the highest probability of belonging to that component is selected and a base kernel is computed between the selected local feature vectors. The IMK is computed as the sum of all the base kernels corresponding to different components of CIGMM. It is proposed to use the responsibility term weighted base kernels in computation of IMK to improve its discrimination ability. This paper also proposes the posterior probability weighted DKs (including the proposed IMKs) to improve their classification performance and reduce the number of support vectors. The performance of the support vector machine (SVM)-based classifiers using the proposed IMKs is studied for speech emotion recognition and speaker identification tasks and compared with that of the SVM-based classifiers using the state-of-the-art DKs. PMID:25050941

  17. 7 CFR 51.2295 - Half kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946... the separated half of a kernel with not more than one-eighth broken off....

  18. Kernel Temporal Differences for Neural Decoding

    PubMed Central

    Bae, Jihye; Sanchez Giraldo, Luis G.; Pohlmeyer, Eric A.; Francis, Joseph T.; Sanchez, Justin C.; Príncipe, José C.

    2015-01-01

    We study the feasibility and capability of the kernel temporal difference (KTD)(λ) algorithm for neural decoding. KTD(λ) is an online, kernel-based learning algorithm, which has been introduced to estimate value functions in reinforcement learning. This algorithm combines kernel-based representations with the temporal difference approach to learning. One of our key observations is that by using strictly positive definite kernels, algorithm's convergence can be guaranteed for policy evaluation. The algorithm's nonlinear functional approximation capabilities are shown in both simulations of policy evaluation and neural decoding problems (policy improvement). KTD can handle high-dimensional neural states containing spatial-temporal information at a reasonable computational complexity allowing real-time applications. When the algorithm seeks a proper mapping between a monkey's neural states and desired positions of a computer cursor or a robot arm, in both open-loop and closed-loop experiments, it can effectively learn the neural state to action mapping. Finally, a visualization of the coadaptation process between the decoder and the subject shows the algorithm's capabilities in reinforcement learning brain machine interfaces. PMID:25866504

  19. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  20. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  1. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  2. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  3. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order... of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or brown spot, as defined in the United States Standards for Shelled Almonds, or which has embedded...

  4. Detailed HIkinematics of Tully-Fisher calibrator galaxies

    NASA Astrophysics Data System (ADS)

    Ponomareva, Anastasia A.; Verheijen, Marc A. W.; Bosma, Albert

    2016-09-01

    We present spatially-resolved HI kinematics of 32 spiral galaxies which have Cepheid or/and Tip of the Red Giant Branch distances, and define a calibrator sample for the Tully-Fisher relation. The interferometric HI data for this sample were collected from available archives and supplemented with new GMRT observations. This paper describes an uniform analysis of the HI kinematics of this inhomogeneous data set. Our main result is an atlas for our calibrator sample that presents global HI profiles, integrated HI column-density maps, HI surface density profiles and, most importantly, detailed kinematic information in the form of high-quality rotation curves derived from highly-resolved, two-dimensional velocity fields and position-velocity diagrams.

  5. Fisher information and the thermodynamics of scale-invariant systems

    NASA Astrophysics Data System (ADS)

    Hernando, A.; Vesperinas, C.; Plastino, A.

    2010-02-01

    We present a thermodynamic formulation for scale-invariant systems based on the minimization with constraints of the Fisher information measure. In such a way a clear analogy between these systems’ thermal properties and those of gases and fluids is seen to emerge in a natural fashion. We focus our attention on the non-interacting scenario, speaking thus of scale-free ideal gases (SFIGs) and present some empirical evidences regarding such disparate systems as electoral results, city populations and total citations in Physics journals, that seem to indicate that SFIGs do exist. We also illustrate the way in which Zipf’s law can be understood in a thermodynamical context as the surface of a finite system. Finally, we derive an equivalent microscopic description of our systems which totally agrees with previous numerical simulations found in the literature.

  6. Enhancing teleportation of quantum Fisher information by partial measurements

    NASA Astrophysics Data System (ADS)

    Xiao, Xing; Yao, Yao; Zhong, Wo-Jun; Li, Yan-Ling; Xie, Ying-Mao

    2016-01-01

    The purport of quantum teleportation is to completely transfer information from one party to another distant partner. However, from the perspective of parameter estimation, it is the information carried by a particular parameter, not the information of total quantum state that needs to be teleported. Due to the inevitable noise in environments, we propose two schemes to enhance quantum Fisher information (QFI) teleportation under amplitude damping noise with the technique of partial measurements. We find that post-partial measurement can greatly enhance the teleported QFI, while the combination of prior partial measurement and post-partial measurement reversal could completely eliminate the effect of decoherence. We show that, somewhat consequentially, enhancing QFI teleportation is more economic than that of improving fidelity teleportation. Our work extends the ability of partial measurements as a quantum technique to battle decoherence in quantum information processing.

  7. Fisher symmetry and the geometry of quantum states

    NASA Astrophysics Data System (ADS)

    Gross, Jonathan A.; Barnum, Howard; Caves, Carlton M.

    The quantum Fisher information (QFI) is a valuable tool on account of the achievable lower bound it provides for single-parameter estimation. Due to the existence of incompatible quantum observables, however, the lower bound provided by the QFI cannot be saturated in the general multi-parameter case. A bound demonstrated by Gill and Massar (GM) captures some of the limitations that incompatibility imposes in the multi-parameter case. We further explore the structure of measurements allowed by quantum mechanics, identifying restrictions beyond those given by the QFI and GM bound. These additional restrictions give insight into the geometry of quantum state space and notions of measurement symmetry related to the QFI.

  8. Fisher-Symmetric Informationally Complete Measurements for Pure States.

    PubMed

    Li, Nan; Ferrie, Christopher; Gross, Jonathan A; Kalev, Amir; Caves, Carlton M

    2016-05-01

    We introduce a new kind of quantum measurement that is defined to be symmetric in the sense of uniform Fisher information across a set of parameters that uniquely represent pure quantum states in the neighborhood of a fiducial pure state. The measurement is locally informationally complete-i.e., it uniquely determines these parameters, as opposed to distinguishing two arbitrary quantum states-and it is maximal in the sense of a multiparameter quantum Cramér-Rao bound. For a d-dimensional quantum system, requiring only local informational completeness allows us to reduce the number of outcomes of the measurement from a minimum close to but below 4d-3, for the usual notion of global pure-state informational completeness, to 2d-1. PMID:27203310

  9. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 3 2010-04-01 2009-04-01 true Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in...

  10. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 3 2011-04-01 2011-04-01 false Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in...

  11. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 3 2012-04-01 2012-04-01 false Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in...

  12. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 3 2013-04-01 2013-04-01 false Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in...

  13. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 3 2014-04-01 2014-04-01 false Tamarind seed kernel powder. 176.350 Section 176... Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a..., packaging, transporting, or holding food, subject to the provisions of this section. (a) Tamarind...

  14. 7 CFR 868.254 - Broken kernels determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Broken kernels determination. 868.254 Section 868.254 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Governing Application of Standards § 868.254 Broken kernels determination. Broken kernels shall...

  15. 7 CFR 868.304 - Broken kernels determination.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 7 2013-01-01 2013-01-01 false Broken kernels determination. 868.304 Section 868.304 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Application of Standards § 868.304 Broken kernels determination. Broken kernels shall be determined by the...

  16. 7 CFR 868.304 - Broken kernels determination.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 7 2012-01-01 2012-01-01 false Broken kernels determination. 868.304 Section 868.304 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Application of Standards § 868.304 Broken kernels determination. Broken kernels shall be determined by the...

  17. 7 CFR 868.254 - Broken kernels determination.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 7 2013-01-01 2013-01-01 false Broken kernels determination. 868.254 Section 868.254 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Governing Application of Standards § 868.254 Broken kernels determination. Broken kernels shall...

  18. 7 CFR 868.304 - Broken kernels determination.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 7 2014-01-01 2014-01-01 false Broken kernels determination. 868.304 Section 868.304 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Application of Standards § 868.304 Broken kernels determination. Broken kernels shall be determined by the...

  19. 7 CFR 868.254 - Broken kernels determination.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 7 2012-01-01 2012-01-01 false Broken kernels determination. 868.254 Section 868.254 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Governing Application of Standards § 868.254 Broken kernels determination. Broken kernels shall...

  20. 7 CFR 868.254 - Broken kernels determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 7 2011-01-01 2011-01-01 false Broken kernels determination. 868.254 Section 868.254 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Governing Application of Standards § 868.254 Broken kernels determination. Broken kernels shall...

  1. 7 CFR 51.2125 - Split or broken kernels.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Split or broken kernels. 51.2125 Section 51.2125 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... § 51.2125 Split or broken kernels. Split or broken kernels means seven-eighths or less of...

  2. 7 CFR 868.304 - Broken kernels determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 7 2011-01-01 2011-01-01 false Broken kernels determination. 868.304 Section 868.304 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Application of Standards § 868.304 Broken kernels determination. Broken kernels shall be determined by the...

  3. 7 CFR 51.2125 - Split or broken kernels.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Split or broken kernels. 51.2125 Section 51.2125 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... § 51.2125 Split or broken kernels. Split or broken kernels means seven-eighths or less of...

  4. 7 CFR 868.254 - Broken kernels determination.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 7 2014-01-01 2014-01-01 false Broken kernels determination. 868.254 Section 868.254 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Governing Application of Standards § 868.254 Broken kernels determination. Broken kernels shall...

  5. 7 CFR 868.304 - Broken kernels determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Broken kernels determination. 868.304 Section 868.304 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Application of Standards § 868.304 Broken kernels determination. Broken kernels shall be determined by the...

  6. Fisher Information, Entropy, and the Second and Third Laws of Thermodynamics

    EPA Science Inventory

    We propose Fisher Information as a new calculable thermodynamic property that can be shown to follow the Second and the Third Laws of Thermodynamics. Fisher Information is, however, qualitatively different from entropy and potentially possessing a great deal more structure. Hence...

  7. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line...

  8. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line...

  9. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line...

  10. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line...

  11. 33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line...

  12. Evaluating the sustainability of a regional system using Fisher information in the San Luis Basin, Colorado

    EPA Science Inventory

    This paper describes the theory, data, and methodology necessary for using Fisher information to assess the sustainability of the San Luis Basin (SLB) regional system over time. Fisher information was originally developed as a measure of the information content in data and is an ...

  13. Fisher information and asymptotic normality in system identification for quantum Markov chains

    SciTech Connect

    Guta, Madalin

    2011-06-15

    This paper deals with the problem of estimating the coupling constant {theta} of a mixing quantum Markov chain. For a repeated measurement on the chain's output we show that the outcomes' time average has an asymptotically normal (Gaussian) distribution, and we give the explicit expressions of its mean and variance. In particular, we obtain a simple estimator of {theta} whose classical Fisher information can be optimized over different choices of measured observables. We then show that the quantum state of the output together with the system is itself asymptotically Gaussian and compute its quantum Fisher information, which sets an absolute bound to the estimation error. The classical and quantum Fisher information are compared in a simple example. In the vicinity of {theta}=0 we find that the quantum Fisher information has a quadratic rather than linear scaling in output size, and asymptotically the Fisher information is localized in the system, while the output is independent of the parameter.

  14. Fisher's contributions to genetics and heredity, with special emphasis on the Gregor Mendel controversy.

    PubMed

    Piegorsch, W W

    1990-12-01

    R. A. Fisher is widely respected for his contributions to both statistics and genetics. For instance, his 1930 text on The Genetical Theory of Natural Selection remains a watershed contribution in that area. Fisher's subsequent research led him to study the work of (Johann) Gregor Mendel, the 19th century monk who first developed the basic principles of heredity with experiments on garden peas. In examining Mendel's original 1865 article, Fisher noted that the conformity between Mendel's reported and proposed (theoretical) ratios of segregating individuals was unusually good, "too good" perhaps. The resulting controversy as to whether Mendel "cooked" his data for presentation has continued to the current day. This review highlights Fisher's most salient points as regards Mendel's "too good" fit, within the context of Fisher's extensive contributions to the development of genetical and evolutionary theory.

  15. Post-tsunami relocation of fisher settlements in South Asia: evidence from the Coromandel Coast, India.

    PubMed

    Bavinck, Maarten; de Klerk, Leo; van der Plaat, Felice; Ravesteijn, Jorik; Angel, Dominique; Arendsen, Hendrik; van Dijk, Tom; de Hoog, Iris; van Koolwijk, Ant; Tuijtel, Stijn; Zuurendonk, Benjamin

    2015-07-01

    The tsunami that struck the coasts of India on 26 December 2004 resulted in the large-scale destruction of fisher habitations. The post-tsunami rehabilitation effort in Tamil Nadu was directed towards relocating fisher settlements in the interior. This paper discusses the outcomes of a study on the social effects of relocation in a sample of nine communities along the Coromandel Coast. It concludes that, although the participation of fishing communities in house design and in allocation procedures has been limited, many fisher households are satisfied with the quality of the facilities. The distance of the new settlements to the shore, however, is regarded as an impediment to engaging in the fishing profession, and many fishers are actually moving back to their old locations. This raises questions as to the direction of coastal zone policy in India, as well as to the weight accorded to safety (and other coastal development interests) vis-à-vis the livelihood needs of fishers.

  16. Chare kernel; A runtime support system for parallel computations

    SciTech Connect

    Shu, W. ); Kale, L.V. )

    1991-03-01

    This paper presents the chare kernel system, which supports parallel computations with irregular structure. The chare kernel is a collection of primitive functions that manage chares, manipulative messages, invoke atomic computations, and coordinate concurrent activities. Programs written in the chare kernel language can be executed on different parallel machines without change. Users writing such programs concern themselves with the creation of parallel actions but not with assigning them to specific processors. The authors describe the design and implementation of the chare kernel. Performance of chare kernel programs on two hypercube machines, the Intel iPSC/2 and the NCUBE, is also given.

  17. Kernel weights optimization for error diffusion halftoning method

    NASA Astrophysics Data System (ADS)

    Fedoseev, Victor

    2015-02-01

    This paper describes a study to find the best error diffusion kernel for digital halftoning under various restrictions on the number of non-zero kernel coefficients and their set of values. As an objective measure of quality, WSNR was used. The problem of multidimensional optimization was solved numerically using several well-known algorithms: Nelder- Mead, BFGS, and others. The study found a kernel function that provides a quality gain of about 5% in comparison with the best of the commonly used kernel introduced by Floyd and Steinberg. Other kernels obtained allow to significantly reduce the computational complexity of the halftoning process without reducing its quality.

  18. Online kernel principal component analysis: a reduced-order model.

    PubMed

    Honeine, Paul

    2012-09-01

    Kernel principal component analysis (kernel-PCA) is an elegant nonlinear extension of one of the most used data analysis and dimensionality reduction techniques, the principal component analysis. In this paper, we propose an online algorithm for kernel-PCA. To this end, we examine a kernel-based version of Oja's rule, initially put forward to extract a linear principal axe. As with most kernel-based machines, the model order equals the number of available observations. To provide an online scheme, we propose to control the model order. We discuss theoretical results, such as an upper bound on the error of approximating the principal functions with the reduced-order model. We derive a recursive algorithm to discover the first principal axis, and extend it to multiple axes. Experimental results demonstrate the effectiveness of the proposed approach, both on synthetic data set and on images of handwritten digits, with comparison to classical kernel-PCA and iterative kernel-PCA.

  19. Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to characterize vegetation recovery after fire disturbance

    NASA Astrophysics Data System (ADS)

    Lanorte, Antonio; Lasaponara, Rosa; Lovallo, Michele; Telesca, Luciano

    2014-02-01

    The time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers.

  20. Discrimination in Employment.

    ERIC Educational Resources Information Center

    Kovarsky, Irving

    Intended as a guide on discrimination problems and issues for students and practitioners in the area of employment relations, this book interrelates historical, religious, economic, medical, and sociological factors surrounding racial, religious, national, sex, age, and physical and mental discrimination to explain discrimination in employment.…

  1. A Novel Framework for Learning Geometry-Aware Kernels.

    PubMed

    Pan, Binbin; Chen, Wen-Sheng; Xu, Chen; Chen, Bo

    2016-05-01

    The data from real world usually have nonlinear geometric structure, which are often assumed to lie on or close to a low-dimensional manifold in a high-dimensional space. How to detect this nonlinear geometric structure of the data is important for the learning algorithms. Recently, there has been a surge of interest in utilizing kernels to exploit the manifold structure of the data. Such kernels are called geometry-aware kernels and are widely used in the machine learning algorithms. The performance of these algorithms critically relies on the choice of the geometry-aware kernels. Intuitively, a good geometry-aware kernel should utilize additional information other than the geometric information. In many applications, it is required to compute the out-of-sample data directly. However, most of the geometry-aware kernel methods are restricted to the available data given beforehand, with no straightforward extension for out-of-sample data. In this paper, we propose a framework for more general geometry-aware kernel learning. The proposed framework integrates multiple sources of information and enables us to develop flexible and effective kernel matrices. Then, we theoretically show how the learned kernel matrices are extended to the corresponding kernel functions, in which the out-of-sample data can be computed directly. Under our framework, a novel family of geometry-aware kernels is developed. Especially, some existing geometry-aware kernels can be viewed as instances of our framework. The performance of the kernels is evaluated on dimensionality reduction, classification, and clustering tasks. The empirical results show that our kernels significantly improve the performance.

  2. Automated fine structure image analysis method for discrimination of diabetic retinopathy stage using conjunctival microvasculature images

    PubMed Central

    Khansari, Maziyar M; O’Neill, William; Penn, Richard; Chau, Felix; Blair, Norman P; Shahidi, Mahnaz

    2016-01-01

    The conjunctiva is a densely vascularized mucus membrane covering the sclera of the eye with a unique advantage of accessibility for direct visualization and non-invasive imaging. The purpose of this study is to apply an automated quantitative method for discrimination of different stages of diabetic retinopathy (DR) using conjunctival microvasculature images. Fine structural analysis of conjunctival microvasculature images was performed by ordinary least square regression and Fisher linear discriminant analysis. Conjunctival images between groups of non-diabetic and diabetic subjects at different stages of DR were discriminated. The automated method’s discriminate rates were higher than those determined by human observers. The method allowed sensitive and rapid discrimination by assessment of conjunctival microvasculature images and can be potentially useful for DR screening and monitoring. PMID:27446692

  3. [Discrimination of Red Tide algae by fluorescence spectra and principle component analysis].

    PubMed

    Su, Rong-guo; Hu, Xu-peng; Zhang, Chuan-song; Wang, Xiu-lin

    2007-07-01

    Fluorescence discrimination technology for 11 species of the Red Tide algae at genus level was constructed by principle component analysis and non-negative least squares. Rayleigh and Raman scattering peaks of 3D fluorescence spectra were eliminated by Delaunay triangulation method. According to the results of Fisher linear discrimination, the first principle component score and the second component score of 3D fluorescence spectra were chosen as discriminant feature and the feature base was established. The 11 algae species were tested, and more than 85% samples were accurately determinated, especially for Prorocentrum donghaiense, Skeletonema costatum, Gymnodinium sp., which have frequently brought Red tide in the East China Sea. More than 95% samples were right discriminated. The results showed that the genus discriminant feature of 3D fluorescence spectra of Red Tide algae given by principle component analysis could work well.

  4. Quark-hadron duality: Pinched kernel approach

    NASA Astrophysics Data System (ADS)

    Dominguez, C. A.; Hernandez, L. A.; Schilcher, K.; Spiesberger, H.

    2016-08-01

    Hadronic spectral functions measured by the ALEPH collaboration in the vector and axial-vector channels are used to study potential quark-hadron duality violations (DV). This is done entirely in the framework of pinched kernel finite energy sum rules (FESR), i.e. in a model independent fashion. The kinematical range of the ALEPH data is effectively extended up to s = 10 GeV2 by using an appropriate kernel, and assuming that in this region the spectral functions are given by perturbative QCD. Support for this assumption is obtained by using e+ e‑ annihilation data in the vector channel. Results in both channels show a good saturation of the pinched FESR, without further need of explicit models of DV.

  5. Wilson Dslash Kernel From Lattice QCD Optimization

    SciTech Connect

    Joo, Balint; Smelyanskiy, Mikhail; Kalamkar, Dhiraj D.; Vaidyanathan, Karthikeyan

    2015-07-01

    Lattice Quantum Chromodynamics (LQCD) is a numerical technique used for calculations in Theoretical Nuclear and High Energy Physics. LQCD is traditionally one of the first applications ported to many new high performance computing architectures and indeed LQCD practitioners have been known to design and build custom LQCD computers. Lattice QCD kernels are frequently used as benchmarks (e.g. 168.wupwise in the SPEC suite) and are generally well understood, and as such are ideal to illustrate several optimization techniques. In this chapter we will detail our work in optimizing the Wilson-Dslash kernels for Intel Xeon Phi, however, as we will show the technique gives excellent performance on regular Xeon Architecture as well.

  6. Management decision making for fisher populations informed by occupancy modeling

    USGS Publications Warehouse

    Fuller, Angela K.; Linden, Daniel W.; Royle, J. Andrew

    2016-01-01

    Harvest data are often used by wildlife managers when setting harvest regulations for species because the data are regularly collected and do not require implementation of logistically and financially challenging studies to obtain the data. However, when harvest data are not available because an area had not previously supported a harvest season, alternative approaches are required to help inform management decision making. When distribution or density data are required across large areas, occupancy modeling is a useful approach, and under certain conditions, can be used as a surrogate for density. We collaborated with the New York State Department of Environmental Conservation (NYSDEC) to conduct a camera trapping study across a 70,096-km2 region of southern New York in areas that were currently open to fisher (Pekania [Martes] pennanti) harvest and those that had been closed to harvest for approximately 65 years. We used detection–nondetection data at 826 sites to model occupancy as a function of site-level landscape characteristics while accounting for sampling variation. Fisher occupancy was influenced positively by the proportion of conifer and mixed-wood forest within a 15-km2 grid cell and negatively associated with road density and the proportion of agriculture. Model-averaged predictions indicated high occupancy probabilities (>0.90) when road densities were low (<1 km/km2) and coniferous and mixed forest proportions were high (>0.50). Predicted occupancy ranged 0.41–0.67 in wildlife management units (WMUs) currently open to trapping, which could be used to guide a minimum occupancy threshold for opening new areas to trapping seasons. There were 5 WMUs that had been closed to trapping but had an average predicted occupancy of 0.52 (0.07 SE), and above the threshold of 0.41. These areas are currently under consideration by NYSDEC for opening a conservative harvest season. We demonstrate the use of occupancy modeling as an aid to management

  7. Searching and Indexing Genomic Databases via Kernelization

    PubMed Central

    Gagie, Travis; Puglisi, Simon J.

    2015-01-01

    The rapid advance of DNA sequencing technologies has yielded databases of thousands of genomes. To search and index these databases effectively, it is important that we take advantage of the similarity between those genomes. Several authors have recently suggested searching or indexing only one reference genome and the parts of the other genomes where they differ. In this paper, we survey the 20-year history of this idea and discuss its relation to kernelization in parameterized complexity. PMID:25710001

  8. Post Tsunami Job Satisfaction among the Fishers of Na Pru Village, on the Andaman Sea Coast of Thailand

    ERIC Educational Resources Information Center

    Pollnac, Richard B.; Kotowicz, Dawn

    2012-01-01

    The paper examines job satisfaction among fishers in a tsunami-impacted area on the Andaman coast of Thailand. Following the tsunami, many predicted that fishers would be reluctant to resume their fishing activities. Observations in the fishing communities, however, indicated that as soon as fishers obtained replacements for equipment damaged by…

  9. A Fast Reduced Kernel Extreme Learning Machine.

    PubMed

    Deng, Wan-Yu; Ong, Yew-Soon; Zheng, Qing-Hua

    2016-04-01

    In this paper, we present a fast and accurate kernel-based supervised algorithm referred to as the Reduced Kernel Extreme Learning Machine (RKELM). In contrast to the work on Support Vector Machine (SVM) or Least Square SVM (LS-SVM), which identifies the support vectors or weight vectors iteratively, the proposed RKELM randomly selects a subset of the available data samples as support vectors (or mapping samples). By avoiding the iterative steps of SVM, significant cost savings in the training process can be readily attained, especially on Big datasets. RKELM is established based on the rigorous proof of universal learning involving reduced kernel-based SLFN. In particular, we prove that RKELM can approximate any nonlinear functions accurately under the condition of support vectors sufficiency. Experimental results on a wide variety of real world small instance size and large instance size applications in the context of binary classification, multi-class problem and regression are then reported to show that RKELM can perform at competitive level of generalized performance as the SVM/LS-SVM at only a fraction of the computational effort incurred.

  10. Semi-Supervised Kernel Mean Shift Clustering.

    PubMed

    Anand, Saket; Mittal, Sushil; Tuzel, Oncel; Meer, Peter

    2014-06-01

    Mean shift clustering is a powerful nonparametric technique that does not require prior knowledge of the number of clusters and does not constrain the shape of the clusters. However, being completely unsupervised, its performance suffers when the original distance metric fails to capture the underlying cluster structure. Despite recent advances in semi-supervised clustering methods, there has been little effort towards incorporating supervision into mean shift. We propose a semi-supervised framework for kernel mean shift clustering (SKMS) that uses only pairwise constraints to guide the clustering procedure. The points are first mapped to a high-dimensional kernel space where the constraints are imposed by a linear transformation of the mapped points. This is achieved by modifying the initial kernel matrix by minimizing a log det divergence-based objective function. We show the advantages of SKMS by evaluating its performance on various synthetic and real datasets while comparing with state-of-the-art semi-supervised clustering algorithms. PMID:26353281

  11. A Fast Reduced Kernel Extreme Learning Machine.

    PubMed

    Deng, Wan-Yu; Ong, Yew-Soon; Zheng, Qing-Hua

    2016-04-01

    In this paper, we present a fast and accurate kernel-based supervised algorithm referred to as the Reduced Kernel Extreme Learning Machine (RKELM). In contrast to the work on Support Vector Machine (SVM) or Least Square SVM (LS-SVM), which identifies the support vectors or weight vectors iteratively, the proposed RKELM randomly selects a subset of the available data samples as support vectors (or mapping samples). By avoiding the iterative steps of SVM, significant cost savings in the training process can be readily attained, especially on Big datasets. RKELM is established based on the rigorous proof of universal learning involving reduced kernel-based SLFN. In particular, we prove that RKELM can approximate any nonlinear functions accurately under the condition of support vectors sufficiency. Experimental results on a wide variety of real world small instance size and large instance size applications in the context of binary classification, multi-class problem and regression are then reported to show that RKELM can perform at competitive level of generalized performance as the SVM/LS-SVM at only a fraction of the computational effort incurred. PMID:26829605

  12. The Palomar kernel-phase experiment: testing kernel phase interferometry for ground-based astronomical observations

    NASA Astrophysics Data System (ADS)

    Pope, Benjamin; Tuthill, Peter; Hinkley, Sasha; Ireland, Michael J.; Greenbaum, Alexandra; Latyshev, Alexey; Monnier, John D.; Martinache, Frantz

    2016-01-01

    At present, the principal limitation on the resolution and contrast of astronomical imaging instruments comes from aberrations in the optical path, which may be imposed by the Earth's turbulent atmosphere or by variations in the alignment and shape of the telescope optics. These errors can be corrected physically, with active and adaptive optics, and in post-processing of the resulting image. A recently developed adaptive optics post-processing technique, called kernel-phase interferometry, uses linear combinations of phases that are self-calibrating with respect to small errors, with the goal of constructing observables that are robust against the residual optical aberrations in otherwise well-corrected imaging systems. Here, we present a direct comparison between kernel phase and the more established competing techniques, aperture masking interferometry, point spread function (PSF) fitting and bispectral analysis. We resolve the α Ophiuchi binary system near periastron, using the Palomar 200-Inch Telescope. This is the first case in which kernel phase has been used with a full aperture to resolve a system close to the diffraction limit with ground-based extreme adaptive optics observations. Excellent agreement in astrometric quantities is found between kernel phase and masking, and kernel phase significantly outperforms PSF fitting and bispectral analysis, demonstrating its viability as an alternative to conventional non-redundant masking under appropriate conditions.

  13. A FURTHER EVALUATION OF PICTURE PROMPTS DURING AUDITORY-VISUAL CONDITIONAL DISCRIMINATION TRAINING

    PubMed Central

    Carp, Charlotte L.; Peterson, Sean P.; Arkel, Amber J.; Petursdottir, Anna I.; Ingvarsson, Einar T.

    2012-01-01

    This study was a systematic replication and extension of Fisher, Kodak, and Moore (2007), in which a picture prompt embedded into a least-to-most prompting sequence facilitated acquisition of auditory-visual conditional discriminations. Participants were 4 children who had been diagnosed with autism; 2 had limited prior receptive skills, and 2 had more advanced receptive skills. We used a balanced design to compare the effects of picture prompts, pointing prompts, and either trial-and-error learning or a no-reinforcement condition. In addition, we assessed the emergence of vocal tacts for the 2 participants who had prior tact repertoires. Picture prompts enhanced acquisition for all participants, but there were no differential effects on tact emergence. The results support a generality of the effect reported by Fisher et al. and suggest that a variety of learners may benefit from the incorporation of picture prompts into auditory-visual conditional discrimination training. PMID:23322929

  14. A further evaluation of picture prompts during auditory-visual conditional discrimination training.

    PubMed

    Carp, Charlotte L; Peterson, Sean P; Arkel, Amber J; Petursdottir, Anna I; Ingvarsson, Einar T

    2012-01-01

    This study was a systematic replication and extension of Fisher, Kodak, and Moore (2007), in which a picture prompt embedded into a least-to-most prompting sequence facilitated acquisition of auditory-visual conditional discriminations. Participants were 4 children who had been diagnosed with autism; 2 had limited prior receptive skills, and 2 had more advanced receptive skills. We used a balanced design to compare the effects of picture prompts, pointing prompts, and either trial-and-error learning or a no-reinforcement condition. In addition, we assessed the emergence of vocal tacts for the 2 participants who had prior tact repertoires. Picture prompts enhanced acquisition for all participants, but there were no differential effects on tact emergence. The results support a generality of the effect reported by Fisher et al. and suggest that a variety of learners may benefit from the incorporation of picture prompts into auditory-visual conditional discrimination training.

  15. Distribution of quantum Fisher information in asymmetric cloning machines

    PubMed Central

    Xiao, Xing; Yao, Yao; Zhou, Lei-Ming; Wang, Xiaoguang

    2014-01-01

    An unknown quantum state cannot be copied and broadcast freely due to the no-cloning theorem. Approximate cloning schemes have been proposed to achieve the optimal cloning characterized by the maximal fidelity between the original and its copies. Here, from the perspective of quantum Fisher information (QFI), we investigate the distribution of QFI in asymmetric cloning machines which produce two nonidentical copies. As one might expect, improving the QFI of one copy results in decreasing the QFI of the other copy. It is perhaps also unsurprising that asymmetric phase-covariant cloning outperforms universal cloning in distributing QFI since a priori information of the input state has been utilized. However, interesting results appear when we compare the distributabilities of fidelity (which quantifies the full information of quantum states), and QFI (which only captures the information of relevant parameters) in asymmetric cloning machines. Unlike the results of fidelity, where the distributability of symmetric cloning is always optimal for any d-dimensional cloning, we find that any asymmetric cloning outperforms symmetric cloning on the distribution of QFI for d ≤ 18, whereas some but not all asymmetric cloning strategies could be worse than symmetric ones when d > 18. PMID:25484234

  16. The Complete Gabor-Fisher Classifier for Robust Face Recognition

    NASA Astrophysics Data System (ADS)

    Štruc, Vitomir; Pavešić, Nikola

    2010-12-01

    This paper develops a novel face recognition technique called Complete Gabor Fisher Classifier (CGFC). Different from existing techniques that use Gabor filters for deriving the Gabor face representation, the proposed approach does not rely solely on Gabor magnitude information but effectively uses features computed based on Gabor phase information as well. It represents one of the few successful attempts found in the literature of combining Gabor magnitude and phase information for robust face recognition. The novelty of the proposed CGFC technique comes from (1) the introduction of a Gabor phase-based face representation and (2) the combination of the recognition technique using the proposed representation with classical Gabor magnitude-based methods into a unified framework. The proposed face recognition framework is assessed in a series of face verification and identification experiments performed on the XM2VTS, Extended YaleB, FERET, and AR databases. The results of the assessment suggest that the proposed technique clearly outperforms state-of-the-art face recognition techniques from the literature and that its performance is almost unaffected by the presence of partial occlusions of the facial area, changes in facial expression, or severe illumination changes.

  17. A generalized Fisher equation and its utility in chemical kinetics.

    PubMed

    Ross, John; Fernández Villaverde, Alejandro; Banga, Julio R; Vázquez, Sara; Morán, Federico

    2010-07-20

    A generalized Fisher equation (GFE) relates the time derivative of the average of the intrinsic rate of growth to its variance. The GFE is an exact mathematical result that has been widely used in population dynamics and genetics, where it originated. Here we demonstrate that the GFE can also be useful in other fields, specifically in chemistry, with models of two chemical reaction systems for which the mechanisms and rate coefficients correspond reasonably well to experiments. A bad fit of the GFE can be a sign of high levels of measurement noise; for low or moderate levels of noise, fulfillment of the GFE is not degraded. Hence, the GFE presents a noise threshold that may be used to test the validity of experimental measurements without requiring any additional information. In a different approach information about the system (model) is included in the calculations. In that case, the discrepancy with the GFE can be used as an optimization criterion for the determination of rate coefficients in a given reaction mechanism.

  18. Antibiotic resistance and stress in the light of Fisher's model.

    PubMed

    Trindade, Sandra; Sousa, Ana; Gordo, Isabel

    2012-12-01

    The role of mutations in evolution depends upon the distribution of their effects on fitness. This distribution is likely to depend on the environment. Indeed genotype-by-environment interactions are key for the process of local adaptation and ecological specialization. An important trait in bacterial evolution is antibiotic resistance, which presents a clear case of change in the direction of selection between environments with and without antibiotics. Here, we study the distribution of fitness effects of mutations, conferring antibiotic resistance to Escherichia coli, in benign and stressful environments without drugs. We interpret the distributions in the light of a fitness landscape model that assumes a single fitness peak. We find that mutation effects (s) are well described by a shifted gamma distribution, with a shift parameter that reflects the distance to the fitness peak and varies across environments. Consistent with the theoretical predictions of Fisher's geometrical model, with a Gaussian relationship between phenotype and fitness, we find that the main effect of stress is to increase the variance in s. Our findings are in agreement with the results of a recent meta-analysis, which suggest that a simple fitness landscape model may capture the variation of mutation effects across species and environments.

  19. PRECISE TULLY-FISHER RELATIONS WITHOUT GALAXY INCLINATIONS

    SciTech Connect

    Obreschkow, D.; Meyer, M.

    2013-11-10

    Power-law relations between tracers of baryonic mass and rotational velocities of disk galaxies, so-called Tully-Fisher relations (TFRs), offer a wealth of applications in galaxy evolution and cosmology. However, measurements of rotational velocities require galaxy inclinations, which are difficult to measure, thus limiting the range of TFR studies. This work introduces a maximum likelihood estimation (MLE) method for recovering the TFR in galaxy samples with limited or no information on inclinations. The robustness and accuracy of this method is demonstrated using virtual and real galaxy samples. Intriguingly, the MLE reliably recovers the TFR of all test samples, even without using any inclination measurements—that is, assuming a random sin i-distribution for galaxy inclinations. Explicitly, this 'inclination-free MLE' recovers the three TFR parameters (zero-point, slope, scatter) with statistical errors only about 1.5 times larger than the best estimates based on perfectly known galaxy inclinations with zero uncertainty. Thus, given realistic uncertainties, the inclination-free MLE is highly competitive. If inclination measurements have mean errors larger than 10°, it is better not to use any inclinations than to consider the inclination measurements to be exact. The inclination-free MLE opens interesting perspectives for future H I surveys by the Square Kilometer Array and its pathfinders.

  20. Enhancing quantum Fisher information by utilizing uncollapsing measurements

    NASA Astrophysics Data System (ADS)

    He, Juan; Ding, Zhi-Yong; Ye, Liu

    2016-09-01

    As an indicator of estimation precision, quantum Fisher information (QFI) lies at the heart of quantum metrology theory. In this work, an effective scheme for enhancing QFI is proposed by utilizing quantum uncollapsing measurements. Two kinds of strategies for the arbitrary two-qubit pure state with weight parameter and phase parameter are implemented under different situations, respectively. We derive the explicit conditions for the optimal measurement strengths, and verify that the QFI can be improved quite well. Meanwhile, due to the relation of quantum correlation and QFI, the maximal value of QFI associated with phase parameter for pure state is always equal to 1. It is worth noting that the optimal measurement strength is only related to the weight parameter, as uncollapsing measurements operation does not induce any disturbance on the value of phase parameter. The scheme also can be extended to improve the parameter estimation precision for an N-qubit pure state. In addition, as an example, the situation of an arbitrary single-qubit state under amplitude damping channel is investigated. It is shown that our scheme also works well for enhancing QFI under decoherence.

  1. THE SLOPE OF THE BARYONIC TULLY-FISHER RELATION

    SciTech Connect

    Gurovich, Sebastian; Freeman, Kenneth; Jerjen, Helmut; Staveley-Smith, Lister; Puerari, Ivanio

    2010-09-15

    We present the results of a baryonic Tully-Fisher relation (BTFR) study for a local sample of relatively isolated disk galaxies. We derive a BTFR with a slope near 3 measured over about 4 dex in baryon mass for our combined H I and bright spiral disk samples. This BTFR is significantly flatter and has less scatter than the TFR (stellar mass only) with its slope near 4 reported for other samples and studies. A BTFR slope near 3 is in better agreement with the expected slope from simple {Lambda}CDM cosmological simulations that include both stellar and gas baryons. The scatter in the TFR/BTFR appears to depend on W{sub 20}: galaxies that rotate slower have more scatter. The atomic gas-to-stars ratio shows a break near W{sub 20} = 250 km s{sup -1} probably associated with a change in star formation efficiency. In contrast, the absence of such a break in the BTFR suggests that this relation was probably set at the main epoch of baryon dissipation rather than as a product of later galactic evolution.

  2. Hellmann–Feynman connection for the relative Fisher information

    SciTech Connect

    Venkatesan, R.C.; Plastino, A.

    2015-08-15

    The (i) reciprocity relations for the relative Fisher information (RFI, hereafter) and (ii) a generalized RFI–Euler theorem are self-consistently derived from the Hellmann–Feynman theorem. These new reciprocity relations generalize the RFI–Euler theorem and constitute the basis for building up a mathematical Legendre transform structure (LTS, hereafter), akin to that of thermodynamics, that underlies the RFI scenario. This demonstrates the possibility of translating the entire mathematical structure of thermodynamics into a RFI-based theoretical framework. Virial theorems play a prominent role in this endeavor, as a Schrödinger-like equation can be associated to the RFI. Lagrange multipliers are determined invoking the RFI–LTS link and the quantum mechanical virial theorem. An appropriate ansatz allows for the inference of probability density functions (pdf’s, hereafter) and energy-eigenvalues of the above mentioned Schrödinger-like equation. The energy-eigenvalues obtained here via inference are benchmarked against established theoretical and numerical results. A principled theoretical basis to reconstruct the RFI-framework from the FIM framework is established. Numerical examples for exemplary cases are provided. - Highlights: • Legendre transform structure for the RFI is obtained with the Hellmann–Feynman theorem. • Inference of the energy-eigenvalues of the SWE-like equation for the RFI is accomplished. • Basis for reconstruction of the RFI framework from the FIM-case is established. • Substantial qualitative and quantitative distinctions with prior studies are discussed.

  3. A transition mass in the local Tully-Fisher relation

    NASA Astrophysics Data System (ADS)

    Simons, Raymond C.; Kassin, Susan A.; Weiner, Benjamin J.; Heckman, Timothy M.; Lee, Janice C.; Lotz, Jennifer M.; Peth, Michael; Tchernyshyov, Kirill

    2015-09-01

    We study the stellar mass Tully-Fisher relation (TFR; stellar mass versus rotation velocity) for a morphologically blind selection of emission line galaxies in the field at redshifts 0.1 < z < 0.375. Kinematics (σg, Vrot) are measured from emission lines in Keck/DEIMOS spectra and quantitative morphology is measured from V- and I-band Hubble images. We find a transition stellar mass in the TFR, log M*/M⊙ = 9.5. Above this mass, nearly all galaxies are rotation dominated, on average more morphologically disc-like according to quantitative morphology, and lie on a relatively tight TFR. Below this mass, the TFR has significant scatter to low rotation velocity and galaxies can either be rotation-dominated discs on the TFR or asymmetric or compact galaxies which scatter off. We refer to this transition mass as the `mass of disc formation', Mdf because above it all star-forming galaxies form discs (except for a small number of major mergers and highly star-forming systems), whereas below it a galaxy may or may not form a disc.

  4. Localized Fisher vector representation for pathology detection in chest radiographs

    NASA Astrophysics Data System (ADS)

    Geva, Ofer; Lieberman, Sivan; Konen, Eli; Greenspan, Hayit

    2016-03-01

    In this work, we present a novel framework for automatic detection of abnormalities in chest radiographs. The representation model is based on the Fisher Vector encoding method. In the representation process, we encode each chest radiograph using a set of extracted local descriptors. These include localized texture features that address typical local texture abnormalities as well as spatial features. Using a Gaussian Mixture Model, a rich image descriptor is generated for each chest radiograph. An improved representation is obtained by selection of features that correspond to the relevant region of interest for each pathology. Categorization of the X-ray images is conducted using supervised learning and the SVM classifier. The proposed system was tested on a dataset of 636 chest radiographs taken from a real clinical environment. We measured the performance in terms of area (AUC) under the receiver operating characteristic (ROC) curve. Results show an AUC value of 0.878 for abnormal mediastinum detection, and AUC values of 0.827 and 0.817 for detection of right and left lung opacities, respectively. These results improve upon the state-of-the-art as compared with two alternative representation models.

  5. Transcranial magnetic stimulation studies in the Miller Fisher syndrome: evidence of corticospinal tract abnormality

    PubMed Central

    Lo, Y; Ratnagopal, P

    2001-01-01

    OBJECTIVES—To evaluate serial central motor conduction time in the Miller Fisher syndrome.
METHOD—Three patients with classic Miller Fisher syndrome were evaluated clinically. They had serial central motor conduction times measured with transcranial magnetic stimulation and nerve conduction studies. Motor evoked potentials were recorded from the first dorsal interossei and abductor hallucis muscles.
RESULTS—All three patients showed reduction in central motor conduction times in tandem with gradual clinical improvement at each review.
CONCLUSIONS—There is electrophysiological evidence of a central reversible corticospinal tract conduction abnormality in the Miller Fisher syndrome.

 PMID:11459894

  6. On conjugate families and Jeffreys priors for von Mises–Fisher distributions

    PubMed Central

    Hornik, Kurt; Grün, Bettina

    2013-01-01

    This paper discusses characteristics of standard conjugate priors and their induced posteriors in Bayesian inference for von Mises–Fisher distributions, using either the canonical natural exponential family or the more commonly employed polar coordinate parameterizations. We analyze when standard conjugate priors as well as posteriors are proper, and investigate the Jeffreys prior for the von Mises–Fisher family. Finally, we characterize the proper distributions in the standard conjugate family of the (matrix-valued) von Mises–Fisher distributions on Stiefel manifolds. PMID:23805026

  7. Characterising brain network topologies: A dynamic analysis approach using heat kernels.

    PubMed

    Chung, A W; Schirmer, M D; Krishnan, M L; Ball, G; Aljabar, P; Edwards, A D; Montana, G

    2016-11-01

    Network theory provides a principled abstraction of the human brain: reducing a complex system into a simpler representation from which to investigate brain organisation. Recent advancement in the neuroimaging field is towards representing brain connectivity as a dynamic process in order to gain a deeper understanding of how the brain is organised for information transport. In this paper we propose a network modelling approach based on the heat kernel to capture the process of heat diffusion in complex networks. By applying the heat kernel to structural brain networks, we define new features which quantify change in heat propagation. Identifying suitable features which can classify networks between cohorts is useful towards understanding the effect of disease on brain architecture. We demonstrate the discriminative power of heat kernel features in both synthetic and clinical preterm data. By generating an extensive range of synthetic networks with varying density and randomisation, we investigate heat diffusion in relation to changes in network topology. We demonstrate that our proposed features provide a metric of network efficiency and may be indicative of organisational principles commonly associated with, for example, small-world architecture. In addition, we show the potential of these features to characterise and classify between network topologies. We further demonstrate our methodology in a clinical setting by applying it to a large cohort of preterm babies scanned at term equivalent age from which diffusion networks were computed. We show that our heat kernel features are able to successfully predict motor function measured at two years of age (sensitivity, specificity, F-score, accuracy = 75.0, 82.5, 78.6, and 82.3%, respectively).

  8. Characterising brain network topologies: A dynamic analysis approach using heat kernels.

    PubMed

    Chung, A W; Schirmer, M D; Krishnan, M L; Ball, G; Aljabar, P; Edwards, A D; Montana, G

    2016-11-01

    Network theory provides a principled abstraction of the human brain: reducing a complex system into a simpler representation from which to investigate brain organisation. Recent advancement in the neuroimaging field is towards representing brain connectivity as a dynamic process in order to gain a deeper understanding of how the brain is organised for information transport. In this paper we propose a network modelling approach based on the heat kernel to capture the process of heat diffusion in complex networks. By applying the heat kernel to structural brain networks, we define new features which quantify change in heat propagation. Identifying suitable features which can classify networks between cohorts is useful towards understanding the effect of disease on brain architecture. We demonstrate the discriminative power of heat kernel features in both synthetic and clinical preterm data. By generating an extensive range of synthetic networks with varying density and randomisation, we investigate heat diffusion in relation to changes in network topology. We demonstrate that our proposed features provide a metric of network efficiency and may be indicative of organisational principles commonly associated with, for example, small-world architecture. In addition, we show the potential of these features to characterise and classify between network topologies. We further demonstrate our methodology in a clinical setting by applying it to a large cohort of preterm babies scanned at term equivalent age from which diffusion networks were computed. We show that our heat kernel features are able to successfully predict motor function measured at two years of age (sensitivity, specificity, F-score, accuracy = 75.0, 82.5, 78.6, and 82.3%, respectively). PMID:27421183

  9. Small convolution kernels for high-fidelity image restoration

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Park, Stephen K.

    1991-01-01

    An algorithm is developed for computing the mean-square-optimal values for small, image-restoration kernels. The algorithm is based on a comprehensive, end-to-end imaging system model that accounts for the important components of the imaging process: the statistics of the scene, the point-spread function of the image-gathering device, sampling effects, noise, and display reconstruction. Subject to constraints on the spatial support of the kernel, the algorithm generates the kernel values that restore the image with maximum fidelity, that is, the kernel minimizes the expected mean-square restoration error. The algorithm is consistent with the derivation of the spatially unconstrained Wiener filter, but leads to a small, spatially constrained kernel that, unlike the unconstrained filter, can be efficiently implemented by convolution. Simulation experiments demonstrate that for a wide range of imaging systems these small kernels can restore images with fidelity comparable to images restored with the unconstrained Wiener filter.

  10. Multiple kernel learning for sparse representation-based classification.

    PubMed

    Shrivastava, Ashish; Patel, Vishal M; Chellappa, Rama

    2014-07-01

    In this paper, we propose a multiple kernel learning (MKL) algorithm that is based on the sparse representation-based classification (SRC) method. Taking advantage of the nonlinear kernel SRC in efficiently representing the nonlinearities in the high-dimensional feature space, we propose an MKL method based on the kernel alignment criteria. Our method uses a two step training method to learn the kernel weights and sparse codes. At each iteration, the sparse codes are updated first while fixing the kernel mixing coefficients, and then the kernel mixing coefficients are updated while fixing the sparse codes. These two steps are repeated until a stopping criteria is met. The effectiveness of the proposed method is demonstrated using several publicly available image classification databases and it is shown that this method can perform significantly better than many competitive image classification algorithms. PMID:24835226

  11. The Milky Way, the Local Group & the IR Tully-Fisher Diagram

    NASA Technical Reports Server (NTRS)

    Malhotra, S.; Spergel, D.; Rhoads, J.; Li, J.

    1996-01-01

    Using the near infrared fluxes of local group galaxies derived from Cosmic Background Explorer/Diffuse Infrared Background Experiment band maps and published Cepheid distances, we construct Tully-Fisher diagrams for the Local Group.

  12. 77 FR 47818 - Proposed Information Collection; Comment Request; Socioeconomics of Commercial Fishers and for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-10

    ...; Socioeconomics of Commercial Fishers and for Hire Diving and Fishing Operations in the Flower Garden Banks...). In 1996, the Flower Gardens Bank National Marine Sanctuary (FGBNMS) was added to the system of...

  13. Statistics is not enough: revisiting Ronald A. Fisher's critique (1936) of Mendel's experimental results (1866).

    PubMed

    Pilpel, Avital

    2007-09-01

    This paper is concerned with the role of rational belief change theory in the philosophical understanding of experimental error. Today, philosophers seek insight about error in the investigation of specific experiments, rather than in general theories. Nevertheless, rational belief change theory adds to our understanding of just such cases: R. A. Fisher's criticism of Mendel's experiments being a case in point. After an historical introduction, the main part of this paper investigates Fisher's paper from the point of view of rational belief change theory: what changes of belief about Mendel's experiment does Fisher go through and with what justification. It leads to surprising insights about what Fisher had done right and wrong, and, more generally, about the limits of statistical methods in detecting error.

  14. Who's Qualified? Seeing Race in Color-Blind Times: Lessons from Fisher v. University of Texas

    ERIC Educational Resources Information Center

    Donnor, Jamel K.

    2015-01-01

    Using Howard Winant's racial dualism theory, this chapter explains how race was discursively operationalized in the recent U.S. Supreme Court higher education antiracial diversity case Fisher v. University of Texas at Austin.

  15. Fisher information of Markovian decay modes. Nonequilibrium equivalence principle, dynamical phase transitions and coarse graining

    NASA Astrophysics Data System (ADS)

    Polettini, Matteo

    2014-09-01

    We introduce the Fisher information in the basis of decay modes of Markovian dynamics, arguing that it encodes important information about the behavior of nonequilibrium systems. In particular we generalize an orthonormality relation between decay eigenmodes of detailed balanced systems to normal generators that commute with their time-reversal. Viewing such modes as tangent vectors to the manifold of statistical distributions, we relate the result to the choice of a coordinate patch that makes the Fisher-Rao metric Euclidean at the steady distribution, realizing a sort of statistical equivalence principle. We then classify nonequilibrium systems according to their spectrum, showing that a degenerate Fisher matrix is the signature of the insurgence of a class of dynamical phase transitions between nonequilibrium regimes, characterized by level crossing and power-law decay in time of suitable order parameters. An important consequence is that normal systems cannot manifest critical behavior. Finally, we study the Fisher matrix of systems with time-scale separation.

  16. Fisher information for the position-dependent mass Schrödinger system

    NASA Astrophysics Data System (ADS)

    Falaye, B. J.; Serrano, F. A.; Dong, Shi-Hai

    2016-01-01

    This study presents the Fisher information for the position-dependent mass Schrödinger equation with hyperbolic potential V (x) = -V0csch2 (ax). The analysis of the quantum-mechanical probability for the ground and exited states (n = 0, 1, 2) has been obtained via the Fisher information. This controls both chemical and physical properties of some molecular systems. The Fisher information is considered only for x > 0 due to the singular point at x = 0. We found that Fisher-information-based uncertainty relation and the Cramer-Rao inequality holds. Some relevant numerical results are presented. The results presented show that the Cramer-Rao and the Heisenberg products in both spaces provide a natural measure for anharmonicity of -V0csch2 (ax).

  17. Monte Carlo Code System for Electron (Positron) Dose Kernel Calculations.

    SciTech Connect

    CHIBANI, OMAR

    1999-05-12

    Version 00 KERNEL performs dose kernel calculations for an electron (positron) isotropic point source in an infinite homogeneous medium. First, the auxiliary code PRELIM is used to prepare cross section data for the considered medium. Then the KERNEL code simulates the transport of electrons and bremsstrahlung photons through the medium until all particles reach their cutoff energies. The deposited energy is scored in concentric spherical shells at a radial distance ranging from zero to twice the source particle range.

  18. Extended Tully-Fisher relations using H I stacking

    NASA Astrophysics Data System (ADS)

    Meyer, Scott A.; Meyer, Martin; Obreschkow, Danail; Staveley-Smith, Lister

    2016-01-01

    We present a new technique for the statistical evaluation of the Tully-Fisher relation (TFR) using spectral line stacking. This technique has the potential to extend TFR observations to lower masses and higher redshifts than possible through a galaxy-by-galaxy analysis. It further avoids the need for individual galaxy inclination measurements. To quantify the properties of stacked H I emission lines, we consider a simplistic model of galactic discs with analytically expressible line profiles. Using this model, we compare the widths of stacked profiles with those of individual galaxies. We then follow the same procedure using more realistic mock galaxies drawn from the S3-SAX model (a derivative of the Millennium simulation). Remarkably, when stacking the apparent H I lines of galaxies with similar absolute magnitude and random inclinations, the width of the stack is very similar to the width of the deprojected (= corrected for inclination) and dedispersed (= after removal of velocity dispersion) input lines. Therefore, the ratio between the widths of the stack and the deprojected/dedispersed input lines is approximately constant - about 0.93 - with very little dependence on the gas dispersion, galaxy mass, galaxy morphology and shape of the rotation curve. Finally, we apply our technique to construct a stacked TFR using H I Parkes All-Sky Survey (HIPASS) data which already has a well-defined TFR based on individual detections. We obtain a B-band TFR with a slope of -8.5 ± 0.4 and a K-band relation with a slope of -11.7 ± 0.6 for the HIPASS data set which is consistent with the existing results.

  19. The Tully-Fisher relation of COLD GASS Galaxies

    NASA Astrophysics Data System (ADS)

    Tiley, Alfred L.; Bureau, Martin; Saintonge, Amélie; Topal, Selcuk; Davis, Timothy A.; Torii, Kazufumi

    2016-10-01

    We present the stellar mass (M*) and Wide-Field Infrared Survey Explorer absolute Band 1 magnitude (MW1) Tully-Fisher relations (TFRs) of subsets of galaxies from the CO Legacy Database for the GALEX Arecibo SDSS Survey (COLD GASS). We examine the benefits and drawbacks of several commonly used fitting functions in the context of measuring CO(1-0) linewidths (and thus rotation velocities), favouring the Gaussian Double Peak function. We find the MW1 and M* TFR, for a carefully selected sub-sample, to be M_{W1} = (-7.1± 0.6) [log {(W_{50}/sin {i}/km s^{-1})}-2.58] - 23.83 ± 0.09 and log {(M_{{ast }}/M_{{⊙}})} = (3.3± 0.3) [log {(W_{50//sin {i}}{km s^{-1}})}-2.58] + 10.51± 0.04, respectively, where W50 is the width of a galaxy's CO(1-0) integrated profile at 50 per cent of its maximum and the inclination i is derived from the galaxy axial ratio measured on the Sloan Digitized Sky Survey r-band image. We find no evidence for any significant offset between the TFRs of COLD GASS galaxies and those of comparison samples of similar redshifts and morphologies. The slope of the COLD GASS M* TFR agrees with the relation of Pizagno et al. However, we measure a comparatively shallower slope for the COLD GASS MW1 TFR as compared to the relation of Tully & Pierce. We attribute this to the fact that the COLD GASS sample comprises galaxies of various (late-type) morphologies. Nevertheless, our work provides a robust reference point with which to compare future CO TFR studies.

  20. Transit Light Curves with Finite Integration Time: Fisher Information Analysis

    NASA Astrophysics Data System (ADS)

    Price, Ellen M.; Rogers, Leslie A.

    2014-10-01

    Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal to noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curve photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal to noise (constant total integration time in the absence of read noise). Uncertainties on the transit ingress/egress time increase by a factor of 34 for Earth-size planets and 3.4 for Jupiter-size planets around Sun-like stars for integration times of 30 minutes compared to instantaneously sampled light curves. Similarly, uncertainties on the mid-transit time for Earth and Jupiter-size planets increase by factors of 3.9 and 1.4. Uncertainties on the transit depth are largely unaffected by finite integration times. While correlations among the transit depth, ingress duration, and transit duration all increase in magnitude with longer integration times, the mid-transit time remains uncorrelated with the other parameters. We provide code in Python and Mathematica for predicting the variances and covariances at www.its.caltech.edu/~eprice.

  1. The different baryonic Tully-Fisher relations at low masses

    NASA Astrophysics Data System (ADS)

    Brook, Chris B.; Santos-Santos, Isabel; Stinson, Greg

    2016-06-01

    We compare the Baryonic Tully-Fisher relation (BTFR) of simulations and observations of galaxies ranging from dwarfs to spirals, using various measures of rotational velocity Vrot. We explore the BTFR when measuring Vrot at the flat part of the rotation curve, Vflat, at the extent of H I gas, Vlast, and using 20 per cent (W20) and 50 per cent (W50) of the width of H I line profiles. We also compare with the maximum circular velocity of the parent halo, V_max^DM, within dark matter only simulations. The different BTFRs increasingly diverge as galaxy mass decreases. Using Vlast one obtains a power law over four orders of magnitude in baryonic mass, with slope similar to the observed BTFR. Measuring Vflat gives similar results as Vlast when galaxies with rising rotation curves are excluded. However, higher rotation velocities would be found for low-mass galaxies if the cold gas extended far enough for Vrot to reach a maximum. W20 gives a similar slope as Vlast but with slightly lower values of Vrot for low-mass galaxies, although this may depend on the extent of the gas in your galaxy sample. W50 bends away from these other relations towards low velocities at low masses. By contrast, V_max^DM bends towards high velocities for low-mass galaxies, as cold gas does not extend out to the radius at which haloes reach V_max^DM. Our study highlights the need for careful comparisons between observations and models: one needs to be consistent about the particular method of measuring Vrot, and precise about the radius at which velocities are measured.

  2. Miller fisher syndrome: a hospital-based retrospective study.

    PubMed

    Yuan, C L; Wang, Y J; Tsai, C P

    2000-01-01

    Miller Fisher syndrome (MFS), characterized as ataxia, areflexia and ophthalmoplegia, is generally considered as a variant of Guillain-Barré syndrome (GBS). However, some investigators believed that the syndrome could be explained by a central origin. To obtain more information about MFS for comparison with GBS, we conducted a retrospective study by analyzing the clinical data of MFS patients admitted to our hospital over a period of 11 years. The calibrated male/female ratio was 1.65. A seasonal clustering in winter was noted. The percentage of MFS among GBS was especially high (18%, 11/60) in Taiwan when compared with other series. Involvement of limb muscle strength, autonomic function and cranial nerves, except ocular motor nerves, was rarely found in our patients. When MFS is accompanied by limb weakness, it might represent a transitional form between MFS and GBS. Bulbar palsy and dysautonomia might predict a relatively poor prognosis. To obtain more reliable information, lumbar puncture should be done 1 week after disease onset, and electrophysiological tests should be done serially in every MFS patient. Eighty percent (80%, 4/5) of our patients were positive for IgG anti-GQ(1b) antibody activity. In our study, there is more evidence indicating that MFS is a peripheral nervous system disorder; however, no definite conclusion could be made as to whether MFS is exclusively a peripheral or central nervous system disorder. We think MFS is an immune-mediated clinical entity which mainly involves the peripheral nervous system with rare involvement of other parts of the central nervous system. PMID:10965158

  3. Transit light curves with finite integration time: Fisher information analysis

    SciTech Connect

    Price, Ellen M.; Rogers, Leslie A.

    2014-10-10

    Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal to noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curve photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal to noise (constant total integration time in the absence of read noise). Uncertainties on the transit ingress/egress time increase by a factor of 34 for Earth-size planets and 3.4 for Jupiter-size planets around Sun-like stars for integration times of 30 minutes compared to instantaneously sampled light curves. Similarly, uncertainties on the mid-transit time for Earth and Jupiter-size planets increase by factors of 3.9 and 1.4. Uncertainties on the transit depth are largely unaffected by finite integration times. While correlations among the transit depth, ingress duration, and transit duration all increase in magnitude with longer integration times, the mid-transit time remains uncorrelated with the other parameters. We provide code in Python and Mathematica for predicting the variances and covariances at www.its.caltech.edu/∼eprice.

  4. Phase space gradient of dissipated work and information: A role of relative Fisher information

    SciTech Connect

    Yamano, Takuya

    2013-11-15

    We show that an information theoretic distance measured by the relative Fisher information between canonical equilibrium phase densities corresponding to forward and backward processes is intimately related to the gradient of the dissipated work in phase space. We present a universal constraint on it via the logarithmic Sobolev inequality. Furthermore, we point out that a possible expression of the lower bound indicates a deep connection in terms of the relative entropy and the Fisher information of the canonical distributions.

  5. Resources and estuarine health: perceptions of elected officials and recreational fishers.

    PubMed

    Burger, J; Sanchez, J; McMahon, M; Leonard, J; Lord, C G; Ramos, R; Gochfeld, M

    1999-10-29

    It is important to understand the perceptions of user groups regarding both the health of our estuaries and environmental problems requiring management. Recreational fishers were interviewed to determine the perceptions of one of the traditional user groups of Barnegat Bay (New Jersey), and elected officials were interviewed to determine if the people charged with making decisions about environmental issues in the bay held similar perceptions. Although relative ratings were similar, there were significant differences in perceptions of the severity of environmental problems, and for the most part, public officials thought the problems were more severe than did the fishers. Personal watercraft (often called Jet Skis) were rated as the most severe problem, followed by chemical pollution, junk, overfishing, street runoff, and boat oil. Small boats, sailboats, wind surfers, and foraging birds were not considered environmental problems by either elected officials or fishermen. The disconnect between the perceptions of the recreational fishers and those of the locally elected public officials suggests that officials may be hearing from some of the more vocal people about problems, rather than from the typical fishers. Both groups felt there were decreases in some of the resources in the bay; over 50% felt the number of fish and crabs had declined, the size of fish and crabs had declined, and the number of turtles had declined. Among recreational fishers, there were almost no differences in perceptions of the severity of environmental problems or in changes in the bay. The problems that were rated the most severe were personal watercraft and overfishing by commercial fishers. Recreational fishers ranked sailboats, wind surfers, and fishing by birds as posing no problem for the bay. Most fishers felt there had been recent major changes in Barnegat Bay, with there now being fewer and smaller fish, fewer and smaller crabs, and fewer turtles. The results suggest that the views

  6. Resources and estuarine health: Perceptions of elected officials and recreational fishers

    SciTech Connect

    Burger, J.; Sanchez, J.; McMahon, M.; Leonard, J.; Lord, C.G.; Ramos, R.; Gochfeld, M.

    1999-10-29

    It is important to understand the perceptions of user groups regarding both the health of their estuaries and environmental problems requiring management. Recreational fishers were interviewed to determine the perceptions of one of the traditional user groups of Barnegat Bay (New Jersey), and elected officials were interviewed to determine if the people charged with making decisions about environmental issues in the bay held similar perceptions. Although relative ratings were similar, there were significant differences in perceptions of the severity of environmental problems, and for the most part, public officials thought the problems were more severe than did the fishers. Personal watercraft (often called Jet Skis) were rated as the most severe problem, followed by chemical pollution, junk, over fishing, street runoff, and boat oil. Small boats, sailboats, wind surfers, and foraging birds were not considered environmental problems by either elected officials or fishermen. The disconnect between the perceptions of the recreational fishers and those of the locally elected public officials suggests that officials may be hearing from some of the more vocal people about problems, rather than from the typical fishers. Both groups felt there were decreases in some of the resources in the bay; over 50% felt the number of fish and crabs had declined, the size of fish and crabs had declined, and the number of turtles had declined. Among recreational fishers, there were almost no differences in perceptions of the severity of environmental problems or in changes in the bay. The problems that were rated the most severe were personal watercraft and over fishing by commercial fishers. Recreational fishers ranked sailboats, wind surfers, and fishing by birds as posing no problem for the bay. Most fishers felt there had been recent major changes in Barnegat Bay, with there now being fewer and smaller fish, fewer and smaller crabs, and fewer turtles. The results suggest that the

  7. A Kernel-based Account of Bibliometric Measures

    NASA Astrophysics Data System (ADS)

    Ito, Takahiko; Shimbo, Masashi; Kudo, Taku; Matsumoto, Yuji

    The application of kernel methods to citation analysis is explored. We show that a family of kernels on graphs provides a unified perspective on the three bibliometric measures that have been discussed independently: relatedness between documents, global importance of individual documents, and importance of documents relative to one or more (root) documents (relative importance). The framework provided by the kernels establishes relative importance as an intermediate between relatedness and global importance, in which the degree of `relativity,' or the bias between relatedness and importance, is naturally controlled by a parameter characterizing individual kernels in the family.

  8. Embedded real-time operating system micro kernel design

    NASA Astrophysics Data System (ADS)

    Cheng, Xiao-hui; Li, Ming-qiang; Wang, Xin-zheng

    2005-12-01

    Embedded systems usually require a real-time character. Base on an 8051 microcontroller, an embedded real-time operating system micro kernel is proposed consisting of six parts, including a critical section process, task scheduling, interruption handle, semaphore and message mailbox communication, clock managent and memory managent. Distributed CPU and other resources are among tasks rationally according to the importance and urgency. The design proposed here provides the position, definition, function and principle of micro kernel. The kernel runs on the platform of an ATMEL AT89C51 microcontroller. Simulation results prove that the designed micro kernel is stable and reliable and has quick response while operating in an application system.

  9. Robust visual tracking via speedup multiple kernel ridge regression

    NASA Astrophysics Data System (ADS)

    Qian, Cheng; Breckon, Toby P.; Li, Hui

    2015-09-01

    Most of the tracking methods attempt to build up feature spaces to represent the appearance of a target. However, limited by the complex structure of the distribution of features, the feature spaces constructed in a linear manner cannot characterize the nonlinear structure well. We propose an appearance model based on kernel ridge regression for visual tracking. Dense sampling is fulfilled around the target image patches to collect the training samples. In order to obtain a kernel space in favor of describing the target appearance, multiple kernel learning is introduced into the selection of kernels. Under the framework, instead of a single kernel, a linear combination of kernels is learned from the training samples to create a kernel space. Resorting to the circulant property of a kernel matrix, a fast interpolate iterative algorithm is developed to seek coefficients that are assigned to these kernels so as to give an optimal combination. After the regression function is learned, all candidate image patches gathered are taken as the input of the function, and the candidate with the maximal response is regarded as the object image patch. Extensive experimental results demonstrate that the proposed method outperforms other state-of-the-art tracking methods.

  10. Robust kernel collaborative representation for face recognition

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Wang, Xiaohui; Ma, Yanbo; Jiang, Yuzheng; Zhu, Yinghui; Jin, Zhong

    2015-05-01

    One of the greatest challenges of representation-based face recognition is that the training samples are usually insufficient. In other words, the training set usually does not include enough samples to show varieties of high-dimensional face images caused by illuminations, facial expressions, and postures. When the test sample is significantly different from the training samples of the same subject, the recognition performance will be sharply reduced. We propose a robust kernel collaborative representation based on virtual samples for face recognition. We think that the virtual training set conveys some reasonable and possible variations of the original training samples. Hence, we design a new object function to more closely match the representation coefficients generated from the original and virtual training sets. In order to further improve the robustness, we implement the corresponding representation-based face recognition in kernel space. It is noteworthy that any kind of virtual training samples can be used in our method. We use noised face images to obtain virtual face samples. The noise can be approximately viewed as a reflection of the varieties of illuminations, facial expressions, and postures. Our work is a simple and feasible way to obtain virtual face samples to impose Gaussian noise (and other types of noise) specifically to the original training samples to obtain possible variations of the original samples. Experimental results on the FERET, Georgia Tech, and ORL face databases show that the proposed method is more robust than two state-of-the-art face recognition methods, such as CRC and Kernel CRC.

  11. LFK. Livermore FORTRAN Kernel Computer Test

    SciTech Connect

    McMahon, F.H.

    1990-05-01

    LFK, the Livermore FORTRAN Kernels, is a computer performance test that measures a realistic floating-point performance range for FORTRAN applications. Informally known as the Livermore Loops test, the LFK test may be used as a computer performance test, as a test of compiler accuracy (via checksums) and efficiency, or as a hardware endurance test. The LFK test, which focuses on FORTRAN as used in computational physics, measures the joint performance of the computer CPU, the compiler, and the computational structures in units of Megaflops/sec or Mflops. A C language version of subroutine KERNEL is also included which executes 24 samples of C numerical computation. The 24 kernels are a hydrodynamics code fragment, a fragment from an incomplete Cholesky conjugate gradient code, the standard inner product function of linear algebra, a fragment from a banded linear equations routine, a segment of a tridiagonal elimination routine, an example of a general linear recurrence equation, an equation of state fragment, part of an alternating direction implicit integration code, an integrate predictor code, a difference predictor code, a first sum, a first difference, a fragment from a two-dimensional particle-in-cell code, a part of a one-dimensional particle-in-cell code, an example of how casually FORTRAN can be written, a Monte Carlo search loop, an example of an implicit conditional computation, a fragment of a two-dimensional explicit hydrodynamics code, a general linear recurrence equation, part of a discrete ordinates transport program, a simple matrix calculation, a segment of a Planckian distribution procedure, a two-dimensional implicit hydrodynamics fragment, and determination of the location of the first minimum in an array.

  12. Oil point pressure of Indian almond kernels

    NASA Astrophysics Data System (ADS)

    Aregbesola, O.; Olatunde, G.; Esuola, S.; Owolarafe, O.

    2012-07-01

    The effect of preprocessing conditions such as moisture content, heating temperature, heating time and particle size on oil point pressure of Indian almond kernel was investigated. Results showed that oil point pressure was significantly (P < 0.05) affected by above mentioned parameters. It was also observed that oil point pressure reduced with increase in heating temperature and heating time for both coarse and fine particles. Furthermore, an increase in moisture content resulted in increased oil point pressure for coarse particles while there was a reduction in oil point pressure with increase in moisture content for fine particles.

  13. Verification of Chare-kernel programs

    SciTech Connect

    Bhansali, S.; Kale, L.V. )

    1989-01-01

    Experience with concurrent programming has shown that concurrent programs can conceal bugs even after extensive testing. Thus, there is a need for practical techniques which can establish the correctness of parallel programs. This paper proposes a method for showing how to prove the partial correctness of programs written in the Chare-kernel language, which is a language designed to support the parallel execution of computation with irregular structures. The proof is based on the lattice proof technique and is divided into two parts. The first part is concerned with the program behavior within a single chare instance, whereas the second part captures the inter-chare interaction.

  14. Tests of the Tully-Fisher Relation Using Cepheids and SNIa

    NASA Astrophysics Data System (ADS)

    Shanks, T.

    1998-12-01

    We make a direct test of Tully-Fisher distance estimates to twelve spiral galaxies with HST Cepheid distances and to twelve spiral galaxies with SNIa distances. The HST Cepheid distances come from the work of Freedman et al (1997), Sandage et al (1996) and Tanvir et al (1995). The SNIa distances come from Pierce (1994), calibrated using the Cepheid results of Sandage et al (1996). The Tully-Fisher distances mostly come from the work of Pierce (1994). The results show that the Tully-Fisher distance moduli are too short with respect to the Cepheid distances by 0.46+/-0.14mag and too short with respect to the SNIa distances by 0.46+/-0.19mag. Combining the HST Cepheid and SNIa data suggests that, overall, previous Tully-Fisher distances were too short by 0.46+/-0.11mag, a result which is significant at the 4sigma level. These data therefore indicate that previous Tully-Fisher distances should be revised upwards by 24+/-6 distance of 19.3+/-1.9Mpc. The value of H_o from Tully-Fisher estimates is correspondingly revised downwards from H_o=84+/-10 to Ho=68+/-8. Further downward revisions of H_o are possible if it proves that it is Malmquist bias in the TF distance estimates that is causing this discrepancy.

  15. Rapidly shifting environmental baselines among fishers of the Gulf of California

    PubMed Central

    Sáenz-Arroyo, Andrea; Roberts, Callum M; Torre, Jorge; Cariño-Olvera, Micheline; Enríquez-Andrade, Roberto R

    2005-01-01

    Shifting environmental baselines are inter-generational changes in perception of the state of the environment. As one generation replaces another, people's perceptions of what is natural change even to the extent that they no longer believe historical anecdotes of past abundance or size of species. Although widely accepted, this phenomenon has yet to be quantitatively tested. Here we survey three generations of fishers from Mexico's Gulf of California (N=108), where fish populations have declined steeply over the last 60 years, to investigate how far and fast their environmental baselines are shifting. Compared to young fishers, old fishers named five times as many species and four times as many fishing sites as once being abundant/productive but now depleted (Kruskal–Wallis tests, both p<0.001) with no evidence of a slowdown in rates of loss experienced by younger compared to older generations (Kruskal–Wallis test, n.s. in both cases). Old fishers caught up to 25 times as many Gulf grouper Mycteroperca jordani as young fishers on their best ever fishing day (regression r2=0.62, p<0.001). Despite times of plentiful large fish still being within living memory, few young fishers appreciated that large species had ever been common or nearshore sites productive. Such rapid shifts in perception of what is natural help explain why society is tolerant of the creeping loss of biodiversity. They imply a large educational hurdle in efforts to reset expectations and targets for conservation. PMID:16191603

  16. Fishers' behaviour in response to the implementation of a Marine Protected Area.

    PubMed

    Horta e Costa, Bárbara; Batista, Marisa I; Gonçalves, Leonel; Erzini, Karim; Caselle, Jennifer E; Cabral, Henrique N; Gonçalves, Emanuel J

    2014-01-01

    Marine Protected Areas (MPAs) have been widely proposed as a fisheries management tool in addition to their conservation purposes. Despite this, few studies have satisfactorily assessed the dynamics of fishers' adaptations to the loss of fishing grounds. Here we used data from before, during and after the implementation of the management plan of a temperate Atlantic multiple-use MPA to examine the factors affecting the spatial and temporal distribution of different gears used by the artisanal fishing fleet. The position of vessels and gear types were obtained by visual surveys and related to spatial features of the marine park. A hotspot analysis was conducted to identify heavily utilized patches for each fishing gear and time period. The contribution of individual vessels to each significant cluster was assessed to better understand fishers' choices. Different fisheries responded differently to the implementation of protection measures, with preferred habitats of target species driving much of the fishers' choices. Within each fishery, individual fishers showed distinct strategies with some operating in a broader area whereas others kept preferred territories. Our findings are based on reliable methods that can easily be applied in coastal multipurpose MPAs to monitor and assess fisheries and fishers responses to different management rules and protection levels. This paper is the first in-depth empirical study where fishers' choices from artisanal fisheries were analysed before, during and after the implementation of a MPA, thereby allowing a clearer understanding of the dynamics of local fisheries and providing significant lessons for marine conservation and management of coastal systems.

  17. Prediction of kernel density of corn using single-kernel near infrared spectroscopy

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Corn hardness as is an important property for dry and wet-millers, food processors and corn breeders developing hybrids for specific markets. Of the several methods used to measure hardness, kernel density measurements are one of the more repeatable methods to quantify hardness. Near infrared spec...

  18. Rapid discrimination of main red meat species based on near-infrared hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Qiao, Lu; Peng, Yankun; Chao, Kuanglin; Qin, Jianwei

    2016-05-01

    Meat is the necessary source of essential nutrients for people including protein, fat, and so on. The discrimination of meat species and the determination of meat authenticity have been an important issue in the meat industry. The objective of this study is to realize the fast and accurate identification of three main red meats containing beef, lamb and pork by using near-infrared hyperspectral imaging (HSI) technology. After acquiring the hyperspectral images of meat samples, the calibration of acquired images and selection of the region of interest (ROI) were carried out. Then spectral preprocessing method of standard normal variate correction (SNV) was used to reduce the light scattering and random noise before the spectral analysis. Finally, characteristic wavelengths were extracted by principal component analysis (PCA), and the Fisher linear discriminant method was applied to establish Fisher discriminant functions to identify the meat species. All the samples were collected from different batches in order to improve the coverage of the models. In addition to the validation of sample itself in train set and cross validation, three different meat samples were sliced at the size of 2cm×2cm×2 cm approximately and were spliced together in one interface to be scanned by HSI system. The acquired hyperspectral data was applied to further validate the discriminant model. The results demonstrated that the near-infrared hyperspectral imaging technology could be applied as an effective, rapid and non-destructive discrimination method for main red meats.

  19. Linear and kernel methods for multi- and hypervariate change detection

    NASA Astrophysics Data System (ADS)

    Nielsen, Allan A.; Canty, Morton J.

    2010-10-01

    The iteratively re-weighted multivariate alteration detection (IR-MAD) algorithm may be used both for unsuper- vised change detection in multi- and hyperspectral remote sensing imagery as well as for automatic radiometric normalization of multi- or hypervariate multitemporal image sequences. Principal component analysis (PCA) as well as maximum autocorrelation factor (MAF) and minimum noise fraction (MNF) analyses of IR-MAD images, both linear and kernel-based (which are nonlinear), may further enhance change signals relative to no-change background. The kernel versions are based on a dual formulation, also termed Q-mode analysis, in which the data enter into the analysis via inner products in the Gram matrix only. In the kernel version the inner products of the original data are replaced by inner products between nonlinear mappings into higher dimensional feature space. Via kernel substitution, also known as the kernel trick, these inner products between the mappings are in turn replaced by a kernel function and all quantities needed in the analysis are expressed in terms of the kernel function. This means that we need not know the nonlinear mappings explicitly. Kernel principal component analysis (PCA), kernel MAF and kernel MNF analyses handle nonlinearities by implicitly transforming data into high (even innite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In image analysis the Gram matrix is often prohibitively large (its size is the number of pixels in the image squared). In this case we may sub-sample the image and carry out the kernel eigenvalue analysis on a set of training data samples only. To obtain a transformed version of the entire image we then project all pixels, which we call the test data, mapped nonlinearly onto the primal eigenvectors. IDL (Interactive Data Language) implementations of IR-MAD, automatic radiometric normalization and kernel PCA/MAF/MNF transformations have been written

  20. Fructan metabolism in developing wheat (Triticum aestivum L.) kernels.

    PubMed

    Verspreet, Joran; Cimini, Sara; Vergauwen, Rudy; Dornez, Emmie; Locato, Vittoria; Le Roy, Katrien; De Gara, Laura; Van den Ende, Wim; Delcour, Jan A; Courtin, Christophe M

    2013-12-01

    Although fructans play a crucial role in wheat kernel development, their metabolism during kernel maturation is far from being understood. In this study, all major fructan-metabolizing enzymes together with fructan content, fructan degree of polymerization and the presence of fructan oligosaccharides were examined in developing wheat kernels (Triticum aestivum L. var. Homeros) from anthesis until maturity. Fructan accumulation occurred mainly in the first 2 weeks after anthesis, and a maximal fructan concentration of 2.5 ± 0.3 mg fructan per kernel was reached at 16 days after anthesis (DAA). Fructan synthesis was catalyzed by 1-SST (sucrose:sucrose 1-fructosyltransferase) and 6-SFT (sucrose:fructan 6-fructosyltransferase), and to a lesser extent by 1-FFT (fructan:fructan 1-fructosyltransferase). Despite the presence of 6G-kestotriose in wheat kernel extracts, the measured 6G-FFT (fructan:fructan 6G-fructosyltransferase) activity levels were low. During kernel filling, which lasted from 2 to 6 weeks after anthesis, kernel fructan content decreased from 2.5 ± 0.3 to 1.31 ± 0.12 mg fructan per kernel (42 DAA) and the average fructan degree of polymerization decreased from 7.3 ± 0.4 (14 DAA) to 4.4 ± 0.1 (42 DAA). FEH (fructan exohydrolase) reached maximal activity between 20 and 28 DAA. No fructan-metabolizing enzyme activities were registered during the final phase of kernel maturation, and fructan content and structure remained unchanged. This study provides insight into the complex metabolism of fructans during wheat kernel development and relates fructan turnover to the general phases of kernel development.

  1. Justice and Reverse Discrimination.

    ERIC Educational Resources Information Center

    Goldman, Alan H.

    Defining reverse discrimination as hiring or admissions decisions based on normally irrelevant criteria, this book develops principles of rights, compensation, and equal opportunity applicable to the reverse discrimination issue. The introduction defines the issue and discusses deductive and inductive methodology as applied to reverse…

  2. Employment Discrimination: A Survey.

    ERIC Educational Resources Information Center

    Caplan, Gerald A.

    Chapter 4 in a book on school law provides a general overview of the various federal statutes directed toward discrimination in employment and considers some of the recent developments under these statutes. The first section is a survey of the employment discrimination laws and their interrelationships. The second section analyzes more closely…

  3. Flash-Type Discrimination

    NASA Technical Reports Server (NTRS)

    Koshak, William J.

    2010-01-01

    This viewgraph presentation describes the significant progress made in the flash-type discrimination algorithm development. The contents include: 1) Highlights of Progress for GLM-R3 Flash-Type discrimination Algorithm Development; 2) Maximum Group Area (MGA) Data; 3) Retrieval Errors from Simulations; and 4) Preliminary Global-scale Retrieval.

  4. Discrimination and health inequities.

    PubMed

    Krieger, Nancy

    2014-01-01

    In 1999, only 20 studies in the public health literature employed instruments to measure self-reported experiences of discrimination. Fifteen years later, the number of empirical investigations on discrimination and health easily exceeds 500, with these studies increasingly global in scope and focused on major types of discrimination variously involving race/ethnicity, indigenous status, immigrant status, gender, sexuality, disability, and age, separately and in combination. And yet, as I also document, even as the number of investigations has dramatically expanded, the scope remains narrow: studies remain focused primarily on interpersonal discrimination, and scant research investigates the health impacts of structural discrimination, a gap consonant with the limited epidemiologic research on political systems and population health. Accordingly, to help advance the state of the field, this updated review article: (a) briefly reviews definitions of discrimination, illustrated with examples from the United States; (b) discusses theoretical insights useful for conceptualizing how discrimination can become embodied and produce health inequities, including via distortion of scientific knowledge; (c) concisely summarizes extant evidence--both robust and inconsistent--linking discrimination and health; and (d) addresses several key methodological controversies and challenges, including the need for careful attention to domains, pathways, level, and spatiotemporal scale, in historical context. PMID:25626224

  5. Microscale acceleration history discriminators

    DOEpatents

    Polosky, Marc A.; Plummer, David W.

    2002-01-01

    A new class of micromechanical acceleration history discriminators is claimed. These discriminators allow the precise differentiation of a wide range of acceleration-time histories, thereby allowing adaptive events to be triggered in response to the severity (or lack thereof) of an external environment. Such devices have applications in airbag activation, and other safety and surety applications.

  6. Reverse Discrimination: Recent Cases.

    ERIC Educational Resources Information Center

    Steinhilber, August W.

    This paper discusses reverse discrimination cases with particular emphasis on Bakke v. Regents of University of California and those cases which preceded it. A brief history is given of court cases used by opponents and proponents in the discussion of reverse discrimination. Legal theory and a discussion of court cases that preceded Bakke follow.…

  7. Aligning Biomolecular Networks Using Modular Graph Kernels

    NASA Astrophysics Data System (ADS)

    Towfic, Fadi; Greenlee, M. Heather West; Honavar, Vasant

    Comparative analysis of biomolecular networks constructed using measurements from different conditions, tissues, and organisms offer a powerful approach to understanding the structure, function, dynamics, and evolution of complex biological systems. We explore a class of algorithms for aligning large biomolecular networks by breaking down such networks into subgraphs and computing the alignment of the networks based on the alignment of their subgraphs. The resulting subnetworks are compared using graph kernels as scoring functions. We provide implementations of the resulting algorithms as part of BiNA, an open source biomolecular network alignment toolkit. Our experiments using Drosophila melanogaster, Saccharomyces cerevisiae, Mus musculus and Homo sapiens protein-protein interaction networks extracted from the DIP repository of protein-protein interaction data demonstrate that the performance of the proposed algorithms (as measured by % GO term enrichment of subnetworks identified by the alignment) is competitive with some of the state-of-the-art algorithms for pair-wise alignment of large protein-protein interaction networks. Our results also show that the inter-species similarity scores computed based on graph kernels can be used to cluster the species into a species tree that is consistent with the known phylogenetic relationships among the species.

  8. Bergman kernel, balanced metrics and black holes

    NASA Astrophysics Data System (ADS)

    Klevtsov, Semyon

    In this thesis we explore the connections between the Kahler geometry and Landau levels on compact manifolds. We rederive the expansion of the Bergman kernel on Kahler manifolds developed by Tian, Yau, Zelditch, Lu and Catlin, using path integral and perturbation theory. The physics interpretation of this result is as an expansion of the projector of wavefunctions on the lowest Landau level, in the special case that the magnetic field is proportional to the Kahler form. This is a geometric expansion, somewhat similar to the DeWitt-Seeley-Gilkey short time expansion for the heat kernel, but in this case describing the long time limit, without depending on supersymmetry. We also generalize this expansion to supersymmetric quantum mechanics and more general magnetic fields, and explore its applications. These include the quantum Hall effect in curved space, the balanced metrics and Kahler gravity. In particular, we conjecture that for a probe in a BPS black hole in type II strings compactified on Calabi-Yau manifolds, the moduli space metric is the balanced metric.

  9. Delimiting Areas of Endemism through Kernel Interpolation

    PubMed Central

    Oliveira, Ubirajara; Brescovit, Antonio D.; Santos, Adalberto J.

    2015-01-01

    We propose a new approach for identification of areas of endemism, the Geographical Interpolation of Endemism (GIE), based on kernel spatial interpolation. This method differs from others in being independent of grid cells. This new approach is based on estimating the overlap between the distribution of species through a kernel interpolation of centroids of species distribution and areas of influence defined from the distance between the centroid and the farthest point of occurrence of each species. We used this method to delimit areas of endemism of spiders from Brazil. To assess the effectiveness of GIE, we analyzed the same data using Parsimony Analysis of Endemism and NDM and compared the areas identified through each method. The analyses using GIE identified 101 areas of endemism of spiders in Brazil GIE demonstrated to be effective in identifying areas of endemism in multiple scales, with fuzzy edges and supported by more synendemic species than in the other methods. The areas of endemism identified with GIE were generally congruent with those identified for other taxonomic groups, suggesting that common processes can be responsible for the origin and maintenance of these biogeographic units. PMID:25611971

  10. Pareto-path multitask multiple kernel learning.

    PubMed

    Li, Cong; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2015-01-01

    A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches. PMID:25532155

  11. Scientific Computing Kernels on the Cell Processor

    SciTech Connect

    Williams, Samuel W.; Shalf, John; Oliker, Leonid; Kamil, Shoaib; Husbands, Parry; Yelick, Katherine

    2007-04-04

    The slowing pace of commodity microprocessor performance improvements combined with ever-increasing chip power demands has become of utmost concern to computational scientists. As a result, the high performance computing community is examining alternative architectures that address the limitations of modern cache-based designs. In this work, we examine the potential of using the recently-released STI Cell processor as a building block for future high-end computing systems. Our work contains several novel contributions. First, we introduce a performance model for Cell and apply it to several key scientific computing kernels: dense matrix multiply, sparse matrix vector multiply, stencil computations, and 1D/2D FFTs. The difficulty of programming Cell, which requires assembly level intrinsics for the best performance, makes this model useful as an initial step in algorithm design and evaluation. Next, we validate the accuracy of our model by comparing results against published hardware results, as well as our own implementations on a 3.2GHz Cell blade. Additionally, we compare Cell performance to benchmarks run on leading superscalar (AMD Opteron), VLIW (Intel Itanium2), and vector (Cray X1E) architectures. Our work also explores several different mappings of the kernels and demonstrates a simple and effective programming model for Cell's unique architecture. Finally, we propose modest microarchitectural modifications that could significantly increase the efficiency of double-precision calculations. Overall results demonstrate the tremendous potential of the Cell architecture for scientific computations in terms of both raw performance and power efficiency.

  12. Pareto-path multitask multiple kernel learning.

    PubMed

    Li, Cong; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2015-01-01

    A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.

  13. Stable Local Volatility Calibration Using Kernel Splines

    NASA Astrophysics Data System (ADS)

    Coleman, Thomas F.; Li, Yuying; Wang, Cheng

    2010-09-01

    We propose an optimization formulation using L1 norm to ensure accuracy and stability in calibrating a local volatility function for option pricing. Using a regularization parameter, the proposed objective function balances the calibration accuracy with the model complexity. Motivated by the support vector machine learning, the unknown local volatility function is represented by a kernel function generating splines and the model complexity is controlled by minimizing the 1-norm of the kernel coefficient vector. In the context of the support vector regression for function estimation based on a finite set of observations, this corresponds to minimizing the number of support vectors for predictability. We illustrate the ability of the proposed approach to reconstruct the local volatility function in a synthetic market. In addition, based on S&P 500 market index option data, we demonstrate that the calibrated local volatility surface is simple and resembles the observed implied volatility surface in shape. Stability is illustrated by calibrating local volatility functions using market option data from different dates.

  14. Transcriptome analysis of Ginkgo biloba kernels

    PubMed Central

    He, Bing; Gu, Yincong; Xu, Meng; Wang, Jianwen; Cao, Fuliang; Xu, Li-an

    2015-01-01

    Ginkgo biloba is a dioecious species native to China with medicinally and phylogenetically important characteristics; however, genomic resources for this species are limited. In this study, we performed the first transcriptome sequencing for Ginkgo kernels at five time points using Illumina paired-end sequencing. Approximately 25.08-Gb clean reads were obtained, and 68,547 unigenes with an average length of 870 bp were generated by de novo assembly. Of these unigenes, 29,987 (43.74%) were annotated in publicly available plant protein database. A total of 3,869 genes were identified as significantly differentially expressed, and enrichment analysis was conducted at different time points. Furthermore, metabolic pathway analysis revealed that 66 unigenes were responsible for terpenoid backbone biosynthesis, with up to 12 up-regulated unigenes involved in the biosynthesis of ginkgolide and bilobalide. Differential gene expression analysis together with real-time PCR experiments indicated that the synthesis of bilobalide may have interfered with the ginkgolide synthesis process in the kernel. These data can remarkably expand the existing transcriptome resources of Ginkgo, and provide a valuable platform to reveal more on developmental and metabolic mechanisms of this species. PMID:26500663

  15. Analysis of maize (Zea mays) kernel density and volume using micro-computed tomography and single-kernel near infrared spectroscopy

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Maize kernel density impacts milling quality of the grain due to kernel hardness. Harder kernels are correlated with higher test weight and are more resistant to breakage during harvest and transport. Softer kernels, in addition to being susceptible to mechanical damage, are also prone to pathogen ...

  16. SVM-based CAD system for early detection of the Alzheimer's disease using kernel PCA and LDA.

    PubMed

    López, M M; Ramírez, J; Górriz, J M; Alvarez, I; Salas-Gonzalez, D; Segovia, F; Chaves, R

    2009-10-30

    Single-photon emission tomography (SPECT) imaging has been widely used to guide clinicians in the early Alzheimer's disease (AD) diagnosis challenge. However, AD detection still relies on subjective steps carried out by clinicians, which entail in some way subjectivity to the final diagnosis. In this work, kernel principal component analysis (PCA) and linear discriminant analysis (LDA) are applied on functional images as dimension reduction and feature extraction techniques, which are subsequently used to train a supervised support vector machine (SVM) classifier. The complete methodology provides a kernel-based computer-aided diagnosis (CAD) system capable to distinguish AD from normal subjects with 92.31% accuracy rate for a SPECT database consisting of 91 patients. The proposed methodology outperforms voxels-as-features (VAF) that was considered as baseline approach, which yields 80.22% for the same SPECT database.

  17. Dynamics of temporal discrimination.

    PubMed

    Guilhardi, Paulo; Church, Russell M

    2005-11-01

    The purpose of this research was to describe and explain the acquisition of temporal discriminations, transitions from one temporal interval to another, and asymptotic performance of stimulus and temporal discriminations. Rats were trained on a multiple cued interval (MCI) procedure with a head entry response on three signaled fixed-interval schedules of reinforcement (30, 60, and 120 sec). They readily learned the three temporal discriminations, whether they were presented simultaneously or successively, and they rapidly adjusted their performance to new intervals when the intermediate interval was varied daily. Although exponential functions provided good descriptions of many measures of temporal discrimination, different parameter values were required for each measure. The addition of a linear operator to a packet theory of timing with a single set of parameters provided a quantitative process model that fit many measures of the dynamics of temporal discrimination.

  18. Discrimination of raw and processed Dipsacus asperoides by near infrared spectroscopy combined with least squares-support vector machine and random forests

    NASA Astrophysics Data System (ADS)

    Xin, Ni; Gu, Xiao-Feng; Wu, Hao; Hu, Yu-Zhu; Yang, Zhong-Lin

    2012-04-01

    Most herbal medicines could be processed to fulfill the different requirements of therapy. The purpose of this study was to discriminate between raw and processed Dipsacus asperoides, a common traditional Chinese medicine, based on their near infrared (NIR) spectra. Least squares-support vector machine (LS-SVM) and random forests (RF) were employed for full-spectrum classification. Three types of kernels, including linear kernel, polynomial kernel and radial basis function kernel (RBF), were checked for optimization of LS-SVM model. For comparison, a linear discriminant analysis (LDA) model was performed for classification, and the successive projections algorithm (SPA) was executed prior to building an LDA model to choose an appropriate subset of wavelengths. The three methods were applied to a dataset containing 40 raw herbs and 40 corresponding processed herbs. We ran 50 runs of 10-fold cross validation to evaluate the model's efficiency. The performance of the LS-SVM with RBF kernel (RBF LS-SVM) was better than the other two kernels. The RF, RBF LS-SVM and SPA-LDA successfully classified all test samples. The mean error rates for the 50 runs of 10-fold cross validation were 1.35% for RBF LS-SVM, 2.87% for RF, and 2.50% for SPA-LDA. The best classification results were obtained by using LS-SVM with RBF kernel, while RF was fast in the training and making predictions.

  19. Discriminative sparse subspace learning and its application to unsupervised feature selection.

    PubMed

    Zhou, Nan; Cheng, Hong; Pedrycz, Witold; Zhang, Yong; Liu, Huaping

    2016-03-01

    In order to efficiently use the intrinsic data information, in this study a Discriminative Sparse Subspace Learning (DSSL) model has been investigated for unsupervised feature selection. First, the feature selection problem is formulated as a subspace learning problem. In order to efficiently learn the discriminative subspace, we investigate the discriminative information in the subspace learning process. Second, a two-step TDSSL algorithm and a joint modeling JDSSL algorithm are developed to incorporate the clusters׳ assignment as the discriminative information. Then, a convergence analysis of these two algorithms is provided. A kernelized discriminative sparse subspace learning (KDSSL) method is proposed to handle the nonlinear subspace learning problem. Finally, extensive experiments are conducted on real-world datasets to show the superiority of the proposed approaches over several state-of-the-art approaches. PMID:26803552

  20. Comparison of Kernel Equating and Item Response Theory Equating Methods

    ERIC Educational Resources Information Center

    Meng, Yu

    2012-01-01

    The kernel method of test equating is a unified approach to test equating with some advantages over traditional equating methods. Therefore, it is important to evaluate in a comprehensive way the usefulness and appropriateness of the Kernel equating (KE) method, as well as its advantages and disadvantages compared with several popular item…

  1. Evidence-based kernels: fundamental units of behavioral influence.

    PubMed

    Embry, Dennis D; Biglan, Anthony

    2008-09-01

    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behavior-influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of its components would render it inert. Existing evidence shows that a variety of kernels can influence behavior in context, and some evidence suggests that frequent use or sufficient use of some kernels may produce longer lasting behavioral shifts. The analysis of kernels could contribute to an empirically based theory of behavioral influence, augment existing prevention or treatment efforts, facilitate the dissemination of effective prevention and treatment practices, clarify the active ingredients in existing interventions, and contribute to efficiently developing interventions that are more effective. Kernels involve one or more of the following mechanisms of behavior influence: reinforcement, altering antecedents, changing verbal relational responding, or changing physiological states directly. The paper describes 52 of these kernels, and details practical, theoretical, and research implications, including calling for a national database of kernels that influence human behavior.

  2. Evidence-based Kernels: Fundamental Units of Behavioral Influence

    PubMed Central

    Biglan, Anthony

    2008-01-01

    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behavior–influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of its components would render it inert. Existing evidence shows that a variety of kernels can influence behavior in context, and some evidence suggests that frequent use or sufficient use of some kernels may produce longer lasting behavioral shifts. The analysis of kernels could contribute to an empirically based theory of behavioral influence, augment existing prevention or treatment efforts, facilitate the dissemination of effective prevention and treatment practices, clarify the active ingredients in existing interventions, and contribute to efficiently developing interventions that are more effective. Kernels involve one or more of the following mechanisms of behavior influence: reinforcement, altering antecedents, changing verbal relational responding, or changing physiological states directly. The paper describes 52 of these kernels, and details practical, theoretical, and research implications, including calling for a national database of kernels that influence human behavior. PMID:18712600

  3. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946... Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of...

  4. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946... Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of...

  5. Evidence-Based Kernels: Fundamental Units of Behavioral Influence

    ERIC Educational Resources Information Center

    Embry, Dennis D.; Biglan, Anthony

    2008-01-01

    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behavior-influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of…

  6. Optimal Bandwidth Selection in Observed-Score Kernel Equating

    ERIC Educational Resources Information Center

    Häggström, Jenny; Wiberg, Marie

    2014-01-01

    The selection of bandwidth in kernel equating is important because it has a direct impact on the equated test scores. The aim of this article is to examine the use of double smoothing when selecting bandwidths in kernel equating and to compare double smoothing with the commonly used penalty method. This comparison was made using both an equivalent…

  7. Sugar uptake into kernels of tunicate tassel-seed maize

    SciTech Connect

    Thomas, P.A.; Felker, F.C.; Crawford, C.G. )

    1990-05-01

    A maize (Zea mays L.) strain expressing both the tassel-seed (Ts-5) and tunicate (Tu) characters was developed which produces glume-covered kernels on the tassel, often born on 7-10 mm pedicels. Vigorous plants produce up to 100 such kernels interspersed with additional sessile kernels. This floral unit provides a potentially valuable experimental system for studying sugar uptake into developing maize seeds. When detached kernels (with glumes and pedicel intact) are placed in incubation solution, fluid flows up the pedicel and into the glumes, entering the pedicel apoplast near the kernel base. The unusual anatomical features of this maize strain permit experimental access to the pedicel apoplast with much less possibility of kernel base tissue damage than with kernels excised from the cob. ({sup 14}C)Fructose incorporation into soluble and insoluble fractions of endosperm increased for 8 days. Endosperm uptake of sucrose, fructose, and D-glucose was significantly greater than that of L-glucose. Fructose uptake was significantly inhibited by CCCP, DNP, and PCMBS. These results suggest the presence of an active, non-diffusion component of sugar transport in maize kernels.

  8. Introduction to Kernel Methods: Classification of Multivariate Data

    NASA Astrophysics Data System (ADS)

    Fauvel, M.

    2016-05-01

    In this chapter, kernel methods are presented for the classification of multivariate data. An introduction example is given to enlighten the main idea of kernel methods. Then emphasis is done on the Support Vector Machine. Structural risk minimization is presented, and linear and non-linear SVM are described. Finally, a full example of SVM classification is given on simulated hyperspectral data.

  9. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  10. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  11. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AGREEMENTS AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  12. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AGREEMENTS AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  13. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE ALMONDS GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which...

  14. Accumulation of storage products in oat during kernel development.

    PubMed

    Banaś, A; Dahlqvist, A; Debski, H; Gummeson, P O; Stymne, S

    2000-12-01

    Lipids, proteins and starch are the main storage products in oat seeds. As a first step in elucidating the regulatory mechanisms behind the deposition of these compounds, two different oat varieties, 'Freja' and 'Matilda', were analysed during kernel development. In both cultivars, the majority of the lipids accumulated at very early stage of development but Matilda accumulated about twice the amount of lipids compared to Freja. Accumulation of proteins and starch started also in the early stage of kernel development but, in contrast to lipids, continued over a considerably longer period. The high-oil variety Matilda also accumulated higher amounts of proteins than Freja. The starch content in Freja kernels was higher than in Matilda kernels and the difference was most pronounced during the early stage of development when oil synthesis was most active. Oleosin accumulation continued during the whole period of kernel development.

  15. Anatomically-aided PET reconstruction using the kernel method

    NASA Astrophysics Data System (ADS)

    Hutchcroft, Will; Wang, Guobao; Chen, Kevin T.; Catana, Ciprian; Qi, Jinyi

    2016-09-01

    This paper extends the kernel method that was proposed previously for dynamic PET reconstruction, to incorporate anatomical side information into the PET reconstruction model. In contrast to existing methods that incorporate anatomical information using a penalized likelihood framework, the proposed method incorporates this information in the simpler maximum likelihood (ML) formulation and is amenable to ordered subsets. The new method also does not require any segmentation of the anatomical image to obtain edge information. We compare the kernel method with the Bowsher method for anatomically-aided PET image reconstruction through a simulated data set. Computer simulations demonstrate that the kernel method offers advantages over the Bowsher method in region of interest quantification. Additionally the kernel method is applied to a 3D patient data set. The kernel method results in reduced noise at a matched contrast level compared with the conventional ML expectation maximization algorithm.

  16. Direct Measurement of Wave Kernels in Time-Distance Helioseismology

    NASA Technical Reports Server (NTRS)

    Duvall, T. L., Jr.

    2006-01-01

    Solar f-mode waves are surface-gravity waves which propagate horizontally in a thin layer near the photosphere with a dispersion relation approximately that of deep water waves. At the power maximum near 3 mHz, the wavelength of 5 Mm is large enough for various wave scattering properties to be observable. Gizon and Birch (2002,ApJ,571,966)h ave calculated kernels, in the Born approximation, for the sensitivity of wave travel times to local changes in damping rate and source strength. In this work, using isolated small magnetic features as approximate point-sourc'e scatterers, such a kernel has been measured. The observed kernel contains similar features to a theoretical damping kernel but not for a source kernel. A full understanding of the effect of small magnetic features on the waves will require more detailed modeling.

  17. OSKI: A Library of Automatically Tuned Sparse Matrix Kernels

    SciTech Connect

    Vuduc, R; Demmel, J W; Yelick, K A

    2005-07-19

    The Optimized Sparse Kernel Interface (OSKI) is a collection of low-level primitives that provide automatically tuned computational kernels on sparse matrices, for use by solver libraries and applications. These kernels include sparse matrix-vector multiply and sparse triangular solve, among others. The primary aim of this interface is to hide the complex decision-making process needed to tune the performance of a kernel implementation for a particular user's sparse matrix and machine, while also exposing the steps and potentially non-trivial costs of tuning at run-time. This paper provides an overview of OSKI, which is based on our research on automatically tuned sparse kernels for modern cache-based superscalar machines.

  18. Optimal discrimination index and discrimination efficiency for essay questions.

    PubMed

    Chan, Wing-shing

    2014-01-01

    Recommended guidelines for discrimination index of multiple choice questions are often indiscriminately applied to essay type questions also. Optimal discrimination index under normality condition for essay question is independently derived. Satisfactory region for discrimination index of essay questions with passing mark at 50% of the total is between 0.12 and 0.31 instead of 0.40 or more in the case for multiple-choice questions. Optimal discrimination index for essay question is shown to increase proportional to the range of scores. Discrimination efficiency as the ratio of the observed discrimination index over the optimal discrimination index is defined. Recommended guidelines for discrimination index of essay questions are provided.

  19. Quadratic negative evidence discrimination

    SciTech Connect

    Anderson, D.N.; Redgate, T.; Anderson, K.K.; Rohay, A.C.; Ryan, F.M.

    1997-05-01

    This paper develops regional discrimination methods which use information inherent in phase magnitudes that are unmeasurable due to small amplitudes and/or high noise levels. The methods are enhancements to teleseismic techniques proposed by, and are extended to regional discrimination. Events observed at teleseismic distances are effectively identified with the M{sub s} vs m{sub b} discriminant because relative to the pressure wave energy (m{sub b}) of an event, an earthquake generates more shear wave energy (M{sub s}) than does an explosion. For some teleseismic events, the M{sub s} magnitude is difficult to measure and is known only to be below a threshold . With M{sub s} unmeasurable, the M{sub s} vs m{sub b} discriminant cannot be formed. However, if the M{sub s} is sufficiently small relative to a measured m{sub b}, then the event is still likely to be an explosion. The methods presented in this report are developed for a single seismic station, and make use of empirical evidence in the regional L{sub g} vs p{sub g} discriminant. The L{sub g} vs p{sub g} discriminant is analogous to the teleseismic M{sub s} vs m{sub b} discriminant.

  20. Frequency discriminator/phase detector

    NASA Technical Reports Server (NTRS)

    Crow, R. B.

    1974-01-01

    Circuit provides dual function of frequency discriminator/phase detector which reduces frequency acquisition time without adding to circuit complexity. Both frequency discriminators, in evaluated frequency discriminator/phase detector circuits, are effective two decades above and below center frequency.

  1. A visualization tool for the kernel-driven model with improved ability in data analysis and kernel assessment

    NASA Astrophysics Data System (ADS)

    Dong, Yadong; Jiao, Ziti; Zhang, Hu; Bai, Dongni; Zhang, Xiaoning; Li, Yang; He, Dandan

    2016-10-01

    The semi-empirical, kernel-driven Bidirectional Reflectance Distribution Function (BRDF) model has been widely used for many aspects of remote sensing. With the development of the kernel-driven model, there is a need to further assess the performance of newly developed kernels. The use of visualization tools can facilitate the analysis of model results and the assessment of newly developed kernels. However, the current version of the kernel-driven model does not contain a visualization function. In this study, a user-friendly visualization tool, named MaKeMAT, was developed specifically for the kernel-driven model. The POLDER-3 and CAR BRDF datasets were used to demonstrate the applicability of MaKeMAT. The visualization of inputted multi-angle measurements enhances understanding of multi-angle measurements and allows the choice of measurements with good representativeness. The visualization of modeling results facilitates the assessment of newly developed kernels. The study shows that the visualization tool MaKeMAT can promote the widespread application of the kernel-driven model.

  2. Tectonic discrimination diagrams revisited

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter

    2006-06-01

    The decision boundaries of most tectonic discrimination diagrams are drawn by eye. Discriminant analysis is a statistically more rigorous way to determine the tectonic affinity of oceanic basalts based on their bulk-rock chemistry. This method was applied to a database of 756 oceanic basalts of known tectonic affinity (ocean island, mid-ocean ridge, or island arc). For each of these training data, up to 45 major, minor, and trace elements were measured. Discriminant analysis assumes multivariate normality. If the same covariance structure is shared by all the classes (i.e., tectonic affinities), the decision boundaries are linear, hence the term linear discriminant analysis (LDA). In contrast with this, quadratic discriminant analysis (QDA) allows the classes to have different covariance structures. To solve the statistical problems associated with the constant-sum constraint of geochemical data, the training data must be transformed to log-ratio space before performing a discriminant analysis. The results can be mapped back to the compositional data space using the inverse log-ratio transformation. An exhaustive exploration of 14,190 possible ternary discrimination diagrams yields the Ti-Si-Sr system as the best linear discrimination diagram and the Na-Nb-Sr system as the best quadratic discrimination diagram. The best linear and quadratic discrimination diagrams using only immobile elements are Ti-V-Sc and Ti-V-Sm, respectively. As little as 5% of the training data are misclassified by these discrimination diagrams. Testing them on a second database of 182 samples that were not part of the training data yields a more reliable estimate of future performance. Although QDA misclassifies fewer training data than LDA, the opposite is generally true for the test data. Therefore LDA is a cruder but more robust classifier than QDA. Another advantage of LDA is that it provides a powerful way to reduce the dimensionality of the multivariate geochemical data in a similar

  3. Landscape-scale habitat selection by fishers translocated to the Olympic Peninsula of Washington

    USGS Publications Warehouse

    Lewis, Jeffrey C.; Jenkins, Kurt J.; Happe, Patricia J.; Manson, David J.; McCalmon, Marc

    2016-01-01

    The fisher was extirpated from much of the Pacific Northwestern United States during the mid- to late-1900s and is now proposed for federal listing as a threatened species in all or part of its west coast range. Following the translocation of 90 fishers from central British Columbia, Canada, to the Olympic Peninsula of Washington State from 2008 to 2010, we investigated the landscape-scale habitat selection of reintroduced fishers across a broad range of forest ages and disturbance histories, providing the first information on habitat relationships of newly reintroduced fishers in coastal coniferous forests in the Pacific Northwest. We developed 17 a priori models to evaluate several habitat-selection hypotheses based on premises of habitat models used to forecast habitat suitability for the reintroduced population. Further, we hypothesized that female fishers, because of their smaller body size than males, greater vulnerability to predation, and specific reproductive requirements, would be more selective than males for mid- to late-seral forest communities, where complex forest structural elements provide secure foraging, resting, and denning sites. We assessed 11 forest structure and landscape characteristics within the home range core-areas used by 19 females and 12 males and within randomly placed pseudo core areas that represented available habitats. We used case-controlled logistic regression to compare the characteristics of used and pseudo core areas and to assess selection by male and female fishers. Females were more selective of core area placement than males. Fifteen of 19 females (79%) and 5 of 12 males (42%) selected core areas within federal lands that encompassed primarily forests with an overstory of mid-sized or large trees. Male fishers exhibited only weak selection for core areas dominated by forests with an overstory of small trees, primarily on land managed for timber production or at high elevations. The amount of natural open area best

  4. Conservation of the Eastern Taiwan Strait Chinese White Dolphin (Sousa chinensis): Fishers' Perspectives and Management Implications.

    PubMed

    Liu, Ta-Kang; Wang, Yu-Cheng; Chuang, Laurence Zsu-Hsin; Chen, Chih-How

    2016-01-01

    The abundance of the eastern Taiwan Strait (ETS) population of the Chinese white dolphin (Sousa chinensis) has been estimated to be less than 100 individuals. It is categorized as critically endangered in the IUCN Red List of Threatened Species. Thus, immediate measures of conservation should be taken to protect it from extinction. Currently, the Taiwanese government plans to designate its habitat as a Major Wildlife Habitat (MWH), a type of marine protected area (MPA) for conservation of wildlife species. Although the designation allows continuing the current exploitation, however, it may cause conflicts among multiple stakeholders with competing interests. The study is to explore the attitude and opinions among the stakeholders in order to better manage the MPA. This study employs a semi-structured interview and a questionnaire survey of local fishers. Results from interviews indicated that the subsistence of fishers remains a major problem. It was found that stakeholders have different perceptions of the fishers' attitude towards conservation and also thought that the fishery-related law enforcement could be difficult. Quantitative survey showed that fishers are generally positive towards the conservation of the Chinese white dolphin but are less willing to participate in the planning process. Most fishers considered temporary fishing closure as feasible for conservation. The results of this study provide recommendations for future efforts towards the goal of better conservation for this endangered species.

  5. Conservation of the Eastern Taiwan Strait Chinese White Dolphin (Sousa chinensis): Fishers' Perspectives and Management Implications.

    PubMed

    Liu, Ta-Kang; Wang, Yu-Cheng; Chuang, Laurence Zsu-Hsin; Chen, Chih-How

    2016-01-01

    The abundance of the eastern Taiwan Strait (ETS) population of the Chinese white dolphin (Sousa chinensis) has been estimated to be less than 100 individuals. It is categorized as critically endangered in the IUCN Red List of Threatened Species. Thus, immediate measures of conservation should be taken to protect it from extinction. Currently, the Taiwanese government plans to designate its habitat as a Major Wildlife Habitat (MWH), a type of marine protected area (MPA) for conservation of wildlife species. Although the designation allows continuing the current exploitation, however, it may cause conflicts among multiple stakeholders with competing interests. The study is to explore the attitude and opinions among the stakeholders in order to better manage the MPA. This study employs a semi-structured interview and a questionnaire survey of local fishers. Results from interviews indicated that the subsistence of fishers remains a major problem. It was found that stakeholders have different perceptions of the fishers' attitude towards conservation and also thought that the fishery-related law enforcement could be difficult. Quantitative survey showed that fishers are generally positive towards the conservation of the Chinese white dolphin but are less willing to participate in the planning process. Most fishers considered temporary fishing closure as feasible for conservation. The results of this study provide recommendations for future efforts towards the goal of better conservation for this endangered species. PMID:27526102

  6. Müllerian mimicry: an examination of Fisher's theory of gradual evolutionary change.

    PubMed

    Balogh, Alexandra C V; Leimar, Olof

    2005-11-01

    In 1927, Fisher suggested that Müllerian mimicry evolution could be gradual and driven by predator generalization. A competing possibility is the so-called two-step hypothesis, entailing that Müllerian mimicry evolves through major mutational leaps of a less-protected species towards a better-protected, which sets the stage for coevolutionary fine-tuning of mimicry. At present, this hypothesis seems to be more widely accepted than Fisher's suggestion. We conducted individual-based simulations of communities with predators and two prey types to assess the possibility of Fisher's process leading to a common prey appearance. We found that Fisher's process worked for initially relatively similar appearances. Moreover, by introducing a predator spectrum consisting of several predator types with different ranges of generalization, we found that gradual evolution towards mimicry occurred also for large initial differences in prey appearance. We suggest that Fisher's process together with a predator spectrum is a realistic alternative to the two-step hypothesis and, furthermore, it has fewer problems with purifying selection. We also examined the factors influencing gradual evolution towards mimicry and found that not only the relative benefits from mimicry but also the mutational schemes of the prey types matter.

  7. Privacy preserving RBF kernel support vector machine.

    PubMed

    Li, Haoran; Xiong, Li; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2014-01-01

    Data sharing is challenging but important for healthcare research. Methods for privacy-preserving data dissemination based on the rigorous differential privacy standard have been developed but they did not consider the characteristics of biomedical data and make full use of the available information. This often results in too much noise in the final outputs. We hypothesized that this situation can be alleviated by leveraging a small portion of open-consented data to improve utility without sacrificing privacy. We developed a hybrid privacy-preserving differentially private support vector machine (SVM) model that uses public data and private data together. Our model leverages the RBF kernel and can handle nonlinearly separable cases. Experiments showed that this approach outperforms two baselines: (1) SVMs that only use public data, and (2) differentially private SVMs that are built from private data. Our method demonstrated very close performance metrics compared to nonprivate SVMs trained on the private data. PMID:25013805

  8. Point-Kernel Shielding Code System.

    1982-02-17

    Version 00 QAD-BSA is a three-dimensional, point-kernel shielding code system based upon the CCC-48/QAD series. It is designed to calculate photon dose rates and heating rates using exponential attenuation and infinite medium buildup factors. Calculational provisions include estimates of fast neutron penetration using data computed by the moments method. Included geometry routines can describe complicated source and shield geometries. An internal library contains data for many frequently used structural and shielding materials, enabling the codemore » to solve most problems with only source strengths and problem geometry required as input. This code system adapts especially well to problems requiring multiple sources and sources with asymmetrical geometry. In addition to being edited separately, the total interaction rates from many sources may be edited at each detector point. Calculated photon interaction rates agree closely with those obtained using QAD-P5A.« less

  9. Kernel density estimation using graphical processing unit

    NASA Astrophysics Data System (ADS)

    Sunarko, Su'ud, Zaki

    2015-09-01

    Kernel density estimation for particles distributed over a 2-dimensional space is calculated using a single graphical processing unit (GTX 660Ti GPU) and CUDA-C language. Parallel calculations are done for particles having bivariate normal distribution and by assigning calculations for equally-spaced node points to each scalar processor in the GPU. The number of particles, blocks and threads are varied to identify favorable configuration. Comparisons are obtained by performing the same calculation using 1, 2 and 4 processors on a 3.0 GHz CPU using MPICH 2.0 routines. Speedups attained with the GPU are in the range of 88 to 349 times compared the multiprocessor CPU. Blocks of 128 threads are found to be the optimum configuration for this case.

  10. The flare kernel in the impulsive phase

    NASA Technical Reports Server (NTRS)

    Dejager, C.

    1986-01-01

    The impulsive phase of a flare is characterized by impulsive bursts of X-ray and microwave radiation, related to impulsive footpoint heating up to 50 or 60 MK, by upward gas velocities (150 to 400 km/sec) and by a gradual increase of the flare's thermal energy content. These phenomena, as well as non-thermal effects, are all related to the impulsive energy injection into the flare. The available observations are also quantitatively consistent with a model in which energy is injected into the flare by beams of energetic electrons, causing ablation of chromospheric gas, followed by convective rise of gas. Thus, a hole is burned into the chromosphere; at the end of impulsive phase of an average flare the lower part of that hole is situated about 1800 km above the photosphere. H alpha and other optical and UV line emission is radiated by a thin layer (approx. 20 km) at the bottom of the flare kernel. The upward rising and outward streaming gas cools down by conduction in about 45 s. The non-thermal effects in the initial phase are due to curtailing of the energy distribution function by escape of energetic electrons. The single flux tube model of a flare does not fit with these observations; instead we propose the spaghetti-bundle model. Microwave and gamma-ray observations suggest the occurrence of dense flare knots of approx. 800 km diameter, and of high temperature. Future observations should concentrate on locating the microwave/gamma-ray sources, and on determining the kernel's fine structure and the related multi-loop structure of the flaring area.

  11. Equivalence of kernel machine regression and kernel distance covariance for multidimensional phenotype association studies.

    PubMed

    Hua, Wen-Yu; Ghosh, Debashis

    2015-09-01

    Associating genetic markers with a multidimensional phenotype is an important yet challenging problem. In this work, we establish the equivalence between two popular methods: kernel-machine regression (KMR), and kernel distance covariance (KDC). KMR is a semiparametric regression framework that models covariate effects parametrically and genetic markers non-parametrically, while KDC represents a class of methods that include distance covariance (DC) and Hilbert-Schmidt independence criterion (HSIC), which are nonparametric tests of independence. We show that the equivalence between the score test of KMR and the KDC statistic under certain conditions can lead to a novel generalization of the KDC test that incorporates covariates. Our contributions are 3-fold: (1) establishing the equivalence between KMR and KDC; (2) showing that the principles of KMR can be applied to the interpretation of KDC; (3) the development of a broader class of KDC statistics, where the class members are statistics corresponding to different kernel combinations. Finally, we perform simulation studies and an analysis of real data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. The ADNI study suggest that SNPs of FLJ16124 exhibit pairwise interaction effects that are strongly correlated to the changes of brain region volumes. PMID:25939365

  12. Equivalence of kernel machine regression and kernel distance covariance for multidimensional phenotype association studies.

    PubMed

    Hua, Wen-Yu; Ghosh, Debashis

    2015-09-01

    Associating genetic markers with a multidimensional phenotype is an important yet challenging problem. In this work, we establish the equivalence between two popular methods: kernel-machine regression (KMR), and kernel distance covariance (KDC). KMR is a semiparametric regression framework that models covariate effects parametrically and genetic markers non-parametrically, while KDC represents a class of methods that include distance covariance (DC) and Hilbert-Schmidt independence criterion (HSIC), which are nonparametric tests of independence. We show that the equivalence between the score test of KMR and the KDC statistic under certain conditions can lead to a novel generalization of the KDC test that incorporates covariates. Our contributions are 3-fold: (1) establishing the equivalence between KMR and KDC; (2) showing that the principles of KMR can be applied to the interpretation of KDC; (3) the development of a broader class of KDC statistics, where the class members are statistics corresponding to different kernel combinations. Finally, we perform simulation studies and an analysis of real data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. The ADNI study suggest that SNPs of FLJ16124 exhibit pairwise interaction effects that are strongly correlated to the changes of brain region volumes.

  13. Mass discrimination during weightlessness

    NASA Technical Reports Server (NTRS)

    Ross, H.

    1981-01-01

    An experiment concerned with the ability of astronauts to discriminate between the mass of objects when both the objects and the astronauts are in weightless states is described. The main object of the experiment is to compare the threshold for weight-discrimination on Earth with that for mass-discrimination in orbit. Tests will be conducted premission and postmission and early and late during the mission while the crew is experiencing weightlessness. A comparison of early and late tests inflight and postflight will reveal the rate of adaptation to zero-gravity and 1-g. The mass discrimination box holds 24 balls which the astronaut will compare to one another in a random routine.

  14. Research on the water-inrush risk of coal floors based on Fisher-evaluation and AHP

    NASA Astrophysics Data System (ADS)

    Xu, D. J.; Wei, W. X.; Xiang, S. Y.

    2016-08-01

    There are many factors that influence floor water-inrush. Based on the widely collected data of floor water-inrush in China, the evaluation factors in this paper consist of water pressure, aquifer type, aquiclude thickness, floor failure depth and fault throw. These are used to build a single Fisher evaluation model and a Fisher evaluation with weighting model of an analytic hierarchy process (AHP). By comparision, through AHP weighting value, the interclass distribution of the data in the Fisher model is relatively more concentrated than the single Fisher evaluation method. It would produce higher reliability and more extensive application value.

  15. Fishers' knowledge as a source of information about the estuarine dolphin (Sotalia guianensis, van Bénéden, 1864).

    PubMed

    Manzan, Maíra Fontes; Lopes, Priscila F M

    2015-01-01

    Fishers' local ecological knowledge (LEK) is an additional tool to obtain information about cetaceans, regarding their local particularities, fishing interactions, and behavior. However, this knowledge could vary in depth of detail according to the level of interaction that fishers have with a specific species. This study investigated differences in small-scale fishers' LEK regarding the estuarine dolphin (Sotalia guianensis) in three Brazilian northeast coastal communities where fishing is practiced in estuarine lagoons and/or coastal waters and where dolphin-watching tourism varies from incipient to important. The fishers (N = 116) were asked about general characteristics of S. guianensis and their interactions with this dolphin during fishing activities. Compared to lagoon fishers, coastal fishers showed greater knowledge about the species but had more negative interactions with the dolphin during fishing activities. Coastal fishing not only offered the opportunity for fishers to observe a wider variety of the dolphin's behavior, but also implied direct contact with the dolphins, as they are bycaught in coastal gillnets. Besides complementing information that could be used for the management of cetaceans, this study shows that the type of environment most used by fishers also affects the accuracy of the information they provide. When designing studies to gather information on species and/or populations with the support of fishers, special consideration should be given to local particularities such as gear and habitats used within the fishing community.

  16. Angular velocity discrimination

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.

    1990-01-01

    Three experiments designed to investigate the ability of naive observers to discriminate rotational velocities of two simultaneously viewed objects are described. Rotations are constrained to occur about the x and y axes, resulting in linear two-dimensional image trajectories. The results indicate that observers can discriminate angular velocities with a competence near that for linear velocities. However, perceived angular rate is influenced by structural aspects of the stimuli.

  17. Combining features from ERP components in single-trial EEG for discriminating four-category visual objects

    NASA Astrophysics Data System (ADS)

    Wang, Changming; Xiong, Shi; Hu, Xiaoping; Yao, Li; Zhang, Jiacai

    2012-10-01

    Categorization of images containing visual objects can be successfully recognized using single-trial electroencephalograph (EEG) measured when subjects view images. Previous studies have shown that task-related information contained in event-related potential (ERP) components could discriminate two or three categories of object images. In this study, we investigated whether four categories of objects (human faces, buildings, cats and cars) could be mutually discriminated using single-trial EEG data. Here, the EEG waveforms acquired while subjects were viewing four categories of object images were segmented into several ERP components (P1, N1, P2a and P2b), and then Fisher linear discriminant analysis (Fisher-LDA) was used to classify EEG features extracted from ERP components. Firstly, we compared the classification results using features from single ERP components, and identified that the N1 component achieved the highest classification accuracies. Secondly, we discriminated four categories of objects using combining features from multiple ERP components, and showed that combination of ERP components improved four-category classification accuracies by utilizing the complementarity of discriminative information in ERP components. These findings confirmed that four categories of object images could be discriminated with single-trial EEG and could direct us to select effective EEG features for classifying visual objects.

  18. Mean-Field Dynamics and Fisher Information in Matter Wave Interferometry.

    PubMed

    Haine, Simon A

    2016-06-10

    There has been considerable recent interest in the mean-field dynamics of various atom-interferometry schemes designed for precision sensing. In the field of quantum metrology, the standard tools for evaluating metrological sensitivity are the classical and quantum Fisher information. In this Letter, we show how these tools can be adapted to evaluate the sensitivity when the behavior is dominated by mean-field dynamics. As an example, we compare the behavior of four recent theoretical proposals for gyroscopes based on matter-wave interference in toroidally trapped geometries. We show that while the quantum Fisher information increases at different rates for the various schemes considered, in all cases it is consistent with the well-known Sagnac phase shift after the matter waves have traversed a closed path. However, we argue that the relevant metric for quantifying interferometric sensitivity is the classical Fisher information, which can vary considerably between the schemes. PMID:27341216

  19. Galaxy luminosity function and Tully-Fisher relation: reconciled through rotation-curve studies

    SciTech Connect

    Cattaneo, Andrea; Salucci, Paolo; Papastergis, Emmanouil E-mail: salucci@sissa.it

    2014-03-10

    The relation between galaxy luminosity L and halo virial velocity v {sub vir} required to fit the galaxy luminosity function differs from the observed Tully-Fisher relation between L and disk speed v {sub rot}. Because of this, the problem of reproducing the galaxy luminosity function and the Tully-Fisher relation simultaneously has plagued semianalytic models since their inception. Here we study the relation between v {sub rot} and v {sub vir} by fitting observational average rotation curves of disk galaxies binned in luminosity. We show that the v {sub rot}-v {sub vir} relation that we obtain in this way can fully account for this seeming inconsistency. Therefore, the reconciliation of the luminosity function with the Tully-Fisher relation rests on the complex dependence of v {sub rot} on v {sub vir}, which arises because the ratio of stellar mass to dark matter mass is a strong function of halo mass.

  20. Certain Fisher & Paykel infant radiant warmers require inspection and possible component replacement.

    PubMed

    2011-10-01

    In certain versions of Fisher & Paykel IW900 series infant radiant warmers, a wiring harness connector located near the heater head assembly may become damaged and overheat during routine use, potentially rendering the warmer inoperable. Although Fisher & Paykel states that the risk to patients is low, facilities with affected warmers should inspect the connectors and, if damage or discoloration is evident, contact the company to arrange for replacement of the wiring harness/connector assemblies. Fisher & Paykel intends to release an Advisory Notice to guide customers in the inspection. This problem affects 120 V and 100 V versions of the IW900 series warmers that were manufactured before October 1, 2010 (identified by serial numbers below 101001). PMID:23444537

  1. Macroscopic response to microscopic intrinsic noise in three-dimensional Fisher fronts.

    PubMed

    Nesic, S; Cuerno, R; Moro, E

    2014-10-31

    We study the dynamics of three-dimensional Fisher fronts in the presence of density fluctuations. To this end we simulate the Fisher equation subject to stochastic internal noise, and study how the front moves and roughens as a function of the number of particles in the system, N. Our results suggest that the macroscopic behavior of the system is driven by the microscopic dynamics at its leading edge where number fluctuations are dominated by rare events. Contrary to naive expectations, the strength of front fluctuations decays extremely slowly as 1/logN, inducing large-scale fluctuations which we find belong to the one-dimensional Kardar-Parisi-Zhang universality class of kinetically rough interfaces. Hence, we find that there is no weak-noise regime for Fisher fronts, even for realistic numbers of particles in macroscopic systems. PMID:25396356

  2. Mean-Field Dynamics and Fisher Information in Matter Wave Interferometry

    NASA Astrophysics Data System (ADS)

    Haine, Simon A.

    2016-06-01

    There has been considerable recent interest in the mean-field dynamics of various atom-interferometry schemes designed for precision sensing. In the field of quantum metrology, the standard tools for evaluating metrological sensitivity are the classical and quantum Fisher information. In this Letter, we show how these tools can be adapted to evaluate the sensitivity when the behavior is dominated by mean-field dynamics. As an example, we compare the behavior of four recent theoretical proposals for gyroscopes based on matter-wave interference in toroidally trapped geometries. We show that while the quantum Fisher information increases at different rates for the various schemes considered, in all cases it is consistent with the well-known Sagnac phase shift after the matter waves have traversed a closed path. However, we argue that the relevant metric for quantifying interferometric sensitivity is the classical Fisher information, which can vary considerably between the schemes.

  3. Low Titers of Canine Distemper Virus Antibody in Wild Fishers (Martes pennanti) in the Eastern USA.

    PubMed

    Peper, Steven T; Peper, Randall L; Mitcheltree, Denise H; Kollias, George V; Brooks, Robert P; Stevens, Sadie S; Serfass, Thomas L

    2016-01-01

    Canine distemper virus (CDV) infects species in the order Carnivora. Members of the family Mustelidae are among the species most susceptible to CDV and have a high mortality rate after infection. Assessing an animal's pathogen or disease load prior to any reintroduction project is important to help protect the animal being reintroduced, as well as the wildlife and livestock in the area of relocation. We screened 58 fishers for CDV antibody prior to their release into Pennsylvania, US, as part of a reintroduction program. Five of the 58 (9%) fishers had a weak-positive reaction for CDV antibody at a dilution of 1:16. None of the fishers exhibited any clinical sign of canine distemper while being held prior to release.

  4. Mean-Field Dynamics and Fisher Information in Matter Wave Interferometry.

    PubMed

    Haine, Simon A

    2016-06-10

    There has been considerable recent interest in the mean-field dynamics of various atom-interferometry schemes designed for precision sensing. In the field of quantum metrology, the standard tools for evaluating metrological sensitivity are the classical and quantum Fisher information. In this Letter, we show how these tools can be adapted to evaluate the sensitivity when the behavior is dominated by mean-field dynamics. As an example, we compare the behavior of four recent theoretical proposals for gyroscopes based on matter-wave interference in toroidally trapped geometries. We show that while the quantum Fisher information increases at different rates for the various schemes considered, in all cases it is consistent with the well-known Sagnac phase shift after the matter waves have traversed a closed path. However, we argue that the relevant metric for quantifying interferometric sensitivity is the classical Fisher information, which can vary considerably between the schemes.

  5. Low Titers of Canine Distemper Virus Antibody in Wild Fishers (Martes pennanti) in the Eastern USA.

    PubMed

    Peper, Steven T; Peper, Randall L; Mitcheltree, Denise H; Kollias, George V; Brooks, Robert P; Stevens, Sadie S; Serfass, Thomas L

    2016-01-01

    Canine distemper virus (CDV) infects species in the order Carnivora. Members of the family Mustelidae are among the species most susceptible to CDV and have a high mortality rate after infection. Assessing an animal's pathogen or disease load prior to any reintroduction project is important to help protect the animal being reintroduced, as well as the wildlife and livestock in the area of relocation. We screened 58 fishers for CDV antibody prior to their release into Pennsylvania, US, as part of a reintroduction program. Five of the 58 (9%) fishers had a weak-positive reaction for CDV antibody at a dilution of 1:16. None of the fishers exhibited any clinical sign of canine distemper while being held prior to release. PMID:26555109

  6. Effects of sample size on KERNEL home range estimates

    USGS Publications Warehouse

    Seaman, D.E.; Millspaugh, J.J.; Kernohan, Brian J.; Brundige, Gary C.; Raedeke, Kenneth J.; Gitzen, Robert A.

    1999-01-01

    Kernel methods for estimating home range are being used increasingly in wildlife research, but the effect of sample size on their accuracy is not known. We used computer simulations of 10-200 points/home range and compared accuracy of home range estimates produced by fixed and adaptive kernels with the reference (REF) and least-squares cross-validation (LSCV) methods for determining the amount of smoothing. Simulated home ranges varied from simple to complex shapes created by mixing bivariate normal distributions. We used the size of the 95% home range area and the relative mean squared error of the surface fit to assess the accuracy of the kernel home range estimates. For both measures, the bias and variance approached an asymptote at about 50 observations/home range. The fixed kernel with smoothing selected by LSCV provided the least-biased estimates of the 95% home range area. All kernel methods produced similar surface fit for most simulations, but the fixed kernel with LSCV had the lowest frequency and magnitude of very poor estimates. We reviewed 101 papers published in The Journal of Wildlife Management (JWM) between 1980 and 1997 that estimated animal home ranges. A minority of these papers used nonparametric utilization distribution (UD) estimators, and most did not adequately report sample sizes. We recommend that home range studies using kernel estimates use LSCV to determine the amount of smoothing, obtain a minimum of 30 observations per animal (but preferably a?Y50), and report sample sizes in published results.

  7. Gaussian kernel width optimization for sparse Bayesian learning.

    PubMed

    Mohsenzadeh, Yalda; Sheikhzadeh, Hamid

    2015-04-01

    Sparse kernel methods have been widely used in regression and classification applications. The performance and the sparsity of these methods are dependent on the appropriate choice of the corresponding kernel functions and their parameters. Typically, the kernel parameters are selected using a cross-validation approach. In this paper, a learning method that is an extension of the relevance vector machine (RVM) is presented. The proposed method can find the optimal values of the kernel parameters during the training procedure. This algorithm uses an expectation-maximization approach for updating kernel parameters as well as other model parameters; therefore, the speed of convergence and computational complexity of the proposed method are the same as the standard RVM. To control the convergence of this fully parameterized model, the optimization with respect to the kernel parameters is performed using a constraint on these parameters. The proposed method is compared with the typical RVM and other competing methods to analyze the performance. The experimental results on the commonly used synthetic data, as well as benchmark data sets, demonstrate the effectiveness of the proposed method in reducing the performance dependency on the initial choice of the kernel parameters. PMID:25794377

  8. Correlation and classification of single kernel fluorescence hyperspectral data with aflatoxin concentration in corn kernels inoculated with Aspergillus flavus spores.

    PubMed

    Yao, H; Hruska, Z; Kincaid, R; Brown, R; Cleveland, T; Bhatnagar, D

    2010-05-01

    The objective of this study was to examine the relationship between fluorescence emissions of corn kernels inoculated with Aspergillus flavus and aflatoxin contamination levels within the kernels. Aflatoxin contamination in corn has been a long-standing problem plaguing the grain industry with potentially devastating consequences to corn growers. In this study, aflatoxin-contaminated corn kernels were produced through artificial inoculation of corn ears in the field with toxigenic A. flavus spores. The kernel fluorescence emission data were taken with a fluorescence hyperspectral imaging system when corn kernels were excited with ultraviolet light. Raw fluorescence image data were preprocessed and regions of interest in each image were created for all kernels. The regions of interest were used to extract spectral signatures and statistical information. The aflatoxin contamination level of single corn kernels was then chemically measured using affinity column chromatography. A fluorescence peak shift phenomenon was noted among different groups of kernels with different aflatoxin contamination levels. The fluorescence peak shift was found to move more toward the longer wavelength in the blue region for the highly contaminated kernels and toward the shorter wavelengths for the clean kernels. Highly contaminated kernels were also found to have a lower fluorescence peak magnitude compared with the less contaminated kernels. It was also noted that a general negative correlation exists between measured aflatoxin and the fluorescence image bands in the blue and green regions. The correlation coefficients of determination, r(2), was 0.72 for the multiple linear regression model. The multivariate analysis of variance found that the fluorescence means of four aflatoxin groups, <1, 1-20, 20-100, and >or=100 ng g(-1) (parts per billion), were significantly different from each other at the 0.01 level of alpha. Classification accuracy under a two-class schema ranged from 0.84 to

  9. Carnivore Translocations and Conservation: Insights from Population Models and Field Data for Fishers (Martes pennanti)

    PubMed Central

    Lewis, Jeffrey C.; Powell, Roger A.; Zielinski, William J.

    2012-01-01

    Translocations are frequently used to restore extirpated carnivore populations. Understanding the factors that influence translocation success is important because carnivore translocations can be time consuming, expensive, and controversial. Using population viability software, we modeled reintroductions of the fisher, a candidate for endangered or threatened status in the Pacific states of the US. Our model predicts that the most important factor influencing successful re-establishment of a fisher population is the number of adult females reintroduced (provided some males are also released). Data from 38 translocations of fishers in North America, including 30 reintroductions, 5 augmentations and 3 introductions, show that the number of females released was, indeed, a good predictor of success but that the number of males released, geographic region and proximity of the source population to the release site were also important predictors. The contradiction between model and data regarding males may relate to the assumption in the model that all males are equally good breeders. We hypothesize that many males may need to be released to insure a sufficient number of good breeders are included, probably large males. Seventy-seven percent of reintroductions with known outcomes (success or failure) succeeded; all 5 augmentations succeeded; but none of the 3 introductions succeeded. Reintroductions were instrumental in reestablishing fisher populations within their historical range and expanding the range from its most-contracted state (43% of the historical range) to its current state (68% of the historical range). To increase the likelihood of translocation success, we recommend that managers: 1) release as many fishers as possible, 2) release more females than males (55–60% females) when possible, 3) release as many adults as possible, especially large males, 4) release fishers from a nearby source population, 5) conduct a formal feasibility assessment, and 6

  10. Fishers' knowledge identifies environmental changes and fish abundance trends in impounded tropical rivers.

    PubMed

    Hallwass, Gustavo; Lopes, Priscila F; Juras, Anastácio A; Silvano, Renato A M

    2013-03-01

    The long-term impacts of large hydroelectric dams on small-scale fisheries in tropical rivers are poorly known. A promising way to investigate such impacts is to compare and integrate the local ecological knowledge (LEK) of resource users with biological data for the same region. We analyzed the accuracy of fishers' LEK to investigate fisheries dynamics and environmental changes in the Lower Tocantins River (Brazilian Amazon) downstream from a large dam. We estimated fishers' LEK through interviews with 300 fishers in nine villages and collected data on 601 fish landings in five of these villages, 22 years after the dam's establishment (2006-2008). We compared these two databases with each other and with data on fish landings from before the dam's establishment (1981) gathered from the literature. The data obtained based on the fishers' LEK (interviews) and from fisheries agreed regarding the primary fish species caught, the most commonly used type of fishing gear (gill nets) and even the most often used gill net mesh sizes but disagreed regarding seasonal fish abundance. According to the interviewed fishers, the primary environmental changes that occurred after the impoundment were an overall decrease in fish abundance, an increase in the abundance of some fish species and, possibly, the local extinction of a commercial fish species (Semaprochilodus brama). These changes were corroborated by comparing fish landings sampled before and 22 years after the impoundment, which indicated changes in the composition of fish landings and a decrease in the total annual fish production. Our results reinforce the hypothesis that large dams may adversely affect small-scale fisheries downstream and establish a feasible approach for applying fishers' LEK to fisheries management, especially in regions with a low research capacity.

  11. On the Karlin-Kimura approaches to the Wright-Fisher diffusion with fluctuating selection

    NASA Astrophysics Data System (ADS)

    Huillet, Thierry

    2011-02-01

    The goal of this work is a comparative study of two Wright-Fisher-like diffusion processes on the interval, one due to Karlin and the other one due to Kimura. Each model accounts for the evolution of one two-locus colony undergoing random mating, under the additional action of selection in a random environment. In other words, we study the effect of disorder on the usual Wright-Fisher model with fixed (nonrandom) selection. There is a drastic qualitative difference between the two models and between the random and nonrandom selection hypotheses. We first present a series of elementary stochastic models and tools that are needed to conduct this study in the context of diffusion process theory, including Kolmogorov backward and forward equations, scale and speed functions, classification of boundaries, and Doob transformation of sample paths using additive functionals. In this spirit, we briefly revisit the neutral Wright-Fisher diffusion and the Wright-Fisher diffusion with nonrandom selection. With these tools at hand, we first deal with the Karlin approach to the Wright-Fisher diffusion model with randomized selection differentials. The specificity of this model is that in the large population case, the boundaries of the state space are natural and hence inaccessible, and so quasi-absorbing only. We supply some limiting properties pertaining to times of hitting of points close to the boundaries. Next, we study the Kimura approach to the Wright-Fisher model with randomized selection, which may be viewed as a modification of the Karlin model, using an appropriate Doob transform which we describe. This model also has natural boundaries, but they turn out to be much more attracting and sticky than in Karlin's version. This leads to a faster approach to the quasi-absorbing states, to a larger time needed to move from the vicinity of one boundary to the other and to a local critical behavior of the branching diffusion obtained after the relevant Doob transformation.

  12. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    PubMed

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  13. Vitamin E profile as a reliable authenticity discrimination factor between chestnut (Castanea sativa Mill.) cultivars.

    PubMed

    Barreira, João C M; Alves, Rita C; Casal, Susana; Ferreira, Isabel C F R; Oliveira, M Beatriz P P; Pereira, José Alberto

    2009-06-24

    In this study, the profile of tocopherols and tocotrienols in chestnut ( Castanea sativa Mill.) kernel oil was evaluated. Four Portuguese chestnut varieties were selected: Aveleira, Boaventura, Judia, and Longal. The vitamin E determination had already been applied to similar matrices, but, to the authors' knowledge, it is the first time that chestnut kernel oil has been evaluated. The prevalent vitamer was gamma-tocopherol, often present in trace amounts in other natural products. Due to the high commercial value of chestnut, a statistical analysis of the obtained results was also conducted to define the tocopherol and tocotrienol profile as a reliable indicator of a specific chestnut variety. To achieve this objective, an analysis of variance was performed to evaluate the accuracy of the method as well as the uniformity of results for each variety. A discriminant analysis was also carried out revealing quite satisfactory results. Four varieties were clustered in four individual groups through the definition of two discriminant analysis dimensions.

  14. Bridging the gap between the KERNEL and RT-11

    SciTech Connect

    Hendra, R.G.

    1981-06-01

    A software package is proposed to allow users of the PL-11 language, and the LSI-11 KERNEL in general, to use their PL-11 programs under RT-11. Further, some general purpose extensions to the KERNEL are proposed that facilitate some number conversions and strong manipulations. A Floating Point Package of procedures to allow full use of the hardware floating point capability of the LSI-11 computers is proposed. Extensions to the KERNEL that allow a user to read, write and delete disc files in the manner of RT-11 is also proposed. A device directory listing routine is also included.

  15. Spectrophotometric method for determination of phosphine residues in cashew kernels.

    PubMed

    Rangaswamy, J R

    1988-01-01

    A spectrophotometric method reported for determination of phosphine (PH3) residues in wheat has been extended for determination of these residues in cashew kernels. Unlike the spectrum for wheat, the spectrum of PH3 residue-AgNO3 chromophore from cashew kernels does not show an absorption maximum at 400 nm; nevertheless, reading the absorbance at 400 nm afforded good recoveries of 90-98%. No interference occurred from crop materials, and crop controls showed low absorbance; the method can be applied for determinations as low as 0.01 ppm PH3 residue in cashew kernels.

  16. Initial-state splitting kernels in cold nuclear matter

    NASA Astrophysics Data System (ADS)

    Ovanesyan, Grigory; Ringer, Felix; Vitev, Ivan

    2016-09-01

    We derive medium-induced splitting kernels for energetic partons that undergo interactions in dense QCD matter before a hard-scattering event at large momentum transfer Q2. Working in the framework of the effective theory SCETG, we compute the splitting kernels beyond the soft gluon approximation. We present numerical studies that compare our new results with previous findings. We expect the full medium-induced splitting kernels to be most relevant for the extension of initial-state cold nuclear matter energy loss phenomenology in both p+A and A+A collisions.

  17. Kernel simplex growing algorithm for hyperspectral endmember extraction

    NASA Astrophysics Data System (ADS)

    Zhao, Liaoying; Zheng, Junpeng; Li, Xiaorun; Wang, Lijiao

    2014-01-01

    In order to effectively extract endmembers for hyperspectral imagery where linear mixing model may not be appropriate due to multiple scattering effects, this paper extends the simplex growing algorithm (SGA) to its kernel version. A new simplex volume formula without dimension reduction is used in SGA to form a new simplex growing algorithm (NSGA). The original data are nonlinearly mapped into a high-dimensional space where the scatters can be ignored. To avoid determining complex nonlinear mapping, a kernel function is used to extend the NSGA to kernel NSGA (KNSGA). Experimental results of simulated and real data prove that the proposed KNSGA approach outperforms SGA and NSGA.

  18. Multitasking kernel for the C and Fortran programming languages

    SciTech Connect

    Brooks, E.D. III

    1984-09-01

    A multitasking kernel for the C and Fortran programming languages which runs on the Unix operating system is presented. The kernel provides a multitasking environment which serves two purposes. The first is to provide an efficient portable environment for the coding, debugging and execution of production multiprocessor programs. The second is to provide a means of evaluating the performance of a multitasking program on model multiprocessors. The performance evaluation features require no changes in the source code of the application and are implemented as a set of compile and run time options in the kernel.

  19. Monte Carlo Code System for Electron (Positron) Dose Kernel Calculations.

    1999-05-12

    Version 00 KERNEL performs dose kernel calculations for an electron (positron) isotropic point source in an infinite homogeneous medium. First, the auxiliary code PRELIM is used to prepare cross section data for the considered medium. Then the KERNEL code simulates the transport of electrons and bremsstrahlung photons through the medium until all particles reach their cutoff energies. The deposited energy is scored in concentric spherical shells at a radial distance ranging from zero to twicemore » the source particle range.« less

  20. False-Positive Serum Botulism Bioassay in Miller-Fisher Syndrome.

    PubMed

    Zeylikman, Yuriy; Shah, Vishal; Shah, Umang; Mirsen, Thomas R; Campellone, Joseph V

    2015-09-01

    We describe a patient with acute progressive weakness and areflexia. Both botulism and Miller-Fisher variant of Guillain-Barré syndrome were initial diagnostic considerations, and she was treated with intravenous immunoglobulin and botulinum antitoxin. A mouse bioassay was positive for botulinum toxin A, although her clinical course, electrodiagnostic studies, and cerebrospinal fluid findings supported Miller-Fisher syndrome. This patient's atypical features offer points of discussion regarding the evaluation of patients with acute neuromuscular weakness and emphasize the limitations of the botulism bioassay.

  1. Cosmology with the largest galaxy cluster surveys: going beyond Fisher matrix forecasts

    SciTech Connect

    Khedekar, Satej; Majumdar, Subhabrata E-mail: subha@tifr.res.in

    2013-02-01

    We make the first detailed MCMC likelihood study of cosmological constraints that are expected from some of the largest, ongoing and proposed, cluster surveys in different wave-bands and compare the estimates to the prevalent Fisher matrix forecasts. Mock catalogs of cluster counts expected from the surveys — eROSITA, WFXT, RCS2, DES and Planck, along with a mock dataset of follow-up mass calibrations are analyzed for this purpose. A fair agreement between MCMC and Fisher results is found only in the case of minimal models. However, for many cases, the marginalized constraints obtained from Fisher and MCMC methods can differ by factors of 30-100%. The discrepancy can be alarmingly large for a time dependent dark energy equation of state, w(a); the Fisher methods are seen to under-estimate the constraints by as much as a factor of 4-5. Typically, Fisher estimates become more and more inappropriate as we move away from ΛCDM, to a constant-w dark energy to varying-w dark energy cosmologies. Fisher analysis, also, predicts incorrect parameter degeneracies. There are noticeable offsets in the likelihood contours obtained from Fisher methods that is caused due to an asymmetry in the posterior likelihood distribution as seen through a MCMC analysis. From the point of mass-calibration uncertainties, a high value of unknown scatter about the mean mass-observable relation, and its redshift dependence, is seen to have large degeneracies with the cosmological parameters σ{sub 8} and w(a) and can degrade the cosmological constraints considerably. We find that the addition of mass-calibrated cluster datasets can improve dark energy and σ{sub 8} constraints by factors of 2-3 from what can be obtained from CMB+SNe+BAO only . Finally, we show that a joint analysis of datasets of two (or more) different cluster surveys would significantly tighten cosmological constraints from using clusters only. Since, details of future cluster surveys are still being planned, we emphasize

  2. Asymptotics semiclassically concentrated on curves for the nonlocal Fisher-Kolmogorov-Petrovskii-Piskunov equation

    NASA Astrophysics Data System (ADS)

    Levchenko, E. A.; Shapovalov, A. V.; Trifonov, A. Yu

    2016-07-01

    In this paper we construct asymptotic solutions for the nonlocal multidimensional Fisher-Kolmogorov-Petrovskii-Piskunov equation in the class of functions concentrated on a one-dimensional manifold (curve) using a semiclassical approximation technique. We show that the construction of these solutions can be reduced to solving a similar problem for the nonlocal Fisher-Kolmogorov-Petrovskii-Piskunov in the class of functions concentrated at a point (zero-dimensional manifold) together with an additional operator condition. The general approach is exemplified by constructing a two-dimensional two-parametric solution, which describes quasi-steady-state patterns on a circumference.

  3. False-Positive Serum Botulism Bioassay in Miller-Fisher Syndrome.

    PubMed

    Zeylikman, Yuriy; Shah, Vishal; Shah, Umang; Mirsen, Thomas R; Campellone, Joseph V

    2015-09-01

    We describe a patient with acute progressive weakness and areflexia. Both botulism and Miller-Fisher variant of Guillain-Barré syndrome were initial diagnostic considerations, and she was treated with intravenous immunoglobulin and botulinum antitoxin. A mouse bioassay was positive for botulinum toxin A, although her clinical course, electrodiagnostic studies, and cerebrospinal fluid findings supported Miller-Fisher syndrome. This patient's atypical features offer points of discussion regarding the evaluation of patients with acute neuromuscular weakness and emphasize the limitations of the botulism bioassay. PMID:26301377

  4. Texture analysis in gel electrophoresis images using an integrative kernel-based approach

    PubMed Central

    Fernandez-Lozano, Carlos; Seoane, Jose A.; Gestal, Marcos; Gaunt, Tom R.; Dorado, Julian; Pazos, Alejandro; Campbell, Colin

    2016-01-01

    Texture information could be used in proteomics to improve the quality of the image analysis of proteins separated on a gel. In order to evaluate the best technique to identify relevant textures, we use several different kernel-based machine learning techniques to classify proteins in 2-DE images into spot and noise. We evaluate the classification accuracy of each of these techniques with proteins extracted from ten 2-DE images of different types of tissues and different experimental conditions. We found that the best classification model was FSMKL, a data integration method using multiple kernel learning, which achieved AUROC values above 95% while using a reduced number of features. This technique allows us to increment the interpretability of the complex combinations of textures and to weight the importance of each particular feature in the final model. In particular the Inverse Difference Moment exhibited the highest discriminating power. A higher value can be associated with an homogeneous structure as this feature describes the homogeneity; the larger the value, the more symmetric. The final model is performed by the combination of different groups of textural features. Here we demonstrated the feasibility of combining different groups of textures in 2-DE image analysis for spot detection. PMID:26758643

  5. On the characterization of vegetation recovery after fire disturbance using Fisher-Shannon analysis and SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series

    NASA Astrophysics Data System (ADS)

    Lasaponara, Rosa; Lanorte, Antonio; Lovallo, Michele; Telesca, Luciano

    2015-04-01

    Time series can fruitfully support fire monitoring and management from statistical analysis of fire occurrence (Tuia et al. 2008) to danger estimation (lasaponara 2005), damage evaluation (Lanorte et al 2014) and post fire recovery (Lanorte et al. 2014). In this paper, the time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers. Reference Lanorte A, R Lasaponara, M Lovallo, L Telesca 2014 Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to

  6. A new method of NIR face recognition using kernel projection DCV and neural networks

    NASA Astrophysics Data System (ADS)

    Qiao, Ya; Lu, Yuan; Feng, Yun-song; Li, Feng; Ling, Yongshun

    2013-09-01

    A new face recognition system was proposed, which used active near infrared imaging system (ANIRIS) as face images acquisition equipment, used kernel discriminative common vector (KDCV) as the feature extraction algorithm and used neural network as the recognition method. The ANIRIS was established by 40 NIR LEDs which used as active light source and a HWB800-IR-80 near infrared filter which used together with CCD camera to serve as the imaging detector. Its function of reducing the influence of varying illuminations to recognition rate was discussed. The KDCV feature extraction and neural network recognition parts were realized by Matlab programming. The experiments on HITSZ Lab2 face database and self-built face database show that the average recognition rate reached more than 95%, proving the effectiveness of proposed system.

  7. Improved Prediction of Malaria Degradomes by Supervised Learning with SVM and Profile Kernel

    PubMed Central

    Kuang, Rui; Gu, Jianying; Cai, Hong; Wang, Yufeng

    2009-01-01

    The spread of drug resistance through malaria parasite populations calls for the development of new therapeutic strategies. However, the seemingly promising genomics-driven target identification paradigm is hampered by the weak annotation coverage. To identify potentially important yet uncharacterized proteins, we apply support vector machines using profile kernels, a supervised discriminative machine learning technique for remote homology detection, as a complement to the traditional alignment based algorithms. In this study, we focus on the prediction of proteases, which have long been considered attractive drug targets because of their indispensable roles in parasite development and infection. Our analysis demonstrates that an abundant and complex repertoire is conserved in five Plasmodium parasite species. Several putative proteases may be important components in networks that mediate cellular processes, including hemoglobin digestion, invasion, trafficking, cell cycle fate, and signal transduction. This catalog of proteases provides a short list of targets for functional characterization and rational inhibitor design. PMID:19057851

  8. Robust low dimensional kernel correlation feature spaces that generalize to unseen datasets

    NASA Astrophysics Data System (ADS)

    Abiantun, Ramzi; Savvides, Marios; Vijayakumar, B. V. K.

    2007-04-01

    In this paper we demonstrate the subspace generalization power of the kernel correlation feature analysis (KCFA) method for extracting a low dimensional subspace that has the ability to represent new unseen datasets. Examining the portability of this algorithm across different datasets is an important practical aspect of real-world face recognition applications where the technology cannot be dataset-dependant. In most face recognition literature, algorithms are demonstrated on datasets by training on one portion of the dataset and testing on the remainder. Generally, the testing subjects' dataset partially or totally overlap the training subjects' dataset however with disjoint images captured from different sessions. Thus, some of the expected facial variations and the people's faces are modeled in the training set. In this paper we describe how we efficiently build a compact feature subspace using kernel correlation filter analysis on the generic training set of the FRGC dataset and use that basis for recognition on a different dataset. The KCFA feature subspace has a total dimension that corresponds to the number of training subjects; we chose to vary this number to include up to all of 222 available in the FRGC generic dataset. We test the built subspace produced by KCFA by projecting other well-known face datasets upon it. We show that this feature subspace has good representation and discrimination to unseen datasets and produces good verification and identification rates compared to other subspace and dimensionality reduction methods such as PCA (when trained on the same FRGC generic dataset). Its efficiency, lower dimensionality and discriminative power make it more practical and powerful than PCA as a robust lower dimensionality reduction method for modeling faces and facial variations.

  9. An information measure for class discrimination. [in remote sensing of crop observation

    NASA Technical Reports Server (NTRS)

    Shen, S. S.; Badhwar, G. D.

    1986-01-01

    This article describes a separability measure for class discrimination. This measure is based on the Fisher information measure for estimating the mixing proportion of two classes. The Fisher information measure not only provides a means to assess quantitatively the information content in the features for separating classes, but also gives the lower bound for the variance of any unbiased estimate of the mixing proportion based on observations of the features. Unlike most commonly used separability measures, this measure is not dependent on the form of the probability distribution of the features and does not imply a specific estimation procedure. This is important because the probability distribution function that describes the data for a given class does not have simple analytic forms, such as a Gaussian. Results of applying this measure to compare the information content provided by three Landsat-derived feature vectors for the purpose of separating small grains from other crops are presented.

  10. Kernel-based Linux emulation for Plan 9.

    SciTech Connect

    Minnich, Ronald G.

    2010-09-01

    CNKemu is a kernel-based system for the 9k variant of the Plan 9 kernel. It is designed to provide transparent binary support for programs compiled for IBM's Compute Node Kernel (CNK) on the Blue Gene series of supercomputers. This support allows users to build applications with the standard Blue Gene toolchain, including C++ and Fortran compilers. While the CNK is not Linux, IBM designed the CNK so that the user interface has much in common with the Linux 2.0 system call interface. The Plan 9 CNK emulator hence provides the foundation of kernel-based Linux system call support on Plan 9. In this paper we discuss cnkemu's implementation and some of its more interesting features, such as the ability to easily intermix Plan 9 and Linux system calls.

  11. Inheritance of Kernel Color in Corn: Explanations and Investigations.

    ERIC Educational Resources Information Center

    Ford, Rosemary H.

    2000-01-01

    Offers a new perspective on traditional problems in genetics on kernel color in corn, including information about genetic regulation, metabolic pathways, and evolution of genes. (Contains 15 references.) (ASK)

  12. Isolation and purification of D-mannose from palm kernel.

    PubMed

    Zhang, Tao; Pan, Ziguo; Qian, Chao; Chen, Xinzhi

    2009-09-01

    An economically viable procedure for the isolation and purification of d-mannose from palm kernel was developed in this research. The palm kernel was catalytically hydrolyzed with sulfuric acid at 100 degrees C and then fermented by mannan-degrading enzymes. The solution after fermentation underwent filtration in a silica gel column, desalination by ion-exchange resin, and crystallization in ethanol to produce pure d-mannose in a total yield of 48.4% (based on the weight of the palm kernel). Different enzymes were investigated, and the results indicated that endo-beta-mannanase was the best enzyme to promote the hydrolysis of the oligosaccharides isolated from the palm kernel. The pure d-mannose sample was characterized by FTIR, (1)H NMR, and (13)C NMR spectra.

  13. A kernel adaptive algorithm for quaternion-valued inputs.

    PubMed

    Paul, Thomas K; Ogunfunmi, Tokunbo

    2015-10-01

    The use of quaternion data can provide benefit in applications like robotics and image recognition, and particularly for performing transforms in 3-D space. Here, we describe a kernel adaptive algorithm for quaternions. A least mean square (LMS)-based method was used, resulting in the derivation of the quaternion kernel LMS (Quat-KLMS) algorithm. Deriving this algorithm required describing the idea of a quaternion reproducing kernel Hilbert space (RKHS), as well as kernel functions suitable with quaternions. A modified HR calculus for Hilbert spaces was used to find the gradient of cost functions defined on a quaternion RKHS. In addition, the use of widely linear (or augmented) filtering is proposed to improve performance. The benefit of the Quat-KLMS and widely linear forms in learning nonlinear transformations of quaternion data are illustrated with simulations. PMID:25594982

  14. Impact of Species and Variety on Concentrations of Minor Lipophilic Bioactive Compounds in Oils Recovered from Plum Kernels.

    PubMed

    Górnaś, Paweł; Rudzińska, Magdalena; Raczyk, Marianna; Mišina, Inga; Soliven, Arianne; Lācis, Gunārs; Segliņa, Dalija

    2016-02-01

    The profile of bioactive compounds (carotenoids, tocopherols, tocotrienols, phytosterols, and squalene) in oils recovered from the kernels of 28 plum varieties of hexaploid species Prunus domestica L. and diploid plums Prunus cerasifera Ehrh. and their crossbreeds were studied. Oil yields in plum kernels of both P. cerasifera and P. domestica was in wide ranges of 22.6-53.1 and 24.2-46.9% (w/w) dw, respectively. The contents of total tocochromanols, carotenoids, phytosterols, and squalene was significantly affected by the variety and ranged between 70.7 and 208.7 mg/100 g of oil, between 0.41 and 3.07 mg/100 g of oil, between 297.2 and 1569.6 mg/100 g of oil, and between 25.7 and 80.4 mg/100 g of oil, respectively. Regardless of the cultivar, β-sitosterol and γ-tocopherol were the main minor lipophilic compounds in plum kernel oils and constituted between 208.5 and 1258.7 mg/100 g of oil and between 60.5 and 182.0 mg/100 g of oil, respectively. Between the studied plum species, significant differences were recorded for δ-tocopherol (p = 0.007), 24-methylenecycloartanol (p = 0.038), and citrostadienol (p = 0.003), but they were insufficient for discrimination by PCA.

  15. DIFFERENTIAL PULSE HEIGHT DISCRIMINATOR

    DOEpatents

    Test, L.D.

    1958-11-11

    Pulse-height discriminators are described, specifically a differential pulse-height discriminator which is adapted to respond to pulses of a band of amplitudes, but to reject pulses of amplitudes greater or less than tbe preselected band. In general, the discriminator includes a vacuum tube having a plurality of grids adapted to cut off plate current in the tube upon the application of sufficient negative voltage. One grid is held below cutoff, while a positive pulse proportional to the amplltude of each pulse is applled to this grid. Another grid has a negative pulse proportional to the amplitude of each pulse simultaneously applied to it. With this arrangement the tube will only pass pulses which are of sufficlent amplitude to counter the cutoff bias but not of sufficlent amplitude to cutoff the tube.

  16. The Dynamic Kernel Scheduler-Part 1

    NASA Astrophysics Data System (ADS)

    Adelmann, Andreas; Locans, Uldis; Suter, Andreas

    2016-10-01

    Emerging processor architectures such as GPUs and Intel MICs provide a huge performance potential for high performance computing. However developing software that uses these hardware accelerators introduces additional challenges for the developer. These challenges may include exposing increased parallelism, handling different hardware designs, and using multiple development frameworks in order to utilise devices from different vendors. The Dynamic Kernel Scheduler (DKS) is being developed in order to provide a software layer between the host application and different hardware accelerators. DKS handles the communication between the host and the device, schedules task execution, and provides a library of built-in algorithms. Algorithms available in the DKS library will be written in CUDA, OpenCL, and OpenMP. Depending on the available hardware, the DKS can select the appropriate implementation of the algorithm. The first DKS version was created using CUDA for the Nvidia GPUs and OpenMP for Intel MIC. DKS was further integrated into OPAL (Object-oriented Parallel Accelerator Library) in order to speed up a parallel FFT based Poisson solver and Monte Carlo simulations for particle-matter interaction used for proton therapy degrader modelling. DKS was also used together with Minuit2 for parameter fitting, where χ2 and max-log-likelihood functions were offloaded to the hardware accelerator. The concepts of the DKS, first results, and plans for the future will be shown in this paper.

  17. Protoribosome by quantum kernel energy method.

    PubMed

    Huang, Lulu; Krupkin, Miri; Bashan, Anat; Yonath, Ada; Massa, Lou

    2013-09-10

    Experimental evidence suggests the existence of an RNA molecular prebiotic entity, called by us the "protoribosome," which may have evolved in the RNA world before evolution of the genetic code and proteins. This vestige of the RNA world, which possesses all of the capabilities required for peptide bond formation, seems to be still functioning in the heart of all of the contemporary ribosome. Within the modern ribosome this remnant includes the peptidyl transferase center. Its highly conserved nucleotide sequence is suggestive of its robustness under diverse environmental conditions, and hence on its prebiotic origin. Its twofold pseudosymmetry suggests that this entity could have been a dimer of self-folding RNA units that formed a pocket within which two activated amino acids might be accommodated, similar to the binding mode of modern tRNA molecules that carry amino acids or peptidyl moieties. Using quantum mechanics and crystal coordinates, this work studies the question of whether the putative protoribosome has properties necessary to function as an evolutionary precursor to the modern ribosome. The quantum model used in the calculations is density functional theory--B3LYP/3-21G*, implemented using the kernel energy method to make the computations practical and efficient. It occurs that the necessary conditions that would characterize a practicable protoribosome--namely (i) energetic structural stability and (ii) energetically stable attachment to substrates--are both well satisfied.

  18. Kernel MAD Algorithm for Relative Radiometric Normalization

    NASA Astrophysics Data System (ADS)

    Bai, Yang; Tang, Ping; Hu, Changmiao

    2016-06-01

    The multivariate alteration detection (MAD) algorithm is commonly used in relative radiometric normalization. This algorithm is based on linear canonical correlation analysis (CCA) which can analyze only linear relationships among bands. Therefore, we first introduce a new version of MAD in this study based on the established method known as kernel canonical correlation analysis (KCCA). The proposed method effectively extracts the non-linear and complex relationships among variables. We then conduct relative radiometric normalization experiments on both the linear CCA and KCCA version of the MAD algorithm with the use of Landsat-8 data of Beijing, China, and Gaofen-1(GF-1) data derived from South China. Finally, we analyze the difference between the two methods. Results show that the KCCA-based MAD can be satisfactorily applied to relative radiometric normalization, this algorithm can well describe the nonlinear relationship between multi-temporal images. This work is the first attempt to apply a KCCA-based MAD algorithm to relative radiometric normalization.

  19. Kernel spectral clustering with memory effect

    NASA Astrophysics Data System (ADS)

    Langone, Rocco; Alzate, Carlos; Suykens, Johan A. K.

    2013-05-01

    Evolving graphs describe many natural phenomena changing over time, such as social relationships, trade markets, metabolic networks etc. In this framework, performing community detection and analyzing the cluster evolution represents a critical task. Here we propose a new model for this purpose, where the smoothness of the clustering results over time can be considered as a valid prior knowledge. It is based on a constrained optimization formulation typical of Least Squares Support Vector Machines (LS-SVM), where the objective function is designed to explicitly incorporate temporal smoothness. The latter allows the model to cluster the current data well and to be consistent with the recent history. We also propose new model selection criteria in order to carefully choose the hyper-parameters of our model, which is a crucial issue to achieve good performances. We successfully test the model on four toy problems and on a real world network. We also compare our model with Evolutionary Spectral Clustering, which is a state-of-the-art algorithm for community detection of evolving networks, illustrating that the kernel spectral clustering with memory effect can achieve better or equal performances.

  20. Fisher-Level Decision Making to Participate in Fisheries Improvement Projects (FIPs) for Yellowfin Tuna in the Philippines

    PubMed Central

    Berentsen, Paul; Bush, Simon R.; Digal, Larry; Oude Lansink, Alfons

    2016-01-01

    This study identifies the capabilities needed by small-scale fishers to participate in Fishery Improvement Projects (FIPs) for yellowfin tuna in the Philippines. The current literature provides little empirical evidence on how different models, or types of FIPs, influence the participation of fishers in their programs and the degree which FIPs are able to foster improvements in fishing practices. To address this literature gap, two different FIPs are empirically analysed, each with different approaches for fostering improvement. The first is the non-governmental organisation-led Partnership Programme Towards Sustainable Tuna, which adopts a bottom-up or development oriented FIP model. The second is the private-led Artesmar FIP, which adopts a top-down or market-oriented FIP approach. The data were obtained from 350 fishers surveyed and were analysed using two separate models run in succession, taking into consideration full, partial, and non-participation in the two FIPs. The results demonstrate that different types of capabilities are required in order to participate in different FIP models. Individual firm capabilities are more important for fishers participation in market-oriented FIPs, which use direct economic incentives to encourage improvements in fisher practices. Collective capabilities are more important for fishers to participate in development-oriented FIPs, which drive improvement by supporting fishers, fisher associations, and governments to move towards market requirements. PMID:27732607

  1. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., or wave action. Note: The Corps of Engineers also has regulations dealing with this section in 33 CFR..., Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  2. 33 CFR 207.260 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 207.260 Section 207.260... REGULATIONS § 207.260 Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher... canal at any stage from the mouth of the Yazoo Diversion Canal where it enters into the...

  3. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., or wave action. Note: The Corps of Engineers also has regulations dealing with this section in 33 CFR..., Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  4. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., or wave action. Note: The Corps of Engineers also has regulations dealing with this section in 33 CFR..., Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  5. 33 CFR 162.85 - Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., or wave action. Note: The Corps of Engineers also has regulations dealing with this section in 33 CFR..., Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited....

  6. The not-so-benign Miller Fisher syndrome: a variant of the Guilain-Barré syndrome.

    PubMed

    Blau, I; Casson, I; Lieberman, A; Weiss, E

    1980-06-01

    Two patients with Fisher's syndrome of ophthalmoplegia, ataxia, and areflexia experienced severe weakness and respiratory distress. Both patients required tracheostomy and assisted ventilation, but both made a complete recovery. Fisher's syndrome is generally considered to be a benign variant of acute infectious polyneuropathy (the Guillain-Barre syndrome). Our two patients demonstrate that the condition is not always benign. PMID:7387472

  7. Resummed memory kernels in generalized system-bath master equations

    SciTech Connect

    Mavros, Michael G.; Van Voorhis, Troy

    2014-08-07

    Generalized master equations provide a concise formalism for studying reduced population dynamics. Usually, these master equations require a perturbative expansion of the memory kernels governing the dynamics; in order to prevent divergences, these expansions must be resummed. Resummation techniques of perturbation series are ubiquitous in physics, but they have not been readily studied for the time-dependent memory kernels used in generalized master equations. In this paper, we present a comparison of different resummation techniques for such memory kernels up to fourth order. We study specifically the spin-boson Hamiltonian as a model system bath Hamiltonian, treating the diabatic coupling between the two states as a perturbation. A novel derivation of the fourth-order memory kernel for the spin-boson problem is presented; then, the second- and fourth-order kernels are evaluated numerically for a variety of spin-boson parameter regimes. We find that resumming the kernels through fourth order using a Padé approximant results in divergent populations in the strong electronic coupling regime due to a singularity introduced by the nature of the resummation, and thus recommend a non-divergent exponential resummation (the “Landau-Zener resummation” of previous work). The inclusion of fourth-order effects in a Landau-Zener-resummed kernel is shown to improve both the dephasing rate and the obedience of detailed balance over simpler prescriptions like the non-interacting blip approximation, showing a relatively quick convergence on the exact answer. The results suggest that including higher-order contributions to the memory kernel of a generalized master equation and performing an appropriate resummation can provide a numerically-exact solution to system-bath dynamics for a general spectral density, opening the way to a new class of methods for treating system-bath dynamics.

  8. The Weighted Super Bergman Kernels Over the Supermatrix Spaces

    NASA Astrophysics Data System (ADS)

    Feng, Zhiming

    2015-12-01

    The purpose of this paper is threefold. Firstly, using Howe duality for , we obtain integral formulas of the super Schur functions with respect to the super standard Gaussian distributions. Secondly, we give explicit expressions of the super Szegö kernels and the weighted super Bergman kernels for the Cartan superdomains of type I. Thirdly, combining these results, we obtain duality relations of integrals over the unitary groups and the Cartan superdomains, and the marginal distributions of the weighted measure.

  9. Kernel approximation for solving few-body integral equations

    NASA Astrophysics Data System (ADS)

    Christie, I.; Eyre, D.

    1986-06-01

    This paper investigates an approximate method for solving integral equations that arise in few-body problems. The method is to replace the kernel by a degenerate kernel defined on a finite dimensional subspace of piecewise Lagrange polynomials. Numerical accuracy of the method is tested by solving the two-body Lippmann-Schwinger equation with non-separable potentials, and the three-body Amado-Lovelace equation with separable two-body potentials.

  10. Drugs, discrimination and disability.

    PubMed

    Gibson, Frances

    2009-12-01

    Whether addiction to prohibited drugs should be classified as a disability for the purposes of disability discrimination is a controversial question in Australia. The leading Australian case of Marsden v Human Rights Equal Opportunity Commission & Coffs Harbour & District Ex-Servicemen & Women's Memorial Club Ltd (HREOC, No H98/51, 30 August 1999); [2000] FCA 1619 concerned a disability discrimination complaint brought by Mr Marsden as a result of his treatment by the club. The case was brought as a public interest test case by the New South Wales Legal Aid Commission. Mr Marsden was on a methadone program at the time. The reasoning of the decision at the Federal Court opened the way for a finding that dependence on illegal drugs constituted a disability under disability discrimination legislation. The media reaction to the court's decision led to State and federal governments proposing legislation limiting legal protection from discrimination for people addicted to illegal drugs on the basis of their drug use. While the proposed federal legislation lapsed after objections from a coalition of medical, legal and other advocacy groups, the New South Wales legislation still provides that, in employment matters, it is not unlawful to discriminate against a person on the ground of disability if the disability relates to the person's addiction to a prohibited drug and the person is actually addicted to a prohibited drug at the time of the discrimination. The article details the sequence of events in the Marsden case, reflects on the role of public interest litigation in achieving social justice outcomes and suggests that Australia's recent ratification of the Convention on the Rights of Persons with Disabilities on 17 July 2008 should encourage legislators to review legislation which may have a discriminatory effect on people suffering from addictions. PMID:20169800

  11. The Role of Fisher Information Theory in the Development of Fundamental Laws in Physical Chemistry

    ERIC Educational Resources Information Center

    Honig, J. M.

    2009-01-01

    The unifying principle that involves rendering the Fisher information measure an extremum is reviewed. It is shown that with this principle, in conjunction with appropriate constraints, a large number of fundamental laws can be derived from a common source in a unified manner. The resulting economy of thought pertaining to fundamental principles…

  12. 75 FR 11939 - Fisher & Paykel Appliances, Inc., Huntington Beach, CA; Notice of Termination of Investigation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration Fisher & Paykel Appliances, Inc., Huntington Beach, CA; Notice of Termination of Investigation Pursuant to Section 221 of the Trade Act of 1974, as amended, an...

  13. Conservation of the Eastern Taiwan Strait Chinese White Dolphin (Sousa chinensis): Fishers' Perspectives and Management Implications

    PubMed Central

    Liu, Ta-Kang; Wang, Yu-Cheng; Chuang, Laurence Zsu-Hsin; Chen, Chih-How

    2016-01-01

    The abundance of the eastern Taiwan Strait (ETS) population of the Chinese white dolphin (Sousa chinensis) has been estimated to be less than 100 individuals. It is categorized as critically endangered in the IUCN Red List of Threatened Species. Thus, immediate measures of conservation should be taken to protect it from extinction. Currently, the Taiwanese government plans to designate its habitat as a Major Wildlife Habitat (MWH), a type of marine protected area (MPA) for conservation of wildlife species. Although the designation allows continuing the current exploitation, however, it may cause conflicts among multiple stakeholders with competing interests. The study is to explore the attitude and opinions among the stakeholders in order to better manage the MPA. This study employs a semi-structured interview and a questionnaire survey of local fishers. Results from interviews indicated that the subsistence of fishers remains a major problem. It was found that stakeholders have different perceptions of the fishers’ attitude towards conservation and also thought that the fishery-related law enforcement could be difficult. Quantitative survey showed that fishers are generally positive towards the conservation of the Chinese white dolphin but are less willing to participate in the planning process. Most fishers considered temporary fishing closure as feasible for conservation. The results of this study provide recommendations for future efforts towards the goal of better conservation for this endangered species. PMID:27526102

  14. 38 CFR 60.10 - Eligibility criteria for Fisher House or other temporary lodging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... from the VA health care facility without an overnight stay. (e) Special authority for organ transplant... lodging for individuals who must be present on site for evaluation, donation, and care related to their status as an organ donor for a veteran. VA may also provide Fisher House or other temporary lodging...

  15. 38 CFR 60.10 - Eligibility criteria for Fisher House or other temporary lodging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... from the VA health care facility without an overnight stay. (e) Special authority for organ transplant... lodging for individuals who must be present on site for evaluation, donation, and care related to their status as an organ donor for a veteran. VA may also provide Fisher House or other temporary lodging...

  16. Heisenberg-like and Fisher-information-based uncertainty relations for N -electron d -dimensional systems

    NASA Astrophysics Data System (ADS)

    Toranzo, I. V.; López-Rosa, S.; Esquivel, R. O.; Dehesa, J. S.

    2015-06-01

    Heisenberg-like and Fisher-information-based uncertainty relations which extend and generalize previous similar expressions are obtained for N -fermion d -dimensional systems. The contributions of both spatial and spin degrees of freedom are taken into account. The accuracy of some of these generalized spinned uncertainty-like relations is numerically examined for a large number of atomic and molecular systems.

  17. Quantifying the Short-Term Costs of Conservation Interventions for Fishers at Lake Alaotra, Madagascar

    PubMed Central

    Wallace, Andrea P. C.; Milner-Gulland, E. J.; Jones, Julia P. G.; Bunnefeld, Nils; Young, Richard; Nicholson, Emily

    2015-01-01

    Artisanal fisheries are a key source of food and income for millions of people, but if poorly managed, fishing can have declining returns as well as impacts on biodiversity. Management interventions such as spatial and temporal closures can improve fishery sustainability and reduce environmental degradation, but may carry substantial short-term costs for fishers. The Lake Alaotra wetland in Madagascar supports a commercially important artisanal fishery and provides habitat for a Critically Endangered primate and other endemic wildlife of conservation importance. Using detailed data from more than 1,600 fisher catches, we used linear mixed effects models to explore and quantify relationships between catch weight, effort, and spatial and temporal restrictions to identify drivers of fisher behaviour and quantify the potential effect of fishing restrictions on catch. We found that restricted area interventions and fishery closures would generate direct short-term costs through reduced catch and income, and these costs vary between groups of fishers using different gear. Our results show that conservation interventions can have uneven impacts on local people with different fishing strategies. This information can be used to formulate management strategies that minimise the adverse impacts of interventions, increase local support and compliance, and therefore maximise conservation effectiveness. PMID:26107284

  18. Using a genetic network to parameterize a landscape resistance surface for fishers, Martes pennanti.

    PubMed

    Garroway, Colin J; Bowman, Jeff; Wilson, Paul J

    2011-10-01

    Knowledge of dispersal-related gene flow is important for addressing many basic and applied questions in ecology and evolution. We used landscape genetics to understand the recovery of a recently expanded population of fishers (Martes pennanti) in Ontario, Canada. An important focus of landscape genetics is modelling the effects of landscape features on gene flow. Most often resistance surfaces in landscape genetic studies are built a priori based upon nongenetic field data or expert opinion. The resistance surface that best fits genetic data is then selected and interpreted. Given inherent biases in using expert opinion or movement data to model gene flow, we sought an alternative approach. We used estimates of conditional genetic distance derived from a network of genetic connectivity to parameterize landscape resistance and build a final resistance surface based upon information-theoretic model selection and multi-model averaging. We sampled 657 fishers from 31 landscapes, genotyped them at 16 microsatellite loci, and modelled the effects of snow depth, road density, river density, and coniferous forest on gene flow. Our final model suggested that road density, river density, and snow depth impeded gene flow during the fisher population expansion demonstrating that both human impacts and seasonal habitat variation affect gene flow for fishers. Our approach to building landscape genetic resistance surfaces mitigates many of the problems and caveats associated with using either nongenetic field data or expert opinion to derive resistance surfaces. PMID:21883589

  19. A COMAPRISON OF MERCURY IN MINK AND FISHER IN RHODE ISLAND

    EPA Science Inventory

    Comparison of total mercury concentrations and nitrogen and carbon stable isotope values in muscle tissue and stomach contents of mink (Mustela vison) and fisher (Martes pennanti) from Rhode Island in 2000- 2003 showed results which appeared to reflect dietary differences betwee...

  20. Balancing Liberty and Equality: Justice Kennedy's Decisive Vote in "Fisher v. University of Texas," Part II

    ERIC Educational Resources Information Center

    Garces, Liliana M.

    2015-01-01

    For the second time in three years, the Supreme Court is reviewing the constitutionality of a race-conscious admissions policy at the University of Texas, Austin. While the case, "Fisher v. University of Texas," raises questions specific to UT Austin, the Court's second review could change the ways higher education institutions across…

  1. Disentangling Similarity Judgments from Pragmatic Judgments: Response to Sloutsky and Fisher (2012)

    ERIC Educational Resources Information Center

    Noles, Nicholaus S.; Gelman, Susan A.

    2012-01-01

    Sloutsky and Fisher (2012) attempt to reframe the results presented in Noles and Gelman (2012) as a pure replication of their original work validating the similarity, induction, naming, and categorization (SINC) model. However, their critique fails to engage with the central findings reported in Noles and Gelman, and their reanalysis fails to…

  2. 76 FR 18151 - Kootenai National Forest, Lincoln County, MT; Miller West Fisher Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... ] West Fisher Project. The Project includes timber harvest, fuels reduction, precommercial thinning...: Vegetation treatments: Timber harvest and associated fuel treatment on 1,898 acres, including intermediate.... These activities would contribute approximately 8.2 million board feet (MMBF) of timber products to...

  3. Book review: Biology and conservation of martens, sables, and fishers: A new synthesis

    USGS Publications Warehouse

    Jenkins, Kurt J.

    2013-01-01

    Review info: Biology and conservation of martens, sables, and fishers: A new synthesis. Edited by K.B. Aubry, W.J. Zielinski, M.G. Raphael, G. Proulx, and S.W. Buskirk, 2012. ISBN: 978-08014, 580pp.

  4. Existence of travelling wave solutions for a Fisher-Kolmogorov system with biomedical applications

    NASA Astrophysics Data System (ADS)

    Belmonte-Beitia, Juan

    2016-07-01

    We consider a Fisher-Kolmogorov system with applications in oncology Pérez-García et al. (2015). Of interest is the question of the existence of travelling front solutions of the system. When the speed of the travelling wave is sufficiently large, existence of such fronts is shown using singular geometric perturbation theory.

  5. Revisited Fisher's equation in a new outlook: A fractional derivative approach

    NASA Astrophysics Data System (ADS)

    Alquran, Marwan; Al-Khaled, Kamel; Sardar, Tridip; Chattopadhyay, Joydev

    2015-11-01

    The well-known Fisher equation with fractional derivative is considered to provide some characteristics of memory embedded into the system. The modified model is analyzed both analytically and numerically. A comparatively new technique residual power series method is used for finding approximate solutions of the modified Fisher model. A new technique combining Sinc-collocation and finite difference method is used for numerical study. The abundance of the bird species Phalacrocorax carbois considered as a test bed to validate the model outcome using estimated parameters. We conjecture non-diffusive and diffusive fractional Fisher equation represents the same dynamics in the interval (memory index, α ∈(0.8384 , 0.9986)). We also observe that when the value of memory index is close to zero, the solutions bifurcate and produce a wave-like pattern. We conclude that the survivability of the species increases for long range memory index. These findings are similar to Fisher observation and act in a similar fashion that advantageous genes do.

  6. Enzymatic treatment of peanut kernels to reduce allergen levels.

    PubMed

    Yu, Jianmei; Ahmedna, Mohamed; Goktepe, Ipek; Cheng, Hsiaopo; Maleki, Soheila

    2011-08-01

    This study investigated the use of enzymatic treatment to reduce peanut allergens in peanut kernels as affected by processing conditions. Two major peanut allergens, Ara h 1 and Ara h 2, were used as indicators of process effectiveness. Enzymatic treatment effectively reduced Ara h 1 and Ara h 2 in roasted peanut kernels by up to 100% under optimal conditions. For instance, treatment of roasted peanut kernels with α-chymotrypsin and trypsin for 1-3h significantly increased the solubility of peanut protein while reducing Ara h 1 and Ara h 2 in peanut kernel extracts by 100% and 98%, respectively, based on ELISA readings. Ara h 1 and Ara h 2 levels in peanut protein extracts were inversely correlated with protein solubility in roasted peanut. Blanching of kernels enhanced the effectiveness of enzyme treatment in roasted peanuts but not in raw peanuts. The optimal concentration of enzyme was determined by response surface to be in the range of 0.1-0.2%. No consistent results were obtained for raw peanut kernels since Ara h 1 and Ara h 2 increased in peanut protein extracts under some treatment conditions and decreased in others. PMID:25214091

  7. An Ensemble Approach to Building Mercer Kernels with Prior Information

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2005-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly dimensional feature space. we describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using pre-defined kernels. These data adaptive kernels can encode prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. Specifically, we demonstrate the use of the algorithm in situations with extremely small samples of data. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS) and demonstrate the method's superior performance against standard methods. The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains templates for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic-algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code.

  8. ODVBA: optimally-discriminative voxel-based analysis.

    PubMed

    Zhang, Tianhao; Davatzikos, Christos

    2011-08-01

    Gaussian smoothing of images prior to applying voxel-based statistics is an important step in voxel-based analysis and statistical parametric mapping (VBA-SPM) and is used to account for registration errors, to Gaussianize the data and to integrate imaging signals from a region around each voxel. However, it has also become a limitation of VBA-SPM based methods, since it is often chosen empirically and lacks spatial adaptivity to the shape and spatial extent of the region of interest, such as a region of atrophy or functional activity. In this paper, we propose a new framework, named optimally-discriminative voxel-based analysis (ODVBA), for determining the optimal spatially adaptive smoothing of images, followed by applying voxel-based group analysis. In ODVBA, nonnegative discriminative projection is applied regionally to get the direction that best discriminates between two groups, e.g., patients and controls; this direction is equivalent to local filtering by an optimal kernel whose coefficients define the optimally discriminative direction. By considering all the neighborhoods that contain a given voxel, we then compose this information to produce the statistic for each voxel. Finally, permutation tests are used to obtain a statistical parametric map of group differences. ODVBA has been evaluated using simulated data in which the ground truth is known and with data from an Alzheimer's disease (AD) study. The experimental results have shown that the proposed ODVBA can precisely describe the shape and location of structural abnormality.

  9. Quantum discriminant analysis for dimensionality reduction and classification

    NASA Astrophysics Data System (ADS)

    Cong, Iris; Duan, Luming

    2016-07-01

    We present quantum algorithms to efficiently perform discriminant analysis for dimensionality reduction and classification over an exponentially large input data set. Compared with the best-known classical algorithms, the quantum algorithms show an exponential speedup in both the number of training vectors M and the feature space dimension N. We generalize the previous quantum algorithm for solving systems of linear equations (2009 Phys. Rev. Lett. 103 150502) to efficiently implement a Hermitian chain product of k trace-normalized N ×N Hermitian positive-semidefinite matrices with time complexity of O({log}(N)). Using this result, we perform linear as well as nonlinear Fisher discriminant analysis for dimensionality reduction over M vectors, each in an N-dimensional feature space, in time O(p {polylog}({MN})/{ε }3), where ɛ denotes the tolerance error, and p is the number of principal projection directions desired. We also present a quantum discriminant analysis algorithm for data classification with time complexity O({log}({MN})/{ε }3).

  10. Direct Kernel Perceptron (DKP): ultra-fast kernel ELM-based classification with non-iterative closed-form weight calculation.

    PubMed

    Fernández-Delgado, Manuel; Cernadas, Eva; Barro, Senén; Ribeiro, Jorge; Neves, José

    2014-02-01

    The Direct Kernel Perceptron (DKP) (Fernández-Delgado et al., 2010) is a very simple and fast kernel-based classifier, related to the Support Vector Machine (SVM) and to the Extreme Learning Machine (ELM) (Huang, Wang, & Lan, 2011), whose α-coefficients are calculated directly, without any iterative training, using an analytical closed-form expression which involves only the training patterns. The DKP, which is inspired by the Direct Parallel Perceptron, (Auer et al., 2008), uses a Gaussian kernel and a linear classifier (perceptron). The weight vector of this classifier in the feature space minimizes an error measure which combines the training error and the hyperplane margin, without any tunable regularization parameter. This weight vector can be translated, using a variable change, to the α-coefficients, and both are determined without iterative calculations. We calculate solutions using several error functions, achieving the best trade-off between accuracy and efficiency with the linear function. These solutions for the α coefficients can be considered alternatives to the ELM with a new physical meaning in terms of error and margin: in fact, the linear and quadratic DKP are special cases of the two-class ELM when the regularization parameter C takes the values C=0 and C=∞. The linear DKP is extremely efficient and much faster (over a vast collection of 42 benchmark and real-life data sets) than 12 very popular and accurate classifiers including SVM, Multi-Layer Perceptron, Adaboost, Random Forest and Bagging of RPART decision trees, Linear Discriminant Analysis, K-Nearest Neighbors, ELM, Probabilistic Neural Networks, Radial Basis Function neural networks and Generalized ART. Besides, despite its simplicity and extreme efficiency, DKP achieves higher accuracies than 7 out of 12 classifiers, exhibiting small differences with respect to the best ones (SVM, ELM, Adaboost and Random Forest), which are much slower. Thus, the DKP provides an easy and fast way

  11. Direct Kernel Perceptron (DKP): ultra-fast kernel ELM-based classification with non-iterative closed-form weight calculation.

    PubMed

    Fernández-Delgado, Manuel; Cernadas, Eva; Barro, Senén; Ribeiro, Jorge; Neves, José

    2014-02-01

    The Direct Kernel Perceptron (DKP) (Fernández-Delgado et al., 2010) is a very simple and fast kernel-based classifier, related to the Support Vector Machine (SVM) and to the Extreme Learning Machine (ELM) (Huang, Wang, & Lan, 2011), whose α-coefficients are calculated directly, without any iterative training, using an analytical closed-form expression which involves only the training patterns. The DKP, which is inspired by the Direct Parallel Perceptron, (Auer et al., 2008), uses a Gaussian kernel and a linear classifier (perceptron). The weight vector of this classifier in the feature space minimizes an error measure which combines the training error and the hyperplane margin, without any tunable regularization parameter. This weight vector can be translated, using a variable change, to the α-coefficients, and both are determined without iterative calculations. We calculate solutions using several error functions, achieving the best trade-off between accuracy and efficiency with the linear function. These solutions for the α coefficients can be considered alternatives to the ELM with a new physical meaning in terms of error and margin: in fact, the linear and quadratic DKP are special cases of the two-class ELM when the regularization parameter C takes the values C=0 and C=∞. The linear DKP is extremely efficient and much faster (over a vast collection of 42 benchmark and real-life data sets) than 12 very popular and accurate classifiers including SVM, Multi-Layer Perceptron, Adaboost, Random Forest and Bagging of RPART decision trees, Linear Discriminant Analysis, K-Nearest Neighbors, ELM, Probabilistic Neural Networks, Radial Basis Function neural networks and Generalized ART. Besides, despite its simplicity and extreme efficiency, DKP achieves higher accuracies than 7 out of 12 classifiers, exhibiting small differences with respect to the best ones (SVM, ELM, Adaboost and Random Forest), which are much slower. Thus, the DKP provides an easy and fast way

  12. Time-frequency optimization for discrimination between imagination of right and left hand movements based on two bipolar electroencephalography channels

    NASA Astrophysics Data System (ADS)

    Yang, Yuan; Chevallier, Sylvain; Wiart, Joe; Bloch, Isabelle

    2014-12-01

    To enforce a widespread use of efficient and easy to use brain-computer interfaces (BCIs), the inter-subject robustness should be increased and the number of electrodes should be reduced. These two key issues are addressed in this contribution, proposing a novel method to identify subject-specific time-frequency characteristics with a minimal number of electrodes. In this method, two alternative criteria, time-frequency discrimination factor ( TFDF) and F score, are proposed to evaluate the discriminative power of time-frequency regions. Distinct from classical measures (e.g., Fisher criterion, r 2 coefficient), the TFDF is based on the neurophysiologic phenomena, on which the motor imagery BCI paradigm relies, rather than only from statistics. F score is based on the popular Fisher's discriminant and purely data driven; however, it differs from traditional measures since it provides a simple and effective measure for quantifying the discriminative power of a multi-dimensional feature vector. The proposed method is tested on BCI competition IV datasets IIa and IIb for discriminating right and left hand motor imagery. Compared to state-of-the-art methods, our method based on both criteria led to comparable or even better classification results, while using fewer electrodes (i.e., only two bipolar channels, C3 and C4). This work indicates that time-frequency optimization can not only improve the classification performance but also contribute to reducing the number of electrodes required in motor imagery BCIs.

  13. Fishing for space: fine-scale multi-sector maritime activities influence fisher location choice.

    PubMed

    Tidd, Alex N; Vermard, Youen; Marchal, Paul; Pinnegar, John; Blanchard, Julia L; Milner-Gulland, E J

    2015-01-01

    The European Union and other states are moving towards Ecosystem Based Fisheries Management to balance food production and security with wider ecosystem concerns. Fishing is only one of several sectors operating within the ocean environment, competing for renewable and non-renewable resources that overlap in a limited space. Other sectors include marine mining, energy generation, recreation, transport and conservation. Trade-offs of these competing sectors are already part of the process but attempts to detail how the seas are being utilised have been primarily based on compilations of data on human activity at large spatial scales. Advances including satellite and shipping automatic tracking enable investigation of factors influencing fishers' choice of fishing grounds at spatial scales relevant to decision-making, including the presence or avoidance of activities by other sectors. We analyse the determinants of English and Welsh scallop-dredging fleet behaviour, including competing sectors, operating in the eastern English Channel. Results indicate aggregate mining activity, maritime traffic, increased fishing costs, and the English inshore 6 and French 12 nautical mile limits negatively impact fishers' likelihood of fishing in otherwise suitable areas. Past success, net-benefits and fishing within the 12 NM predispose fishers to use areas. Systematic conservation planning has yet to be widely applied in marine systems, and the dynamics of spatial overlap of fishing with other activities have not been studied at scales relevant to fisher decision-making. This study demonstrates fisher decision-making is indeed affected by the real-time presence of other sectors in an area, and therefore trade-offs which need to be accounted for in marine planning. As marine resource extraction demands intensify, governments will need to take a more proactive approach to resolving these trade-offs, and studies such as this will be required as the evidential foundation for future

  14. Bickerstaff brainstem encephalitis and Fisher syndrome: anti-GQ1b antibody syndrome.

    PubMed

    Shahrizaila, Nortina; Yuki, Nobuhiro

    2013-05-01

    In the 1950s, Bickerstaff and Fisher independently described cases with a unique presentation of ophthalmoplegia and ataxia. The neurological features were typically preceded by an antecedent infection and the majority of patients made a spontaneous recovery. In the cases with Bickerstaff brainstem encephalitis, there was associated altered consciousness and in some, hyperreflexia, in support of a central pathology whereas in Fisher syndrome, patients were areflexic in keeping with a peripheral aetiology. However, both authors recognised certain similarities to Guillain-Barré syndrome such as the presence of peripheral neuropathy and cerebrospinal fluid albuminocytological dissociation. The discovery of immunoglobulin G anti-GQ1b antibodies in patients with Fisher syndrome and later in Bickerstaff brainstem encephalitis was crucial in providing the necessary evidence to conclude that both conditions were in fact part of the same spectrum of disease by virtue of their common clinical and immunological profiles. Following this, other neurological presentations that share anti-GQ1b antibodies emerged in the literature. These include acute ophthalmoparesis and acute ataxic neuropathy, which represent the less extensive spectrum of the disease whereas pharyngeal-cervical-brachial weakness and Fisher syndrome overlap with Guillain-Barré syndrome represent the more extensive end of the spectrum. The conditions can be referred to as the 'anti-GQ1b antibody syndrome'. In this review, we look back at the historical descriptions and describe how our understanding of Fisher syndrome and Bickerstaff brainstem encephalitis has evolved from their initial descriptions more than half a century ago. PMID:22984203

  15. The influence of fisher knowledge on the susceptibility of reef fish aggregations to fishing.

    PubMed

    Robinson, Jan; Cinner, Joshua E; Graham, Nicholas A J

    2014-01-01

    Reef fishes that exhibit predictable aggregating behaviour are often considered vulnerable to overexploitation. However, fisher knowledge of this behaviour is often heterogeneous and, coupled with socioeconomic factors that constrain demand for or access to aggregated fish, will influence susceptibility to fishing. At two case study locations in Papua New Guinea, Ahus and Karkar islands, we conducted interview-based surveys to examine how local context influenced heterogeneity in knowledge of fish aggregations. We then explored the role of fisher knowledge in conferring susceptibility to fishing relative to socioeconomic drivers of fishing effort. Local heterogeneity in knowledge of aggregating behaviour differed between our case studies. At Ahus, variable access rights among fishers and genders to the main habitats were sources of heterogeneity in knowledge. By contrast, knowledge was more homogenous at Karkar and the sole source of variation was gear type. Differences between locations in the susceptibility of aggregations to fishing depended primarily on socioeconomic drivers of fishing effort rather than catchability. While Ahus fishers were knowledgeable of fish aggregations and used more selective gears, Karkar fishers were less constrained by tenure in their access to aggregation habitat. However, fishing effort was greater at Ahus and likely related to high dependency on fishing, greater access to provincial capital markets than Karkar and a weakening of customary management. Moreover, highly efficient fishing techniques have emerged at Ahus to exploit the non-reproductive aggregating behaviour of target species. Understanding how knowledge is structured within fishing communities and its relation to socioeconomic drivers of fishing effort is important if customary practices for conservation, such as tambu areas, are to be supported. The findings of this study call for a holistic approach to assessing the risks posed to reef fish aggregations by fishing

  16. Discrimination Learning in Children

    ERIC Educational Resources Information Center

    Ochocki, Thomas E.; And Others

    1975-01-01

    Examined the learning performance of 192 fourth-, fifth-, and sixth-grade children on either a two or four choice simultaneous color discrimination task. Compared the use of verbal reinforcement and/or punishment, under conditions of either complete or incomplete instructions. (Author/SDH)

  17. Reversing Discrimination: A Perspective

    ERIC Educational Resources Information Center

    Pati, Gopal; Reilly, Charles W.

    1977-01-01

    Examines the debate over affirmative action and reverse discrimination, and discusses how and why the present dilemma has developed. Suggests that organizations can best address the problem through an honest, in-depth analysis of their organizational structure and management practices. (JG)

  18. Airborne particulate discriminator

    DOEpatents

    Creek, Kathryn Louise; Castro, Alonso; Gray, Perry Clayton

    2009-08-11

    A method and apparatus for rapid and accurate detection and discrimination of biological, radiological, and chemical particles in air. A suspect aerosol of the target particulates is treated with a taggant aerosol of ultrafine particulates. Coagulation of the taggant and target particles causes a change in fluorescent properties of the cloud, providing an indication of the presence of the target.

  19. Airborne Fraunhofer Line Discriminator

    NASA Technical Reports Server (NTRS)

    Gabriel, F. C.; Markle, D. A.

    1969-01-01

    Airborne Fraunhofer Line Discriminator enables prospecting for fluorescent materials, hydrography with fluorescent dyes, and plant studies based on fluorescence of chlorophyll. Optical unit design is the coincidence of Fraunhofer lines in the solar spectrum occurring at the characteristic wavelengths of some fluorescent materials.

  20. Sex Discrimination in Coaching.

    ERIC Educational Resources Information Center

    Dessem, Lawrence

    1980-01-01

    Even in situations in which the underpayment of girls' coaches is due to the sex of the students coached rather than to the sex of the coaches, the coaches and the girls coached are victims of unlawful discrimination. Available from Harvard Women's Law Journal, Harvard Law School, Cambridge, MA 02138. (Author/IRT)