Science.gov

Sample records for acid representation based

  1. XML-BASED REPRESENTATION

    SciTech Connect

    R. KELSEY

    2001-02-01

    For focused applications with limited user and use application communities, XML can be the right choice for representation. It is easy to use, maintain, and extend and enjoys wide support in commercial and research sectors. When the knowledge and information to be represented is object-based and use of that knowledge and information is a high priority, then XML-based representation should be considered. This paper discusses some of the issues involved in using XML-based representation and presents an example application that successfully uses an XML-based representation.

  2. DNA binding protein identification by combining pseudo amino acid composition and profile-based protein representation

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Wang, Shanyi; Wang, Xiaolong

    2015-10-01

    DNA-binding proteins play an important role in most cellular processes. Therefore, it is necessary to develop an efficient predictor for identifying DNA-binding proteins only based on the sequence information of proteins. The bottleneck for constructing a useful predictor is to find suitable features capturing the characteristics of DNA binding proteins. We applied PseAAC to DNA binding protein identification, and PseAAC was further improved by incorporating the evolutionary information by using profile-based protein representation. Finally, Combined with Support Vector Machines (SVMs), a predictor called iDNAPro-PseAAC was proposed. Experimental results on an updated benchmark dataset showed that iDNAPro-PseAAC outperformed some state-of-the-art approaches, and it can achieve stable performance on an independent dataset. By using an ensemble learning approach to incorporate more negative samples (non-DNA binding proteins) in the training process, the performance of iDNAPro-PseAAC was further improved. The web server of iDNAPro-PseAAC is available at http://bioinformatics.hitsz.edu.cn/iDNAPro-PseAAC/.

  3. Identification of Protein-Protein Interactions via a Novel Matrix-Based Sequence Representation Model with Amino Acid Contact Information.

    PubMed

    Ding, Yijie; Tang, Jijun; Guo, Fei

    2016-09-24

    Identification of protein-protein interactions (PPIs) is a difficult and important problem in biology. Since experimental methods for predicting PPIs are both expensive and time-consuming, many computational methods have been developed to predict PPIs and interaction networks, which can be used to complement experimental approaches. However, these methods have limitations to overcome. They need a large number of homology proteins or literature to be applied in their method. In this paper, we propose a novel matrix-based protein sequence representation approach to predict PPIs, using an ensemble learning method for classification. We construct the matrix of Amino Acid Contact (AAC), based on the statistical analysis of residue-pairing frequencies in a database of 6323 protein-protein complexes. We first represent the protein sequence as a Substitution Matrix Representation (SMR) matrix. Then, the feature vector is extracted by applying algorithms of Histogram of Oriented Gradient (HOG) and Singular Value Decomposition (SVD) on the SMR matrix. Finally, we feed the feature vector into a Random Forest (RF) for judging interaction pairs and non-interaction pairs. Our method is applied to several PPI datasets to evaluate its performance. On the S . c e r e v i s i a e dataset, our method achieves 94 . 83 % accuracy and 92 . 40 % sensitivity. Compared with existing methods, and the accuracy of our method is increased by 0 . 11 percentage points. On the H . p y l o r i dataset, our method achieves 89 . 06 % accuracy and 88 . 15 % sensitivity, the accuracy of our method is increased by 0 . 76 % . On the H u m a n PPI dataset, our method achieves 97 . 60 % accuracy and 96 . 37 % sensitivity, and the accuracy of our method is increased by 1 . 30 % . In addition, we test our method on a very important PPI network, and it achieves 92 . 71 % accuracy. In the Wnt-related network, the accuracy of our method is increased by 16 . 67 % . The source code and all datasets are available

  4. Identification of Protein–Protein Interactions via a Novel Matrix-Based Sequence Representation Model with Amino Acid Contact Information

    PubMed Central

    Ding, Yijie; Tang, Jijun; Guo, Fei

    2016-01-01

    Identification of protein–protein interactions (PPIs) is a difficult and important problem in biology. Since experimental methods for predicting PPIs are both expensive and time-consuming, many computational methods have been developed to predict PPIs and interaction networks, which can be used to complement experimental approaches. However, these methods have limitations to overcome. They need a large number of homology proteins or literature to be applied in their method. In this paper, we propose a novel matrix-based protein sequence representation approach to predict PPIs, using an ensemble learning method for classification. We construct the matrix of Amino Acid Contact (AAC), based on the statistical analysis of residue-pairing frequencies in a database of 6323 protein–protein complexes. We first represent the protein sequence as a Substitution Matrix Representation (SMR) matrix. Then, the feature vector is extracted by applying algorithms of Histogram of Oriented Gradient (HOG) and Singular Value Decomposition (SVD) on the SMR matrix. Finally, we feed the feature vector into a Random Forest (RF) for judging interaction pairs and non-interaction pairs. Our method is applied to several PPI datasets to evaluate its performance. On the S.cerevisiae dataset, our method achieves 94.83% accuracy and 92.40% sensitivity. Compared with existing methods, and the accuracy of our method is increased by 0.11 percentage points. On the H.pylori dataset, our method achieves 89.06% accuracy and 88.15% sensitivity, the accuracy of our method is increased by 0.76%. On the Human PPI dataset, our method achieves 97.60% accuracy and 96.37% sensitivity, and the accuracy of our method is increased by 1.30%. In addition, we test our method on a very important PPI network, and it achieves 92.71% accuracy. In the Wnt-related network, the accuracy of our method is increased by 16.67%. The source code and all datasets are available at https://figshare.com/s/580c11dce13e63cb9a53. PMID

  5. Object-based representations of spatial images

    NASA Astrophysics Data System (ADS)

    Newsam, Shawn; Bhagavathy, Sitaram; Kenney, Charles; Manjunath, B. S.; Fonseca, Leila

    2001-03-01

    Object based representations of image data enable new content-related functionalities while facilitating management of large image databases. Developing such representations for multi-date and multi-spectral images is one of the objectives of the second phase of the Alexandria Digital Library (ADL) project at UCSB. Image segmentation and image registration are two of the main issues that are to be addressed in creating localized image representations. We present in this paper some of the recent and current work by the ADL's image processing group on robust image segmentation, registration, and the use of image texture for content representation. Built upon these technologies are techniques for managing large repositories of data. A texture thesaurus assists in creating a semantic classification of image regions. An object-based representation is proposed to facilitate data storage, retrieval, analysis, and navigation.

  6. On volume-source representations based on the representation theorem

    NASA Astrophysics Data System (ADS)

    Ichihara, Mie; Kusakabe, Tetsuya; Kame, Nobuki; Kumagai, Hiroyuki

    2016-01-01

    We discuss different ways to characterize a moment tensor associated with an actual volume change of ΔV C , which has been represented in terms of either the stress glut or the corresponding stress-free volume change ΔV T . Eshelby's virtual operation provides a conceptual model relating ΔV C to ΔV T and the stress glut, where non-elastic processes such as phase transitions allow ΔV T to be introduced and subsequent elastic deformation of - ΔV T is assumed to produce the stress glut. While it is true that ΔV T correctly represents the moment tensor of an actual volume source with volume change ΔV C , an explanation as to why such an operation relating ΔV C to ΔV T exists has not previously been given. This study presents a comprehensive explanation of the relationship between ΔV C and ΔV T based on the representation theorem. The displacement field is represented using Green's function, which consists of two integrals over the source surface: one for displacement and the other for traction. Both integrals are necessary for representing volumetric sources, whereas the representation of seismic faults includes only the first term, as the second integral over the two adjacent fault surfaces, across which the traction balances, always vanishes. Therefore, in a seismological framework, the contribution from the second term should be included as an additional surface displacement. We show that the seismic moment tensor of a volume source is directly obtained from the actual state of the displacement and stress at the source without considering any virtual non-elastic operations. A purely mathematical procedure based on the representation theorem enables us to specify the additional imaginary displacement necessary for representing a volume source only by the displacement term, which links ΔV C to ΔV T . It also specifies the additional imaginary stress necessary for representing a moment tensor solely by the traction term, which gives the "stress glut." The

  7. Group representations, error bases and quantum codes

    SciTech Connect

    Knill, E

    1996-01-01

    This report continues the discussion of unitary error bases and quantum codes. Nice error bases are characterized in terms of the existence of certain characters in a group. A general construction for error bases which are non-abelian over the center is given. The method for obtaining codes due to Calderbank et al. is generalized and expressed purely in representation theoretic terms. The significance of the inertia subgroup both for constructing codes and obtaining the set of transversally implementable operations is demonstrated.

  8. Fingerprint Compression Based on Sparse Representation.

    PubMed

    Shao, Guangqi; Wu, Yanping; A, Yong; Liu, Xiao; Guo, Tiande

    2014-02-01

    A new fingerprint compression algorithm based on sparse representation is introduced. Obtaining an overcomplete dictionary from a set of fingerprint patches allows us to represent them as a sparse linear combination of dictionary atoms. In the algorithm, we first construct a dictionary for predefined fingerprint image patches. For a new given fingerprint images, represent its patches according to the dictionary by computing l(0)-minimization and then quantize and encode the representation. In this paper, we consider the effect of various factors on compression results. Three groups of fingerprint images are tested. The experiments demonstrate that our algorithm is efficient compared with several competing compression techniques (JPEG, JPEG 2000, and WSQ), especially at high compression ratios. The experiments also illustrate that the proposed algorithm is robust to extract minutiae.

  9. Prostate segmentation by sparse representation based classification

    PubMed Central

    Gao, Yaozong; Liao, Shu; Shen, Dinggang

    2012-01-01

    Purpose: The segmentation of prostate in CT images is of essential importance to external beam radiotherapy, which is one of the major treatments for prostate cancer nowadays. During the radiotherapy, the prostate is radiated by high-energy x rays from different directions. In order to maximize the dose to the cancer and minimize the dose to the surrounding healthy tissues (e.g., bladder and rectum), the prostate in the new treatment image needs to be accurately localized. Therefore, the effectiveness and efficiency of external beam radiotherapy highly depend on the accurate localization of the prostate. However, due to the low contrast of the prostate with its surrounding tissues (e.g., bladder), the unpredicted prostate motion, and the large appearance variations across different treatment days, it is challenging to segment the prostate in CT images. In this paper, the authors present a novel classification based segmentation method to address these problems. Methods: To segment the prostate, the proposed method first uses sparse representation based classification (SRC) to enhance the prostate in CT images by pixel-wise classification, in order to overcome the limitation of poor contrast of the prostate images. Then, based on the classification results, previous segmented prostates of the same patient are used as patient-specific atlases to align onto the current treatment image and the majority voting strategy is finally adopted to segment the prostate. In order to address the limitations of the traditional SRC in pixel-wise classification, especially for the purpose of segmentation, the authors extend SRC from the following four aspects: (1) A discriminant subdictionary learning method is proposed to learn a discriminant and compact representation of training samples for each class so that the discriminant power of SRC can be increased and also SRC can be applied to the large-scale pixel-wise classification. (2) The L1 regularized sparse coding is replaced by

  10. Exploring the Query Expansion Methods for Concept Based Representation

    DTIC Science & Technology

    2014-11-01

    Exploring the Query Expansion Methods for Concept Based Representation Yue Wang and Hui Fang Department of Electrical and Computer Engineering...physicians find relevant medical cases for patients they are dealing with. Concept based representation has been shown to be effective in biomedical...in this paper, we explored two external resources to perform query expansion for the basic concept based representation method, and discussed the

  11. Weighted Discriminative Dictionary Learning based on Low-rank Representation

    NASA Astrophysics Data System (ADS)

    Chang, Heyou; Zheng, Hao

    2017-01-01

    Low-rank representation has been widely used in the field of pattern classification, especially when both training and testing images are corrupted with large noise. Dictionary plays an important role in low-rank representation. With respect to the semantic dictionary, the optimal representation matrix should be block-diagonal. However, traditional low-rank representation based dictionary learning methods cannot effectively exploit the discriminative information between data and dictionary. To address this problem, this paper proposed weighted discriminative dictionary learning based on low-rank representation, where a weighted representation regularization term is constructed. The regularization associates label information of both training samples and dictionary atoms, and encourages to generate a discriminative representation with class-wise block-diagonal structure, which can further improve the classification performance where both training and testing images are corrupted with large noise. Experimental results demonstrate advantages of the proposed method over the state-of-the-art methods.

  12. A shape representation for computer vision based on differential topology.

    PubMed

    Blicher, A P

    1995-01-01

    We describe a shape representation for use in computer vision, after a brief review of shape representation and object recognition in general. Our shape representation is based on graph structures derived from level sets whose characteristics are understood from differential topology, particularly singularity theory. This leads to a representation which is both stable and whose changes under deformation are simple. The latter allows smoothing in the representation domain ('symbolic smoothing'), which in turn can be used for coarse-to-fine strategies, or as a discrete analog of scale space. Essentially the same representation applies to an object embedded in 3-dimensional space as to one in the plane, and likewise for a 3D object and its silhouette. We suggest how this can be used for recognition.

  13. Improved Separability Criteria Based on Bloch Representation of Density Matrices

    PubMed Central

    Shen, Shu-Qian; Yu, Juan; Li, Ming; Fei, Shao-Ming

    2016-01-01

    The correlation matrices or tensors in the Bloch representation of density matrices are encoded with entanglement properties. In this paper, based on the Bloch representation of density matrices, we give some new separability criteria for bipartite and multipartite quantum states. Theoretical analysis and some examples show that the proposed criteria can be more efficient than the previous related criteria. PMID:27350031

  14. Conditional Covariance-based Representation of Multidimensional Test Structure.

    ERIC Educational Resources Information Center

    Bolt, Daniel M.

    2001-01-01

    Presents a new nonparametric method for constructing a spatial representation of multidimensional test structure, the Conditional Covariance-based SCALing (CCSCAL) method. Describes an index to measure the accuracy of the representation. Uses simulation and real-life data analyses to show that the method provides a suitable approximation to…

  15. Coronavirus phylogeny based on triplets of nucleic acids bases

    NASA Astrophysics Data System (ADS)

    Liao, Bo; Liu, Yanshu; Li, Renfa; Zhu, Wen

    2006-04-01

    We considered the fully overlapping triplets of nucleotide bases and proposed a 2D graphical representation of protein sequences consisting of 20 amino acids and a stop code. Based on this 2D graphical representation, we outlined a new approach to analyze the phylogenetic relationships of coronaviruses by constructing a covariance matrix. The evolutionary distances are obtained through measuring the differences among the two-dimensional curves.

  16. An object-based methodology for knowledge representation in SGML

    SciTech Connect

    Kelsey, R.L.; Hartley, R.T.; Webster, R.B.

    1997-11-01

    An object-based methodology for knowledge representation and its Standard Generalized Markup Language (SGML) implementation is presented. The methodology includes class, perspective domain, and event constructs for representing knowledge within an object paradigm. The perspective construct allows for representation of knowledge from multiple and varying viewpoints. The event construct allows actual use of knowledge to be represented. The SGML implementation of the methodology facilitates usability, structured, yet flexible knowledge design, and sharing and reuse of knowledge class libraries.

  17. Supervised Filter Learning for Representation Based Face Recognition

    PubMed Central

    Bi, Chao; Zhang, Lei; Qi, Miao; Zheng, Caixia; Yi, Yugen; Wang, Jianzhong; Zhang, Baoxue

    2016-01-01

    Representation based classification methods, such as Sparse Representation Classification (SRC) and Linear Regression Classification (LRC) have been developed for face recognition problem successfully. However, most of these methods use the original face images without any preprocessing for recognition. Thus, their performances may be affected by some problematic factors (such as illumination and expression variances) in the face images. In order to overcome this limitation, a novel supervised filter learning algorithm is proposed for representation based face recognition in this paper. The underlying idea of our algorithm is to learn a filter so that the within-class representation residuals of the faces' Local Binary Pattern (LBP) features are minimized and the between-class representation residuals of the faces' LBP features are maximized. Therefore, the LBP features of filtered face images are more discriminative for representation based classifiers. Furthermore, we also extend our algorithm for heterogeneous face recognition problem. Extensive experiments are carried out on five databases and the experimental results verify the efficacy of the proposed algorithm. PMID:27416030

  18. Sparse coding based feature representation method for remote sensing images

    NASA Astrophysics Data System (ADS)

    Oguslu, Ender

    In this dissertation, we study sparse coding based feature representation method for the classification of multispectral and hyperspectral images (HSI). The existing feature representation systems based on the sparse signal model are computationally expensive, requiring to solve a convex optimization problem to learn a dictionary. A sparse coding feature representation framework for the classification of HSI is presented that alleviates the complexity of sparse coding through sub-band construction, dictionary learning, and encoding steps. In the framework, we construct the dictionary based upon the extracted sub-bands from the spectral representation of a pixel. In the encoding step, we utilize a soft threshold function to obtain sparse feature representations for HSI. Experimental results showed that a randomly selected dictionary could be as effective as a dictionary learned from optimization. The new representation usually has a very high dimensionality requiring a lot of computational resources. In addition, the spatial information of the HSI data has not been included in the representation. Thus, we modify the framework by incorporating the spatial information of the HSI pixels and reducing the dimension of the new sparse representations. The enhanced model, called sparse coding based dense feature representation (SC-DFR), is integrated with a linear support vector machine (SVM) and a composite kernels SVM (CKSVM) classifiers to discriminate different types of land cover. We evaluated the proposed algorithm on three well known HSI datasets and compared our method to four recently developed classification methods: SVM, CKSVM, simultaneous orthogonal matching pursuit (SOMP) and image fusion and recursive filtering (IFRF). The results from the experiments showed that the proposed method can achieve better overall and average classification accuracies with a much more compact representation leading to more efficient sparse models for HSI classification. To further

  19. A path-oriented matrix-based knowledge representation system

    NASA Technical Reports Server (NTRS)

    Feyock, Stefan; Karamouzis, Stamos T.

    1993-01-01

    Experience has shown that designing a good representation is often the key to turning hard problems into simple ones. Most AI (Artificial Intelligence) search/representation techniques are oriented toward an infinite domain of objects and arbitrary relations among them. In reality much of what needs to be represented in AI can be expressed using a finite domain and unary or binary predicates. Well-known vector- and matrix-based representations can efficiently represent finite domains and unary/binary predicates, and allow effective extraction of path information by generalized transitive closure/path matrix computations. In order to avoid space limitations a set of abstract sparse matrix data types was developed along with a set of operations on them. This representation forms the basis of an intelligent information system for representing and manipulating relational data.

  20. The DTW-based representation space for seismic pattern classification

    NASA Astrophysics Data System (ADS)

    Orozco-Alzate, Mauricio; Castro-Cabrera, Paola Alexandra; Bicego, Manuele; Londoño-Bonilla, John Makario

    2015-12-01

    Distinguishing among the different seismic volcanic patterns is still one of the most important and labor-intensive tasks for volcano monitoring. This task could be lightened and made free from subjective bias by using automatic classification techniques. In this context, a core but often overlooked issue is the choice of an appropriate representation of the data to be classified. Recently, it has been suggested that using a relative representation (i.e. proximities, namely dissimilarities on pairs of objects) instead of an absolute one (i.e. features, namely measurements on single objects) is advantageous to exploit the relational information contained in the dissimilarities to derive highly discriminant vector spaces, where any classifier can be used. According to that motivation, this paper investigates the suitability of a dynamic time warping (DTW) dissimilarity-based vector representation for the classification of seismic patterns. Results show the usefulness of such a representation in the seismic pattern classification scenario, including analyses of potential benefits from recent advances in the dissimilarity-based paradigm such as the proper selection of representation sets and the combination of different dissimilarity representations that might be available for the same data.

  1. Sparse representations for online-learning-based hyperspectral image compression.

    PubMed

    Ülkü, İrem; Töreyin, Behçet Uğur

    2015-10-10

    Sparse models provide data representations in the fewest possible number of nonzero elements. This inherent characteristic enables sparse models to be utilized for data compression purposes. Hyperspectral data is large in size. In this paper, a framework for sparsity-based hyperspectral image compression methods using online learning is proposed. There are various sparse optimization models. A comparative analysis of sparse representations in terms of their hyperspectral image compression performance is presented. For this purpose, online-learning-based hyperspectral image compression methods are proposed using four different sparse representations. Results indicate that, independent of the sparsity models, online-learning-based hyperspectral data compression schemes yield the best compression performances for data rates of 0.1 and 0.3 bits per sample, compared to other state-of-the-art hyperspectral data compression techniques, in terms of image quality measured as average peak signal-to-noise ratio.

  2. Medical Image Fusion Based on Feature Extraction and Sparse Representation

    PubMed Central

    Wei, Gao; Zongxi, Song

    2017-01-01

    As a novel multiscale geometric analysis tool, sparse representation has shown many advantages over the conventional image representation methods. However, the standard sparse representation does not take intrinsic structure and its time complexity into consideration. In this paper, a new fusion mechanism for multimodal medical images based on sparse representation and decision map is proposed to deal with these problems simultaneously. Three decision maps are designed including structure information map (SM) and energy information map (EM) as well as structure and energy map (SEM) to make the results reserve more energy and edge information. SM contains the local structure feature captured by the Laplacian of a Gaussian (LOG) and EM contains the energy and energy distribution feature detected by the mean square deviation. The decision map is added to the normal sparse representation based method to improve the speed of the algorithm. Proposed approach also improves the quality of the fused results by enhancing the contrast and reserving more structure and energy information from the source images. The experiment results of 36 groups of CT/MR, MR-T1/MR-T2, and CT/PET images demonstrate that the method based on SR and SEM outperforms five state-of-the-art methods. PMID:28321246

  3. Frame-based knowledge representation for processing planning

    NASA Astrophysics Data System (ADS)

    Lindsay, K. J.

    An Expert System is being developed to perform generative process planning for individual parts fabricated from extruded and sheet metal materials, and for bonded metal assemblies. The system employs a frame-based knowledge representation structure and production rules to generate detailed fabrication and processing instructions. The system is being developed using the InterLISP-D language, commercially available expert system development software and a dedicated LISP machine. The paper describes the knowledge-based representation and reasoning techniques applied within the system and pertinent development issues.

  4. Incorporating Feature-Based Annotations into Automatically Generated Knowledge Representations

    NASA Astrophysics Data System (ADS)

    Lumb, L. I.; Lederman, J. I.; Aldridge, K. D.

    2006-12-01

    Earth Science Markup Language (ESML) is efficient and effective in representing scientific data in an XML- based formalism. However, features of the data being represented are not accounted for in ESML. Such features might derive from events (e.g., a gap in data collection due to instrument servicing), identifications (e.g., a scientifically interesting area/volume in an image), or some other source. In order to account for features in an ESML context, we consider them from the perspective of annotation, i.e., the addition of information to existing documents without changing the originals. Although it is possible to extend ESML to incorporate feature-based annotations internally (e.g., by extending the XML schema for ESML), there are a number of complicating factors that we identify. Rather than pursuing the ESML-extension approach, we focus on an external representation for feature-based annotations via XML Pointer Language (XPointer). In previous work (Lumb &Aldridge, HPCS 2006, IEEE, doi:10.1109/HPCS.2006.26), we have shown that it is possible to extract relationships from ESML-based representations, and capture the results in the Resource Description Format (RDF). Thus we explore and report on this same requirement for XPointer-based annotations of ESML representations. As in our past efforts, the Global Geodynamics Project (GGP) allows us to illustrate with a real-world example this approach for introducing annotations into automatically generated knowledge representations.

  5. Toward a brain-based componential semantic representation.

    PubMed

    Binder, Jeffrey R; Conant, Lisa L; Humphries, Colin J; Fernandino, Leonardo; Simons, Stephen B; Aguilar, Mario; Desai, Rutvik H

    2016-01-01

    Componential theories of lexical semantics assume that concepts can be represented by sets of features or attributes that are in some sense primitive or basic components of meaning. The binary features used in classical category and prototype theories are problematic in that these features are themselves complex concepts, leaving open the question of what constitutes a primitive feature. The present availability of brain imaging tools has enhanced interest in how concepts are represented in brains, and accumulating evidence supports the claim that these representations are at least partly "embodied" in the perception, action, and other modal neural systems through which concepts are experienced. In this study we explore the possibility of devising a componential model of semantic representation based entirely on such functional divisions in the human brain. We propose a basic set of approximately 65 experiential attributes based on neurobiological considerations, comprising sensory, motor, spatial, temporal, affective, social, and cognitive experiences. We provide normative data on the salience of each attribute for a large set of English nouns, verbs, and adjectives, and show how these attribute vectors distinguish a priori conceptual categories and capture semantic similarity. Robust quantitative differences between concrete object categories were observed across a large number of attribute dimensions. A within- versus between-category similarity metric showed much greater separation between categories than representations derived from distributional (latent semantic) analysis of text. Cluster analyses were used to explore the similarity structure in the data independent of a priori labels, revealing several novel category distinctions. We discuss how such a representation might deal with various longstanding problems in semantic theory, such as feature selection and weighting, representation of abstract concepts, effects of context on semantic retrieval, and

  6. Argumentation-Based Collaborative Inquiry in Science Through Representational Work: Impact on Primary Students' Representational Fluency

    NASA Astrophysics Data System (ADS)

    Nichols, Kim; Gillies, Robyn; Hedberg, John

    2016-06-01

    This study explored the impact of argumentation-promoting collaborative inquiry and representational work in science on primary students' representational fluency. Two hundred sixty-six year 6 students received instruction on natural disasters with a focus on collaborative inquiry. Students in the Comparison condition received only this instruction. Students in the Explanation condition were also instructed with a focus on explanations using representations. Students in the Argumentation condition received similar instruction to the Comparison and Explanation conditions but were also instructed with a focus on argumentation using representations. Conceptual understanding and representational competencies (interpreting, explaining and constructing representations) were measured prior to and immediately following the instruction. A small group collaborative representational task was video recorded at the end of the instruction and coded for modes of knowledge-building discourse; knowledge-sharing and knowledge-construction. Higher measures of conceptual understanding, representational competencies and knowledge-construction discourse were taken together as representational fluency. Students in all conditions showed significant improvement in conceptual understanding, interpreting representations and explaining representations. Students in the Comparison and Argumentation conditions also showed significantly improved scores in constructing representations. When compared to the other conditions, the Explanation group had the highest scores in conceptual understanding and also interpreting and explaining representations. While the Argumentation group had the highest scores for constructing representations, their scores for conceptual understanding as well as interpreting and explaining representations were also high. There was no difference between the groups in knowledge-sharing discourse; however, the Argumentation group displayed the highest incidence of knowledge

  7. Visual Tracking Based on Extreme Learning Machine and Sparse Representation

    PubMed Central

    Wang, Baoxian; Tang, Linbo; Yang, Jinglin; Zhao, Baojun; Wang, Shuigen

    2015-01-01

    The existing sparse representation-based visual trackers mostly suffer from both being time consuming and having poor robustness problems. To address these issues, a novel tracking method is presented via combining sparse representation and an emerging learning technique, namely extreme learning machine (ELM). Specifically, visual tracking can be divided into two consecutive processes. Firstly, ELM is utilized to find the optimal separate hyperplane between the target observations and background ones. Thus, the trained ELM classification function is able to remove most of the candidate samples related to background contents efficiently, thereby reducing the total computational cost of the following sparse representation. Secondly, to further combine ELM and sparse representation, the resultant confidence values (i.e., probabilities to be a target) of samples on the ELM classification function are used to construct a new manifold learning constraint term of the sparse representation framework, which tends to achieve robuster results. Moreover, the accelerated proximal gradient method is used for deriving the optimal solution (in matrix form) of the constrained sparse tracking model. Additionally, the matrix form solution allows the candidate samples to be calculated in parallel, thereby leading to a higher efficiency. Experiments demonstrate the effectiveness of the proposed tracker. PMID:26506359

  8. Visual tracking based on extreme learning machine and sparse representation.

    PubMed

    Wang, Baoxian; Tang, Linbo; Yang, Jinglin; Zhao, Baojun; Wang, Shuigen

    2015-10-22

    The existing sparse representation-based visual trackers mostly suffer from both being time consuming and having poor robustness problems. To address these issues, a novel tracking method is presented via combining sparse representation and an emerging learning technique, namely extreme learning machine (ELM). Specifically, visual tracking can be divided into two consecutive processes. Firstly, ELM is utilized to find the optimal separate hyperplane between the target observations and background ones. Thus, the trained ELM classification function is able to remove most of the candidate samples related to background contents efficiently, thereby reducing the total computational cost of the following sparse representation. Secondly, to further combine ELM and sparse representation, the resultant confidence values (i.e., probabilities to be a target) of samples on the ELM classification function are used to construct a new manifold learning constraint term of the sparse representation framework, which tends to achieve robuster results. Moreover, the accelerated proximal gradient method is used for deriving the optimal solution (in matrix form) of the constrained sparse tracking model. Additionally, the matrix form solution allows the candidate samples to be calculated in parallel, thereby leading to a higher efficiency. Experiments demonstrate the effectiveness of the proposed tracker.

  9. Optimized Color Filter Arrays for Sparse Representation Based Demosaicking.

    PubMed

    Li, Jia; Bai, Chenyan; Lin, Zhouchen; Yu, Jian

    2017-03-08

    Demosaicking is the problem of reconstructing a color image from the raw image captured by a digital color camera that covers its only imaging sensor with a color filter array (CFA). Sparse representation based demosaicking has been shown to produce superior reconstruction quality. However, almost all existing algorithms in this category use the CFAs which are not specifically optimized for the algorithms. In this paper, we consider optimally designing CFAs for sparse representation based demosaicking, where the dictionary is well-chosen. The fact that CFAs correspond to the projection matrices used in compressed sensing inspires us to optimize CFAs via minimizing the mutual coherence. This is more challenging than that for traditional projection matrices because CFAs have physical realizability constraints. However, most of the existing methods for minimizing the mutual coherence require that the projection matrices should be unconstrained, making them inapplicable for designing CFAs. We consider directly minimizing the mutual coherence with the CFA's physical realizability constraints as a generalized fractional programming problem, which needs to find sufficiently accurate solutions to a sequence of nonconvex nonsmooth minimization problems. We adapt the redistributed proximal bundle method to address this issue. Experiments on benchmark images testify to the superiority of the proposed method. In particular, we show that a simple sparse representation based demosaicking algorithm with our specifically optimized CFA can outperform LSSC [1]. To the best of our knowledge, it is the first sparse representation based demosaicking algorithm that beats LSSC in terms of CPSNR.

  10. Ensemble polarimetric SAR image classification based on contextual sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Lamei; Wang, Xiao; Zou, Bin; Qiao, Zhijun

    2016-05-01

    Polarimetric SAR image interpretation has become one of the most interesting topics, in which the construction of the reasonable and effective technique of image classification is of key importance. Sparse representation represents the data using the most succinct sparse atoms of the over-complete dictionary and the advantages of sparse representation also have been confirmed in the field of PolSAR classification. However, it is not perfect, like the ordinary classifier, at different aspects. So ensemble learning is introduced to improve the issue, which makes a plurality of different learners training and obtained the integrated results by combining the individual learner to get more accurate and ideal learning results. Therefore, this paper presents a polarimetric SAR image classification method based on the ensemble learning of sparse representation to achieve the optimal classification.

  11. Signal extrapolation based on wavelet representation

    NASA Astrophysics Data System (ADS)

    Xia, Xiang-Gen; Kuo, C.-C. Jay; Zhang, Zhen

    1993-11-01

    The Papoulis-Gerchberg (PG) algorithm is well known for band-limited signal extrapolation. We consider the generalization of the PG algorithm to signals in the wavelet subspaces in this research. The uniqueness of the extrapolation for continuous-time signals is examined, and sufficient conditions on signals and wavelet bases for the generalized PG (GPG) algorithm to converge are given. We also propose a discrete GPG algorithm for discrete-time signal extrapolation, and investigate its convergence. Numerical examples are given to illustrate the performance of the discrete GPG algorithm.

  12. Graph-based representation for multiview image geometry.

    PubMed

    Maugey, Thomas; Ortega, Antonio; Frossard, Pascal

    2015-05-01

    In this paper, we propose a new geometry representation method for multiview image sets. Our approach relies on graphs to describe the multiview geometry information in a compact and controllable way. The links of the graph connect pixels in different images and describe the proximity between pixels in 3D space. These connections are dependent on the geometry of the scene and provide the right amount of information that is necessary for coding and reconstructing multiple views. Our multiview image representation is very compact and adapts the transmitted geometry information as a function of the complexity of the prediction performed at the decoder side. To achieve this, our graph-based representation (GBR) carefully selects the amount of geometry information needed before coding. This is in contrast with depth coding, which directly compresses with losses the original geometry signal, thus making it difficult to quantify the impact of coding errors on geometry-based interpolation. We present the principles of this GBR and we build an efficient coding algorithm to represent it. We compare our GBR approach to classical depth compression methods and compare their respective view synthesis qualities as a function of the compactness of the geometry description. We show that GBR can achieve significant gains in geometry coding rate over depth-based schemes operating at similar quality. Experimental results demonstrate the potential of this new representation.

  13. Implementation of a frame-based representation in CLIPS

    NASA Technical Reports Server (NTRS)

    Assal, Hisham; Myers, Leonard

    1990-01-01

    Knowledge representation is one of the major concerns in expert systems. The representation of domain-specific knowledge should agree with the nature of the domain entities and their use in the real world. For example, architectural applications deal with objects and entities such as spaces, walls, and windows. A natural way of representing these architectural entities is provided by frames. This research explores the potential of using the expert system shell CLIPS, developed by NASA, to implement a frame-based representation that can accommodate architectural knowledge. These frames are similar but quite different from the 'template' construct in version 4.3 of CLIPS. Templates support only the grouping of related information and the assignment of default values to template fields. In addition to these features frames provide other capabilities including definition of classes, inheritance between classes and subclasses, relation of objects of different classes with 'has-a', association of methods (demons) of different types (standard and user-defined) to fields (slots), and creation of new fields at run-time. This frame-based representation is implemented completely in CLIPS. No change to the source code is necessary.

  14. 3D ear identification based on sparse representation.

    PubMed

    Zhang, Lin; Ding, Zhixuan; Li, Hongyu; Shen, Ying

    2014-01-01

    Biometrics based personal authentication is an effective way for automatically recognizing, with a high confidence, a person's identity. Recently, 3D ear shape has attracted tremendous interests in research field due to its richness of feature and ease of acquisition. However, the existing ICP (Iterative Closet Point)-based 3D ear matching methods prevalent in the literature are not quite efficient to cope with the one-to-many identification case. In this paper, we aim to fill this gap by proposing a novel effective fully automatic 3D ear identification system. We at first propose an accurate and efficient template-based ear detection method. By utilizing such a method, the extracted ear regions are represented in a common canonical coordinate system determined by the ear contour template, which facilitates much the following stages of feature extraction and classification. For each extracted 3D ear, a feature vector is generated as its representation by making use of a PCA-based local feature descriptor. At the stage of classification, we resort to the sparse representation based classification approach, which actually solves an l1-minimization problem. To the best of our knowledge, this is the first work introducing the sparse representation framework into the field of 3D ear identification. Extensive experiments conducted on a benchmark dataset corroborate the effectiveness and efficiency of the proposed approach. The associated Matlab source code and the evaluation results have been made publicly online available at http://sse.tongji.edu.cn/linzhang/ear/srcear/srcear.htm.

  15. Towards Web-based representation and processing of health information

    PubMed Central

    Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J

    2009-01-01

    Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new

  16. Sparse representation-based image restoration via nonlocal supervised coding

    NASA Astrophysics Data System (ADS)

    Li, Ao; Chen, Deyun; Sun, Guanglu; Lin, Kezheng

    2016-10-01

    Sparse representation (SR) and nonlocal technique (NLT) have shown great potential in low-level image processing. However, due to the degradation of the observed image, SR and NLT may not be accurate enough to obtain a faithful restoration results when they are used independently. To improve the performance, in this paper, a nonlocal supervised coding strategy-based NLT for image restoration is proposed. The novel method has three main contributions. First, to exploit the useful nonlocal patches, a nonnegative sparse representation is introduced, whose coefficients can be utilized as the supervised weights among patches. Second, a novel objective function is proposed, which integrated the supervised weights learning and the nonlocal sparse coding to guarantee a more promising solution. Finally, to make the minimization tractable and convergence, a numerical scheme based on iterative shrinkage thresholding is developed to solve the above underdetermined inverse problem. The extensive experiments validate the effectiveness of the proposed method.

  17. Phylogenetic tree construction based on 2D graphical representation

    NASA Astrophysics Data System (ADS)

    Liao, Bo; Shan, Xinzhou; Zhu, Wen; Li, Renfa

    2006-04-01

    A new approach based on the two-dimensional (2D) graphical representation of the whole genome sequence [Bo Liao, Chem. Phys. Lett., 401(2005) 196.] is proposed to analyze the phylogenetic relationships of genomes. The evolutionary distances are obtained through measuring the differences among the 2D curves. The fuzzy theory is used to construct phylogenetic tree. The phylogenetic relationships of H5N1 avian influenza virus illustrate the utility of our approach.

  18. Order-Based Representation in Random Networks of Cortical Neurons

    PubMed Central

    Kermany, Einat; Lyakhov, Vladimir; Zrenner, Christoph; Marom, Shimon

    2008-01-01

    The wide range of time scales involved in neural excitability and synaptic transmission might lead to ongoing change in the temporal structure of responses to recurring stimulus presentations on a trial-to-trial basis. This is probably the most severe biophysical constraint on putative time-based primitives of stimulus representation in neuronal networks. Here we show that in spontaneously developing large-scale random networks of cortical neurons in vitro the order in which neurons are recruited following each stimulus is a naturally emerging representation primitive that is invariant to significant temporal changes in spike times. With a relatively small number of randomly sampled neurons, the information about stimulus position is fully retrievable from the recruitment order. The effective connectivity that makes order-based representation invariant to time warping is characterized by the existence of stations through which activity is required to pass in order to propagate further into the network. This study uncovers a simple invariant in a noisy biological network in vitro; its applicability under in vivo constraints remains to be seen. PMID:19023409

  19. Teaching object concepts for XML-based representations.

    SciTech Connect

    Kelsey, R. L.

    2002-01-01

    Students learned about object-oriented design concepts and knowledge representation through the use of a set of toy blocks. The blocks represented a limited and focused domain of knowledge and one that was physical and tangible. The blocks helped the students to better visualize, communicate, and understand the domain of knowledge as well as how to perform object decomposition. The blocks were further abstracted to an engineering design kit for water park design. This helped the students to work on techniques for abstraction and conceptualization. It also led the project from tangible exercises into software and programming exercises. Students employed XML to create object-based knowledge representations and Java to use the represented knowledge. The students developed and implemented software allowing a lay user to design and create their own water slide and then to take a simulated ride on their slide.

  20. Time-frequency representation measurement based on temporal Fourier transformation

    NASA Astrophysics Data System (ADS)

    Suen, Yifan; Xiao, Shaoqiu; Hao, Sumin; Zhao, Xiaoxiang; Xiong, Yigao; Liu, Shenye

    2016-10-01

    We propose a new scheme to physically realize the short-time Fourier transform (STFT) of chirped optical pulse using time-lens array that enables us to get time-frequency representation without using FFT algorithm. The time-lens based upon the four-wave mixing is used to perform the process of temporal Fourier transformation. Pump pulse is used for both providing the quadratic phase and being the window function of STFT. The idea of STFT is physically realized in our scheme. Simulations have been done to investigate performance of the time-frequency representation scheme (TFRS) in comparison with STFT using FFT algorithm. Optimal measurement of resolution in time and frequency has been discussed.

  1. Argumentation-Based Collaborative Inquiry in Science through Representational Work: Impact on Primary Students' Representational Fluency

    ERIC Educational Resources Information Center

    Nichols, Kim; Gillies, Robyn; Hedberg, John

    2016-01-01

    This study explored the impact of argumentation-promoting collaborative inquiry and representational work in science on primary students' representational fluency. Two hundred sixty-six year 6 students received instruction on natural disasters with a focus on collaborative inquiry. Students in the Comparison condition received only this…

  2. 3D Ear Identification Based on Sparse Representation

    PubMed Central

    Zhang, Lin; Ding, Zhixuan; Li, Hongyu; Shen, Ying

    2014-01-01

    Biometrics based personal authentication is an effective way for automatically recognizing, with a high confidence, a person’s identity. Recently, 3D ear shape has attracted tremendous interests in research field due to its richness of feature and ease of acquisition. However, the existing ICP (Iterative Closet Point)-based 3D ear matching methods prevalent in the literature are not quite efficient to cope with the one-to-many identification case. In this paper, we aim to fill this gap by proposing a novel effective fully automatic 3D ear identification system. We at first propose an accurate and efficient template-based ear detection method. By utilizing such a method, the extracted ear regions are represented in a common canonical coordinate system determined by the ear contour template, which facilitates much the following stages of feature extraction and classification. For each extracted 3D ear, a feature vector is generated as its representation by making use of a PCA-based local feature descriptor. At the stage of classification, we resort to the sparse representation based classification approach, which actually solves an l1-minimization problem. To the best of our knowledge, this is the first work introducing the sparse representation framework into the field of 3D ear identification. Extensive experiments conducted on a benchmark dataset corroborate the effectiveness and efficiency of the proposed approach. The associated Matlab source code and the evaluation results have been made publicly online available at http://sse.tongji.edu.cn/linzhang/ear/srcear/srcear.htm. PMID:24740247

  3. Vigilance detection based on sparse representation of EEG.

    PubMed

    Yu, Hongbin; Lu, Hongtao; Ouyang, Tian; Liu, Hongjun; Lu, Bao-Liang

    2010-01-01

    Electroencephalogram (EEG) based vigilance detection of those people who engage in long time attention demanding tasks such as monotonous monitoring or driving is a key field in the research of brain-computer interface (BCI). However, robust detection of human vigilance from EEG is very difficult due to the low SNR nature of EEG signals. Recently, compressive sensing and sparse representation become successful tools in the fields of signal reconstruction and machine learning. In this paper, we propose to use the sparse representation of EEG to the vigilance detection problem. We first use continuous wavelet transform to extract the rhythm features of EEG data, and then employ the sparse representation method to the wavelet transform coefficients. We collect five subjects' EEG recordings in a simulation driving environment and apply the proposed method to detect the vigilance of the subjects. The experimental results show that the algorithm framework proposed in this paper can successfully estimate driver's vigilance with the average accuracy about 94.22 %. We also compare our algorithm framework with other vigilance estimation methods using different feature extraction and classifier selection approaches, the result shows that the proposed method has obvious advantages in the classification accuracy.

  4. Sparse representation based image interpolation with nonlocal autoregressive modeling.

    PubMed

    Dong, Weisheng; Zhang, Lei; Lukac, Rastislav; Shi, Guangming

    2013-04-01

    Sparse representation is proven to be a promising approach to image super-resolution, where the low-resolution (LR) image is usually modeled as the down-sampled version of its high-resolution (HR) counterpart after blurring. When the blurring kernel is the Dirac delta function, i.e., the LR image is directly down-sampled from its HR counterpart without blurring, the super-resolution problem becomes an image interpolation problem. In such cases, however, the conventional sparse representation models (SRM) become less effective, because the data fidelity term fails to constrain the image local structures. In natural images, fortunately, many nonlocal similar patches to a given patch could provide nonlocal constraint to the local structure. In this paper, we incorporate the image nonlocal self-similarity into SRM for image interpolation. More specifically, a nonlocal autoregressive model (NARM) is proposed and taken as the data fidelity term in SRM. We show that the NARM-induced sampling matrix is less coherent with the representation dictionary, and consequently makes SRM more effective for image interpolation. Our extensive experimental results demonstrate that the proposed NARM-based image interpolation method can effectively reconstruct the edge structures and suppress the jaggy/ringing artifacts, achieving the best image interpolation results so far in terms of PSNR as well as perceptual quality metrics such as SSIM and FSIM.

  5. An empirical study on the matrix-based protein representations and their combination with sequence-based approaches.

    PubMed

    Nanni, Loris; Lumini, Alessandra; Brahnam, Sheryl

    2013-03-01

    Many domains have a stake in the development of reliable systems for automatic protein classification. Of particular interest in recent studies of automatic protein classification is the exploration of new methods for extracting features from a protein that enhance classification for specific problems. These methods have proven very useful in one or two domains, but they have failed to generalize well across several domains (i.e. classification problems). In this paper, we evaluate several feature extraction approaches for representing proteins with the aim of sequence-based protein classification. Several protein representations are evaluated, those starting from: the position specific scoring matrix (PSSM) of the proteins; the amino-acid sequence; a matrix representation of the protein, of dimension (length of the protein) ×20, obtained using the substitution matrices for representing each amino-acid as a vector. A valuable result is that a texture descriptor can be extracted from the PSSM protein representation which improves the performance of standard descriptors based on the PSSM representation. Experimentally, we develop our systems by comparing several protein descriptors on nine different datasets. Each descriptor is used to train a support vector machine (SVM) or an ensemble of SVM. Although different stand-alone descriptors work well on some datasets (but not on others), we have discovered that fusion among classifiers trained using different descriptors obtains a good performance across all the tested datasets. Matlab code/Datasets used in the proposed paper are available at http://www.bias.csr.unibo.it\

  6. Video rate morphological processor based on a redundant number representation

    NASA Astrophysics Data System (ADS)

    Kuczborski, Wojciech; Attikiouzel, Yianni; Crebbin, Gregory A.

    1992-03-01

    This paper presents a video rate morphological processor for automated visual inspection of printed circuit boards, integrated circuit masks, and other complex objects. Inspection algorithms are based on gray-scale mathematical morphology. Hardware complexity of the known methods of real-time implementation of gray-scale morphology--the umbra transform and the threshold decomposition--has prompted us to propose a novel technique which applied an arithmetic system without carrying propagation. After considering several arithmetic systems, a redundant number representation has been selected for implementation. Two options are analyzed here. The first is a pure signed digit number representation (SDNR) with the base of 4. The second option is a combination of the base-2 SDNR (to represent gray levels of images) and the conventional twos complement code (to represent gray levels of structuring elements). Operation principle of the morphological processor is based on the concept of the digit level systolic array. Individual processing units and small memory elements create a pipeline. The memory elements store current image windows (kernels). All operation primitives of processing units apply a unified direction of digit processing: most significant digit first (MSDF). The implementation technology is based on the field programmable gate arrays by Xilinx. This paper justified the rationality of a new approach to logic design, which is the decomposition of Boolean functions instead of Boolean minimization.

  7. Supervised classification of protein structures based on convex hull representation.

    PubMed

    Wang, Yong; Wu, Ling-Yun; Chen, Luonan; Zhang, Xiang-Sun

    2007-01-01

    One of the central problems in functional genomics is to establish the classification schemes of protein structures. In this paper the relationship of protein structures is uncovered within the framework of supervised learning. Specifically, the novel patterns based on convex hull representation are firstly extracted from a protein structure, then the classification system is constructed and machine learning methods such as neural networks, Hidden Markov Models (HMM) and Support Vector Machines (SVMs) are applied. The CATH scheme is highlighted in the classification experiments. The results indicate that the proposed supervised classification scheme is effective and efficient.

  8. An object-based methodology for knowledge representation

    SciTech Connect

    Kelsey, R.L.; Hartley, R.T.; Webster, R.B.

    1997-11-01

    An object based methodology for knowledge representation is presented. The constructs and notation to the methodology are described and illustrated with examples. The ``blocks world,`` a classic artificial intelligence problem, is used to illustrate some of the features of the methodology including perspectives and events. Representing knowledge with perspectives can enrich the detail of the knowledge and facilitate potential lines of reasoning. Events allow example uses of the knowledge to be represented along with the contained knowledge. Other features include the extensibility and maintainability of knowledge represented in the methodology.

  9. Block diagonal representations for covariance based anomalous change detectors

    SciTech Connect

    Matsekh, Anna; Theiler, James

    2009-01-01

    Change detection methods are of crucial importance in many remote sensing applications such as monitoring and surveillance, where the goal is to identify and separate changes of interest from pervasive changes inevitably present in images taken at different times and in different environmental and illumination conditions. Anomalous change detection (ACD) methods aim to identify rare, unusual, or anomalous changes among the changes of interest. Covariance-based ACD methods provide a powerful tool for detection of unusual changes in hyper-spectral images. In this paper we study the properties of the eigenvalue spectra of a family of ACD matrices in order to better understand the algebraic and numerical behavior of the covariance-based quadratic ACD methods. We propose to use singular vectors of covariance matrices of two hyper-spectral images in whitened coordinates for obtaining block-diagonal representations of the matrices of quadratic ACD methods. SVD transformation gives an equivalent representation of ACD matrices in compact block-diagonal form. In the paper we show that the eigenvalue spectrum of a block-diagonal ACD matrix can be identified analytically as a function of the singular value spectrum of the corresponding covariance matrix in whitened coordinates.

  10. Finger vein verification system based on sparse representation.

    PubMed

    Xin, Yang; Liu, Zhi; Zhang, Haixia; Zhang, Hong

    2012-09-01

    Finger vein verification is a promising biometric pattern for personal identification in terms of security and convenience. The recognition performance of this technology heavily relies on the quality of finger vein images and on the recognition algorithm. To achieve efficient recognition performance, a special finger vein imaging device is developed, and a finger vein recognition method based on sparse representation is proposed. The motivation for the proposed method is that finger vein images exhibit a sparse property. In the proposed system, the regions of interest (ROIs) in the finger vein images are segmented and enhanced. Sparse representation and sparsity preserving projection on ROIs are performed to obtain the features. Finally, the features are measured for recognition. An equal error rate of 0.017% was achieved based on the finger vein image database, which contains images that were captured by using the near-IR imaging device that was developed in this study. The experimental results demonstrate that the proposed method is faster and more robust than previous methods.

  11. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation

    PubMed Central

    Sun, Rui; Zhang, Guanghai; Yan, Xiaoxing; Gao, Jun

    2016-01-01

    Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods. PMID:27537888

  12. Effects of Computer-Based Visual Representation on Mathematics Learning and Cognitive Load

    ERIC Educational Resources Information Center

    Yung, Hsin I.; Paas, Fred

    2015-01-01

    Visual representation has been recognized as a powerful learning tool in many learning domains. Based on the assumption that visual representations can support deeper understanding, we examined the effects of visual representations on learning performance and cognitive load in the domain of mathematics. An experimental condition with visual…

  13. Supporting Students' Learning with Multiple Representations in a Dynamic Simulation-Based Learning Environment

    ERIC Educational Resources Information Center

    van der Meij, Jan; de Jong, Ton

    2006-01-01

    In this study, the effects of different types of support for learning from multiple representations in a simulation-based learning environment were examined. The study extends known research by examining the use of dynamic representations instead of static representations and it examines the role of the complexity of the domain and the learning…

  14. Multiple Representations-Based Face Sketch-Photo Synthesis.

    PubMed

    Peng, Chunlei; Gao, Xinbo; Wang, Nannan; Tao, Dacheng; Li, Xuelong; Li, Jie

    2016-11-01

    Face sketch-photo synthesis plays an important role in law enforcement and digital entertainment. Most of the existing methods only use pixel intensities as the feature. Since face images can be described using features from multiple aspects, this paper presents a novel multiple representations-based face sketch-photo-synthesis method that adaptively combines multiple representations to represent an image patch. In particular, it combines multiple features from face images processed using multiple filters and deploys Markov networks to exploit the interacting relationships between the neighboring image patches. The proposed framework could be solved using an alternating optimization strategy and it normally converges in only five outer iterations in the experiments. Our experimental results on the Chinese University of Hong Kong (CUHK) face sketch database, celebrity photos, CUHK Face Sketch FERET Database, IIIT-D Viewed Sketch Database, and forensic sketches demonstrate the effectiveness of our method for face sketch-photo synthesis. In addition, cross-database and database-dependent style-synthesis evaluations demonstrate the generalizability of this novel method and suggest promising solutions for face identification in forensic science.

  15. Magnetic resonance brain tissue segmentation based on sparse representations

    NASA Astrophysics Data System (ADS)

    Rueda, Andrea

    2015-12-01

    Segmentation or delineation of specific organs and structures in medical images is an important task in the clinical diagnosis and treatment, since it allows to characterize pathologies through imaging measures (biomarkers). In brain imaging, segmentation of main tissues or specific structures is challenging, due to the anatomic variability and complexity, and the presence of image artifacts (noise, intensity inhomogeneities, partial volume effect). In this paper, an automatic segmentation strategy is proposed, based on sparse representations and coupled dictionaries. Image intensity patterns are singly related to tissue labels at the level of small patches, gathering this information in coupled intensity/segmentation dictionaries. This dictionaries are used within a sparse representation framework to find the projection of a new intensity image onto the intensity dictionary, and the same projection can be used with the segmentation dictionary to estimate the corresponding segmentation. Preliminary results obtained with two publicly available datasets suggest that the proposal is capable of estimating adequate segmentations for gray matter (GM) and white matter (WM) tissues, with an average overlapping of 0:79 for GM and 0:71 for WM (with respect to original segmentations).

  16. [Stewart's acid-base approach].

    PubMed

    Funk, Georg-Christian

    2007-01-01

    In addition to paCO(2), Stewart's acid base model takes into account the influence of albumin, inorganic phosphate, electrolytes and lactate on acid-base equilibrium. It allows a comprehensive and quantitative analysis of acid-base disorders. Particularly simultaneous and mixed metabolic acid-base disorders, which are common in critically ill patients, can be assessed. Stewart's approach is therefore a valuable tool in addition to the customary acid-base approach based on bicarbonate or base excess. However, some chemical aspects of Stewart's approach remain controversial.

  17. Sparse representation-based spectral clustering for SAR image segmentation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangrong; Wei, Zhengli; Feng, Jie; Jiao, Licheng

    2011-12-01

    A new method, sparse representation based spectral clustering (SC) with Nyström method, is proposed for synthetic aperture radar (SAR) image segmentation. Different from the conventional SC, this proposed technique is developed by using the sparse coefficients which obtained by solving l1 minimization problem to construct the affinity matrix and the Nyström method is applied to alleviate the segmentation process. The advantage of our proposed method is that we do not need to select the scaling parameter in the Gaussian kernel function artificially. We apply the proposed method, k-means and the classic spectral clustering algorithm with Nyström method to SAR image segmentation. The results show that compared with the other two methods, the proposed method can obtain much better segmentation results.

  18. Sorting of Single Biomolecules based on Fourier Polar Representation of Surface Enhanced Raman Spectra

    PubMed Central

    Leray, Aymeric; Brulé, Thibault; Buret, Mickael; Colas des Francs, Gérard; Bouhelier, Alexandre; Dereux, Alain; Finot, Eric

    2016-01-01

    Surface enhanced Raman scattering (SERS) spectroscopy becomes increasingly used in biosensors for its capacity to detect and identify single molecules. In practice, a large number of SERS spectra are acquired and reliable ranking methods are thus essential for analysing all these data. Supervised classification strategies, which are the most effective methods, are usually applied but they require pre-determined models or classes. In this work, we propose to sort SERS spectra in unknown groups with an alternative strategy called Fourier polar representation. This non-fitting method based on simple Fourier sine and cosine transforms produces a fast and graphical representation for sorting SERS spectra with quantitative information. The reliability of this method was first investigated theoretically and numerically. Then, its performances were tested on two concrete biological examples: first with single amino-acid molecule (cysteine) and then with a mixture of three distinct odorous molecules. The benefits of this Fourier polar representation were highlighted and compared to the well-established statistical principal component analysis method. PMID:26833130

  19. Sorting of Single Biomolecules based on Fourier Polar Representation of Surface Enhanced Raman Spectra

    NASA Astrophysics Data System (ADS)

    Leray, Aymeric; Brulé, Thibault; Buret, Mickael; Colas Des Francs, Gérard; Bouhelier, Alexandre; Dereux, Alain; Finot, Eric

    2016-02-01

    Surface enhanced Raman scattering (SERS) spectroscopy becomes increasingly used in biosensors for its capacity to detect and identify single molecules. In practice, a large number of SERS spectra are acquired and reliable ranking methods are thus essential for analysing all these data. Supervised classification strategies, which are the most effective methods, are usually applied but they require pre-determined models or classes. In this work, we propose to sort SERS spectra in unknown groups with an alternative strategy called Fourier polar representation. This non-fitting method based on simple Fourier sine and cosine transforms produces a fast and graphical representation for sorting SERS spectra with quantitative information. The reliability of this method was first investigated theoretically and numerically. Then, its performances were tested on two concrete biological examples: first with single amino-acid molecule (cysteine) and then with a mixture of three distinct odorous molecules. The benefits of this Fourier polar representation were highlighted and compared to the well-established statistical principal component analysis method.

  20. Pavement crack characteristic detection based on sparse representation

    NASA Astrophysics Data System (ADS)

    Sun, Xiaoming; Huang, Jianping; Liu, Wanyu; Xu, Mantao

    2012-12-01

    Pavement crack detection plays an important role in pavement maintaining and management. The three-dimensional (3D) pavement crack detection technique based on laser is a recent trend due to its ability of discriminating dark areas, which are not caused by pavement distress such as tire marks, oil spills and shadows. In the field of 3D pavement crack detection, the most important thing is the accurate extraction of cracks in individual pavement profile without destroying pavement profile. So after analyzing the pavement profile signal characteristics and the changeability of pavement crack characteristics, a new method based on the sparse representation is developed to decompose pavement profile signal into a summation of the mainly pavement profile and cracks. Based on the characteristics of the pavement profile signal and crack, the mixed dictionary is constructed with an over-complete exponential function and an over-complete trapezoidal membership function, and the signal is separated by learning in this mixed dictionary with a matching pursuit algorithm. Some experiments were conducted and promising results were obtained, showing that we can detect the pavement crack efficiently and achieve a good separation of crack from pavement profile without destroying pavement profile.

  1. NVL - a knowledge representation language based on semantic networks

    SciTech Connect

    Hudli, A.V.

    1989-01-01

    Taxonomic hierarchical networks or semantic networks have been widely used in representing knowledge in AI applications. Semantic networks have been the preferred form of representation in AI, rather than predicate logic because of the need to represent complex structured knowledge. However, the formal semantics of these networks has not been dealt with adequately in the literature. In this thesis, semantic networks are described by means of a formal relational logic called NVL. The characteristic features of NVL are limitor lists and binary predicates. Limitor lists are similar to restricted quantifiers but are more expressive. Several special binary relations are used to express the key ideas of semantic networks. NVL is based on the principles of semantic networks and taxonomic reasoning. The unification and inference mechanisms of NVL have considerable inherent parallelism which makes the language suitable for parallel implementation. The current opinion in AI is that semantic networks represent a subset of first order logic. Rather than modify predicate logic by adding features of semantic networks, the approach has been to devise a new form of logic by considering the basic principles and epistemological primitives of semantic networks such as properties, class concepts, relations, and inheritance. The syntax and semantics of NVL are first presented. Rules in the knowledge based are represented by V relation which also plays an important role in deriving inferences. The (mathematical) correctness of NVL is proved and concepts of unification of lists and inference in NVL are introduced. Parallel algorithms for unification and inference are developed.

  2. Pedestrian detection from thermal images: A sparse representation based approach

    NASA Astrophysics Data System (ADS)

    Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi

    2016-05-01

    Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.

  3. Uncertainty Representation and Interpretation in Model-Based Prognostics Algorithms Based on Kalman Filter Estimation

    NASA Technical Reports Server (NTRS)

    Galvan, Jose Ramon; Saxena, Abhinav; Goebel, Kai Frank

    2012-01-01

    This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process, and how it relates to uncertainty representation, management and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for two while considering prognostics in making critical decisions.

  4. Perceptual hashing of sheet music based on graphical representation

    NASA Astrophysics Data System (ADS)

    Kremser, Gert; Schmucker, Martin

    2006-02-01

    For the protection of Intellectual Property Rights (IPR), different passive protection methods have been developed. These watermarking and fingerprinting technologies protect content beyond access control and thus tracing illegal distributions as well as the identification of people who are responsible for a illegal distribution is possible. The public's attention was attracted especially to the second application by the illegal distribution of the so called 'Hollywood screeners'. The focus of current research is on audio and video content and images. These are the common content types we are faced with every day, and which mostly have a huge commercial value. Especially the illegal distribution of content that has not been officially published shows the potential commercial impact of illegal distributions. Content types, however, are not limited to audio, video and images. There is a range of other content types, which also deserve the development of passive protection technologies. For sheet music for instance, different watermarking technologies have been developed, which up to this point only function within certain limitations. This is the reason why we wanted to find out how to develop a fingerprinting or perceptual hashing method for sheet music. In this article, we describe the development of our algorithm for sheet music, which is based on simple graphical features. We describe the selection of these features and the subsequent processing steps. The resulting compact representation is analyzed and the first performance results are reported.

  5. Feature Selection and Pedestrian Detection Based on Sparse Representation

    PubMed Central

    Yao, Shihong; Wang, Tao; Shen, Weiming; Pan, Shaoming; Chong, Yanwen; Ding, Fei

    2015-01-01

    Pedestrian detection have been currently devoted to the extraction of effective pedestrian features, which has become one of the obstacles in pedestrian detection application according to the variety of pedestrian features and their large dimension. Based on the theoretical analysis of six frequently-used features, SIFT, SURF, Haar, HOG, LBP and LSS, and their comparison with experimental results, this paper screens out the sparse feature subsets via sparse representation to investigate whether the sparse subsets have the same description abilities and the most stable features. When any two of the six features are fused, the fusion feature is sparsely represented to obtain its important components. Sparse subsets of the fusion features can be rapidly generated by avoiding calculation of the corresponding index of dimension numbers of these feature descriptors; thus, the calculation speed of the feature dimension reduction is improved and the pedestrian detection time is reduced. Experimental results show that sparse feature subsets are capable of keeping the important components of these six feature descriptors. The sparse features of HOG and LSS possess the same description ability and consume less time compared with their full features. The ratios of the sparse feature subsets of HOG and LSS to their full sets are the highest among the six, and thus these two features can be used to best describe the characteristics of the pedestrian and the sparse feature subsets of the combination of HOG-LSS show better distinguishing ability and parsimony. PMID:26295480

  6. Navigation based on a sensorimotor representation: a virtual reality study

    NASA Astrophysics Data System (ADS)

    Zetzsche, Christoph; Galbraith, Christopher; Wolter, Johannes; Schill, Kerstin

    2007-02-01

    We investigate the hypothesis that the basic representation of space which underlies human navigation does not resemble an image-like map and is not restricted by the laws of Euclidean geometry. For this we developed a new experimental technique in which we use the properties of a virtual environment (VE) to directly influence the development of the representation. We compared the navigation performance of human observers under two conditions. Either the VE is consistent with the geometrical properties of physical space and could hence be represented in a map-like fashion, or it contains severe violations of Euclidean metric and planar topology, and would thus pose difficulties for the correct development of such a representation. Performance is not influenced by this difference, suggesting that a map-like representation is not the major basis of human navigation. Rather, the results are consistent with a representation which is similar to a non-planar graph augmented with path length information, or with a sensorimotor representation which combines sensory properties and motor actions. The latter may be seen as part of a revised view of perceptual processes due to recent results in psychology and neurobiology, which indicate that the traditional strict separation of sensory and motor systems is no longer tenable.

  7. Arithmetic word problem solving: evidence for a magnitude-based mental representation.

    PubMed

    Orrantia, Josetxu; Múñez, David

    2013-01-01

    Previous findings have suggested that number processing involves a mental representation of numerical magnitude. Other research has shown that sensory experiences are part and parcel of the mental representation (or "simulation") that individuals construct during reading. We aimed at exploring whether arithmetic word-problem solving entails the construction of a mental simulation based on a representation of numerical magnitude. Participants were required to solve word problems and to perform an intermediate figure discrimination task that matched or mismatched, in terms of magnitude comparison, the mental representations that individuals constructed during problem solving. Our results showed that participants were faster in the discrimination task and performed better in the solving task when the figures matched the mental representations. These findings provide evidence that an analog magnitude-based mental representation is routinely activated during word-problem solving, and they add to a growing body of literature that emphasizes the experiential view of language comprehension.

  8. A Knowledge-Based Representation Scheme for Environmental Science Models

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Dungan, Jennifer L.; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    One of the primary methods available for studying environmental phenomena is the construction and analysis of computational models. We have been studying how artificial intelligence techniques can be applied to assist in the development and use of environmental science models within the context of NASA-sponsored activities. We have identified several high-utility areas as potential targets for research and development: model development; data visualization, analysis, and interpretation; model publishing and reuse, training and education; and framing, posing, and answering questions. Central to progress on any of the above areas is a representation for environmental models that contains a great deal more information than is present in a traditional software implementation. In particular, a traditional software implementation is devoid of any semantic information that connects the code with the environmental context that forms the background for the modeling activity. Before we can build AI systems to assist in model development and usage, we must develop a representation for environmental models that adequately describes a model's semantics and explicitly represents the relationship between the code and the modeling task at hand. We have developed one such representation in conjunction with our work on the SIGMA (Scientists' Intelligent Graphical Modeling Assistant) environment. The key feature of the representation is that it provides a semantic grounding for the symbols in a set of modeling equations by linking those symbols to an explicit representation of the underlying environmental scenario.

  9. Protein Sub-Nuclear Localization Based on Effective Fusion Representations and Dimension Reduction Algorithm LDA.

    PubMed

    Wang, Shunfang; Liu, Shuhui

    2015-12-19

    An effective representation of a protein sequence plays a crucial role in protein sub-nuclear localization. The existing representations, such as dipeptide composition (DipC), pseudo-amino acid composition (PseAAC) and position specific scoring matrix (PSSM), are insufficient to represent protein sequence due to their single perspectives. Thus, this paper proposes two fusion feature representations of DipPSSM and PseAAPSSM to integrate PSSM with DipC and PseAAC, respectively. When constructing each fusion representation, we introduce the balance factors to value the importance of its components. The optimal values of the balance factors are sought by genetic algorithm. Due to the high dimensionality of the proposed representations, linear discriminant analysis (LDA) is used to find its important low dimensional structure, which is essential for classification and location prediction. The numerical experiments on two public datasets with KNN classifier and cross-validation tests showed that in terms of the common indexes of sensitivity, specificity, accuracy and MCC, the proposed fusing representations outperform the traditional representations in protein sub-nuclear localization, and the representation treated by LDA outperforms the untreated one.

  10. Representer-based analyses in the coastal upwelling system

    NASA Astrophysics Data System (ADS)

    Kurapov, A. L.; Egbert, G. D.; Allen, J. S.; Miller, R. N.

    2009-10-01

    The impact of surface velocity and SSH data assimilated in a model of wind-driven upwelling over the shelf is studied using representer and observational array mode analyses and twin experiments, utilizing new tangent linear (TL) and adjoint (ADJ) codes. Bathymetry, forcing, and initial conditions are assumed to be alongshore uniform reducing the problem to classical two-dimensional. The model error is attributed to uncertainty in the surface wind stress. The representers, analyzed in cross-shore sections, show how assimilated observations provide corrections to the wind stress and circulation fields, and give information on the structure of the multivariate prior model error covariance. Since these error covariance fields satisfy the dynamics of the TL model, they maintain dominant balances (Ekman transport, geostrophy, thermal wind). Solutions computed over a flat bottom are qualitatively similar to a known analytical solution. Representers obtained with long cross-shore decorrelation scale for the wind stress errors lx (compared to the distance to coast) exhibit the classical upwelling structure. Solutions obtained with much smaller lx show structure associated with Ekman pumping, and are nearly singular if lx is smaller than the grid resolution. The zones of maximum influence of observations are sensitive to the background ocean conditions and are not necessarily centered around the observation locations. Array mode analysis is utilized to obtain model structures (combinations of representers) that are most stably observed by a given array. This analysis reveals that for realistic measurement errors and observational configurations, surface velocities will be more effective than SSH in providing correction to the wind stress on the scales of tens of km. In the DA test with synthetic observations, the prior nonlinear solution is obtained with spatially uniform alongshore wind stress and the true solution with the wind stress sharply reduced inshore of the

  11. Seq2Logo: a method for construction and visualization of amino acid binding motifs and sequence profiles including sequence weighting, pseudo counts and two-sided representation of amino acid enrichment and depletion

    PubMed Central

    Thomsen, Martin Christen Frølund; Nielsen, Morten

    2012-01-01

    Seq2Logo is a web-based sequence logo generator. Sequence logos are a graphical representation of the information content stored in a multiple sequence alignment (MSA) and provide a compact and highly intuitive representation of the position-specific amino acid composition of binding motifs, active sites, etc. in biological sequences. Accurate generation of sequence logos is often compromised by sequence redundancy and low number of observations. Moreover, most methods available for sequence logo generation focus on displaying the position-specific enrichment of amino acids, discarding the equally valuable information related to amino acid depletion. Seq2logo aims at resolving these issues allowing the user to include sequence weighting to correct for data redundancy, pseudo counts to correct for low number of observations and different logotype representations each capturing different aspects related to amino acid enrichment and depletion. Besides allowing input in the format of peptides and MSA, Seq2Logo accepts input as Blast sequence profiles, providing easy access for non-expert end-users to characterize and identify functionally conserved/variable amino acids in any given protein of interest. The output from the server is a sequence logo and a PSSM. Seq2Logo is available at http://www.cbs.dtu.dk/biotools/Seq2Logo (14 May 2012, date last accessed). PMID:22638583

  12. Seq2Logo: a method for construction and visualization of amino acid binding motifs and sequence profiles including sequence weighting, pseudo counts and two-sided representation of amino acid enrichment and depletion.

    PubMed

    Thomsen, Martin Christen Frølund; Nielsen, Morten

    2012-07-01

    Seq2Logo is a web-based sequence logo generator. Sequence logos are a graphical representation of the information content stored in a multiple sequence alignment (MSA) and provide a compact and highly intuitive representation of the position-specific amino acid composition of binding motifs, active sites, etc. in biological sequences. Accurate generation of sequence logos is often compromised by sequence redundancy and low number of observations. Moreover, most methods available for sequence logo generation focus on displaying the position-specific enrichment of amino acids, discarding the equally valuable information related to amino acid depletion. Seq2logo aims at resolving these issues allowing the user to include sequence weighting to correct for data redundancy, pseudo counts to correct for low number of observations and different logotype representations each capturing different aspects related to amino acid enrichment and depletion. Besides allowing input in the format of peptides and MSA, Seq2Logo accepts input as Blast sequence profiles, providing easy access for non-expert end-users to characterize and identify functionally conserved/variable amino acids in any given protein of interest. The output from the server is a sequence logo and a PSSM. Seq2Logo is available at http://www.cbs.dtu.dk/biotools/Seq2Logo (14 May 2012, date last accessed).

  13. Redundancy in perceptual and linguistic experience: comparing feature-based and distributional models of semantic representation.

    PubMed

    Riordan, Brian; Jones, Michael N

    2011-04-01

    Since their inception, distributional models of semantics have been criticized as inadequate cognitive theories of human semantic learning and representation. A principal challenge is that the representations derived by distributional models are purely symbolic and are not grounded in perception and action; this challenge has led many to favor feature-based models of semantic representation. We argue that the amount of perceptual and other semantic information that can be learned from purely distributional statistics has been underappreciated. We compare the representations of three feature-based and nine distributional models using a semantic clustering task. Several distributional models demonstrated semantic clustering comparable with clustering-based on feature-based representations. Furthermore, when trained on child-directed speech, the same distributional models perform as well as sensorimotor-based feature representations of children's lexical semantic knowledge. These results suggest that, to a large extent, information relevant for extracting semantic categories is redundantly coded in perceptual and linguistic experience. Detailed analyses of the semantic clusters of the feature-based and distributional models also reveal that the models make use of complementary cues to semantic organization from the two data streams. Rather than conceptualizing feature-based and distributional models as competing theories, we argue that future focus should be on understanding the cognitive mechanisms humans use to integrate the two sources.

  14. A Max-Margin Perspective on Sparse Representation-Based Classification

    DTIC Science & Technology

    2013-11-30

    ABSTRACT 16. SECURITY CLASSIFICATION OF: 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES 12. DISTRIBUTION AVAILIBILITY...Perspective on Sparse Representation-Based Classification Sparse Representation-based Classification (SRC) is a powerful tool in distinguishing signal...a reconstructive perspective, which neither offer- s any guarantee on its classification performance nor pro- The views, opinions and/or findings

  15. Spatiotemporal dynamics of similarity-based neural representations of facial identity.

    PubMed

    Vida, Mark D; Nestor, Adrian; Plaut, David C; Behrmann, Marlene

    2017-01-10

    Humans' remarkable ability to quickly and accurately discriminate among thousands of highly similar complex objects demands rapid and precise neural computations. To elucidate the process by which this is achieved, we used magnetoencephalography to measure spatiotemporal patterns of neural activity with high temporal resolution during visual discrimination among a large and carefully controlled set of faces. We also compared these neural data to lower level "image-based" and higher level "identity-based" model-based representations of our stimuli and to behavioral similarity judgments of our stimuli. Between ∼50 and 400 ms after stimulus onset, face-selective sources in right lateral occipital cortex and right fusiform gyrus and sources in a control region (left V1) yielded successful classification of facial identity. In all regions, early responses were more similar to the image-based representation than to the identity-based representation. In the face-selective regions only, responses were more similar to the identity-based representation at several time points after 200 ms. Behavioral responses were more similar to the identity-based representation than to the image-based representation, and their structure was predicted by responses in the face-selective regions. These results provide a temporally precise description of the transformation from low- to high-level representations of facial identity in human face-selective cortex and demonstrate that face-selective cortical regions represent multiple distinct types of information about face identity at different times over the first 500 ms after stimulus onset. These results have important implications for understanding the rapid emergence of fine-grained, high-level representations of object identity, a computation essential to human visual expertise.

  16. An Accurate Projector Calibration Method Based on Polynomial Distortion Representation

    PubMed Central

    Liu, Miao; Sun, Changku; Huang, Shujun; Zhang, Zonghua

    2015-01-01

    In structure light measurement systems or 3D printing systems, the errors caused by optical distortion of a digital projector always affect the precision performance and cannot be ignored. Existing methods to calibrate the projection distortion rely on calibration plate and photogrammetry, so the calibration performance is largely affected by the quality of the plate and the imaging system. This paper proposes a new projector calibration approach that makes use of photodiodes to directly detect the light emitted from a digital projector. By analyzing the output sequence of the photoelectric module, the pixel coordinates can be accurately obtained by the curve fitting method. A polynomial distortion representation is employed to reduce the residuals of the traditional distortion representation model. Experimental results and performance evaluation show that the proposed calibration method is able to avoid most of the disadvantages in traditional methods and achieves a higher accuracy. This proposed method is also practically applicable to evaluate the geometric optical performance of other optical projection system. PMID:26492247

  17. Scalar products in GL(3)-based models with trigonometric R-matrix. Determinant representation

    NASA Astrophysics Data System (ADS)

    Slavnov, N. A.

    2015-03-01

    We study quantum integrable GL(3)-based models with a trigonometric R-matrix solvable by the nested algebraic Bethe ansatz. We derive a determinant representation for a special case of scalar products of Bethe vectors. This representation allows one to find a determinant formula for the form factor of one of the monodromy matrix entries. We also point out an essential difference between form factors in the models with the trigonometric R-matrix and their analogs in GL(3)-invariant models.

  18. Sparsity-Based Representation for Classification Algorithms and Comparison Results for Transient Acoustic Signals

    DTIC Science & Technology

    2016-05-01

    large but correlated noise and signal interference (i.e., low-rank interference). Another contribution is the implementation of deep learning ...representation, low rank, deep learning 52 Tung-Duong Tran-Luu 301-394-3082Unclassified Unclassified Unclassified UU ii Approved for public release; distribution...is unlimited. Contents List of Figures v List of Tables vi 1. Introduction 1 1.1 Motivations 1 1.2 Sparsity-Based Representation for Transient Acoustic

  19. Improving the Representational Strategies of Children in a Music-Listening and Playing Task: An Intervention-Based Study

    ERIC Educational Resources Information Center

    Gil, Vicent; Reybrouck, Mark; Tejada, Jesús; Verschaffel, Lieven

    2015-01-01

    This intervention-based study focuses on the relation between music and its graphic representation from a meta-representational point of view. It aims to determine whether middle school students show an increase in meta-representational competence (MRC) after an educational intervention. Three classes of 11 to 14-year-old students participated in…

  20. Refining the Construct of Classroom-Based Writing-from-Readings Assessment: The Role of Task Representation

    ERIC Educational Resources Information Center

    Wolfersberger, Mark

    2013-01-01

    This article argues that task representation should be considered as part of the construct of classroom-based academic writing. Task representation is a process that writers move through when creating a unique mental model of the requirements for each new writing task they encounter. Writers' task representations evolve throughout the composing…

  1. Deep Learning of Part-Based Representation of Data Using Sparse Autoencoders With Nonnegativity Constraints.

    PubMed

    Hosseini-Asl, Ehsan; Zurada, Jacek M; Nasraoui, Olfa

    2016-12-01

    We demonstrate a new deep learning autoencoder network, trained by a nonnegativity constraint algorithm (nonnegativity-constrained autoencoder), that learns features that show part-based representation of data. The learning algorithm is based on constraining negative weights. The performance of the algorithm is assessed based on decomposing data into parts and its prediction performance is tested on three standard image data sets and one text data set. The results indicate that the nonnegativity constraint forces the autoencoder to learn features that amount to a part-based representation of data, while improving sparsity and reconstruction quality in comparison with the traditional sparse autoencoder and nonnegative matrix factorization. It is also shown that this newly acquired representation improves the prediction performance of a deep neural network.

  2. Implicit kernel sparse shape representation: a sparse-neighbors-based objection segmentation framework.

    PubMed

    Yao, Jincao; Yu, Huimin; Hu, Roland

    2017-01-01

    This paper introduces a new implicit-kernel-sparse-shape-representation-based object segmentation framework. Given an input object whose shape is similar to some of the elements in the training set, the proposed model can automatically find a cluster of implicit kernel sparse neighbors to approximately represent the input shape and guide the segmentation. A distance-constrained probabilistic definition together with a dualization energy term is developed to connect high-level shape representation and low-level image information. We theoretically prove that our model not only derives from two projected convex sets but is also equivalent to a sparse-reconstruction-error-based representation in the Hilbert space. Finally, a "wake-sleep"-based segmentation framework is applied to drive the evolutionary curve to recover the original shape of the object. We test our model on two public datasets. Numerical experiments on both synthetic images and real applications show the superior capabilities of the proposed framework.

  3. Alignment editing and identification of consensus secondary structures for nucleic acid sequences: interactive use of dot matrix representations.

    PubMed Central

    Davis, J P; Janjić, N; Pribnow, D; Zichi, D A

    1995-01-01

    We present a computer-aided approach for identifying and aligning consensus secondary structure within a set of functionally related oligonucleotide sequences aligned by sequence. The method relies on visualization of secondary structure using a generalization of the dot matrix representation appropriate for consensus sequence data sets. An interactive computer program implementing such a visualization of consensus structure has been developed. The program allows for alignment editing, data and display filtering and various modes of base pair representation, including co-variation. The utility of this approach is demonstrated with four sample data sets derived from in vitro selection experiments and one data set comprising tRNA sequences. Images PMID:7501472

  4. An Empirical Study of Plan-Based Representations of Pascal and Fortran Code.

    DTIC Science & Technology

    1987-06-01

    COMPUTING LABORATORY lReport No. CCL-0687-0 00 IAN EMPIRICAL STUDY OF PLAN-BASED REPRESENTATIONS OF PASCAL AND FORTRAN CODE Scott P. Robertson Chiung-Chen Yu...82173 ’, " Office of Naval Research Contract No. N00014-86-K-0876 Work Unit No. NR 4424203-01 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED...researchers have argued recenitly that programmers utilize a plan-based representation when composing or comprehending program code. In a series of studies we

  5. Hardware support for shape decoding from 2D-region-based image representations

    NASA Astrophysics Data System (ADS)

    Privat, Gilles; Le Hin, Ivan

    1997-01-01

    Graphics systems have long been using standard libraries or APIs to insulate applications from implementation specifics. The same approach is applicable to natural image representations based on object primitives, such as proposed for MPEG4 standardization. The rendering of these image objects can be hidden behind APIs and supported either in hardware or software, depending on the level of representation they address, so that higher-level manipulation of these objects is made independent of the pixel level. We evaluate the trade-offs involved in the choice of these primitives to be used as pivotal intermediate representations. The example addressed is shape coding for image regions obtained from segmentation. Shape coding primitives based on either contour (chain codes), union of elementary patterns and alphaplane are evaluated with regard to both the possibility to support them on different architecture models and the level of functionalities they make available.

  6. Low-rank and eigenface based sparse representation for face recognition.

    PubMed

    Hou, Yi-Fu; Sun, Zhan-Li; Chong, Yan-Wen; Zheng, Chun-Hou

    2014-01-01

    In this paper, based on low-rank representation and eigenface extraction, we present an improvement to the well known Sparse Representation based Classification (SRC). Firstly, the low-rank images of the face images of each individual in training subset are extracted by the Robust Principal Component Analysis (Robust PCA) to alleviate the influence of noises (e.g., illumination difference and occlusions). Secondly, Singular Value Decomposition (SVD) is applied to extract the eigenfaces from these low-rank and approximate images. Finally, we utilize these eigenfaces to construct a compact and discriminative dictionary for sparse representation. We evaluate our method on five popular databases. Experimental results demonstrate the effectiveness and robustness of our method.

  7. Low-Rank and Eigenface Based Sparse Representation for Face Recognition

    PubMed Central

    Hou, Yi-Fu; Sun, Zhan-Li; Chong, Yan-Wen; Zheng, Chun-Hou

    2014-01-01

    In this paper, based on low-rank representation and eigenface extraction, we present an improvement to the well known Sparse Representation based Classification (SRC). Firstly, the low-rank images of the face images of each individual in training subset are extracted by the Robust Principal Component Analysis (Robust PCA) to alleviate the influence of noises (e.g., illumination difference and occlusions). Secondly, Singular Value Decomposition (SVD) is applied to extract the eigenfaces from these low-rank and approximate images. Finally, we utilize these eigenfaces to construct a compact and discriminative dictionary for sparse representation. We evaluate our method on five popular databases. Experimental results demonstrate the effectiveness and robustness of our method. PMID:25334027

  8. Low-dose computed tomography image denoising based on joint wavelet and sparse representation.

    PubMed

    Ghadrdan, Samira; Alirezaie, Javad; Dillenseger, Jean-Louis; Babyn, Paul

    2014-01-01

    Image denoising and signal enhancement are the most challenging issues in low dose computed tomography (CT) imaging. Sparse representational methods have shown initial promise for these applications. In this work we present a wavelet based sparse representation denoising technique utilizing dictionary learning and clustering. By using wavelets we extract the most suitable features in the images to obtain accurate dictionary atoms for the denoising algorithm. To achieve improved results we also lower the number of clusters which reduces computational complexity. In addition, a single image noise level estimation is developed to update the cluster centers in higher PSNRs. Our results along with the computational efficiency of the proposed algorithm clearly demonstrates the improvement of the proposed algorithm over other clustering based sparse representation (CSR) and K-SVD methods.

  9. Nucleic acid based logical systems.

    PubMed

    Han, Da; Kang, Huaizhi; Zhang, Tao; Wu, Cuichen; Zhou, Cuisong; You, Mingxu; Chen, Zhuo; Zhang, Xiaobing; Tan, Weihong

    2014-05-12

    Researchers increasingly visualize a significant role for artificial biochemical logical systems in biological engineering, much like digital logic circuits in electrical engineering. Those logical systems could be utilized as a type of servomechanism to control nanodevices in vitro, monitor chemical reactions in situ, or regulate gene expression in vivo. Nucleic acids (NA), as carriers of genetic information with well-regulated and predictable structures, are promising materials for the design and engineering of biochemical circuits. A number of logical devices based on nucleic acids (NA) have been designed to handle various processes for technological or biotechnological purposes. This article focuses on the most recent and important developments in NA-based logical devices and their evolution from in vitro, through cellular, even towards in vivo biological applications.

  10. An efficient numerical method for protein sequences similarity analysis based on a new two-dimensional graphical representation.

    PubMed

    El-Lakkani, A; Mahran, H

    2015-01-01

    A new two-dimensional graphical representation of protein sequences is introduced. Twenty concentric evenly spaced circles divided by n radial lines into equal divisions are selected to represent any protein sequence of length n. Each circle represents one of the different 20 amino acids, and each radial line represents a single amino acid of the protein sequence. An efficient numerical method based on the graph is proposed to measure the similarity between two protein sequences. To prove the accuracy of our approach, the method is applied to NADH dehydrogenase subunit 5 (ND5) proteins of nine different species and 24 transferrin sequences from vertebrates. High values of correlation coefficient between our results and the results of ClustalW are obtained (approximately perfect correlations). These values are higher than the values obtained in many other related works.

  11. Teaching Representations of Competency-Based Education. A Case Study

    ERIC Educational Resources Information Center

    Covarrubias-Papahiu, Patricia

    2016-01-01

    The aim of this research was to know how the Competency-Based Education (CBE) approach is represented by professors who are part of the professional education of psychologists, and the challenges and implications of, in their opinion, incorporating it in the classroom practice. Therefore, a research was conducted to know the type of…

  12. Estimation of the entropy based on its polynomial representation.

    PubMed

    Vinck, Martin; Battaglia, Francesco P; Balakirsky, Vladimir B; Vinck, A J Han; Pennartz, Cyriel M A

    2012-05-01

    Estimating entropy from empirical samples of finite size is of central importance for information theory as well as the analysis of complex statistical systems. Yet, this delicate task is marred by intrinsic statistical bias. Here we decompose the entropy function into a polynomial approximation function and a remainder function. The approximation function is based on a Taylor expansion of the logarithm. Given n observations, we give an unbiased, linear estimate of the first n power series terms based on counting sets of k coincidences. For the remainder function we use nonlinear Bayesian estimation with a nearly flat prior distribution on the entropy that was developed by Nemenman, Shafee, and Bialek. Our simulations show that the combined entropy estimator has reduced bias in comparison to other available estimators.

  13. Estimation of the entropy based on its polynomial representation

    NASA Astrophysics Data System (ADS)

    Vinck, Martin; Battaglia, Francesco P.; Balakirsky, Vladimir B.; Vinck, A. J. Han; Pennartz, Cyriel M. A.

    2012-05-01

    Estimating entropy from empirical samples of finite size is of central importance for information theory as well as the analysis of complex statistical systems. Yet, this delicate task is marred by intrinsic statistical bias. Here we decompose the entropy function into a polynomial approximation function and a remainder function. The approximation function is based on a Taylor expansion of the logarithm. Given n observations, we give an unbiased, linear estimate of the first n power series terms based on counting sets of k coincidences. For the remainder function we use nonlinear Bayesian estimation with a nearly flat prior distribution on the entropy that was developed by Nemenman, Shafee, and Bialek. Our simulations show that the combined entropy estimator has reduced bias in comparison to other available estimators.

  14. Sparse coding based dense feature representation model for hyperspectral image classification

    NASA Astrophysics Data System (ADS)

    Oguslu, Ender; Zhou, Guoqing; Zheng, Zezhong; Iftekharuddin, Khan; Li, Jiang

    2015-11-01

    We present a sparse coding based dense feature representation model (a preliminary version of the paper was presented at the SPIE Remote Sensing Conference, Dresden, Germany, 2013) for hyperspectral image (HSI) classification. The proposed method learns a new representation for each pixel in HSI through the following four steps: sub-band construction, dictionary learning, encoding, and feature selection. The new representation usually has a very high dimensionality requiring a large amount of computational resources. We applied the l1/lq regularized multiclass logistic regression technique to reduce the size of the new representation. We integrated the method with a linear support vector machine (SVM) and a composite kernels SVM (CKSVM) to discriminate different types of land cover. We evaluated the proposed algorithm on three well-known HSI datasets and compared our method to four recently developed classification methods: SVM, CKSVM, simultaneous orthogonal matching pursuit, and image fusion and recursive filtering. Experimental results show that the proposed method can achieve better overall and average classification accuracies with a much more compact representation leading to more efficient sparse models for HSI classification.

  15. Is Mathematical Representation of Problems an Evidence-Based Strategy for Students with Mathematics Difficulties?

    ERIC Educational Resources Information Center

    Jitendra, Asha K.; Nelson, Gena; Pulles, Sandra M.; Kiss, Allyson J.; Houseworth, James

    2016-01-01

    The purpose of the present review was to evaluate the quality of the research and evidence base for representation of problems as a strategy to enhance the mathematical performance of students with learning disabilities and those at risk for mathematics difficulties. The authors evaluated 25 experimental and quasiexperimental studies according to…

  16. An Individualized e-Reading System Developed Based on Multi-Representations Approach

    ERIC Educational Resources Information Center

    Ko, Chien-Chuan; Chiang, Chun-Han; Lin, Yun-Lung; Chen, Ming-Chung

    2011-01-01

    Students with disabilities encounter many difficulties in learning activities, especially in reading. To help them participate in reading activities more effectively, this study proposed an integrated reading support system based on the principle of multiple representations. An integrated e-reading support system provides physical, sensory, and…

  17. Supporting Representational Competence in High School Biology with Computer-Based Biomolecular Visualizations

    ERIC Educational Resources Information Center

    Wilder, Anna; Brinkerhoff, Jonathan

    2007-01-01

    This study assessed the effectiveness of computer-based biomolecular visualization activities on the development of high school biology students' representational competence as a means of understanding and visualizing protein structure/function relationships. Also assessed were students' attitudes toward these activities. Sixty-nine students…

  18. Growth Points in Linking Representations of Function: A Research-Based Framework

    ERIC Educational Resources Information Center

    Ronda, Erlina

    2015-01-01

    This paper describes five growth points in linking representations of function developed from a study of secondary school learners. Framed within the cognitivist perspective and process-object conception of function, the growth points were identified and described based on linear and quadratic function tasks learners can do and their strategies…

  19. Compressive Fresnel digital holography using Fresnelet based sparse representation

    NASA Astrophysics Data System (ADS)

    Ramachandran, Prakash; Alex, Zachariah C.; Nelleri, Anith

    2015-04-01

    Compressive sensing (CS) in digital holography requires only very less number of pixel level detections in hologram plane for accurate image reconstruction and this is achieved by exploiting the sparsity of the object wave. When the input object fields are non-sparse in spatial domain, CS demands a suitable sparsification method like wavelet decomposition. The Fresnelet, a suitable wavelet basis for processing Fresnel digital holograms is an efficient sparsifier for the complex Fresnel field obtained by the Fresnel transform of the object field and minimizes the mutual coherence between sensing and sparsifying matrices involved in CS. The paper demonstrates the merits of Fresnelet based sparsification in compressive digital Fresnel holography over conventional method of sparsifying the input object field. The phase shifting digital Fresnel holography (PSDH) is used to retrieve the complex Fresnel field for the chosen problem. The results are presented from a numerical experiment to show the proof of the concept.

  20. Sparse Representation Based Multiple Frame Video Super-Resolution.

    PubMed

    Dai, Qiqin; Yoo, Seunghwan; Kappeler, Armin; Katsaggelos, Aggelos K

    2016-11-22

    In this paper, we propose two multiple-frame superresolution (SR) algorithms based on dictionary learning and motion estimation. First, we adopt the use of video bilevel dictionary learning which has been used for single-frame SR. It is extended to multiple frames by using motion estimation with subpixel accuracy. We propose a batch and a temporally recursive multi-frame SR algorithm, which improve over single frame SR. Finally, we propose a novel dictionary learning algorithm utilizing consecutive video frames, rather than still images or individual video frames, which further improves the performance of the video SR algorithms. Extensive experimental comparisons with state-of-the-art SR algorithms verify the effectiveness of our proposed multiple-frame video SR approach.

  1. Shape representation for efficient landmark-based segmentation in 3-d.

    PubMed

    Ibragimov, Bulat; Likar, Boštjan; Pernuš, Franjo; Vrtovec, Tomaž

    2014-04-01

    In this paper, we propose a novel approach to landmark-based shape representation that is based on transportation theory, where landmarks are considered as sources and destinations, all possible landmark connections as roads, and established landmark connections as goods transported via these roads. Landmark connections, which are selectively established, are identified through their statistical properties describing the shape of the object of interest, and indicate the least costly roads for transporting goods from sources to destinations. From such a perspective, we introduce three novel shape representations that are combined with an existing landmark detection algorithm based on game theory. To reduce computational complexity, which results from the extension from 2-D to 3-D segmentation, landmark detection is augmented by a concept known in game theory as strategy dominance. The novel shape representations, game-theoretic landmark detection and strategy dominance are combined into a segmentation framework that was evaluated on 3-D computed tomography images of lumbar vertebrae and femoral heads. The best shape representation yielded symmetric surface distance of 0.75 mm and 1.11 mm, and Dice coefficient of 93.6% and 96.2% for lumbar vertebrae and femoral heads, respectively. By applying strategy dominance, the computational costs were further reduced for up to three times.

  2. Improving the learning of clinical reasoning through computer-based cognitive representation

    PubMed Central

    Wu, Bian; Wang, Minhong; Johnson, Janice M.; Grotzer, Tina A.

    2014-01-01

    Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students’ learning products from the beginning to the end of the study, consistent with students’ report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction. PMID:25518871

  3. The Acid-Base Titration of a Very Weak Acid: Boric Acid

    ERIC Educational Resources Information Center

    Celeste, M.; Azevedo, C.; Cavaleiro, Ana M. V.

    2012-01-01

    A laboratory experiment based on the titration of boric acid with strong base in the presence of d-mannitol is described. Boric acid is a very weak acid and direct titration with NaOH is not possible. An auxiliary reagent that contributes to the release of protons in a known stoichiometry facilitates the acid-base titration. Students obtain the…

  4. Integration of object-oriented knowledge representation with the CLIPS rule based system

    NASA Technical Reports Server (NTRS)

    Logie, David S.; Kamil, Hasan

    1990-01-01

    The paper describes a portion of the work aimed at developing an integrated, knowledge based environment for the development of engineering-oriented applications. An Object Representation Language (ORL) was implemented in C++ which is used to build and modify an object-oriented knowledge base. The ORL was designed in such a way so as to be easily integrated with other representation schemes that could effectively reason with the object base. Specifically, the integration of the ORL with the rule based system C Language Production Systems (CLIPS), developed at the NASA Johnson Space Center, will be discussed. The object-oriented knowledge representation provides a natural means of representing problem data as a collection of related objects. Objects are comprised of descriptive properties and interrelationships. The object-oriented model promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects. Data is inherited through an object network via the relationship links. Together, the two schemes complement each other in that the object-oriented approach efficiently handles problem data while the rule based knowledge is used to simulate the reasoning process. Alone, the object based knowledge is little more than an object-oriented data storage scheme; however, the CLIPS inference engine adds the mechanism to directly and automatically reason with that knowledge. In this hybrid scheme, the expert system dynamically queries for data and can modify the object base with complete access to all the functionality of the ORL from rules.

  5. Spectral-spatial hyperspectral image classification using super-pixel-based spatial pyramid representation

    NASA Astrophysics Data System (ADS)

    Fan, Jiayuan; Tan, Hui Li; Toomik, Maria; Lu, Shijian

    2016-10-01

    Spatial pyramid matching has demonstrated its power for image recognition task by pooling features from spatially increasingly fine sub-regions. Motivated by the concept of feature pooling at multiple pyramid levels, we propose a novel spectral-spatial hyperspectral image classification approach using superpixel-based spatial pyramid representation. This technique first generates multiple superpixel maps by decreasing the superpixel number gradually along with the increased spatial regions for labelled samples. By using every superpixel map, sparse representation of pixels within every spatial region is then computed through local max pooling. Finally, features learned from training samples are aggregated and trained by a support vector machine (SVM) classifier. The proposed spectral-spatial hyperspectral image classification technique has been evaluated on two public hyperspectral datasets, including the Indian Pines image containing 16 different agricultural scene categories with a 20m resolution acquired by AVIRIS and the University of Pavia image containing 9 land-use categories with a 1.3m spatial resolution acquired by the ROSIS-03 sensor. Experimental results show significantly improved performance compared with the state-of-the-art works. The major contributions of this proposed technique include (1) a new spectral-spatial classification approach to generate feature representation for hyperspectral image, (2) a complementary yet effective feature pooling approach, i.e. the superpixel-based spatial pyramid representation that is used for the spatial correlation study, (3) evaluation on two public hyperspectral image datasets with superior image classification performance.

  6. Virtual images inspired consolidate collaborative representation-based classification method for face recognition

    NASA Astrophysics Data System (ADS)

    Liu, Shigang; Zhang, Xinxin; Peng, Yali; Cao, Han

    2016-07-01

    The collaborative representation-based classification method performs well in the field of classification of high-dimensional images such as face recognition. It utilizes training samples from all classes to represent a test sample and assigns a class label to the test sample using the representation residuals. However, this method still suffers from the problem that limited number of training sample influences the classification accuracy when applied to image classification. In this paper, we propose a modified collaborative representation-based classification method (MCRC), which exploits novel virtual images and can obtain high classification accuracy. The procedure to produce virtual images is very simple but the use of them can bring surprising performance improvement. The virtual images can sufficiently denote the features of original face images in some case. Extensive experimental results doubtlessly demonstrate that the proposed method can effectively improve the classification accuracy. This is mainly attributed to the integration of the collaborative representation and the proposed feature-information dominated virtual images.

  7. Multiple Kernel Sparse Representation based Orthogonal Discriminative Projection and Its Cost-Sensitive Extension.

    PubMed

    Zhang, Guoqing; Sun, Huaijiang; Xia, Guiyu; Sun, Quansen

    2016-07-07

    Sparse representation based classification (SRC) has been developed and shown great potential for real-world application. Based on SRC, Yang et al. [10] devised a SRC steered discriminative projection (SRC-DP) method. However, as a linear algorithm, SRC-DP cannot handle the data with highly nonlinear distribution. Kernel sparse representation-based classifier (KSRC) is a non-linear extension of SRC and can remedy the drawback of SRC. KSRC requires the use of a predetermined kernel function and selection of the kernel function and its parameters is difficult. Recently, multiple kernel learning for SRC (MKL-SRC) [22] has been proposed to learn a kernel from a set of base kernels. However, MKL-SRC only considers the within-class reconstruction residual while ignoring the between-class relationship, when learning the kernel weights. In this paper, we propose a novel multiple kernel sparse representation-based classifier (MKSRC), and then we use it as a criterion to design a multiple kernel sparse representation based orthogonal discriminative projection method (MK-SR-ODP). The proposed algorithm aims at learning a projection matrix and a corresponding kernel from the given base kernels such that in the low dimension subspace the between-class reconstruction residual is maximized and the within-class reconstruction residual is minimized. Furthermore, to achieve a minimum overall loss by performing recognition in the learned low-dimensional subspace, we introduce cost information into the dimensionality reduction method. The solutions for the proposed method can be efficiently found based on trace ratio optimization method [33]. Extensive experimental results demonstrate the superiority of the proposed algorithm when compared with the state-of-the-art methods.

  8. A Frame-based Representation for a Bedside Ventilator Weaning Protocol

    PubMed Central

    Sorenson, D; Grissom, CK; Carpenter, L; Austin, A; Sward, K; Napoli, L; Warner, HR; Morris, AH

    2016-01-01

    We describe the use of a frame-based knowledge representation to construct an adequately-explicit bedside clinical decision support application for ventilator weaning. The application consists of a data entry form, a knowledge base, an inference engine, and a patient database. The knowledge base contains database queries, a data dictionary, and decision frames. A frame consists of a title, a list of findings necessary to make a decision or carry out an action, and a logic or mathematical statement to determine its output. Frames for knowledge representation are advantageous because they can be created, visualized, and conceptualized as self-contained entities that correspond to accepted medical constructs. They facilitate knowledge engineering and provide understandable explanations of protocol outputs for clinicians. Our frames are elements of a hierarchical decision process. In addition to running diagnostic and therapeutic logic, frames can run database queries, make changes to the user interface, and modify computer variables. PMID:18358789

  9. Reexamining the language account of cross-national differences in base-10 number representations.

    PubMed

    Vasilyeva, Marina; Laski, Elida V; Ermakova, Anna; Lai, Weng-Feng; Jeong, Yoonkyung; Hachigian, Amy

    2015-01-01

    East Asian students consistently outperform students from other nations in mathematics. One explanation for this advantage is a language account; East Asian languages, unlike most Western languages, provide cues about the base-10 structure of multi-digit numbers, facilitating the development of base-10 number representations. To test this view, the current study examined how kindergartners represented two-digit numbers using single unit-blocks and ten-blocks. The participants (N=272) were from four language groups (Korean, Mandarin, English, and Russian) that vary in the extent of "transparency" of the base-10 structure. In contrast to previous findings with older children, kindergartners showed no cross-language variability in the frequency of producing base-10 representations. Furthermore, they showed a pattern of within-language variability that was not consistent with the language account and was likely attributable to experiential factors. These findings suggest that language might not play as critical a role in the development of base-10 representations as suggested in earlier research.

  10. User-based representation of time-resolved multimodal public transportation networks

    PubMed Central

    Alessandretti, Laura; Gauvin, Laetitia

    2016-01-01

    Multimodal transportation systems, with several coexisting services like bus, tram and metro, can be represented as time-resolved multilayer networks where the different transportation modes connecting the same set of nodes are associated with distinct network layers. Their quantitative description became possible recently due to openly accessible datasets describing the geo-localized transportation dynamics of large urban areas. Advancements call for novel analytics, which combines earlier established methods and exploits the inherent complexity of the data. Here, we provide a novel user-based representation of public transportation systems, which combines representations, accounting for the presence of multiple lines and reducing the effect of spatial embeddedness, while considering the total travel time, its variability across the schedule, and taking into account the number of transfers necessary. After the adjustment of earlier techniques to the novel representation framework, we analyse the public transportation systems of several French municipal areas and identify hidden patterns of privileged connections. Furthermore, we study their efficiency as compared to the commuting flow. The proposed representation could help to enhance resilience of local transportation systems to provide better design policies for future developments. PMID:27493773

  11. A Computational Shape-based Model of Anger and Sadness Justifies a Configural Representation of Faces

    PubMed Central

    Neth, Donald; Martinez, Aleix M.

    2010-01-01

    Research suggests that configural cues (second-order relations) play a major role in the representation and classification of face images; making faces a “special” class of objects, since object recognition seems to use different encoding mechanisms. It is less clear, however, how this representation emerges and whether this representation is also used in the recognition of facial expressions of emotion. In this paper, we show how configural cues emerge naturally from a classical analysis of shape in the recognition of anger and sadness. In particular our results suggest that at least two of the dimensions of the computational (cognitive) space of facial expressions of emotion correspond to pure configural changes. The first of these dimensions measures the distance between the eyebrows and the mouth, while the second is concerned with the height-width ratio of the face. Under this proposed model, becoming a face “expert” would mean to move from the generic shape representation to that based on configural cues. These results suggest that the recognition of facial expressions of emotion shares this expertise property with the other processes of face processing. PMID:20510267

  12. Sparse Representation-Based Image Quality Index With Adaptive Sub-Dictionaries.

    PubMed

    Li, Leida; Cai, Hao; Zhang, Yabin; Lin, Weisi; Kot, Alex C; Sun, Xingming

    2016-08-01

    Distortions cause structural changes in digital images, leading to degraded visual quality. Dictionary-based sparse representation has been widely studied recently due to its ability to extract inherent image structures. Meantime, it can extract image features with slightly higher level semantics. Intuitively, sparse representation can be used for image quality assessment, because visible distortions can cause significant changes to the sparse features. In this paper, a new sparse representation-based image quality assessment model is proposed based on the construction of adaptive sub-dictionaries. An overcomplete dictionary trained from natural images is employed to capture the structure changes between the reference and distorted images by sparse feature extraction via adaptive sub-dictionary selection. Based on the observation that image sparse features are invariant to weak degradations and the perceived image quality is generally influenced by diverse issues, three auxiliary quality features are added, including gradient, color, and luminance information. The proposed method is not sensitive to training images, so a universal dictionary can be adopted for quality evaluation. Extensive experiments on five public image quality databases demonstrate that the proposed method produces the state-of-the-art results, and it delivers consistently well performances when tested in different image quality databases.

  13. Multisensory Part-based Representations of Objects in Human Lateral Occipital Cortex

    PubMed Central

    Erdogan, Goker; Chen, Quanjing; Garcea, Frank E.; Mahon, Bradford Z.; Jacobs, Robert A.

    2016-01-01

    The format of high-level object representations in temporal-occipital cortex is a fundamental and as yet unresolved issue. Here we use fMRI to show that human lateral occipital cortex (LOC) encodes novel 3-D objects in a multisensory and part-based format. We show that visual and haptic exploration of objects leads to similar patterns of neural activity in human LOC and that the shared variance between visually and haptically induced patterns of BOLD contrast in LOC reflects the part structure of the objects. We also show that linear classifiers trained on neural data from LOC on a subset of the objects successfully predict a novel object based on its component part structure. These data demonstrate a multisensory code for object representations in LOC that specifies the part structure of objects. PMID:26918587

  14. The Brain's Representations May Be Compatible With Convolution-Based Memory Models.

    PubMed

    Kato, Kenichi; Caplan, Jeremy B

    2017-02-13

    Convolution is a mathematical operation used in vector-models of memory that have been successful in explaining a broad range of behaviour, including memory for associations between pairs of items, an important primitive of memory upon which a broad range of everyday memory behaviour depends. However, convolution models have trouble with naturalistic item representations, which are highly auto-correlated (as one finds, e.g., with photographs), and this has cast doubt on their neural plausibility. Consequently, modellers working with convolution have used item representations composed of randomly drawn values, but introducing so-called noise-like representation raises the question how those random-like values might relate to actual item properties. We propose that a compromise solution to this problem may already exist. It has also long been known that the brain tends to reduce auto-correlations in its inputs. For example, centre-surround cells in the retina approximate a Difference-of-Gaussians (DoG) transform. This enhances edges, but also turns natural images into images that are closer to being statistically like white noise. We show the DoG-transformed images, although not optimal compared to noise-like representations, survive the convolution model better than naturalistic images. This is a proof-of-principle that the pervasive tendency of the brain to reduce auto-correlations may result in representations of information that are already adequately compatible with convolution, supporting the neural plausibility of convolution-based association-memory. (PsycINFO Database Record

  15. Internal representations for face detection: an application of noise-based image classification to BOLD responses.

    PubMed

    Nestor, Adrian; Vettel, Jean M; Tarr, Michael J

    2013-11-01

    What basic visual structures underlie human face detection and how can we extract such structures directly from the amplitude of neural responses elicited by face processing? Here, we address these issues by investigating an extension of noise-based image classification to BOLD responses recorded in high-level visual areas. First, we assess the applicability of this classification method to such data and, second, we explore its results in connection with the neural processing of faces. To this end, we construct luminance templates from white noise fields based on the response of face-selective areas in the human ventral cortex. Using behaviorally and neurally-derived classification images, our results reveal a family of simple but robust image structures subserving face representation and detection. Thus, we confirm the role played by classical face selective regions in face detection and we help clarify the representational basis of this perceptual function. From a theory standpoint, our findings support the idea of simple but highly diagnostic neurally-coded features for face detection. At the same time, from a methodological perspective, our work demonstrates the ability of noise-based image classification in conjunction with fMRI to help uncover the structure of high-level perceptual representations.

  16. Quantum representation and watermark strategy for color images based on the controlled rotation of qubits

    NASA Astrophysics Data System (ADS)

    Li, Panchi; Xiao, Hong; Li, Binxu

    2016-11-01

    In this paper, a novel quantum representation and watermarking scheme based on the controlled rotation of qubits are proposed. Firstly, a flexible representation for quantum color image (FRQCI) is proposed to facilitate the image processing tasks. Some basic image processing operations based on FRQCI representation are introduced. Then, a novel watermarking scheme for quantum images is presented. In our scheme, the carrier image is stored in the phase θ of a qubit; at the same time, the watermark image is embedded into the phase φ of a qubit, which will not affect the carrier image's visual effect. Before being embedded into the carrier image, the watermark image is scrambled to be seemingly meaningless using quantum circuits, which further ensures the security of the watermark image. All the operations mentioned above are implemented by the controlled rotation of qubits. The experimental results on the classical computer show that the proposed watermarking scheme has better visual quality under a higher embedding capacity and outperforms the existing schemes in the literature.

  17. Spacetime texture representation and recognition based on a spatiotemporal orientation analysis.

    PubMed

    Derpanis, Konstantinos G; Wildes, Richard P

    2012-06-01

    This paper is concerned with the representation and recognition of the observed dynamics (i.e., excluding purely spatial appearance cues) of spacetime texture based on a spatiotemporal orientation analysis. The term "spacetime texture" is taken to refer to patterns in visual spacetime, (x,y,t), that primarily are characterized by the aggregate dynamic properties of elements or local measurements accumulated over a region of spatiotemporal support, rather than in terms of the dynamics of individual constituents. Examples include image sequences of natural processes that exhibit stochastic dynamics (e.g., fire, water, and windblown vegetation) as well as images of simpler dynamics when analyzed in terms of aggregate region properties (e.g., uniform motion of elements in imagery, such as pedestrians and vehicular traffic). Spacetime texture representation and recognition is important as it provides an early means of capturing the structure of an ensuing image stream in a meaningful fashion. Toward such ends, a novel approach to spacetime texture representation and an associated recognition method are described based on distributions (histograms) of spacetime orientation structure. Empirical evaluation on both standard and original image data sets shows the promise of the approach, including significant improvement over alternative state-of-the-art approaches in recognizing the same pattern from different viewpoints.

  18. Segmentation of Hyperacute Cerebral Infarcts Based on Sparse Representation of Diffusion Weighted Imaging

    PubMed Central

    Zhang, Xiaodong; Jing, Shasha; Gao, Peiyi; Xue, Jing; Su, Lu; Li, Weiping; Ren, Lijie

    2016-01-01

    Segmentation of infarcts at hyperacute stage is challenging as they exhibit substantial variability which may even be hard for experts to delineate manually. In this paper, a sparse representation based classification method is explored. For each patient, four volumetric data items including three volumes of diffusion weighted imaging and a computed asymmetry map are employed to extract patch features which are then fed to dictionary learning and classification based on sparse representation. Elastic net is adopted to replace the traditional L0-norm/L1-norm constraints on sparse representation to stabilize sparse code. To decrease computation cost and to reduce false positives, regions-of-interest are determined to confine candidate infarct voxels. The proposed method has been validated on 98 consecutive patients recruited within 6 hours from onset. It is shown that the proposed method could handle well infarcts with intensity variability and ill-defined edges to yield significantly higher Dice coefficient (0.755 ± 0.118) than the other two methods and their enhanced versions by confining their segmentations within the regions-of-interest (average Dice coefficient less than 0.610). The proposed method could provide a potential tool to quantify infarcts from diffusion weighted imaging at hyperacute stage with accuracy and speed to assist the decision making especially for thrombolytic therapy. PMID:27746825

  19. Sparse representation based multi-threshold segmentation for hyperspectral target detection

    NASA Astrophysics Data System (ADS)

    Feng, Wei-yi; Chen, Qian; Miao, Zhuang; He, Wei-ji; Gu, Guo-hua; Zhuang, Jia-yan

    2013-08-01

    A sparse representation based multi-threshold segmentation (SRMTS) algorithm for target detection in hyperspectral images is proposed. Benefiting from the sparse representation, the high-dimensional spectral data can be characterized into a series of sparse feature vectors which has only a few nonzero coefficients. Through setting an appropriate threshold, the noise removed sparse spectral vectors are divided into two subspaces in the sparse domain consistent with the sample spectrum to separate the target from the background. Then a correlation and a vector 1-norm are calculated respectively in the subspaces. The sparse characteristic of the target is used to ext ract the target with a multi -threshold method. Unlike the conventional hyperspectral dimensionality reduction methods used in target detection algorithms, like Principal Components Analysis (PCA) and Maximum Noise Fraction (MNF), this algorithm maintains the spectral characteristics while removing the noise due to the sparse representation. In the experiments, an orthogonal wavelet sparse base is used to sparse the spectral information and a best contraction threshold to remove the hyperspectral image noise according to the noise estimation of the test images. Compared with co mmon algorithms, such as Adaptive Cosine Estimator (ACE), Constrained Energy Minimizat ion (CEM) and the noise removed MNF-CEM algorithm, the proposed algorithm demonstrates higher detection rates and robustness via the ROC curves.

  20. Single-Trial Sparse Representation-Based Approach for VEP Extraction

    PubMed Central

    Yu, Nannan; Hu, Funian; Zou, Dexuan; Ding, Qisheng

    2016-01-01

    Sparse representation is a powerful tool in signal denoising, and visual evoked potentials (VEPs) have been proven to have strong sparsity over an appropriate dictionary. Inspired by this idea, we present in this paper a novel sparse representation-based approach to solving the VEP extraction problem. The extraction process is performed in three stages. First, instead of using the mixed signals containing the electroencephalogram (EEG) and VEPs, we utilise an EEG from a previous trial, which did not contain VEPs, to identify the parameters of the EEG autoregressive (AR) model. Second, instead of the moving average (MA) model, sparse representation is used to model the VEPs in the autoregressive-moving average (ARMA) model. Finally, we calculate the sparse coefficients and derive VEPs by using the AR model. Next, we tested the performance of the proposed algorithm with synthetic and real data, after which we compared the results with that of an AR model with exogenous input modelling and a mixed overcomplete dictionary-based sparse component decomposition method. Utilising the synthetic data, the algorithms are then employed to estimate the latencies of P100 of the VEPs corrupted by added simulated EEG at different signal-to-noise ratio (SNR) values. The validations demonstrate that our method can well preserve the details of the VEPs for latency estimation, even in low SNR environments. PMID:27807541

  1. A geometric modeler based on a dual-geometry representation polyhedra and rational b-splines

    NASA Technical Reports Server (NTRS)

    Klosterman, A. L.

    1984-01-01

    For speed and data base reasons, solid geometric modeling of large complex practical systems is usually approximated by a polyhedra representation. Precise parametric surface and implicit algebraic modelers are available but it is not yet practical to model the same level of system complexity with these precise modelers. In response to this contrast the GEOMOD geometric modeling system was built so that a polyhedra abstraction of the geometry would be available for interactive modeling without losing the precise definition of the geometry. Part of the reason that polyhedra modelers are effective is that all bounded surfaces can be represented in a single canonical format (i.e., sets of planar polygons). This permits a very simple and compact data structure. Nonuniform rational B-splines are currently the best representation to describe a very large class of geometry precisely with one canonical format. The specific capabilities of the modeler are described.

  2. A joint sparse representation-based method for double-trial evoked potentials estimation.

    PubMed

    Yu, Nannan; Liu, Haikuan; Wang, Xiaoyan; Lu, Hanbing

    2013-12-01

    In this paper, we present a novel approach to solving an evoked potentials estimating problem. Generally, the evoked potentials in two consecutive trials obtained by repeated identical stimuli of the nerves are extremely similar. In order to trace evoked potentials, we propose a joint sparse representation-based double-trial evoked potentials estimation method, taking full advantage of this similarity. The estimation process is performed in three stages: first, according to the similarity of evoked potentials and the randomness of a spontaneous electroencephalogram, the two consecutive observations of evoked potentials are considered as superpositions of the common component and the unique components; second, making use of their characteristics, the two sparse dictionaries are constructed; and finally, we apply the joint sparse representation method in order to extract the common component of double-trial observations, instead of the evoked potential in each trial. A series of experiments carried out on simulated and human test responses confirmed the superior performance of our method.

  3. The Neural Representation of Unexpected Uncertainty During Value-Based Decision Making

    PubMed Central

    Payzan-LeNestour, Elise; Dunne, Simon; Bossaerts, Peter; O'Doherty, John P.

    2016-01-01

    Summary Uncertainty is an inherent property of the environment and a central feature of models of decision-making and learning. Theoretical propositions suggest that one form, unexpected uncertainty, may be used to rapidly adapt to changes in the environment, while being influenced by two other forms: risk and estimation uncertainty. While previous studies have reported neural representations of estimation uncertainty and risk, relatively little is known about unexpected uncertainty. Here, participants performed a decision-making task while undergoing functional magnetic resonance imaging (fMRI), which, in combination with a Bayesian model-based analysis, enabled each form of uncertainty to be separately measured. We found representations of unexpected uncertainty in multiple cortical areas, as well as the noradrenergic brainstem nucleus locus coeruleus. Other unique cortical regions were found to encode risk, estimation uncertainty and learning rate. Collectively, these findings support theoretical models in which several formally separable uncertainty computations determine the speed of learning. PMID:23849203

  4. Asymptotic Representation of the Temperature Field in Injection Stages During the Cyclic Acidizing of the Formation

    NASA Astrophysics Data System (ADS)

    Filippov, A. I.; Akhmetova, O. V.; Koval‧skii, A. A.; Kabirov, I. F.

    2016-11-01

    The authors have obtained an asymptotic solution to the problem on the temperature fi eld appearing during the injection of a chemically active solvent into a layered inhomogeneous orthotropic formation with account taken of temperature disturbances due to the preceding technological processes. This makes it possible to use the obtained solution for calculations under the conditions of multiple cyclic action. The sources of temperature disturbances are the liberation of heat due to the chemical reaction and the change in the temperature of the acid solution injected into the formation. Therefore, the obtained solution also covers the range of thermal-acidizing processes. The implemented method of construction of the solution is an extension of an asymptotic method accurate in the mean, which has been proposed by the authors as applied to the problems with nonzero initial conditions. The results of calculations of space-time temperature distributions under the conditions of hydrochloric acid treatment of carbonaceous formations have been given.

  5. A stepwise stoichiometric representation to confirm the dependence of pesticide/humic acid interactions on salt concentration and to test the performance of a silica bonded humic acid column.

    PubMed

    André, C; Thomassin, M; Berthelot, A; Guillaume, Y C

    2006-02-01

    In a previous paper (André et al., in press), a novel chromatographic column was developed in our laboratory for studying the binding of pesticides with humic acid (HA), the main organic component in soil. It was demonstrated that this column supported a low fraction of organic modifier in the aqueous mobile phase (<0.25 (v/v)). To overcome this limitation for a practical use, a column in which the stationary phase was based on silica gel with chemically bonded humic acid was created. It was shown that this novel HA column supported a higher methanol fraction (<0.55 (v/v)). As well, the dependence of pesticide/humic acid interactions on salt (sodium chloride) concentration has been expressed in terms of a stepwise stoichiometric representation, which leads to a specific equation for the partition of the added salt between the pesticide molecule, the HA, and the pesticide/HA complex. Based on this novel equation, the dependence of the pesticide/humic acid association on the salt concentration can be formulated via a relation similar to the one of Tanford. In addition, for the first time, the calculation of the affinity energy distribution for different values of the salt concentration in the mobile phase confirmed the existence of several types of binding sites on the HA macromolecule.

  6. Secure Base Representations in Middle Childhood Across Two Western Cultures: Associations with Parental Attachment Representations and Maternal Reports of Behavior Problems

    PubMed Central

    Waters, Theodore E. A.; Bosmans, Guy; Vandevivere, Eva; Dujardin, Adinda; Waters, Harriet S.

    2015-01-01

    Recent work examining the content and organization of attachment representations suggests that one way in which we represent the attachment relationship is in the form of a cognitive script. That said, this work has largely focused on early childhood or adolescence/adulthood, leaving a large gap in our understanding of script-like attachment representations in the middle childhood period. We present two studies and provide three critical pieces of evidence regarding the presence of a script-like representation of the attachment relationship in middle childhood. We present evidence that a middle childhood attachment script assessment tapped a stable underlying script using samples drawn from two western cultures, the United States (Study 1) and Belgium (Study 2). We also found evidence suggestive of the intergenerational transmission of secure base script knowledge (Study 1) and relations between secure base script knowledge and symptoms of psychopathology in middle childhood (Study 2). The results from this investigation represent an important downward extension of the secure base script construct. PMID:26147774

  7. Secure base representations in middle childhood across two Western cultures: Associations with parental attachment representations and maternal reports of behavior problems.

    PubMed

    Waters, Theodore E A; Bosmans, Guy; Vandevivere, Eva; Dujardin, Adinda; Waters, Harriet S

    2015-08-01

    Recent work examining the content and organization of attachment representations suggests that 1 way in which we represent the attachment relationship is in the form of a cognitive script. This work has largely focused on early childhood or adolescence/adulthood, leaving a large gap in our understanding of script-like attachment representations in the middle childhood period. We present 2 studies and provide 3 critical pieces of evidence regarding the presence of a script-like representation of the attachment relationship in middle childhood. We present evidence that a middle childhood attachment script assessment tapped a stable underlying script using samples drawn from 2 western cultures, the United States (Study 1) and Belgium (Study 2). We also found evidence suggestive of the intergenerational transmission of secure base script knowledge (Study 1) and relations between secure base script knowledge and symptoms of psychopathology in middle childhood (Study 2). The results from this investigation represent an important downward extension of the secure base script construct.

  8. Compact 2-D graphical representation of DNA

    NASA Astrophysics Data System (ADS)

    Randić, Milan; Vračko, Marjan; Zupan, Jure; Novič, Marjana

    2003-05-01

    We present a novel 2-D graphical representation for DNA sequences which has an important advantage over the existing graphical representations of DNA in being very compact. It is based on: (1) use of binary labels for the four nucleic acid bases, and (2) use of the 'worm' curve as template on which binary codes are placed. The approach is illustrated on DNA sequences of the first exon of human β-globin and gorilla β-globin.

  9. [Recognition of water-injected meat based on visible/near-infrared spectrum and sparse representation].

    PubMed

    Hao, Dong-mei; Zhou, Ya-nan; Wang, Yu; Zhang, Song; Yang, Yi-min; Lin, Ling; Li, Gang; Wang, Xiu-li

    2015-01-01

    The present paper proposed a new nondestructive method based on visible/near infrared spectrum (Vis/NIRS) and sparse representation to rapidly and accurately discriminate between raw meat and water-injected meat. Water-injected meat model was built by injecting water into non-destructed meat samples comprising pigskin, fat layer and muscle layer. Vis/NIRS data were collected from raw meat and six scales of water-injected meat with spectrometers. To reduce the redundant information in the spectrum and improve the difference between the samples,. some preprocessing steps were performed for the spectral data, including light modulation and normalization. Effective spectral bands were extracted from the preprocessed spectral data. The meat samples were classified as raw meat and water-injected meat, and further, water-injected meat with different water injection rates. All the training samples were used to compose an atom dictionary, and test samples were represented by the sparsest linear combinations of these atoms via l1-minimization. Projection errors of test samples with respect to each category were calculated. A test sample was classified to the category with the minimum projection error, and leave-one-out cross-validation was conducted. The recognition performance from sparse representation was compared with that from support vector machine (SVM).. Experimental results showed that the overall recognition accuracy of sparse representation for raw meat and water-injected meat was more than 90%, which was higher than that of SVM. For water-injected meat samples with different water injection rates, the recognition accuracy presented a positive correlation with the water injection rate difference. Spare representation-based classifier eliminates the need for the training and feature extraction steps required by conventional pattern recognition models, and is suitable for processing data of high dimensionality and small sample size. Furthermore, it has a low

  10. An Efficient Representation-Based Method for Boundary Point and Outlier Detection.

    PubMed

    Li, Xiaojie; Lv, Jiancheng; Yi, Zhang

    2016-10-19

    Detecting boundary points (including outliers) is often more interesting than detecting normal observations, since they represent valid, interesting, and potentially valuable patterns. Since data representation can uncover the intrinsic data structure, we present an efficient representation-based method for detecting such points, which are generally located around the margin of densely distributed data, such as a cluster. For each point, the negative components in its representation generally correspond to the boundary points among its affine combination of points. In the presented method, the reverse unreachability of a point is proposed to evaluate to what degree this observation is a boundary point. The reverse unreachability can be calculated by counting the number of zero and negative components in the representation. The reverse unreachability explicitly takes into account the global data structure and reveals the disconnectivity between a data point and other points. This paper reveals that the reverse unreachability of points with lower density has a higher score. Note that the score of reverse unreachability of an outlier is greater than that of a boundary point. The top-$m$ ranked points can thus be identified as outliers. The greater the value of the reverse unreachability, the more likely the point is a boundary point. Compared with related methods, our method better reflects the characteristics of the data, and simultaneously detects outliers and boundary points regardless of their distribution and the dimensionality of the space. Experimental results obtained for a number of synthetic and real-world data sets demonstrate the effectiveness and efficiency of our method.

  11. A neural network model of semantic memory linking feature-based object representation and words.

    PubMed

    Cuppini, C; Magosso, E; Ursino, M

    2009-06-01

    Recent theories in cognitive neuroscience suggest that semantic memory is a distributed process, which involves many cortical areas and is based on a multimodal representation of objects. The aim of this work is to extend a previous model of object representation to realize a semantic memory, in which sensory-motor representations of objects are linked with words. The model assumes that each object is described as a collection of features, coded in different cortical areas via a topological organization. Features in different objects are segmented via gamma-band synchronization of neural oscillators. The feature areas are further connected with a lexical area, devoted to the representation of words. Synapses among the feature areas, and among the lexical area and the feature areas are trained via a time-dependent Hebbian rule, during a period in which individual objects are presented together with the corresponding words. Simulation results demonstrate that, during the retrieval phase, the network can deal with the simultaneous presence of objects (from sensory-motor inputs) and words (from acoustic inputs), can correctly associate objects with words and segment objects even in the presence of incomplete information. Moreover, the network can realize some semantic links among words representing objects with shared features. These results support the idea that semantic memory can be described as an integrated process, whose content is retrieved by the co-activation of different multimodal regions. In perspective, extended versions of this model may be used to test conceptual theories, and to provide a quantitative assessment of existing data (for instance concerning patients with neural deficits).

  12. Exploring the potential of the theory of social representations in community-based health research--and vice versa?

    PubMed

    Howarth, Caroline; Foster, Juliet; Dorrer, Nike

    2004-03-01

    This article seeks to demonstrate the importance of developing a dialogue between social representations theory and community approaches to researching issues of health. We show how we have used the theory within our own research to ground our findings at the level of community. The article is divided into three sections: the recognition of competing systems of knowledge; the role of representations in maintaining stigmatizing practices; and the impact of representations on identities. Each section is illustrated with material drawn from Foster's research on mental illness and Dorrer's research on women's representations of healthy eating. We conclude by arguing that, while social representations theory is a valuable tool for community-based health research, the theory would benefit from developing a more participatory methodology.

  13. Rubber airplane: Constraint-based component-modeling for knowledge representation in computer-aided conceptual design

    NASA Technical Reports Server (NTRS)

    Kolb, Mark A.

    1990-01-01

    Viewgraphs on Rubber Airplane: Constraint-based Component-Modeling for Knowledge Representation in Computer Aided Conceptual Design are presented. Topics covered include: computer aided design; object oriented programming; airfoil design; surveillance aircraft; commercial aircraft; aircraft design; and launch vehicles.

  14. A framework for knowledge acquisition, representation and problem-solving in knowledge-based planning

    NASA Astrophysics Data System (ADS)

    Martinez-Bermudez, Iliana

    This research addresses the problem of developing planning knowledge-based applications. In particular, it is concerned with the problems of knowledge acquisition and representation---the issues that remain an impediment to the development of large-scale, knowledge-based planning applications. This work aims to develop a model of planning problem solving that facilitates expert knowledge elicitation and also supports effective problem solving. Achieving this goal requires determining the types of knowledge used by planning experts, the structure of this knowledge, and the problem-solving process that results in the plan. While answering these questions it became clear that the knowledge structure, as well as the process of problem solving, largely depends on the knowledge available to the expert. This dissertation proposes classification of planning problems based on their use of expert knowledge. Such classification can help in the selection of the appropriate planning method when dealing with a specific planning problem. The research concentrates on one of the identified classes of planning problems that can be characterized by well-defined and well-structured problem-solving knowledge. To achieve a more complete knowledge representation architecture for such problems, this work employs the task-specific approach to problem solving. The result of this endeavor is a task-specific methodology that allows the representation and use of planning knowledge in a structural, consistent manner specific to the domain of the application. The shell for building a knowledge-based planning application was created as a proof of concept for the methodology described in this dissertation. This shell enabled the development of a system for manufacturing planning---COMPLAN. COMPLAN encompasses knowledge related to four generic techniques used in composite material manufacturing and, given the description of the composite part, creates a family of plans capable of producing it.

  15. Phosphonic acid based exchange resins

    DOEpatents

    Horwitz, E.P.; Alexandratos, S.D.; Gatrone, R.C.; Chiarizia, R.

    1995-09-12

    An ion exchange resin is described for extracting metal ions from a liquid waste stream. An ion exchange resin is prepared by copolymerizing a vinylidene diphosphonic acid with styrene, acrylonitrile and divinylbenzene. 10 figs.

  16. Phosphonic acid based exchange resins

    DOEpatents

    Horwitz, E. Philip; Alexandratos, Spiro D.; Gatrone, Ralph C.; Chiarizia, Ronato

    1995-01-01

    An ion exchange resin for extracting metal ions from a liquid waste stream. An ion exchange resin is prepared by copolymerizing a vinylidene diphosphonic acid with styrene, acrylonitrile and divinylbenzene.

  17. Towards Model Inadequacy Representations for Flamelet-Based RANS Combustion Simulations

    NASA Astrophysics Data System (ADS)

    Oliver, Todd; Lee, M. K.; Sondak, David; Simmons, Chris; Moser, Robert

    2016-11-01

    Flamelet-based RANS simulations are commonly used in combustion engineering. In such simulations, chemical reactions are represented by a "flamelet-library" of laminar diffusion flame solutions generated with some chemical mechanism, and turbulence is represented using typical eddy-viscosity-based RANS closures. Modeling errors are introduced through both of these models as well as their interaction. In this work, we formulate and apply physics-based stochastic model inadequacy representations to capture the effects of possible modeling errors, allowing their impact on quantities of interest to be estimated. Specifically, the uncertainty introduced by inadequacy of the chemical mechanism is represented using a recently developed stochastic operator approach, which is extended to the diffusion flame here, leading to a stochastic diffusion flame library. A Karhunen-Loeve decomposition applied to these random fields enables low-dimensional representation of this uncertainty. A stochastic extension of typical eddy-viscosity-based RANS models is developed to represent inadequacy in the turbulence closures. The full stochastic model is demonstrated on simulations of a planar jet flame.

  18. The Role of Familiarity for Representations in Norm-Based Face Space

    PubMed Central

    Faerber, Stella J.; Kaufmann, Jürgen M.; Leder, Helmut; Martin, Eva Maria; Schweinberger, Stefan R.

    2016-01-01

    According to the norm-based version of the multidimensional face space model (nMDFS, Valentine, 1991), any given face and its corresponding anti-face (which deviates from the norm in exactly opposite direction as the original face) should be equidistant to a hypothetical prototype face (norm), such that by definition face and anti-face should bear the same level of perceived typicality. However, it has been argued that familiarity affects perceived typicality and that representations of familiar faces are qualitatively different (e.g., more robust and image-independent) from those for unfamiliar faces. Here we investigated the role of face familiarity for rated typicality, using two frequently used operationalisations of typicality (deviation-based: DEV), and distinctiveness (face in the crowd: FITC) for faces of celebrities and their corresponding anti-faces. We further assessed attractiveness, likeability and trustworthiness ratings of the stimuli, which are potentially related to typicality. For unfamiliar faces and their corresponding anti-faces, in line with the predictions of the nMDFS, our results demonstrate comparable levels of perceived typicality (DEV). In contrast, familiar faces were perceived much less typical than their anti-faces. Furthermore, familiar faces were rated higher than their anti-faces in distinctiveness, attractiveness, likability and trustworthiness. These findings suggest that familiarity strongly affects the distribution of facial representations in norm-based face space. Overall, our study suggests (1) that familiarity needs to be considered in studies of mental representations of faces, and (2) that familiarity, general distance-to-norm and more specific vector directions in face space make different and interactive contributions to different types of facial evaluations. PMID:27168323

  19. The Conjugate Acid-Base Chart.

    ERIC Educational Resources Information Center

    Treptow, Richard S.

    1986-01-01

    Discusses the difficulties that beginning chemistry students have in understanding acid-base chemistry. Describes the use of conjugate acid-base charts in helping students visualize the conjugate relationship. Addresses chart construction, metal ions, buffers and pH titrations, and the organic functional groups and nonaqueous solvents. (TW)

  20. Students' Alternate Conceptions on Acids and Bases

    ERIC Educational Resources Information Center

    Pan, Hanqing; Henriques, Laura

    2015-01-01

    Knowing what students bring to the classroom can and should influence how we teach them. This study is a review of the literature associated with secondary and postsecondary students' ideas about acids and bases. It was found that there are six types of alternate ideas about acids and bases that students hold. These are: macroscopic properties of…

  1. The Kidney and Acid-Base Regulation

    ERIC Educational Resources Information Center

    Koeppen, Bruce M.

    2009-01-01

    Since the topic of the role of the kidneys in the regulation of acid base balance was last reviewed from a teaching perspective (Koeppen BM. Renal regulation of acid-base balance. Adv Physiol Educ 20: 132-141, 1998), our understanding of the specific membrane transporters involved in H+, HCO , and NH transport, and especially how these…

  2. Gyrator transform based double random phase encoding with sparse representation for information authentication

    NASA Astrophysics Data System (ADS)

    Chen, Jun-xin; Zhu, Zhi-liang; Fu, Chong; Yu, Hai; Zhang, Li-bo

    2015-07-01

    Optical information security systems have drawn long-term concerns. In this paper, an optical information authentication approach using gyrator transform based double random phase encoding with sparse representation is proposed. Different from traditional optical encryption schemes, only sparse version of the ciphertext is preserved, and hence the decrypted result is completely unrecognizable and shows no similarity to the plaintext. However, we demonstrate that the noise-like decipher result can be effectively authenticated by means of optical correlation approach. Simulations prove that the proposed method is feasible and effective, and can provide additional protection for optical security systems.

  3. Dynamic analysis of the Space Station truss structure based on a continuum representation

    NASA Technical Reports Server (NTRS)

    Thomas, Segun; Stubbs, Norris

    1989-01-01

    A mathematical model is developed for the real-time simulation of a Space Station. First, a continuum equivalent representation of the Space Station truss structure is presented which accounts for extensional, transverse, and shear deformations and coupling between them. The procedure achieves a significant reduction in the degrees of freedom of the system. Dynamic equations are then formulated for the continuum equivalent of the Space Station truss structure based on the matrix version of Kane's dynamical equations. Finally, constraint equations are derived for the dynamic analysis of flexible bodies with closed loop configuration.

  4. Improving Feature Representation Based on a Neural Network for Author Profiling in Social Media Texts

    PubMed Central

    2016-01-01

    We introduce a lexical resource for preprocessing social media data. We show that a neural network-based feature representation is enhanced by using this resource. We conducted experiments on the PAN 2015 and PAN 2016 author profiling corpora and obtained better results when performing the data preprocessing using the developed lexical resource. The resource includes dictionaries of slang words, contractions, abbreviations, and emoticons commonly used in social media. Each of the dictionaries was built for the English, Spanish, Dutch, and Italian languages. The resource is freely available. PMID:27795703

  5. Improving Feature Representation Based on a Neural Network for Author Profiling in Social Media Texts.

    PubMed

    Gómez-Adorno, Helena; Markov, Ilia; Sidorov, Grigori; Posadas-Durán, Juan-Pablo; Sanchez-Perez, Miguel A; Chanona-Hernandez, Liliana

    2016-01-01

    We introduce a lexical resource for preprocessing social media data. We show that a neural network-based feature representation is enhanced by using this resource. We conducted experiments on the PAN 2015 and PAN 2016 author profiling corpora and obtained better results when performing the data preprocessing using the developed lexical resource. The resource includes dictionaries of slang words, contractions, abbreviations, and emoticons commonly used in social media. Each of the dictionaries was built for the English, Spanish, Dutch, and Italian languages. The resource is freely available.

  6. Distinct pathways for rule-based retrieval and spatial mapping of memory representations in hippocampal neurons

    PubMed Central

    Navawongse, Rapeechai; Eichenbaum, Howard

    2013-01-01

    Hippocampal neurons encode events within the context in which they occurred, a fundamental feature of episodic memory. Here we explored the sources of event and context information represented by hippocampal neurons during the retrieval of object associations in rats. Temporary inactivation of the medial prefrontal cortex differentially reduced the selectivity of rule-based object associations represented by hippocampal neuronal firing patterns but did not affect spatial firing patterns. By contrast, inactivation of the medial entorhinal cortex resulted in a pervasive reorganization of hippocampal mappings of spatial context and events. These results suggest distinct and cooperative prefrontal and medial temporal mechanisms in memory representation. PMID:23325238

  7. Can Verbalisers Learn as well as Visualisers in Simulation-Based CAL with Predominantly Visual Representations? Preliminary Evidence from a Pilot Study

    ERIC Educational Resources Information Center

    Liu, Tzu-Chien; Kinshuk; Lin, Yi-Chun; Wang, Ssu-Chin

    2012-01-01

    Simulation-based computer-assisted learning (CAL) is emerging as new technologies are finding a place in mainstream education. Dynamically linked multiple representations (DLMRs) is at the core of simulation-based CAL. DLMRs includes multiple visual representations, and it enables students to manipulate one representation and to immediately…

  8. Voxel Based Representation of Full-Waveform Airborne Laser Scanner Data for Forestry Applications

    NASA Astrophysics Data System (ADS)

    Stelling, N.; Richter, K.

    2016-06-01

    The advantages of using airborne full-waveform laser scanner data in forest applications, e.g. for the description of the vertical vegetation structure or accurate biomass estimation, have been emphasized in many publications. To exploit the full potential offered by airborne full-waveform laser scanning data, the development of voxel based methods for data analysis is essential. In contrast to existing approaches based on the extraction of discrete 3D points by a Gaussian decomposition, it is very promising to derive the voxel attributes from the digitised waveform directly. For this purpose, the waveform data have to be transferred into a 3D voxel representation. This requires a series of radiometric and geometric transformations of the raw full-waveform laser scanner data. Thus, the paper deals with the geometric aspects and describes a processing chain from the raw waveform data to an attenuationcorrected volumetric forest stand reconstruction. The integration of attenuation-corrected waveform data into the voxel space is realised with an efficient parametric voxel traversal method operating on an octree data structure. The voxel attributes are derived from the amplitudes of the attenuation-corrected waveforms. Additionally, a new 3D filtering approach is presented to eliminate non-object voxel. Applying these methods to real full-waveform laser scanning data, a voxel based representation of a spruce was generated combining three flight strips from different viewing directions.

  9. Enhanced reduced representation bisulfite sequencing for assessment of DNA methylation at base pair resolution.

    PubMed

    Garrett-Bakelman, Francine E; Sheridan, Caroline K; Kacmarczyk, Thadeous J; Ishii, Jennifer; Betel, Doron; Alonso, Alicia; Mason, Christopher E; Figueroa, Maria E; Melnick, Ari M

    2015-02-24

    DNA methylation pattern mapping is heavily studied in normal and diseased tissues. A variety of methods have been established to interrogate the cytosine methylation patterns in cells. Reduced representation of whole genome bisulfite sequencing was developed to detect quantitative base pair resolution cytosine methylation patterns at GC-rich genomic loci. This is accomplished by combining the use of a restriction enzyme followed by bisulfite conversion. Enhanced Reduced Representation Bisulfite Sequencing (ERRBS) increases the biologically relevant genomic loci covered and has been used to profile cytosine methylation in DNA from human, mouse and other organisms. ERRBS initiates with restriction enzyme digestion of DNA to generate low molecular weight fragments for use in library preparation. These fragments are subjected to standard library construction for next generation sequencing. Bisulfite conversion of unmethylated cytosines prior to the final amplification step allows for quantitative base resolution of cytosine methylation levels in covered genomic loci. The protocol can be completed within four days. Despite low complexity in the first three bases sequenced, ERRBS libraries yield high quality data when using a designated sequencing control lane. Mapping and bioinformatics analysis is then performed and yields data that can be easily integrated with a variety of genome-wide platforms. ERRBS can utilize small input material quantities making it feasible to process human clinical samples and applicable in a range of research applications. The video produced demonstrates critical steps of the ERRBS protocol.

  10. Learning to detect objects in images via a sparse, part-based representation.

    PubMed

    Agarwal, Shivani; Awan, Aatif; Roth, Dan

    2004-11-01

    We study the problem of detecting objects in still, gray-scale images. Our primary focus is the development of a learning-based approach to the problem that makes use of a sparse, part-based representation. A vocabulary of distinctive object parts is automatically constructed from a set of sample images of the object class of interest; images are then represented using parts from this vocabulary, together with spatial relations observed among the parts. Based on this representation, a learning algorithm is used to automatically learn to detect instances of the object class in new images. The approach can be applied to any object with distinguishable parts in a relatively fixed spatial configuration; it is evaluated here on difficult sets of real-world images containing side views of cars, and is seen to successfully detect objects in varying conditions amidst background clutter and mild occlusion. In evaluating object detection approaches, several important methodological issues arise that have not been satisfactorily addressed in previous work. A secondary focus of this paper is to highlight these issues and to develop rigorous evaluation standards for the object detection problem. A critical evaluation of our approach under the proposed standards is presented.

  11. [Identification of transmission fluid based on NIR spectroscopy by combining sparse representation method with manifold learning].

    PubMed

    Jiang, Lu-Lu; Luo, Mei-Fu; Zhang, Yu; Yu, Xin-Jie; Kong, Wen-Wen; Liu, Fei

    2014-01-01

    An identification method based on sparse representation (SR) combined with autoencoder network (AN) manifold learning was proposed for discriminating the varieties of transmission fluid by using near infrared (NIR) spectroscopy technology. NIR transmittance spectra from 600 to 1 800 nm were collected from 300 transmission fluid samples of five varieties (each variety consists of 60 samples). For each variety, 30 samples were randomly selected as training set (totally 150 samples), and the rest 30 ones as testing set (totally 150 samples). Autoencoder network manifold learning was applied to obtain the characteristic information in the 600-1800 nm spectra and the number of characteristics was reduced to 10. Principal component analysis (PCA) was applied to extract several relevant variables to represent the useful information of spectral variables. All of the training samples made up a data dictionary of the sparse representation (SR). Then the transmission fluid variety identification problem was reduced to the problem as how to represent the testing samples from the data dictionary (training samples data). The identification result thus could be achieved by solving the L-1 norm-based optimization problem. We compared the effectiveness of the proposed method with that of linear discriminant analysis (LDA), least squares support vector machine (LS-SVM) and sparse representation (SR) using the relevant variables selected by principal component analysis (PCA) and AN. Experimental results demonstrated that the overall identification accuracy of the proposed method for the five transmission fluid varieties was 97.33% by AN-SR, which was significantly higher than that of LDA or LS-SVM. Therefore, the proposed method can provide a new effective method for identification of transmission fluid variety.

  12. Flutter signal extracting technique based on FOG and self-adaptive sparse representation algorithm

    NASA Astrophysics Data System (ADS)

    Lei, Jian; Meng, Xiangtao; Xiang, Zheng

    2016-10-01

    Due to various moving parts inside, when a spacecraft runs in orbits, its structure could get a minor angular vibration, which results in vague image formation of space camera. Thus, image compensation technique is required to eliminate or alleviate the effect of movement on image formation and it is necessary to realize precise measuring of flutter angle. Due to the advantages such as high sensitivity, broad bandwidth, simple structure and no inner mechanical moving parts, FOG (fiber optical gyro) is adopted in this study to measure minor angular vibration. Then, movement leading to image degeneration is achieved by calculation. The idea of the movement information extracting algorithm based on self-adaptive sparse representation is to use arctangent function approximating L0 norm to construct unconstrained noisy-signal-aimed sparse reconstruction model and then solve the model by a method based on steepest descent algorithm and BFGS algorithm to estimate sparse signal. Then taking the advantage of the principle of random noises not able to be represented by linear combination of elements, useful signal and random noised are separated effectively. Because the main interference of minor angular vibration to image formation of space camera is random noises, sparse representation algorithm could extract useful information to a large extent and acts as a fitting pre-process method of image restoration. The self-adaptive sparse representation algorithm presented in this paper is used to process the measured minor-angle-vibration signal of FOG used by some certain spacecraft. By component analysis of the processing results, we can find out that the algorithm could extract micro angular vibration signal of FOG precisely and effectively, and can achieve the precision degree of 0.1".

  13. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    SciTech Connect

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  14. Sequential injection redox or acid-base titration for determination of ascorbic acid or acetic acid.

    PubMed

    Lenghor, Narong; Jakmunee, Jaroon; Vilen, Michael; Sara, Rolf; Christian, Gary D; Grudpan, Kate

    2002-12-06

    Two sequential injection titration systems with spectrophotometric detection have been developed. The first system for determination of ascorbic acid was based on redox reaction between ascorbic acid and permanganate in an acidic medium and lead to a decrease in color intensity of permanganate, monitored at 525 nm. A linear dependence of peak area obtained with ascorbic acid concentration up to 1200 mg l(-1) was achieved. The relative standard deviation for 11 replicate determinations of 400 mg l(-1) ascorbic acid was 2.9%. The second system, for acetic acid determination, was based on acid-base titration of acetic acid with sodium hydroxide using phenolphthalein as an indicator. The decrease in color intensity of the indicator was proportional to the acid content. A linear calibration graph in the range of 2-8% w v(-1) of acetic acid with a relative standard deviation of 4.8% (5.0% w v(-1) acetic acid, n=11) was obtained. Sample throughputs of 60 h(-1) were achieved for both systems. The systems were successfully applied for the assays of ascorbic acid in vitamin C tablets and acetic acid content in vinegars, respectively.

  15. A new graph-based molecular descriptor using the canonical representation of the molecule.

    PubMed

    Hentabli, Hamza; Saeed, Faisal; Abdo, Ammar; Salim, Naomie

    2014-01-01

    Molecular similarity is a pervasive concept in drug design. The basic idea underlying molecular similarity is the similar property principle, which states that structurally similar molecules will exhibit similar physicochemical and biological properties. In this paper, a new graph-based molecular descriptor (GBMD) is introduced. The GBMD is a new method of obtaining a rough description of 2D molecular structure in textual form based on the canonical representations of the molecule outline shape and it allows rigorous structure specification using small and natural grammars. Simulated virtual screening experiments with the MDDR database show clearly the superiority of the graph-based descriptor compared to many standard descriptors (ALOGP, MACCS, EPFP4, CDKFP, PCFP, and SMILE) using the Tanimoto coefficient (TAN) and the basic local alignment search tool (BLAST) when searches were carried.

  16. RSS Fingerprint Based Indoor Localization Using Sparse Representation with Spatio-Temporal Constraint.

    PubMed

    Piao, Xinglin; Zhang, Yong; Li, Tingshu; Hu, Yongli; Liu, Hao; Zhang, Ke; Ge, Yun

    2016-11-03

    The Received Signal Strength (RSS) fingerprint-based indoor localization is an important research topic in wireless network communications. Most current RSS fingerprint-based indoor localization methods do not explore and utilize the spatial or temporal correlation existing in fingerprint data and measurement data, which is helpful for improving localization accuracy. In this paper, we propose an RSS fingerprint-based indoor localization method by integrating the spatio-temporal constraints into the sparse representation model. The proposed model utilizes the inherent spatial correlation of fingerprint data in the fingerprint matching and uses the temporal continuity of the RSS measurement data in the localization phase. Experiments on the simulated data and the localization tests in the real scenes show that the proposed method improves the localization accuracy and stability effectively compared with state-of-the-art indoor localization methods.

  17. RSS Fingerprint Based Indoor Localization Using Sparse Representation with Spatio-Temporal Constraint

    PubMed Central

    Piao, Xinglin; Zhang, Yong; Li, Tingshu; Hu, Yongli; Liu, Hao; Zhang, Ke; Ge, Yun

    2016-01-01

    The Received Signal Strength (RSS) fingerprint-based indoor localization is an important research topic in wireless network communications. Most current RSS fingerprint-based indoor localization methods do not explore and utilize the spatial or temporal correlation existing in fingerprint data and measurement data, which is helpful for improving localization accuracy. In this paper, we propose an RSS fingerprint-based indoor localization method by integrating the spatio-temporal constraints into the sparse representation model. The proposed model utilizes the inherent spatial correlation of fingerprint data in the fingerprint matching and uses the temporal continuity of the RSS measurement data in the localization phase. Experiments on the simulated data and the localization tests in the real scenes show that the proposed method improves the localization accuracy and stability effectively compared with state-of-the-art indoor localization methods. PMID:27827882

  18. Model-based object classification using unification grammars and abstract representations

    NASA Astrophysics Data System (ADS)

    Liburdy, Kathleen A.; Schalkoff, Robert J.

    1993-04-01

    The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.

  19. A simple representation of energy matrix elements in terms of symmetry-invariant bases.

    PubMed

    Cui, Peng; Wu, Jian; Zhang, Guiqing; Boyd, Russell J

    2010-02-01

    When a system under consideration has some symmetry, usually its Hamiltonian space can be parallel partitioned into a set of subspaces, which is invariant under symmetry operations. The bases that span these invariant subspaces are also invariant under the symmetry operations, and they are the symmetry-invariant bases. A standard methodology is available to construct a series of generator functions (GFs) and corresponding symmetry-adapted basis (SAB) functions from these symmetry-invariant bases. Elements of the factorized Hamiltonian and overlap matrix can be expressed in terms of these SAB functions, and their simple representations can be deduced in terms of GFs. The application of this method to the Heisenberg spin Hamiltonian is demonstrated.

  20. Rigid Body Attitude Control Based on a Manifold Representation of Direction Cosine Matrices

    NASA Astrophysics Data System (ADS)

    Nakath, David; Clemens, Joachim; Rachuy, Carsten

    2017-01-01

    Autonomous systems typically actively observe certain aspects of their surroundings, which makes them dependent on a suitable controller. However, building an attitude controller for three degrees of freedom is a challenging task, mainly due to singularities in the different parametrizations of the three dimensional rotation group SO(3). Thus, we propose an attitude controller based on a manifold representation of direction cosine matrices: In state space, the attitude is globally and uniquely represented as a direction cosine matrix R ∈ SO(3). However, differences in the state space, i.e., the attitude errors, are exposed to the controller in the vector space ℝ3. This is achieved by an operator, which integrates the matrix logarithm mapping from SO(3) to so(3) and the map from so(3) to ℝ3. Based on this representation, we derive a proportional and derivative feedback controller, whose output has an upper bound to prevent actuator saturation. Additionally, the feedback is preprocessed by a particle filter to account for measurement and state transition noise. We evaluate our approach in a simulator in three different spacecraft maneuver scenarios: (i) stabilizing, (ii) rest-to-rest, and (iii) nadir-pointing. The controller exhibits stable behavior from initial attitudes near and far from the setpoint. Furthermore, it is able to stabilize a spacecraft and can be used for nadir-pointing maneuvers.

  1. Improving Low-dose Cardiac CT Images based on 3D Sparse Representation

    NASA Astrophysics Data System (ADS)

    Shi, Luyao; Hu, Yining; Chen, Yang; Yin, Xindao; Shu, Huazhong; Luo, Limin; Coatrieux, Jean-Louis

    2016-03-01

    Cardiac computed tomography (CCT) is a reliable and accurate tool for diagnosis of coronary artery diseases and is also frequently used in surgery guidance. Low-dose scans should be considered in order to alleviate the harm to patients caused by X-ray radiation. However, low dose CT (LDCT) images tend to be degraded by quantum noise and streak artifacts. In order to improve the cardiac LDCT image quality, a 3D sparse representation-based processing (3D SR) is proposed by exploiting the sparsity and regularity of 3D anatomical features in CCT. The proposed method was evaluated by a clinical study of 14 patients. The performance of the proposed method was compared to the 2D spares representation-based processing (2D SR) and the state-of-the-art noise reduction algorithm BM4D. The visual assessment, quantitative assessment and qualitative assessment results show that the proposed approach can lead to effective noise/artifact suppression and detail preservation. Compared to the other two tested methods, 3D SR method can obtain results with image quality most close to the reference standard dose CT (SDCT) images.

  2. Improving Low-dose Cardiac CT Images based on 3D Sparse Representation

    PubMed Central

    Shi, Luyao; Hu, Yining; Chen, Yang; Yin, Xindao; Shu, Huazhong; Luo, Limin; Coatrieux, Jean-Louis

    2016-01-01

    Cardiac computed tomography (CCT) is a reliable and accurate tool for diagnosis of coronary artery diseases and is also frequently used in surgery guidance. Low-dose scans should be considered in order to alleviate the harm to patients caused by X-ray radiation. However, low dose CT (LDCT) images tend to be degraded by quantum noise and streak artifacts. In order to improve the cardiac LDCT image quality, a 3D sparse representation-based processing (3D SR) is proposed by exploiting the sparsity and regularity of 3D anatomical features in CCT. The proposed method was evaluated by a clinical study of 14 patients. The performance of the proposed method was compared to the 2D spares representation-based processing (2D SR) and the state-of-the-art noise reduction algorithm BM4D. The visual assessment, quantitative assessment and qualitative assessment results show that the proposed approach can lead to effective noise/artifact suppression and detail preservation. Compared to the other two tested methods, 3D SR method can obtain results with image quality most close to the reference standard dose CT (SDCT) images. PMID:26980176

  3. Improving Low-dose Cardiac CT Images based on 3D Sparse Representation.

    PubMed

    Shi, Luyao; Hu, Yining; Chen, Yang; Yin, Xindao; Shu, Huazhong; Luo, Limin; Coatrieux, Jean-Louis

    2016-03-16

    Cardiac computed tomography (CCT) is a reliable and accurate tool for diagnosis of coronary artery diseases and is also frequently used in surgery guidance. Low-dose scans should be considered in order to alleviate the harm to patients caused by X-ray radiation. However, low dose CT (LDCT) images tend to be degraded by quantum noise and streak artifacts. In order to improve the cardiac LDCT image quality, a 3D sparse representation-based processing (3D SR) is proposed by exploiting the sparsity and regularity of 3D anatomical features in CCT. The proposed method was evaluated by a clinical study of 14 patients. The performance of the proposed method was compared to the 2D spares representation-based processing (2D SR) and the state-of-the-art noise reduction algorithm BM4D. The visual assessment, quantitative assessment and qualitative assessment results show that the proposed approach can lead to effective noise/artifact suppression and detail preservation. Compared to the other two tested methods, 3D SR method can obtain results with image quality most close to the reference standard dose CT (SDCT) images.

  4. Dictionary learning method for joint sparse representation-based image fusion

    NASA Astrophysics Data System (ADS)

    Zhang, Qiheng; Fu, Yuli; Li, Haifeng; Zou, Jian

    2013-05-01

    Recently, sparse representation (SR) and joint sparse representation (JSR) have attracted a lot of interest in image fusion. The SR models signals by sparse linear combinations of prototype signal atoms that make a dictionary. The JSR indicates that different signals from the various sensors of the same scene form an ensemble. These signals have a common sparse component and each individual signal owns an innovation sparse component. The JSR offers lower computational complexity compared with SR. First, for JSR-based image fusion, we give a new fusion rule. Then, motivated by the method of optimal directions (MOD), for JSR, we propose a novel dictionary learning method (MODJSR) whose dictionary updating procedure is derived by employing the JSR structure one time with singular value decomposition (SVD). MODJSR has lower complexity than the K-SVD algorithm which is often used in previous JSR-based fusion algorithms. To capture the image details more efficiently, we proposed the generalized JSR in which the signals ensemble depends on two dictionaries. MODJSR is extended to MODGJSR in this case. MODJSR/MODGJSR can simultaneously carry out dictionary learning, denoising, and fusion of noisy source images. Some experiments are given to demonstrate the validity of the MODJSR/MODGJSR for image fusion.

  5. Decoding abstract and concrete concept representations based on single-trial fMRI data.

    PubMed

    Wang, Jing; Baucom, Laura B; Shinkareva, Svetlana V

    2013-05-01

    Previously, multi-voxel pattern analysis has been used to decode words referring to concrete object categories. In this study we investigated if single-trial-based brain activity was sufficient to distinguish abstract (e.g., mercy) versus concrete (e.g., barn) concept representations. Multiple neuroimaging studies have identified differences in the processing of abstract versus concrete concepts based on the averaged activity across time by using univariate methods. In this study we used multi-voxel pattern analysis to decode functional magnetic resonance imaging (fMRI) data when participants perform a semantic similarity judgment task on triplets of either abstract or concrete words with similar meanings. Classifiers were trained to identify individual trials as concrete or abstract. Cross-validated accuracies for classifying trials as abstract or concrete were significantly above chance (P < 0.05) for all participants. Discriminating information was distributed in multiple brain regions. Moreover, accuracy of identifying single trial data for any one participant as abstract or concrete was also reliably above chance (P < 0.05) when the classifier was trained solely on data from other participants. These results suggest abstract and concrete concepts differ in representations in terms of neural activity patterns during a short period of time across the whole brain.

  6. Infrared moving small target detection based on saliency extraction and image sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaomin; Ren, Kan; Gao, Jin; Li, Chaowei; Gu, Guohua; Wan, Minjie

    2016-10-01

    Moving small target detection in infrared image is a crucial technique of infrared search and tracking system. This paper present a novel small target detection technique based on frequency-domain saliency extraction and image sparse representation. First, we exploit the features of Fourier spectrum image and magnitude spectrum of Fourier transform to make a rough extract of saliency regions and use a threshold segmentation system to classify the regions which look salient from the background, which gives us a binary image as result. Second, a new patch-image model and over-complete dictionary were introduced to the detection system, then the infrared small target detection was converted into a problem solving and optimization process of patch-image information reconstruction based on sparse representation. More specifically, the test image and binary image can be decomposed into some image patches follow certain rules. We select the target potential area according to the binary patch-image which contains salient region information, then exploit the over-complete infrared small target dictionary to reconstruct the test image blocks which may contain targets. The coefficients of target image patch satisfy sparse features. Finally, for image sequence, Euclidean distance was used to reduce false alarm ratio and increase the detection accuracy of moving small targets in infrared images due to the target position correlation between frames.

  7. Weighted sparse representation for human ear recognition based on local descriptor

    NASA Astrophysics Data System (ADS)

    Mawloud, Guermoui; Djamel, Melaab

    2016-01-01

    A two-stage ear recognition framework is presented where two local descriptors and a sparse representation algorithm are combined. In a first stage, the algorithm proceeds by deducing a subset of the closest training neighbors to the test ear sample. The selection is based on the K-nearest neighbors classifier in the pattern of oriented edge magnitude feature space. In a second phase, the co-occurrence of adjacent local binary pattern features are extracted from the preselected subset and combined to form a dictionary. Afterward, sparse representation classifier is employed on the developed dictionary in order to infer the closest element to the test sample. Thus, by splitting up the ear image into a number of segments and applying the described recognition routine on each of them, the algorithm finalizes by attributing a final class label based on majority voting over the individual labels pointed out by each segment. Experimental results demonstrate the effectiveness as well as the robustness of the proposed scheme over leading state-of-the-art methods. Especially when the ear image is occluded, the proposed algorithm exhibits a great robustness and reaches the recognition performances outlined in the state of the art.

  8. Force Concept Inventory-Based Multiple-Choice Test for Investigating Students' Representational Consistency

    ERIC Educational Resources Information Center

    Nieminen, Pasi; Savinainen, Antti; Viiri, Jouni

    2010-01-01

    This study investigates students' ability to interpret multiple representations consistently (i.e., representational consistency) in the context of the force concept. For this purpose we developed the Representational Variant of the Force Concept Inventory (R-FCI), which makes use of nine items from the 1995 version of the Force Concept Inventory…

  9. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    SciTech Connect

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang; Chen, Ken Chung; Shen, Steve G. F.; Yan, Jin; Lee, Philip K. M.; Chow, Ben; Liu, Nancy X.; Xia, James J.; Shen, Dinggang

    2014-04-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  10. Addressing the translational dilemma: dynamic knowledge representation of inflammation using agent-based modeling.

    PubMed

    An, Gary; Christley, Scott

    2012-01-01

    Given the panoply of system-level diseases that result from disordered inflammation, such as sepsis, atherosclerosis, cancer, and autoimmune disorders, understanding and characterizing the inflammatory response is a key target of biomedical research. Untangling the complex behavioral configurations associated with a process as ubiquitous as inflammation represents a prototype of the translational dilemma: the ability to translate mechanistic knowledge into effective therapeutics. A critical failure point in the current research environment is a throughput bottleneck at the level of evaluating hypotheses of mechanistic causality; these hypotheses represent the key step toward the application of knowledge for therapy development and design. Addressing the translational dilemma will require utilizing the ever-increasing power of computers and computational modeling to increase the efficiency of the scientific method in the identification and evaluation of hypotheses of mechanistic causality. More specifically, development needs to focus on facilitating the ability of non-computer trained biomedical researchers to utilize and instantiate their knowledge in dynamic computational models. This is termed "dynamic knowledge representation." Agent-based modeling is an object-oriented, discrete-event, rule-based simulation method that is well suited for biomedical dynamic knowledge representation. Agent-based modeling has been used in the study of inflammation at multiple scales. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggest that this modeling framework is well suited for addressing the translational dilemma. This review describes agent-based modeling, gives examples of its applications in the study of inflammation, and introduces a proposed general expansion of the use of modeling and simulation to augment the generation and evaluation of knowledge

  11. Acid and base degraded products of ketorolac.

    PubMed

    Salaris, Margherita; Nieddu, Maria; Rubattu, Nicola; Testa, Cecilia; Luongo, Elvira; Rimoli, Maria Grazia; Boatto, Gianpiero

    2010-06-05

    The stability of ketorolac tromethamine was investigated in acid (0.5M HCl) and alkaline conditions (0.5M NaOH), using the same procedure reported by Devarajan et al. [2]. The acid and base degradation products were identified by liquid chromatography-mass spectrometry (LC-MS).

  12. Subject and Citation Indexing. Part I: The Clustering Structure of Composite Representations in the Cystic Fibrosis Document Collection. Part II: The Optimal, Cluster-Based Retrieval Performance of Composite Representations.

    ERIC Educational Resources Information Center

    Shaw, W. M., Jr.

    1991-01-01

    Two articles discuss the clustering of composite representations in the Cystic Fibrosis Document Collection from the National Library of Medicine's MEDLINE file. Clustering is evaluated as a function of the exhaustivity of composite representations based on Medical Subject Headings (MeSH) and citation indexes, and evaluation of retrieval…

  13. Generating Multiple Base-Resolution DNA Methylomes Using Reduced Representation Bisulfite Sequencing.

    PubMed

    Chatterjee, Aniruddha; Rodger, Euan J; Stockwell, Peter A; Le Mée, Gwenn; Morison, Ian M

    2017-01-01

    Reduced representation bisulfite sequencing (RRBS) is an effective technique for profiling genome-wide DNA methylation patterns in eukaryotes. RRBS couples size selection, bisulfite conversion, and second-generation sequencing to enrich for CpG-dense regions of the genome. The progressive improvement of second-generation sequencing technologies and reduction in cost provided an opportunity to examine the DNA methylation patterns of multiple genomes. Here, we describe a protocol for sequencing multiple RRBS libraries in a single sequencing reaction to generate base-resolution methylomes. Furthermore, we provide a brief guideline for base-calling and data analysis of multiplexed RRBS libraries. These strategies will be useful to perform large-scale, genome-wide DNA methylation analysis.

  14. Site of metabolism prediction based on ab initio derived atom representations.

    PubMed

    Finkelmann, Arndt R; Göller, Andreas H; Schneider, Gisbert

    2017-03-21

    Machine learning models for site of metabolism (SoM) prediction offer the ability to identify metabolic soft spots in low molecular weight drug molecules at low computational cost and enable data-based reactivity prediction. SoM prediction is an atom classification problem. Successful construction of machine learning models requires atom representations that capture the reactivity-determining features of a potential reaction site. We have developed a descriptor scheme that characterizes an atom's steric and electronic environment and its relative location in the molecular structure. The partial charge distributions were obtained from fast quantum mechanical calculations. We successfully trained machine learning classifiers on curated cytochrome p450 metabolism data. The models based on the new atom descriptors showed sustained accuracy for retrospective analyses of metabolism optimization campaigns and lead optimization projects from Bayer Pharmaceuticals. The results obtained demonstrate the practicality of quantum-chemistry-supported machine learning models for hit-to-lead optimization.

  15. Locality-preserving sparse representation-based classification in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Gao, Lianru; Yu, Haoyang; Zhang, Bing; Li, Qingting

    2016-10-01

    This paper proposes to combine locality-preserving projections (LPP) and sparse representation (SR) for hyperspectral image classification. The LPP is first used to reduce the dimensionality of all the training and testing data by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold, where the high-dimensional data lies. Then, SR codes the projected testing pixels as sparse linear combinations of all the training samples to classify the testing pixels by evaluating which class leads to the minimum approximation error. The integration of LPP and SR represents an innovative contribution to the literature. The proposed approach, called locality-preserving SR-based classification, addresses the imbalance between high dimensionality of hyperspectral data and the limited number of training samples. Experimental results on three real hyperspectral data sets demonstrate that the proposed approach outperforms the original counterpart, i.e., SR-based classification.

  16. Image super-resolution reconstruction via RBM-based joint dictionary learning and sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Zhaohui; Liu, Anran; Lei, Qian

    2015-12-01

    In this paper, we propose a method for single image super-resolution(SR). Given the training set produced from large amount of high-low resolution image patches, an over-complete joint dictionary is firstly learned from a pair of high-low resolution image feature space based on Restricted Boltzmann Machines (RBM). Then for each low resolution image patch densely extracted from an up-scaled low resolution input image , its high resolution image patch can be reconstructed based on sparse representation. Finally, the reconstructed image patches are overlapped to form a large image, and a high resolution image can be achieved by means of iterated residual image compensation. Experimental results verify the effectiveness of the proposed method.

  17. Aspect-Aided Dynamic Non-Negative Sparse Representation-Based Microwave Image Classification

    PubMed Central

    Zhang, Xinzheng; Yang, Qiuyue; Liu, Miaomiao; Jia, Yunjian; Liu, Shujun; Li, Guojun

    2016-01-01

    Classification of target microwave images is an important application in much areas such as security, surveillance, etc. With respect to the task of microwave image classification, a recognition algorithm based on aspect-aided dynamic non-negative least square (ADNNLS) sparse representation is proposed. Firstly, an aspect sector is determined, the center of which is the estimated aspect angle of the testing sample. The training samples in the aspect sector are divided into active atoms and inactive atoms by smooth self-representative learning. Secondly, for each testing sample, the corresponding active atoms are selected dynamically, thereby establishing dynamic dictionary. Thirdly, the testing sample is represented with ℓ1-regularized non-negative sparse representation under the corresponding dynamic dictionary. Finally, the class label of the testing sample is identified by use of the minimum reconstruction error. Verification of the proposed algorithm was conducted using the Moving and Stationary Target Acquisition and Recognition (MSTAR) database which was acquired by synthetic aperture radar. Experiment results validated that the proposed approach was able to capture the local aspect characteristics of microwave images effectively, thereby improving the classification performance. PMID:27598172

  18. Stereo vision-based obstacle avoidance for micro air vehicles using an egocylindrical image space representation

    NASA Astrophysics Data System (ADS)

    Brockers, R.; Fragoso, A.; Matthies, L.

    2016-05-01

    Micro air vehicles which operate autonomously at low altitude in cluttered environments require a method for onboard obstacle avoidance for safe operation. Previous methods deploy either purely reactive approaches, mapping low-level visual features directly to actuator inputs to maneuver the vehicle around the obstacle, or deliberative methods that use on-board 3-D sensors to create a 3-D, voxel-based world model, which is then used to generate collision free 3-D trajectories. In this paper, we use forward-looking stereo vision with a large horizontal and vertical field of view and project range from stereo into a novel robot-centered, cylindrical, inverse range map we call an egocylinder. With this implementation we reduce the complexity of our world representation from a 3D map to a 2.5D image-space representation, which supports very efficient motion planning and collision checking, and allows to implement configuration space expansion as an image processing function directly on the egocylinder. Deploying a fast reactive motion planner directly on the configuration space expanded egocylinder image, we demonstrate the effectiveness of this new approach experimentally in an indoor environment.

  19. Gender in facial representations: a contrast-based study of adaptation within and between the sexes.

    PubMed

    Oruç, Ipek; Guo, Xiaoyue M; Barton, Jason J S

    2011-01-18

    Face aftereffects are proving to be an effective means of examining the properties of face-specific processes in the human visual system. We examined the role of gender in the neural representation of faces using a contrast-based adaptation method. If faces of different genders share the same representational face space, then adaptation to a face of one gender should affect both same- and different-gender faces. Further, if these aftereffects differ in magnitude, this may indicate distinct gender-related factors in the organization of this face space. To control for a potential confound between physical similarity and gender, we used a Bayesian ideal observer and human discrimination data to construct a stimulus set in which pairs of different-gender faces were equally dissimilar as same-gender pairs. We found that the recognition of both same-gender and different-gender faces was suppressed following a brief exposure of 100 ms. Moreover, recognition was more suppressed for test faces of a different-gender than those of the same-gender as the adaptor, despite the equivalence in physical and psychophysical similarity. Our results suggest that male and female faces likely occupy the same face space, allowing transfer of aftereffects between the genders, but that there are special properties that emerge along gender-defining dimensions of this space.

  20. Aspect-Aided Dynamic Non-Negative Sparse Representation-Based Microwave Image Classification.

    PubMed

    Zhang, Xinzheng; Yang, Qiuyue; Liu, Miaomiao; Jia, Yunjian; Liu, Shujun; Li, Guojun

    2016-09-02

    Classification of target microwave images is an important application in much areas such as security, surveillance, etc. With respect to the task of microwave image classification, a recognition algorithm based on aspect-aided dynamic non-negative least square (ADNNLS) sparse representation is proposed. Firstly, an aspect sector is determined, the center of which is the estimated aspect angle of the testing sample. The training samples in the aspect sector are divided into active atoms and inactive atoms by smooth self-representative learning. Secondly, for each testing sample, the corresponding active atoms are selected dynamically, thereby establishing dynamic dictionary. Thirdly, the testing sample is represented with ℓ 1 -regularized non-negative sparse representation under the corresponding dynamic dictionary. Finally, the class label of the testing sample is identified by use of the minimum reconstruction error. Verification of the proposed algorithm was conducted using the Moving and Stationary Target Acquisition and Recognition (MSTAR) database which was acquired by synthetic aperture radar. Experiment results validated that the proposed approach was able to capture the local aspect characteristics of microwave images effectively, thereby improving the classification performance.

  1. An exploratory study of mental representations for rehabilitation based upon the Theory of Planned Behaviour.

    PubMed

    Bains, Baljinder; Powell, Theresa; Lorenc, Louise

    2007-04-01

    This study explores whether mental representations based upon the Theory of Planned Behaviour (TPB) can predict engagement in rehabilitation after acquired brain injury (ABI). A scale was developed to measure: treatment outcome beliefs, perceived barriers, subjective norm, and control cognitions. Other adjustment factors that are often seen as important in predicting engagement, e.g., denial and anger, were also measured using the Motivation for Traumatic Brain Injury Questionnaire (MOT-Q). Clinicians also provided ratings of patient's engagement in rehabilitation which was used as the measure of actual behaviour. The scales were administered to 40 participants with ABI who were a mean of 14 months post-injury. The new scale showed good internal reliability, and regression analyses demonstrated that "treatment outcome beliefs" were the main aspects of TPB that predicted engagement. Control beliefs and adjustment factors added very little to the variance in engagement that was already explained by TPB. However, the amount of variance that was explained by TPB was small and it is suggested that other volitional factors need to be considered in order to predict engagement more accurately. This study thus constitutes a first exploration of mental representations about rehabilitation after ABI using TPB. It suggests a theoretical model that may help clinicians to formulate lack of engagement. It also suggests that addressing patients' beliefs about the benefits of undertaking rehabilitation may prove fruitful in increasing engagement.

  2. Representation Learning of Temporal Dynamics for Skeleton-Based Action Recognition.

    PubMed

    Du, Yong; Fu, Yun; Wang, Liang

    2016-07-01

    Motion characteristics of human actions can be represented by the position variation of skeleton joints. Traditional approaches generally extract the spatial-temporal representation of the skeleton sequences with well-designed hand-crafted features. In this paper, in order to recognize actions according to the relative motion between the limbs and the trunk, we propose an end-to-end hierarchical RNN for skeleton-based action recognition. We divide human skeleton into five main parts in terms of the human physical structure, and then feed them to five independent subnets for local feature extraction. After the following hierarchical feature fusion and extraction from local to global, dimensions of the final temporal dynamics representations are reduced to the same number of action categories in the corresponding data set through a single-layer perceptron. In addition, the output of the perceptron is temporally accumulated as the input of a softmax layer for classification. Random scale and rotation transformations are employed to improve the robustness during training. We compare with five other deep RNN variants derived from our model in order to verify the effectiveness of the proposed network. In addition, we compare with several other methods on motion capture and Kinect data sets. Furthermore, we evaluate the robustness of our model trained with random scale and rotation transformations for a multiview problem. Experimental results demonstrate that our model achieves the state-of-the-art performance with high computational efficiency.

  3. Sparse representation based on local time-frequency template matching for bearing transient fault feature extraction

    NASA Astrophysics Data System (ADS)

    He, Qingbo; Ding, Xiaoxi

    2016-05-01

    The transients caused by the localized fault are important measurement information for bearing fault diagnosis. Thus it is crucial to extract the transients from the bearing vibration or acoustic signals that are always corrupted by a large amount of background noise. In this paper, an iterative transient feature extraction approach is proposed based on time-frequency (TF) domain sparse representation. The approach is realized by presenting a new method, called local TF template matching. In this method, the TF atoms are constructed based on the TF distribution (TFD) of the Morlet wavelet bases and local TF templates are formulated from the TF atoms for the matching process. The instantaneous frequency (IF) ridge calculated from the TFD of an analyzed signal provides the frequency parameter values for the TF atoms as well as an effective template matching path on the TF plane. In each iteration, local TF templates are employed to do correlation with the TFD of the analyzed signal along the IF ridge tube for identifying the optimum parameters of transient wavelet model. With this iterative procedure, transients can be extracted in the TF domain from measured signals one by one. The final signal can be synthesized by combining the extracted TF atoms and the phase of the raw signal. The local TF template matching builds an effective TF matching-based sparse representation approach with the merit of satisfying the native pulse waveform structure of transients. The effectiveness of the proposed method is verified by practical defective bearing signals. Comparison results also show that the proposed method is superior to traditional methods in transient feature extraction.

  4. A critical review of the allocentric spatial representation and its neural underpinnings: toward a network-based perspective

    PubMed Central

    Ekstrom, Arne D.; Arnold, Aiden E. G. F.; Iaria, Giuseppe

    2014-01-01

    While the widely studied allocentric spatial representation holds a special status in neuroscience research, its exact nature and neural underpinnings continue to be the topic of debate, particularly in humans. Here, based on a review of human behavioral research, we argue that allocentric representations do not provide the kind of map-like, metric representation one might expect based on past theoretical work. Instead, we suggest that almost all tasks used in past studies involve a combination of egocentric and allocentric representation, complicating both the investigation of the cognitive basis of an allocentric representation and the task of identifying a brain region specifically dedicated to it. Indeed, as we discuss in detail, past studies suggest numerous brain regions important to allocentric spatial memory in addition to the hippocampus, including parahippocampal, retrosplenial, and prefrontal cortices. We thus argue that although allocentric computations will often require the hippocampus, particularly those involving extracting details across temporally specific routes, the hippocampus is not necessary for all allocentric computations. We instead suggest that a non-aggregate network process involving multiple interacting brain areas, including hippocampus and extra-hippocampal areas such as parahippocampal, retrosplenial, prefrontal, and parietal cortices, better characterizes the neural basis of spatial representation during navigation. According to this model, an allocentric representation does not emerge from the computations of a single brain region (i.e., hippocampus) nor is it readily decomposable into additive computations performed by separate brain regions. Instead, an allocentric representation emerges from computations partially shared across numerous interacting brain regions. We discuss our non-aggregate network model in light of existing data and provide several key predictions for future experiments. PMID:25346679

  5. Solid Acid Based Fuel Cells

    DTIC Science & Technology

    2007-11-02

    superprotonic solid acids with elements such as P, As, Si and Ge, which have greater affinities to oxygen , we anticipate that the reduction reaction will be...bulk material consisted of an apatite phase (hexagonal symmetry) of variable composition, LixLa10-x(SiO4)6O3-x, with excess lithium residing in the...in Tables 1 and 2, indicate that this compound is a rather conventional apatite with fixed stoichiometry, LiLa9(SiO4)6O2 (x = 1). Such a result is

  6. Whole body acid-base modeling revisited.

    PubMed

    Ring, Troels; Nielsen, Søren

    2017-04-01

    The textbook account of whole body acid-base balance in terms of endogenous acid production, renal net acid excretion, and gastrointestinal alkali absorption, which is the only comprehensive model around, has never been applied in clinical practice or been formally validated. To improve understanding of acid-base modeling, we managed to write up this conventional model as an expression solely on urine chemistry. Renal net acid excretion and endogenous acid production were already formulated in terms of urine chemistry, and we could from the literature also see gastrointestinal alkali absorption in terms of urine excretions. With a few assumptions it was possible to see that this expression of net acid balance was arithmetically identical to minus urine charge, whereby under the development of acidosis, urine was predicted to acquire a net negative charge. The literature already mentions unexplained negative urine charges so we scrutinized a series of seminal papers and confirmed empirically the theoretical prediction that observed urine charge did acquire negative charge as acidosis developed. Hence, we can conclude that the conventional model is problematic since it predicts what is physiologically impossible. Therefore, we need a new model for whole body acid-base balance, which does not have impossible implications. Furthermore, new experimental studies are needed to account for charge imbalance in urine under development of acidosis.

  7. Synthesis of new kojic acid based unnatural α-amino acid derivatives.

    PubMed

    Balakrishna, C; Payili, Nagaraju; Yennam, Satyanarayana; Devi, P Uma; Behera, Manoranjan

    2015-11-01

    An efficient method for the preparation of kojic acid based α-amino acid derivatives by alkylation of glycinate schiff base with bromokojic acids have been described. Using this method, mono as well as di alkylated kojic acid-amino acid conjugates have been prepared. This is the first synthesis of C-linked kojic acid-amino acid conjugate where kojic acid is directly linked to amino acid through a C-C bond.

  8. Face recognition via edge-based Gabor feature representation for plastic surgery-altered images

    NASA Astrophysics Data System (ADS)

    Chude-Olisah, Chollette C.; Sulong, Ghazali; Chude-Okonkwo, Uche A. K.; Hashim, Siti Z. M.

    2014-12-01

    Plastic surgery procedures on the face introduce skin texture variations between images of the same person (intra-subject), thereby making the task of face recognition more difficult than in normal scenario. Usually, in contemporary face recognition systems, the original gray-level face image is used as input to the Gabor descriptor, which translates to encoding some texture properties of the face image. The texture-encoding process significantly degrades the performance of such systems in the case of plastic surgery due to the presence of surgically induced intra-subject variations. Based on the proposition that the shape of significant facial components such as eyes, nose, eyebrow, and mouth remains unchanged after plastic surgery, this paper employs an edge-based Gabor feature representation approach for the recognition of surgically altered face images. We use the edge information, which is dependent on the shapes of the significant facial components, to address the plastic surgery-induced texture variation problems. To ensure that the significant facial components represent useful edge information with little or no false edges, a simple illumination normalization technique is proposed for preprocessing. Gabor wavelet is applied to the edge image to accentuate on the uniqueness of the significant facial components for discriminating among different subjects. The performance of the proposed method is evaluated on the Georgia Tech (GT) and the Labeled Faces in the Wild (LFW) databases with illumination and expression problems, and the plastic surgery database with texture changes. Results show that the proposed edge-based Gabor feature representation approach is robust against plastic surgery-induced face variations amidst expression and illumination problems and outperforms the existing plastic surgery face recognition methods reported in the literature.

  9. Are word representations abstract or instance-based? Effects of spelling inconsistency in orthographic learning.

    PubMed

    Burt, Jennifer S; Long, Julia

    2011-09-01

    In Experiment 1, 62 10-year-old children studied printed pseudowords with semantic information. The items were later represented in a different format for reading, with half of the items spelled in the same way as before and half displayed in a new phonologically equivalent spelling. In a dictation test, the exposure to an alternative spelling substantially increased the number of errors that matched the alternative spelling, especially in good spellers. Orthographic learning predicted word identification when accuracy on orthographic choice for words was controlled. In Experiment 2, the effects on dictation responses of exposure to a misspelling versus the correct spelling, and the interactive effect of spelling ability, were confirmed relative to a no-exposure control in adults. The results support a single-lexicon view of reading and spelling and have implications for abstractionist and instance-based theories of orthographic representations.

  10. Feature Selection Based on High Dimensional Model Representation for Hyperspectral Images.

    PubMed

    Taskin Kaya, Gulsen; Kaya, Huseyin; Bruzzone, Lorenzo

    2017-03-24

    In hyperspectral image analysis, the classification task has generally been addressed jointly with dimensionality reduction due to both the high correlation between the spectral features and the noise present in spectral bands which might significantly degrade classification performance. In supervised classification, limited training instances in proportion to the number of spectral features have negative impacts on the classification accuracy, which has known as Hughes effects or curse of dimensionality in the literature. In this paper, we focus on dimensionality reduction problem, and propose a novel feature-selection algorithm which is based on the method called High Dimensional Model Representation. The proposed algorithm is tested on some toy examples and hyperspectral datasets in comparison to conventional feature-selection algorithms in terms of classification accuracy, stability of the selected features and computational time. The results showed that the proposed approach provides both high classification accuracy and robust features with a satisfactory computational time.

  11. Schematic representation of residue-based protein context-dependent data: an application to transmembrane proteins.

    PubMed

    Campagne, F; Weinstein, H

    1999-01-01

    An algorithmic method for drawing residue-based schematic diagrams of proteins on a 2D page is presented and illustrated. The method allows the creation of rendering engines dedicated to a given family of sequences, or fold. The initial implementation provides an engine that can produce a 2D diagram representing secondary structure for any transmembrane protein sequence. We present the details of the strategy for automating the drawing of these diagrams. The most important part of this strategy is the development of an algorithm for laying out residues of a loop that connects to arbitrary points of a 2D plane. As implemented, this algorithm is suitable for real-time modification of the loop layout. This work is of interest for the representation and analysis of data from (1) protein databases, (2) mutagenesis results, or (3) various kinds of protein context-dependent annotations or data.

  12. A Direct, Biomass-Based Synthesis of Benzoic Acid: Formic Acid-Mediated Deoxygenation of the Glucose-Derived Materials Quinic Acid and Shikimic Acid

    SciTech Connect

    Arceo, Elena; Ellman, Jonathan; Bergman, Robert

    2010-05-03

    An alternative biomass-based route to benzoic acid from the renewable starting materials quinic acid and shikimic acid is described. Benzoic acid is obtained selectively using a highly efficient, one-step formic acid-mediated deoxygenation method.

  13. Spatiotemporal Context Awareness for Urban Traffic Modeling and Prediction: Sparse Representation Based Variable Selection

    PubMed Central

    Yang, Su; Shi, Shixiong; Hu, Xiaobing; Wang, Minjie

    2015-01-01

    Spatial-temporal correlations among the data play an important role in traffic flow prediction. Correspondingly, traffic modeling and prediction based on big data analytics emerges due to the city-scale interactions among traffic flows. A new methodology based on sparse representation is proposed to reveal the spatial-temporal dependencies among traffic flows so as to simplify the correlations among traffic data for the prediction task at a given sensor. Three important findings are observed in the experiments: (1) Only traffic flows immediately prior to the present time affect the formation of current traffic flows, which implies the possibility to reduce the traditional high-order predictors into an 1-order model. (2) The spatial context relevant to a given prediction task is more complex than what is assumed to exist locally and can spread out to the whole city. (3) The spatial context varies with the target sensor undergoing prediction and enlarges with the increment of time lag for prediction. Because the scope of human mobility is subject to travel time, identifying the varying spatial context against time lag is crucial for prediction. Since sparse representation can capture the varying spatial context to adapt to the prediction task, it outperforms the traditional methods the inputs of which are confined as the data from a fixed number of nearby sensors. As the spatial-temporal context for any prediction task is fully detected from the traffic data in an automated manner, where no additional information regarding network topology is needed, it has good scalability to be applicable to large-scale networks. PMID:26496370

  14. Spatiotemporal Context Awareness for Urban Traffic Modeling and Prediction: Sparse Representation Based Variable Selection.

    PubMed

    Yang, Su; Shi, Shixiong; Hu, Xiaobing; Wang, Minjie

    2015-01-01

    Spatial-temporal correlations among the data play an important role in traffic flow prediction. Correspondingly, traffic modeling and prediction based on big data analytics emerges due to the city-scale interactions among traffic flows. A new methodology based on sparse representation is proposed to reveal the spatial-temporal dependencies among traffic flows so as to simplify the correlations among traffic data for the prediction task at a given sensor. Three important findings are observed in the experiments: (1) Only traffic flows immediately prior to the present time affect the formation of current traffic flows, which implies the possibility to reduce the traditional high-order predictors into an 1-order model. (2) The spatial context relevant to a given prediction task is more complex than what is assumed to exist locally and can spread out to the whole city. (3) The spatial context varies with the target sensor undergoing prediction and enlarges with the increment of time lag for prediction. Because the scope of human mobility is subject to travel time, identifying the varying spatial context against time lag is crucial for prediction. Since sparse representation can capture the varying spatial context to adapt to the prediction task, it outperforms the traditional methods the inputs of which are confined as the data from a fixed number of nearby sensors. As the spatial-temporal context for any prediction task is fully detected from the traffic data in an automated manner, where no additional information regarding network topology is needed, it has good scalability to be applicable to large-scale networks.

  15. Hierarchical QSAR technology based on the Simplex representation of molecular structure

    NASA Astrophysics Data System (ADS)

    Kuz'min, V. E.; Artemenko, A. G.; Muratov, E. N.

    2008-06-01

    This article is about the hierarchical quantitative structure-activity relationship technology (HiT QSAR) based on the Simplex representation of molecular structure (SiRMS) and its application for different QSAR/QSP(property)R tasks. The essence of this technology is a sequential solution (with the use of the information obtained on the previous steps) to the QSAR problem by the series of enhanced models of molecular structure description [from one dimensional (1D) to four dimensional (4D)]. It is a system of permanently improved solutions. In the SiRMS approach, every molecule is represented as a system of different simplexes (tetratomic fragments with fixed composition, structure, chirality and symmetry). The level of simplex descriptors detailing increases consecutively from the 1D to 4D representation of the molecular structure. The advantages of the approach reported here are the absence of "molecular alignment" problems, consideration of different physical-chemical properties of atoms (e.g. charge, lipophilicity, etc.), the high adequacy and good interpretability of obtained models and clear ways for molecular design. The efficiency of the HiT QSAR approach is demonstrated by comparing it with the most popular modern QSAR approaches on two representative examination sets. The examples of successful application of the HiT QSAR for various QSAR/QSPR investigations on the different levels (1D-4D) of the molecular structure description are also highlighted. The reliability of developed QSAR models as predictive virtual screening tools and their ability to serve as the base of directed drug design was validated by subsequent synthetic and biological experiments, among others. The HiT QSAR is realized as a complex of computer programs known as HiT QSAR software that also includes a powerful statistical block and a number of useful utilities.

  16. Epileptic Seizure Detection with Log-Euclidean Gaussian Kernel-Based Sparse Representation.

    PubMed

    Yuan, Shasha; Zhou, Weidong; Wu, Qi; Zhang, Yanli

    2016-05-01

    Epileptic seizure detection plays an important role in the diagnosis of epilepsy and reducing the massive workload of reviewing electroencephalography (EEG) recordings. In this work, a novel algorithm is developed to detect seizures employing log-Euclidean Gaussian kernel-based sparse representation (SR) in long-term EEG recordings. Unlike the traditional SR for vector data in Euclidean space, the log-Euclidean Gaussian kernel-based SR framework is proposed for seizure detection in the space of the symmetric positive definite (SPD) matrices, which form a Riemannian manifold. Since the Riemannian manifold is nonlinear, the log-Euclidean Gaussian kernel function is applied to embed it into a reproducing kernel Hilbert space (RKHS) for performing SR. The EEG signals of all channels are divided into epochs and the SPD matrices representing EEG epochs are generated by covariance descriptors. Then, the testing samples are sparsely coded over the dictionary composed by training samples utilizing log-Euclidean Gaussian kernel-based SR. The classification of testing samples is achieved by computing the minimal reconstructed residuals. The proposed method is evaluated on the Freiburg EEG dataset of 21 patients and shows its notable performance on both epoch-based and event-based assessments. Moreover, this method handles multiple channels of EEG recordings synchronously which is more speedy and efficient than traditional seizure detection methods.

  17. "Lesson Rainbow": The Use of Multiple Representations in an Internet-Based, Discipline-Integrated Science Lesson

    ERIC Educational Resources Information Center

    Hsu, Ying-Shao

    2006-01-01

    This paper presents the development and evaluation of a web-based lesson--Lesson Rainbow. This lesson features multiple representations (MRs), which purposefully deliver concepts in relation to distinctive disciplinary subject areas through story-based animations that are closely related to learners' life experiences. The researchers selected 58…

  18. Progressive sparse representation-based classification using local discrete cosine transform evaluation for image recognition

    NASA Astrophysics Data System (ADS)

    Song, Xiaoning; Feng, Zhen-Hua; Hu, Guosheng; Yang, Xibei; Yang, Jingyu; Qi, Yunsong

    2015-09-01

    This paper proposes a progressive sparse representation-based classification algorithm using local discrete cosine transform (DCT) evaluation to perform face recognition. Specifically, the sum of the contributions of all training samples of each subject is first taken as the contribution of this subject, then the redundant subject with the smallest contribution to the test sample is iteratively eliminated. Second, the progressive method aims at representing the test sample as a linear combination of all the remaining training samples, by which the representation capability of each training sample is exploited to determine the optimal "nearest neighbors" for the test sample. Third, the transformed DCT evaluation is constructed to measure the similarity between the test sample and each local training sample using cosine distance metrics in the DCT domain. The final goal of the proposed method is to determine an optimal weighted sum of nearest neighbors that are obtained under the local correlative degree evaluation, which is approximately equal to the test sample, and we can use this weighted linear combination to perform robust classification. Experimental results conducted on the ORL database of faces (created by the Olivetti Research Laboratory in Cambridge), the FERET face database (managed by the Defense Advanced Research Projects Agency and the National Institute of Standards and Technology), AR face database (created by Aleix Martinez and Robert Benavente in the Computer Vision Center at U.A.B), and USPS handwritten digit database (gathered at the Center of Excellence in Document Analysis and Recognition at SUNY Buffalo) demonstrate the effectiveness of the proposed method.

  19. Research-Based Worksheets on Using Multiple Representations in Science Classrooms

    ERIC Educational Resources Information Center

    Hill, Matthew; Sharma, Manjula

    2015-01-01

    The ability to represent the world like a scientist is difficult to teach; it is more than simply knowing the representations (e.g., graphs, words, equations and diagrams). For meaningful science learning to take place, consideration needs to be given to explicitly integrating representations into instructional methods, linked to the content, and…

  20. Gender Difference in the Use of Thought Representation--A Corpus-Based Study

    ERIC Educational Resources Information Center

    Riissanen, Anne; Watson, Greg

    2014-01-01

    This study (Note 1) investigates potential differences in language use between genders, by applying a modified model of thought representation. Our hypothesis is that women use more direct forms of thought representation than men in modern spoken British English. Women are said to favour "private speech" that creates intimacy and…

  1. Interleaved Practice with Multiple Representations: Analyses with Knowledge Tracing Based Techniques

    ERIC Educational Resources Information Center

    Rau, Martina A.; Pardos, Zachary A.

    2012-01-01

    The goal of this paper is to use Knowledge Tracing to augment the results obtained from an experiment that investigated the effects of practice schedules using an intelligent tutoring system for fractions. Specifically, this experiment compared different practice schedules of multiple representations of fractions: representations were presented to…

  2. Clustering-weighted SIFT-based classification method via sparse representation

    NASA Astrophysics Data System (ADS)

    Sun, Bo; Xu, Feng; He, Jun

    2014-07-01

    In recent years, sparse representation-based classification (SRC) has received significant attention due to its high recognition rate. However, the original SRC method requires a rigid alignment, which is crucial for its application. Therefore, features such as SIFT descriptors are introduced into the SRC method, resulting in an alignment-free method. However, a feature-based dictionary always contains considerable useful information for recognition. We explore the relationship of the similarity of the SIFT descriptors to multitask recognition and propose a clustering-weighted SIFT-based SRC method (CWS-SRC). The proposed approach is considerably more suitable for multitask recognition with sufficient samples. Using two public face databases (AR and Yale face) and a self-built car-model database, the performance of the proposed method is evaluated and compared to that of the SRC, SIFT matching, and MKD-SRC methods. Experimental results indicate that the proposed method exhibits better performance in the alignment-free scenario with sufficient samples.

  3. Feature-based representations of emotional facial expressions in the human amygdala.

    PubMed

    Ahs, Fredrik; Davis, Caroline F; Gorka, Adam X; Hariri, Ahmad R

    2014-09-01

    The amygdala plays a central role in processing facial affect, responding to diverse expressions and features shared between expressions. Although speculation exists regarding the nature of relationships between expression- and feature-specific amygdala reactivity, this matter has not been fully explored. We used functional magnetic resonance imaging and principal component analysis (PCA) in a sample of 300 young adults, to investigate patterns related to expression- and feature-specific amygdala reactivity to faces displaying neutral, fearful, angry or surprised expressions. The PCA revealed a two-dimensional correlation structure that distinguished emotional categories. The first principal component separated neutral and surprised from fearful and angry expressions, whereas the second principal component separated neutral and angry from fearful and surprised expressions. This two-dimensional correlation structure of amygdala reactivity may represent specific feature-based cues conserved across discrete expressions. To delineate which feature-based cues characterized this pattern, face stimuli were averaged and then subtracted according to their principal component loadings. The first principal component corresponded to displacement of the eyebrows, whereas the second principal component corresponded to increased exposure of eye whites together with movement of the brow. Our results suggest a convergent representation of facial affect in the amygdala reflecting feature-based processing of discrete expressions.

  4. Accelerated reconstruction of electrical impedance tomography images via patch based sparse representation

    NASA Astrophysics Data System (ADS)

    Wang, Qi; Lian, Zhijie; Wang, Jianming; Chen, Qingliang; Sun, Yukuan; Li, Xiuyan; Duan, Xiaojie; Cui, Ziqiang; Wang, Huaxiang

    2016-11-01

    Electrical impedance tomography (EIT) reconstruction is a nonlinear and ill-posed problem. Exact reconstruction of an EIT image inverts a high dimensional mathematical model to calculate the conductivity field, which causes significant problems regarding that the computational complexity will reduce the achievable frame rate, which is considered as a major advantage of EIT imaging. The single-step method, state estimation method, and projection method were always used to accelerate reconstruction process. The basic principle of these methods is to reduce computational complexity. However, maintaining high resolution in space together with not much cost is still challenging, especially for complex conductivity distribution. This study proposes an idea to accelerate image reconstruction of EIT based on compressive sensing (CS) theory, namely, CSEIT method. The novel CSEIT method reduces the sampling rate through minimizing redundancy in measurements, so that detailed information of reconstruction is not lost. In order to obtain sparse solution, which is the prior condition of signal recovery required by CS theory, a novel image reconstruction algorithm based on patch-based sparse representation is proposed. By applying the new framework of CSEIT, the data acquisition time, or the sampling rate, is reduced by more than two times, while the accuracy of reconstruction is significantly improved.

  5. An application to pulmonary emphysema classification based on model of texton learning by sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Zhou, Xiangrong; Goshima, Satoshi; Chen, Huayue; Muramatsu, Chisako; Hara, Takeshi; Yokoyama, Ryojiro; Kanematsu, Masayuki; Fujita, Hiroshi

    2012-03-01

    We aim at using a new texton based texture classification method in the classification of pulmonary emphysema in computed tomography (CT) images of the lungs. Different from conventional computer-aided diagnosis (CAD) pulmonary emphysema classification methods, in this paper, firstly, the dictionary of texton is learned via applying sparse representation(SR) to image patches in the training dataset. Then the SR coefficients of the test images over the dictionary are used to construct the histograms for texture presentations. Finally, classification is performed by using a nearest neighbor classifier with a histogram dissimilarity measure as distance. The proposed approach is tested on 3840 annotated regions of interest consisting of normal tissue and mild, moderate and severe pulmonary emphysema of three subtypes. The performance of the proposed system, with an accuracy of about 88%, is comparably higher than state of the art method based on the basic rotation invariant local binary pattern histograms and the texture classification method based on texton learning by k-means, which performs almost the best among other approaches in the literature.

  6. Sensorimotor representation and knowledge-based reasoning for spatial exploration and localisation.

    PubMed

    Zetzsche, C; Wolter, J; Schill, K

    2008-12-01

    We investigate a hybrid system for autonomous exploration and navigation, and implement it in a virtual mobile agent, which operates in virtual spatial environments. The system is based on several distinguishing properties. The representation is not map-like, but based on sensorimotor features, i.e. on combinations of sensory features and motor actions. The system has a hybrid architecture, which integrates a bottom-up processing of sensorimotor features with a top-down, knowledge-based reasoning strategy. This strategy selects the optimal motor action in each step according to the principle of maximum information gain. Two sensorimotor levels with different behavioural granularity are implemented, a macro-level, which controls the movements of the agent in space, and a micro-level, which controls its eye movements. At each level, the same type of hybrid architecture and the same principle of information gain are used for sensorimotor control. The localisation performance of the system is tested with large sets of virtual rooms containing different mixtures of unique and non-unique objects. The results demonstrate that the system efficiently performs those exploratory motor actions that yield a maximum amount of information about the current environment. Localisation is typically achieved within a few steps. Furthermore, the computational complexity of the underlying computations is limited, and the system is robust with respect to minor variations in the spatial environments.

  7. Depth-based representations: Which coding format for 3D video broadcast applications?

    NASA Astrophysics Data System (ADS)

    Kerbiriou, Paul; Boisson, Guillaume; Sidibé, Korian; Huynh-Thu, Quan

    2011-03-01

    3D Video (3DV) delivery standardization is currently ongoing in MPEG. Now time is to choose 3DV data representation format. What is at stake is the final quality for end-users, i.e. synthesized views' visual quality. We focus on two major rival depth-based formats, namely Multiview Video plus Depth (MVD) and Layered Depth Video (LDV). MVD can be considered as the basic depth-based 3DV format, generated by disparity estimation from multiview sequences. LDV is more sophisticated, with the compaction of multiview data into color- and depth-occlusions layers. We compare final views quality using MVD2 and LDV (both containing two color channels plus two depth components) coded with MVC at various compression ratios. Depending on the format, the appropriate synthesis process is performed to generate final stereoscopic pairs. Comparisons are provided in terms of SSIM and PSNR with respect to original views and to synthesized references (obtained without compression). Eventually, LDV outperforms significantly MVD when using state-of-the-art reference synthesis algorithms. Occlusions management before encoding is advantageous in comparison with handling redundant signals at decoder side. Besides, we observe that depth quantization does not induce much loss on the final view quality until a significant degradation level. Improvements in disparity estimation and view synthesis algorithms are therefore still expected during the remaining standardization steps.

  8. Computer-based learning: interleaving whole and sectional representation of neuroanatomy.

    PubMed

    Pani, John R; Chariker, Julia H; Naaz, Farah

    2013-01-01

    The large volume of material to be learned in biomedical disciplines requires optimizing the efficiency of instruction. In prior work with computer-based instruction of neuroanatomy, it was relatively efficient for learners to master whole anatomy and then transfer to learning sectional anatomy. It may, however, be more efficient to continuously integrate learning of whole and sectional anatomy. A study of computer-based learning of neuroanatomy was conducted to compare a basic transfer paradigm for learning whole and sectional neuroanatomy with a method in which the two forms of representation were interleaved (alternated). For all experimental groups, interactive computer programs supported an approach to instruction called adaptive exploration. Each learning trial consisted of time-limited exploration of neuroanatomy, self-timed testing, and graphical feedback. The primary result of this study was that interleaved learning of whole and sectional neuroanatomy was more efficient than the basic transfer method, without cost to long-term retention or generalization of knowledge to recognizing new images (Visible Human and MRI).

  9. Remote Sensing Image Fusion Method Based on Nonsubsampled Shearlet Transform and Sparse Representation

    NASA Astrophysics Data System (ADS)

    Moonon, Altan-Ulzii; Hu, Jianwen; Li, Shutao

    2015-12-01

    The remote sensing image fusion is an important preprocessing technique in remote sensing image processing. In this paper, a remote sensing image fusion method based on the nonsubsampled shearlet transform (NSST) with sparse representation (SR) is proposed. Firstly, the low resolution multispectral (MS) image is upsampled and color space is transformed from Red-Green-Blue (RGB) to Intensity-Hue-Saturation (IHS). Then, the high resolution panchromatic (PAN) image and intensity component of MS image are decomposed by NSST to high and low frequency coefficients. The low frequency coefficients of PAN and the intensity component are fused by the SR with the learned dictionary. The high frequency coefficients of intensity component and PAN image are fused by local energy based fusion rule. Finally, the fused result is obtained by performing inverse NSST and inverse IHS transform. The experimental results on IKONOS and QuickBird satellites demonstrate that the proposed method provides better spectral quality and superior spatial information in the fused image than other remote sensing image fusion methods both in visual effect and object evaluation.

  10. Network-based representation of energy transfer in unsteady separated flow

    NASA Astrophysics Data System (ADS)

    Nair, Aditya; Taira, Kunihiko

    2015-11-01

    We construct a network-based representation of energy pathways in unsteady separated flows using a POD-Galerkin projection model. In this formulation, we regard the POD modes as the network nodes and the energy transfer between the modes as the network edges. Based on the energy transfer analysis performed by Noack et al. (2008), edge weights are characterized on the interaction graph. As an example, we examine the energy transfer within the two-dimensional incompressible flow over a circular cylinder. In particular, we analyze the energy pathways involved in flow transition from the unstable symmetric steady state to periodic shedding cycle. The growth of perturbation energy over the network is examined to highlight key features of flow physics and to determine how the energy transfer can be influenced. Furthermore, we implement closed-loop flow control on the POD-Galerkin model to alter the energy interaction path and modify the global behavior of the wake dynamics. The insights gained will be used to perform further network analysis on fluid flows with added complexity. Work supported by US Army Research Office (W911NF-14-1-0386) and US Air Force Office of Scientific Research (YIP: FA9550-13-1-0183).

  11. Neural Network Based Representation of UH-60A Pilot and Hub Accelerations

    NASA Technical Reports Server (NTRS)

    Kottapalli, Sesi

    2000-01-01

    Neural network relationships between the full-scale, experimental hub accelerations and the corresponding pilot floor vertical vibration are studied. The present physics-based, quantitative effort represents an initial systematic study on the UH-60A Black Hawk hub accelerations. The NASA/Army UH-60A Airloads Program flight test database was used. A 'maneuver-effect-factor (MEF)', derived using the roll-angle and the pitch-rate, was used. Three neural network based representation-cases were considered. The pilot floor vertical vibration was considered in the first case and the hub accelerations were separately considered in the second case. The third case considered both the hub accelerations and the pilot floor vertical vibration. Neither the advance ratio nor the gross weight alone could be used to predict the pilot floor vertical vibration. However, the advance ratio and the gross weight together could be used to predict the pilot floor vertical vibration over the entire flight envelope. The hub accelerations data were modeled and found to be of very acceptable quality. The hub accelerations alone could not be used to predict the pilot floor vertical vibration. Thus, the hub accelerations alone do not drive the pilot floor vertical vibration. However, the hub accelerations, along with either the advance ratio or the gross weight or both, could be used to satisfactorily predict the pilot floor vertical vibration. The hub accelerations are clearly a factor in determining the pilot floor vertical vibration.

  12. Connectivity strength-weighted sparse group representation-based brain network construction for MCI classification.

    PubMed

    Yu, Renping; Zhang, Han; An, Le; Chen, Xiaobo; Wei, Zhihui; Shen, Dinggang

    2017-02-02

    Brain functional network analysis has shown great potential in understanding brain functions and also in identifying biomarkers for brain diseases, such as Alzheimer's disease (AD) and its early stage, mild cognitive impairment (MCI). In these applications, accurate construction of biologically meaningful brain network is critical. Sparse learning has been widely used for brain network construction; however, its l1 -norm penalty simply penalizes each edge of a brain network equally, without considering the original connectivity strength which is one of the most important inherent linkwise characters. Besides, based on the similarity of the linkwise connectivity, brain network shows prominent group structure (i.e., a set of edges sharing similar attributes). In this article, we propose a novel brain functional network modeling framework with a "connectivity strength-weighted sparse group constraint." In particular, the network modeling can be optimized by considering both raw connectivity strength and its group structure, without losing the merit of sparsity. Our proposed method is applied to MCI classification, a challenging task for early AD diagnosis. Experimental results based on the resting-state functional MRI, from 50 MCI patients and 49 healthy controls, show that our proposed method is more effective (i.e., achieving a significantly higher classification accuracy, 84.8%) than other competing methods (e.g., sparse representation, accuracy = 65.6%). Post hoc inspection of the informative features further shows more biologically meaningful brain functional connectivities obtained by our proposed method. Hum Brain Mapp, 2017. © 2017 Wiley Periodicals, Inc.

  13. Attachment and God Representations among Lay Catholics, Priests, and Religious: A Matched Comparison Study Based on the Adult Attachment Interview

    ERIC Educational Resources Information Center

    Cassibba, Rosalinda; Granqvist, Pehr; Costantini, Alessandro; Gatto, Sergio

    2008-01-01

    Based on the idea that believers' perceived relationships with God develop from their attachment-related experiences with primary caregivers, the authors explored the quality of such experiences and their representations among individuals who differed in likelihood of experiencing a principal attachment to God. Using the Adult Attachment Interview…

  14. Why Representations?

    ERIC Educational Resources Information Center

    Schultz, James E.; Waters, Michael S.

    2000-01-01

    Discusses representations in the context of solving a system of linear equations. Views representations (concrete, tables, graphs, algebraic, matrices) from perspectives of understanding, technology, generalization, exact versus approximate solution, and learning style. (KHR)

  15. Secure Base Representations in Middle Childhood across Two Western Cultures: Associations with Parental Attachment Representations and Maternal Reports of Behavior Problems

    ERIC Educational Resources Information Center

    Waters, Theodore E. A.; Bosmans, Guy; Vandevivere, Eva; Dujardin, Adinda; Waters, Harriet S.

    2015-01-01

    Recent work examining the content and organization of attachment representations suggests that 1 way in which we represent the attachment relationship is in the form of a cognitive script. This work has largely focused on early childhood or adolescence/adulthood, leaving a large gap in our understanding of script-like attachment representations in…

  16. Applying representational state transfer (REST) architecture to archetype-based electronic health record systems

    PubMed Central

    2013-01-01

    Background The openEHR project and the closely related ISO 13606 standard have defined structures supporting the content of Electronic Health Records (EHRs). However, there is not yet any finalized openEHR specification of a service interface to aid application developers in creating, accessing, and storing the EHR content. The aim of this paper is to explore how the Representational State Transfer (REST) architectural style can be used as a basis for a platform-independent, HTTP-based openEHR service interface. Associated benefits and tradeoffs of such a design are also explored. Results The main contribution is the formalization of the openEHR storage, retrieval, and version-handling semantics and related services into an implementable HTTP-based service interface. The modular design makes it possible to prototype, test, replicate, distribute, cache, and load-balance the system using ordinary web technology. Other contributions are approaches to query and retrieval of the EHR content that takes caching, logging, and distribution into account. Triggering on EHR change events is also explored. A final contribution is an open source openEHR implementation using the above-mentioned approaches to create LiU EEE, an educational EHR environment intended to help newcomers and developers experiment with and learn about the archetype-based EHR approach and enable rapid prototyping. Conclusions Using REST addressed many architectural concerns in a successful way, but an additional messaging component was needed to address some architectural aspects. Many of our approaches are likely of value to other archetype-based EHR implementations and may contribute to associated service model specifications. PMID:23656624

  17. Jigsaw Cooperative Learning: Acid-Base Theories

    ERIC Educational Resources Information Center

    Tarhan, Leman; Sesen, Burcin Acar

    2012-01-01

    This study focused on investigating the effectiveness of jigsaw cooperative learning instruction on first-year undergraduates' understanding of acid-base theories. Undergraduates' opinions about jigsaw cooperative learning instruction were also investigated. The participants of this study were 38 first-year undergraduates in chemistry education…

  18. Separation of Acids, Bases, and Neutral Compounds

    NASA Astrophysics Data System (ADS)

    Fujita, Megumi; Mah, Helen M.; Sgarbi, Paulo W. M.; Lall, Manjinder S.; Ly, Tai Wei; Browne, Lois M.

    2003-01-01

    Separation of Acids, Bases, and Neutral Compounds requires the following software, which is available for free download from the Internet: Netscape Navigator, version 4.75 or higher, or Microsoft Internet Explorer, version 5.0 or higher; Chime plug-in, version compatible with your OS and browser (available from MDL); and Flash player, version 5 or higher (available from Macromedia).

  19. Understanding hERG inhibition with QSAR models based on a one-dimensional molecular representation

    NASA Astrophysics Data System (ADS)

    Diller, David J.; Hobbs, Doug W.

    2007-07-01

    Blockage of the potassium channel encoded by the human ether-a-go-go related gene (hERG) is well understood to be the root cause of the cardio-toxicity of numerous approved and investigational drugs. As such, a cascade of in vitro and in vivo assays have been developed to filter compounds with hERG inhibitory activity. Quantitative structure activity relationship (QSAR) models are used at the very earliest part of this cascade to eliminate compounds that are likely to have this undesirable activity prior to synthesis. Here a new QSAR technique based on the one-dimensional representation is described in the context of the development of a model to predict hERG inhibition. The model is shown to perform close to the limits of the quality of the data used for model building. In order to make optimal use of the available data, a general robust mathematical scheme was developed and is described to simultaneously incorporate quantitative data, such as IC50 = 50 nM, and qualitative data, such as inactive or IC50 > 30 μM into QSAR models without discarding any experimental information.

  20. Flexible natural language parser based on a two-level representation of syntax

    SciTech Connect

    Lesmo, L.; Torasso, P.

    1983-01-01

    In this paper the authors present a parser which allows to make explicit the interconnections between syntax and semantics, to analyze the sentences in a quasi-deterministic fashion and, in many cases, to identify the roles of the various constituents even if the sentence is ill-formed. The main feature of the approach on which the parser is based consists in a two-level representation of the syntactic knowledge: a first set of rules emits hypotheses about the constituents of the sentence and their functional role and another set of rules verifies whether a hypothesis satisfies the constraints about the well-formedness of sentences. However, the application of the second set of rules is delayed until the semantic knowledge confirms the acceptability of the hypothesis. If the semantics reject it, a new hypothesis is obtained by applying a simple and relatively inexpensive natural modification; a set of these modifications is predefined and only when none of them is applicable a real backup is performed: in most cases this situation corresponds to a case where people would normally garden path. 19 references.

  1. Space-based radar representation in the advanced warfighting simulation (AWARS)

    NASA Astrophysics Data System (ADS)

    Phend, Andrew E.; Buckley, Kathryn; Elliott, Steven R.; Stanley, Page B.; Shea, Peter M.; Rutland, Jimmie A.

    2004-09-01

    Space and orbiting systems impact multiple battlefield operating systems (BOS). Space support to current operations is a perfect example of how the United States fights. Satellite-aided munitions, communications, navigation and weather systems combine to achieve military objectives in a relatively short amount of time. Through representation of space capabilities within models and simulations, the military will have the ability to train and educate officers and soldiers to fight from the high ground of space or to conduct analysis and determine the requirements or utility of transformed forces empowered with advanced space-based capabilities. The Army Vice Chief of Staff acknowledged deficiencies in space modeling and simulation during the September 2001 Space Force Management Analsyis Review (FORMAL) and directed that a multi-disciplinary team be established to recommend a service-wide roadmap to address shortcomings. A Focus Area Collaborative Team (FACT), led by the U.S. Army Space & Missile Defense Command with participation across the Army, confirmed the weaknesses in scope, consistency, correctness, completeness, availability, and usability of space model and simulation (M&S) for Army applications. The FACT addressed the need to develop a roadmap to remedy Space M&S deficiencies using a highly parallelized process and schedule designed to support a recommendation during the Sep 02 meeting of the Army Model and Simulation Executive Council (AMSEC).

  2. 3D Face Recognition Based on Multiple Keypoint Descriptors and Sparse Representation

    PubMed Central

    Zhang, Lin; Ding, Zhixuan; Li, Hongyu; Shen, Ying; Lu, Jianwei

    2014-01-01

    Recent years have witnessed a growing interest in developing methods for 3D face recognition. However, 3D scans often suffer from the problems of missing parts, large facial expressions, and occlusions. To be useful in real-world applications, a 3D face recognition approach should be able to handle these challenges. In this paper, we propose a novel general approach to deal with the 3D face recognition problem by making use of multiple keypoint descriptors (MKD) and the sparse representation-based classification (SRC). We call the proposed method 3DMKDSRC for short. Specifically, with 3DMKDSRC, each 3D face scan is represented as a set of descriptor vectors extracted from keypoints by meshSIFT. Descriptor vectors of gallery samples form the gallery dictionary. Given a probe 3D face scan, its descriptors are extracted at first and then its identity can be determined by using a multitask SRC. The proposed 3DMKDSRC approach does not require the pre-alignment between two face scans and is quite robust to the problems of missing data, occlusions and expressions. Its superiority over the other leading 3D face recognition schemes has been corroborated by extensive experiments conducted on three benchmark databases, Bosphorus, GavabDB, and FRGC2.0. The Matlab source code for 3DMKDSRC and the related evaluation results are publicly available at http://sse.tongji.edu.cn/linzhang/3dmkdsrcface/3dmkdsrc.htm. PMID:24940876

  3. Dim moving target tracking algorithm based on particle discriminative sparse representation

    NASA Astrophysics Data System (ADS)

    Li, Zhengzhou; Li, Jianing; Ge, Fengzeng; Shao, Wanxing; Liu, Bing; Jin, Gang

    2016-03-01

    The small dim moving target usually submerged in strong noise, and its motion observability is debased by numerous false alarms for low signal-to-noise ratio (SNR). A target tracking algorithm based on particle filter and discriminative sparse representation is proposed in this paper to cope with the uncertainty of dim moving target tracking. The weight of every particle is the crucial factor to ensuring the accuracy of dim target tracking for particle filter (PF) that can achieve excellent performance even under the situation of non-linear and non-Gaussian motion. In discriminative over-complete dictionary constructed according to image sequence, the target dictionary describes target signal and the background dictionary embeds background clutter. The difference between target particle and background particle is enhanced to a great extent, and the weight of every particle is then measured by means of the residual after reconstruction using the prescribed number of target atoms and their corresponding coefficients. The movement state of dim moving target is then estimated and finally tracked by these weighted particles. Meanwhile, the subspace of over-complete dictionary is updated online by the stochastic estimation algorithm. Some experiments are induced and the experimental results show the proposed algorithm could improve the performance of moving target tracking by enhancing the consistency between the posteriori probability distribution and the moving target state.

  4. Pulmonary emphysema classification based on an improved texton learning model by sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Min; Zhou, Xiangrong; Goshima, Satoshi; Chen, Huayue; Muramatsu, Chisako; Hara, Takeshi; Yokoyama, Ryujiro; Kanematsu, Masayuki; Fujita, Hiroshi

    2013-03-01

    In this paper, we present a texture classification method based on texton learned via sparse representation (SR) with new feature histogram maps in the classification of emphysema. First, an overcomplete dictionary of textons is learned via KSVD learning on every class image patches in the training dataset. In this stage, high-pass filter is introduced to exclude patches in smooth area to speed up the dictionary learning process. Second, 3D joint-SR coefficients and intensity histograms of the test images are used for characterizing regions of interest (ROIs) instead of conventional feature histograms constructed from SR coefficients of the test images over the dictionary. Classification is then performed using a classifier with distance as a histogram dissimilarity measure. Four hundreds and seventy annotated ROIs extracted from 14 test subjects, including 6 paraseptal emphysema (PSE) subjects, 5 centrilobular emphysema (CLE) subjects and 3 panlobular emphysema (PLE) subjects, are used to evaluate the effectiveness and robustness of the proposed method. The proposed method is tested on 167 PSE, 240 CLE and 63 PLE ROIs consisting of mild, moderate and severe pulmonary emphysema. The accuracy of the proposed system is around 74%, 88% and 89% for PSE, CLE and PLE, respectively.

  5. Foundations for Reasoning in Cognition-Based Computational Representations of Human Decision Making

    SciTech Connect

    SENGLAUB, MICHAEL E.; HARRIS, DAVID L.; RAYBOURN, ELAINE M.

    2001-11-01

    In exploring the question of how humans reason in ambiguous situations or in the absence of complete information, we stumbled onto a body of knowledge that addresses issues beyond the original scope of our effort. We have begun to understand the importance that philosophy, in particular the work of C. S. Peirce, plays in developing models of human cognition and of information theory in general. We have a foundation that can serve as a basis for further studies in cognition and decision making. Peircean philosophy provides a foundation for understanding human reasoning and capturing behavioral characteristics of decision makers due to cultural, physiological, and psychological effects. The present paper describes this philosophical approach to understanding the underpinnings of human reasoning. We present the work of C. S. Peirce, and define sets of fundamental reasoning behavior that would be captured in the mathematical constructs of these newer technologies and would be able to interact in an agent type framework. Further, we propose the adoption of a hybrid reasoning model based on his work for future computational representations or emulations of human cognition.

  6. Whose place is it anyway? Representational politics in a place-based health initiative.

    PubMed

    Rushton, Carole

    2014-03-01

    The association between place and poor health, such as chronic disease, is well documented and in recent years has given rise to public health strategies such as place-based initiatives (PBIs). This article reports on the emergence of one such initiative in Australia, in regions identified as culturally diverse and socially disadvantaged. The study draws on the intellectual resources provided by governmentality and actor-network theory to provide insights into the reasons why community actors were excluded from a new governance body established to represent their interests. Risk-thinking and representational politics determined who represented whom in the PBI partnership. Paradoxically, actors representing 'community', identified as being 'at risk', were excluded from the partnership during its translation because they were also identified as being 'a risk'. As a consequence, contrary to federal government health and social policy in Australia, it was state government interests rather than the interests of community actors that influenced decisions made in relation to local health planning and the allocation of resources.

  7. 3D face recognition based on multiple keypoint descriptors and sparse representation.

    PubMed

    Zhang, Lin; Ding, Zhixuan; Li, Hongyu; Shen, Ying; Lu, Jianwei

    2014-01-01

    Recent years have witnessed a growing interest in developing methods for 3D face recognition. However, 3D scans often suffer from the problems of missing parts, large facial expressions, and occlusions. To be useful in real-world applications, a 3D face recognition approach should be able to handle these challenges. In this paper, we propose a novel general approach to deal with the 3D face recognition problem by making use of multiple keypoint descriptors (MKD) and the sparse representation-based classification (SRC). We call the proposed method 3DMKDSRC for short. Specifically, with 3DMKDSRC, each 3D face scan is represented as a set of descriptor vectors extracted from keypoints by meshSIFT. Descriptor vectors of gallery samples form the gallery dictionary. Given a probe 3D face scan, its descriptors are extracted at first and then its identity can be determined by using a multitask SRC. The proposed 3DMKDSRC approach does not require the pre-alignment between two face scans and is quite robust to the problems of missing data, occlusions and expressions. Its superiority over the other leading 3D face recognition schemes has been corroborated by extensive experiments conducted on three benchmark databases, Bosphorus, GavabDB, and FRGC2.0. The Matlab source code for 3DMKDSRC and the related evaluation results are publicly available at http://sse.tongji.edu.cn/linzhang/3dmkdsrcface/3dmkdsrc.htm.

  8. Linear Titration Curves of Acids and Bases.

    PubMed

    Joseph, N R

    1959-05-29

    The Henderson-Hasselbalch equation, by a simple transformation, becomes pH - pK = pA - pB, where pA and pB are the negative logarithms of acid and base concentrations. Sigmoid titration curves then reduce to straight lines; titration curves of polyelectrolytes, to families of straight lines. The method is applied to the titration of the dipeptide glycyl aminotricarballylic acid, with four titrable groups. Results are expressed as Cartesian and d'Ocagne nomograms. The latter is of a general form applicable to polyelectrolytes of any degree of complexity.

  9. A no-reference perceptual blurriness metric based fast super-resolution of still pictures using sparse representation

    NASA Astrophysics Data System (ADS)

    Choi, Jae-Seok; Bae, Sung-Ho; Kim, Munchurl

    2015-03-01

    In recent years, perceptually-driven super-resolution (SR) methods have been proposed to lower computational complexity. Furthermore, sparse representation based super-resolution is known to produce competitive high-resolution images with lower computational costs compared to other SR methods. Nevertheless, super-resolution is still difficult to be implemented with substantially low processing power for real-time applications. In order to speed up the processing time of SR, much effort has been made with efficient methods, which selectively incorporate elaborate computation algorithms for perceptually sensitive image regions based on a metric, such as just noticeable distortion (JND). Inspired by the previous works, we first propose a novel fast super-resolution method with sparse representation, which incorporates a no-reference just noticeable blur (JNB) metric. That is, the proposed fast super-resolution method efficiently generates super-resolution images by selectively applying a sparse representation method for perceptually sensitive image areas which are detected based on the JNB metric. Experimental results show that our JNB-based fast super-resolution method is about 4 times faster than a non-perceptual sparse representation based SR method for 256× 256 test LR images. Compared to a JND-based SR method, the proposed fast JNB-based SR method is about 3 times faster, with approximately 0.1 dB higher PSNR and a slightly higher SSIM value in average. This indicates that our proposed perceptual JNB-based SR method generates high-quality SR images with much lower computational costs, opening a new possibility for real-time hardware implementations.

  10. A ranklet-based image representation for mass classification in digital mammograms

    SciTech Connect

    Masotti, Matteo

    2006-10-15

    Regions of interest (ROIs) found on breast radiographic images are classified as either tumoral mass or normal tissue by means of a support vector machine classifier. Classification features are the coefficients resulting from the specific image representation used to encode each ROI. Pixel and wavelet image representations have already been discussed in one of our previous works. To investigate the possibility of improving classification performances, a novel nonparametric, orientation-selective, and multiresolution image representation is developed and evaluated, namely a ranklet image representation. A dataset consisting of 1000 ROIs representing biopsy-proven tumoral masses (either benign or malignant) and 5000 ROIs representing normal breast tissue is used. ROIs are extracted from the digital database for screening mammography collected by the University of South Florida. Classification performances are evaluated using the area A{sub z} under the receiver operating characteristic curve. By achieving A{sub z} values of 0.978{+-}0.003 and 90% sensitivity with a false positive fraction value of 4.5%, experiments demonstrate classification results higher than those reached by the previous image representations. In particular, the improvement on the A{sub z} value over that achieved by the wavelet image representation is statistically relevant with the two-tailed p value <0.0001. Besides, owing to the tolerance that the ranklet image representation reveals to variations in the ROIs' gray-level intensity histogram, this approach discloses to be robust also when tested on radiographic images having gray-level intensity histogram remarkably different from that used for training.

  11. The first proton sponge-based amino acids: synthesis, acid-base properties and some reactivity.

    PubMed

    Ozeryanskii, Valery A; Gorbacheva, Anastasia Yu; Pozharskii, Alexander F; Vlasenko, Marina P; Tereznikov, Alexander Yu; Chernov'yants, Margarita S

    2015-08-21

    The first hybrid base constructed from 1,8-bis(dimethylamino)naphthalene (proton sponge or DMAN) and glycine, N-methyl-N-(8-dimethylamino-1-naphthyl)aminoacetic acid, was synthesised in high yield and its hydrobromide was structurally characterised and used to determine the acid-base properties via potentiometric titration. It was found that the basic strength of the DMAN-glycine base (pKa = 11.57, H2O) is on the level of amidine amino acids like arginine and creatine and its structure, zwitterionic vs. neutral, based on the spectroscopic (IR, NMR, mass) and theoretical (DFT) approaches has a strong preference to the zwitterionic form. Unlike glycine, the DMAN-glycine zwitterion is N-chiral and is hydrolytically cleaved with the loss of glycolic acid on heating in DMSO. This reaction together with the mild decarboxylative conversion of proton sponge-based amino acids into 2,3-dihydroperimidinium salts under air-oxygen was monitored with the help of the DMAN-alanine amino acid. The newly devised amino acids are unique as they combine fluorescence, strongly basic and redox-active properties.

  12. Predicting siRNA efficacy based on multiple selective siRNA representations and their combination at score level

    NASA Astrophysics Data System (ADS)

    He, Fei; Han, Ye; Gong, Jianting; Song, Jiazhi; Wang, Han; Li, Yanwen

    2017-03-01

    Small interfering RNAs (siRNAs) may induce to targeted gene knockdown, and the gene silencing effectiveness relies on the efficacy of the siRNA. Therefore, the task of this paper is to construct an effective siRNA prediction method. In our work, we try to describe siRNA from both quantitative and qualitative aspects. For quantitative analyses, we form four groups of effective features, including nucleotide frequencies, thermodynamic stability profile, thermodynamic of siRNA-mRNA interaction, and mRNA related features, as a new mixed representation, in which thermodynamic of siRNA-mRNA interaction is introduced to siRNA efficacy prediction for the first time to our best knowledge. And then an F-score based feature selection is employed to investigate the contribution of each feature and remove the weak relevant features. Meanwhile, we encode the siRNA sequence and existed empirical design rules as a qualitative siRNA representation. These two kinds of siRNA representations are combined to predict siRNA efficacy by supported Vector Regression (SVR) at score level. The experimental results indicate that our method may select the features with powerful discriminative ability and make the two kinds of siRNA representations work at full capacity. The prediction results also demonstrate that our method can outperform other popular siRNA efficacy prediction algorithms.

  13. Predicting siRNA efficacy based on multiple selective siRNA representations and their combination at score level

    PubMed Central

    He, Fei; Han, Ye; Gong, Jianting; Song, Jiazhi; Wang, Han; Li, Yanwen

    2017-01-01

    Small interfering RNAs (siRNAs) may induce to targeted gene knockdown, and the gene silencing effectiveness relies on the efficacy of the siRNA. Therefore, the task of this paper is to construct an effective siRNA prediction method. In our work, we try to describe siRNA from both quantitative and qualitative aspects. For quantitative analyses, we form four groups of effective features, including nucleotide frequencies, thermodynamic stability profile, thermodynamic of siRNA-mRNA interaction, and mRNA related features, as a new mixed representation, in which thermodynamic of siRNA-mRNA interaction is introduced to siRNA efficacy prediction for the first time to our best knowledge. And then an F-score based feature selection is employed to investigate the contribution of each feature and remove the weak relevant features. Meanwhile, we encode the siRNA sequence and existed empirical design rules as a qualitative siRNA representation. These two kinds of siRNA representations are combined to predict siRNA efficacy by supported Vector Regression (SVR) at score level. The experimental results indicate that our method may select the features with powerful discriminative ability and make the two kinds of siRNA representations work at full capacity. The prediction results also demonstrate that our method can outperform other popular siRNA efficacy prediction algorithms. PMID:28317874

  14. Temporal Super Resolution Enhancement of Echocardiographic Images Based on Sparse Representation.

    PubMed

    Gifani, Parisa; Behnam, Hamid; Haddadi, Farzan; Sani, Zahra Alizadeh; Shojaeifard, Maryam

    2016-01-01

    A challenging issue for echocardiographic image interpretation is the accurate analysis of small transient motions of myocardium and valves during real-time visualization. A higher frame rate video may reduce this difficulty, and temporal super resolution (TSR) is useful for illustrating the fast-moving structures. In this paper, we introduce a novel framework that optimizes TSR enhancement of echocardiographic images by utilizing temporal information and sparse representation. The goal of this method is to increase the frame rate of echocardiographic videos, and therefore enable more accurate analyses of moving structures. For the proposed method, we first derived temporal information by extracting intensity variation time curves (IVTCs) assessed for each pixel. We then designed both low-resolution and high-resolution overcomplete dictionaries based on prior knowledge of the temporal signals and a set of prespecified known functions. The IVTCs can then be described as linear combinations of a few prototype atoms in the low-resolution dictionary. We used the Bayesian compressive sensing (BCS) sparse recovery algorithm to find the sparse coefficients of the signals. We extracted the sparse coefficients and the corresponding active atoms in the low-resolution dictionary to construct new sparse coefficients corresponding to the high-resolution dictionary. Using the estimated atoms and the high-resolution dictionary, a new IVTC with more samples was constructed. Finally, by placing the new IVTC signals in the original IVTC positions, we were able to reconstruct the original echocardiography video with more frames. The proposed method does not require training of low-resolution and high-resolution dictionaries, nor does it require motion estimation; it does not blur fast-moving objects, and does not have blocking artifacts.

  15. The Construction of Semantic Memory: Grammar-Based Representations Learned from Relational Episodic Information

    PubMed Central

    Battaglia, Francesco P.; Pennartz, Cyriel M. A.

    2011-01-01

    After acquisition, memories underlie a process of consolidation, making them more resistant to interference and brain injury. Memory consolidation involves systems-level interactions, most importantly between the hippocampus and associated structures, which takes part in the initial encoding of memory, and the neocortex, which supports long-term storage. This dichotomy parallels the contrast between episodic memory (tied to the hippocampal formation), collecting an autobiographical stream of experiences, and semantic memory, a repertoire of facts and statistical regularities about the world, involving the neocortex at large. Experimental evidence points to a gradual transformation of memories, following encoding, from an episodic to a semantic character. This may require an exchange of information between different memory modules during inactive periods. We propose a theory for such interactions and for the formation of semantic memory, in which episodic memory is encoded as relational data. Semantic memory is modeled as a modified stochastic grammar, which learns to parse episodic configurations expressed as an association matrix. The grammar produces tree-like representations of episodes, describing the relationships between its main constituents at multiple levels of categorization, based on its current knowledge of world regularities. These regularities are learned by the grammar from episodic memory information, through an expectation-maximization procedure, analogous to the inside–outside algorithm for stochastic context-free grammars. We propose that a Monte-Carlo sampling version of this algorithm can be mapped on the dynamics of “sleep replay” of previously acquired information in the hippocampus and neocortex. We propose that the model can reproduce several properties of semantic memory such as decontextualization, top-down processing, and creation of schemata. PMID:21887143

  16. Students and tutors' social representations of assessment in problem-based learning tutorials supporting change

    PubMed Central

    Bollela, Valdes R; Gabarra, Manoel HC; da Costa, Caetano; Lima, Rita CP

    2009-01-01

    Background Medical programmes that implement problem-based learning (PBL) face several challenges when introducing this innovative learning method. PBL relies on small group as the foundation of study, and tutors facilitate learning by guiding the process rather than teaching the group. One of the major challenges is the use of strategies to assess students working in small groups. Self-, peer- and tutor-assessment are integral part of PBL tutorials and they're not easy to perform, especially for non experienced students and tutors. The undergraduate PBL medical programme was introduced in 2003, and after two years the curriculum committee decided to evaluate the tutorial assessment in the new program. Methods A random group of ten students, out of a cohort of sixty, and ten tutors (out of eighteen) were selected for semi-structured interviews. The social representations' theory was used to explore how the students and tutors made sense of "assessment in tutorials". The data were content analyzed using software for qualitative and quantitative processing of text according to lexicological distribution patterns. Results Even though students and tutors are aware of the broader purpose of assessment, they felt that they were not enough trained and confident to the tutorial assessment. Assigning numbers to complex behaviors on a regular basis, as in tutorials, is counter productive to cooperative group learning and self assessment. Tutors believe that students are immature and not able to assess themselves and tutors. Students believe that good grades are closely related to good oral presentation skills and also showed a corporative attitude among themselves (protecting each other from poor grades). Conclusion Faculty training on PBL tutorials' assessment process and a systematic strategy to evaluate new programs is absolutely necessary to review and correct directions. It is envisaged that planners can make better-informed decisions about curricular implementation

  17. Micro-Expression Recognition based on 2D Gabor Filter and Sparse Representation

    NASA Astrophysics Data System (ADS)

    Zheng, Hao

    2017-01-01

    Micro-expression recognition is always a challenging problem for its quick facial expression. This paper proposed a novel method named 2D Gabor filter and Sparse Representation (2DGSR) to deal with the recognition of micro-expression. In our method, 2D Gabor filter is used for enhancing the robustness of the variations due to increasing the discrimination power. While the sparse representation is applied to deal with the subtlety, and cast recognition as a sparse approximation problem. We compare our method to other popular methods in three spontaneous micro-expression recognition databases. The results show that our method has more excellent performance than other methods.

  18. Efficient generation of sum-of-products representations of high-dimensional potential energy surfaces based on multimode expansions

    NASA Astrophysics Data System (ADS)

    Ziegler, Benjamin; Rauhut, Guntram

    2016-03-01

    The transformation of multi-dimensional potential energy surfaces (PESs) from a grid-based multimode representation to an analytical one is a standard procedure in quantum chemical programs. Within the framework of linear least squares fitting, a simple and highly efficient algorithm is presented, which relies on a direct product representation of the PES and a repeated use of Kronecker products. It shows the same scalings in computational cost and memory requirements as the potfit approach. In comparison to customary linear least squares fitting algorithms, this corresponds to a speed-up and memory saving by several orders of magnitude. Different fitting bases are tested, namely, polynomials, B-splines, and distributed Gaussians. Benchmark calculations are provided for the PESs of a set of small molecules.

  19. A Discussion on Uncertainty Representation and Interpretation in Model-Based Prognostics Algorithms based on Kalman Filter Estimation Applied to Prognostics of Electronics Components

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Saxen, Abhinav; Goebel, Kai

    2012-01-01

    This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process and how it relates to uncertainty representation, management, and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function and the true remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for the two while considering prognostics in making critical decisions.

  20. Students' integration of multiple representations in a titration experiment

    NASA Astrophysics Data System (ADS)

    Kunze, Nicole M.

    A complete understanding of a chemical concept is dependent upon a student's ability to understand the microscopic or particulate nature of the phenomenon and integrate the microscopic, symbolic, and macroscopic representations of the phenomenon. Acid-base chemistry is a general chemistry topic requiring students to understand the topics of chemical reactions, solutions, and equilibrium presented earlier in the course. In this study, twenty-five student volunteers from a second semester general chemistry course completed two interviews. The first interview was completed prior to any classroom instruction on acids and bases. The second interview took place after classroom instruction, a prelab activity consisting of a titration calculation worksheet, a titration computer simulation, or a microscopic level animation of a titration, and two microcomputer-based laboratory (MBL) titration experiments. During the interviews, participants were asked to define and describe acid-base concepts and in the second interview they also drew the microscopic representations of four stages in an acid-base titration. An analysis of the data showed that participants had integrated the three representations of an acid-base titration to varying degrees. While some participants showed complete understanding of acids, bases, titrations, and solution chemistry, other participants showed several alternative conceptions concerning strong acid and base dissociation, the formation of titration products, and the dissociation of soluble salts. Before instruction, participants' definitions of acid, base, and pH were brief and consisted of descriptive terms. After instruction, the definitions were more scientific and reflected the definitions presented during classroom instruction.

  1. Embodied Numerosity: Implicit Hand-Based Representations Influence Symbolic Number Processing across Cultures

    ERIC Educational Resources Information Center

    Domahs, Frank; Moeller, Korbinian; Huber, Stefan; Willmes, Klaus; Nuerk, Hans-Christoph

    2010-01-01

    In recent years, a strong functional relationship between finger counting and number processing has been suggested. Developmental studies have shown specific effects of the structure of the individual finger counting system on arithmetic abilities. Moreover, the orientation of the mental quantity representation ("number line") seems to be…

  2. MAP-Motivated Carrier Synchronization of GMSK Based on the Laurent AMP Representation

    NASA Technical Reports Server (NTRS)

    Simon, M. K.

    1998-01-01

    Using the MAP estimation approach to carrier synchronization of digital modulations containing ISI together with a two pulse stream AMP representation of GMSK, it is possible to obtain an optimum closed loop configuration in the same manner as has been previously proposed for other conventional modulations with ISI.

  3. Mis/Representations in School-Based Digital Media Production: An Ethnographic Exploration with Muslim Girls

    ERIC Educational Resources Information Center

    Dahya, Negin; Jenson, Jennifer

    2015-01-01

    In this article, the authors discuss findings from a digital media production club with racialized girls in a low-income school in Toronto, Ontario. Specifically, the authors consider how student-produced media is impacted by ongoing postcolonial structures relating to power and representation in the school and in the media production work of…

  4. Graphical representations of the chemistry of garnets in a three-dimensional MATLAB based provenance plot

    NASA Astrophysics Data System (ADS)

    Knierzinger, Wolfgang; Palzer, Markus; Wagreich, Michael; Meszar, Maria; Gier, Susanne

    2016-04-01

    A newly developed, MATLAB based garnet provenance plot allows a three-dimensional tetrahedral representation of the chemistry of garnets for the endmembers almandine, pyrope, spessartine and grossular. Based on a freely accessible database of Suggate & Hall (2013) and additional EPMA-data on the internet, the chemistry of more than 2500 garnets was evaluated and used to create various subfields that correspond to different facies conditions of metapelitic, metasomatic and metaigneous rocks as well as granitic rocks. These triangulated subfields act as reference structures within the tetrahedron, facilitating assignments of garnet chemistries to different lithologies. In comparison with conventional tenary garnet discrimination diagrams by Mange & Morton (2007), Wright/Preston et al. (1938/2002) and Aubrecht et al. (2009), this tetrahedral provenance plot enables a better assessment of the conditions of formation of garnets by reducing the overlapping of certain subfields. In particular, a clearer distinction between greenschist facies rocks, amphibolite facies rocks and granitic rocks can be achieved. First applications of the tetrahedral garnet plot provided new insights on sedimentary processes during the Lower Miocene in the pre-Alpine Molasse basin. Bibliography Aubrecht, R., Meres, S., Sykora, M., Mikus, T. (2009). Provenance of the detrital garnets and spinels from the Albian sediments of the Czorsztyn Unit (Pieniny Klippen Belt , Western Carpathians, Slovakia). In: Geologica Carpathica, Dec. 2009, 60, 6, pp. 463-483. Mange, M.A., Morton, A.C. (2007). Geochemistry of Heavy Minerals. In: Mange, M.A. & Wright, D.T.(2007).Heavy Minerals in Use, Amsterdam, pp. 345-391. Preston, J., Hartley, A., Mange-Rajetzky, M., Hole, M., May, G., Buck, S., Vaughan, L. (2002). The provenance of Triassic continental sandstones from the Beryl Field, northern North Sea: Mineralogical, geochemical and sedimentological constraints. In: Journal of Sedimentary Research, 72, pp. 18

  5. Investigating Students' Reasoning about Acid-Base Reactions

    ERIC Educational Resources Information Center

    Cooper, Melanie M.; Kouyoumdjian, Hovig; Underwood, Sonia M.

    2016-01-01

    Acid-base chemistry is central to a wide range of reactions. If students are able to understand how and why acid-base reactions occur, it should provide a basis for reasoning about a host of other reactions. Here, we report the development of a method to characterize student reasoning about acid-base reactions based on their description of…

  6. Evaluation of a Computer-Based Training Program for Enhancing Arithmetic Skills and Spatial Number Representation in Primary School Children

    PubMed Central

    Rauscher, Larissa; Kohn, Juliane; Käser, Tanja; Mayer, Verena; Kucian, Karin; McCaskey, Ursina; Esser, Günter; von Aster, Michael

    2016-01-01

    Calcularis is a computer-based training program which focuses on basic numerical skills, spatial representation of numbers and arithmetic operations. The program includes a user model allowing flexible adaptation to the child's individual knowledge and learning profile. The study design to evaluate the training comprises three conditions (Calcularis group, waiting control group, spelling training group). One hundred and thirty-eight children from second to fifth grade participated in the study. Training duration comprised a minimum of 24 training sessions of 20 min within a time period of 6–8 weeks. Compared to the group without training (waiting control group) and the group with an alternative training (spelling training group), the children of the Calcularis group demonstrated a higher benefit in subtraction and number line estimation with medium to large effect sizes. Therefore, Calcularis can be used effectively to support children in arithmetic performance and spatial number representation. PMID:27445889

  7. An Introductory Laboratory Exercise for Acids and Bases.

    ERIC Educational Resources Information Center

    Miller, Richard; Silberman, Robert

    1986-01-01

    Discusses an acid-base neutralization exercise requiring groups of students to determine: (1) combinations of solutions giving neutralization; (2) grouping solutions as acids or bases; and (3) ranking groups in order of concentration. (JM)

  8. The Bronsted-Lowery Acid-Base Concept.

    ERIC Educational Resources Information Center

    Kauffman, George B.

    1988-01-01

    Gives the background history of the simultaneous discovery of acid-base relationships by Johannes Bronsted and Thomas Lowry. Provides a brief biographical sketch of each. Discusses their concept of acids and bases in some detail. (CW)

  9. Consistent sparse representations of EEG ERP and ICA components based on wavelet and chirplet dictionaries.

    PubMed

    Qiu, Jun-Wei; Zao, John K; Wang, Peng-Hua; Chou, Yu-Hsiang

    2010-01-01

    A randomized search algorithm for sparse representations of EEG event-related potentials (ERPs) and their statistically independent components is presented. This algorithm combines greedy matching pursuit (MP) technique with covariance matrix adaptation evolution strategy (CMA-ES) to select small number of signal atoms from over-complete wavelet and chirplet dictionaries that offer best approximations of quasi-sparse ERP signals. During the search process, adaptive pruning of signal parameters was used to eliminate redundant or degenerative atoms. As a result, the CMA-ES/MP algorithm is capable of producing accurate efficient and consistent sparse representations of ERP signals and their ICA components. This paper explains the working principles of the algorithm and presents the preliminary results of its use.

  10. Wigner functions for noncommutative quantum mechanics: A group representation based construction

    NASA Astrophysics Data System (ADS)

    Chowdhury, S. Hasibul Hassan; Ali, S. Twareque

    2015-12-01

    This paper is devoted to the construction and analysis of the Wigner functions for noncommutative quantum mechanics, their marginal distributions, and star-products, following a technique developed earlier, viz, using the unitary irreducible representations of the group GNC, which is the three fold central extension of the Abelian group of ℝ4. These representations have been exhaustively studied in earlier papers. The group GNC is identified with the kinematical symmetry group of noncommutative quantum mechanics of a system with two degrees of freedom. The Wigner functions studied here reflect different levels of non-commutativity—both the operators of position and those of momentum not commuting, the position operators not commuting and finally, the case of standard quantum mechanics, obeying the canonical commutation relations only.

  11. Wigner functions for noncommutative quantum mechanics: A group representation based construction

    SciTech Connect

    Chowdhury, S. Hasibul Hassan; Ali, S. Twareque

    2015-12-15

    This paper is devoted to the construction and analysis of the Wigner functions for noncommutative quantum mechanics, their marginal distributions, and star-products, following a technique developed earlier, viz, using the unitary irreducible representations of the group G{sub NC}, which is the three fold central extension of the Abelian group of ℝ{sup 4}. These representations have been exhaustively studied in earlier papers. The group G{sub NC} is identified with the kinematical symmetry group of noncommutative quantum mechanics of a system with two degrees of freedom. The Wigner functions studied here reflect different levels of non-commutativity—both the operators of position and those of momentum not commuting, the position operators not commuting and finally, the case of standard quantum mechanics, obeying the canonical commutation relations only.

  12. A 4-Dimensional Representation of Antennal Lobe Output Based on an Ensemble of Characterized Projection Neurons

    PubMed Central

    Staudacher, Erich M.; Huetteroth, Wolf; Schachtner, Joachim; Daly, Kevin C.

    2009-01-01

    A central problem facing studies of neural encoding in sensory systems is how to accurately quantify the extent of spatial and temporal responses. In this study, we take advantage of the relatively simple and stereotypic neural architecture found in invertebrates. We combine standard electrophysiological techniques, recently developed population analysis techniques, and novel anatomical methods to form an innovative 4-dimensional view of odor output representations in the antennal lobe of the moth Manduca sexta. This novel approach allows quantification of olfactory responses of characterized neurons with spike time resolution. Additionally, arbitrary integration windows can be used for comparisons with other methods such as imaging. By assigning statistical significance to changes in neuronal firing, this method can visualize activity across the entire antennal lobe. The resulting 4-dimensional representation of antennal lobe output complements imaging and multi-unit experiments yet provides a more comprehensive and accurate view of glomerular activation patterns in spike time resolution. PMID:19464513

  13. Unitary irreducible representations of SL(2,C) in discrete and continuous SU(1,1) bases

    SciTech Connect

    Conrady, Florian; Hnybida, Jeff

    2011-01-15

    We derive the matrix elements of generators of unitary irreducible representations of SL(2,C) with respect to basis states arising from a decomposition into irreducible representations of SU(1,1). This is done with regard to a discrete basis diagonalized by J{sup 3} and a continuous basis diagonalized by K{sup 1}, and for both the discrete and continuous series of SU(1,1). For completeness, we also treat the more conventional SU(2) decomposition as a fifth case. The derivation proceeds in a functional/differential framework and exploits the fact that state functions and differential operators have a similar structure in all five cases. The states are defined explicitly and related to SU(1,1) and SU(2) matrix elements.

  14. Mathematical modeling of acid-base physiology

    PubMed Central

    Occhipinti, Rossana; Boron, Walter F.

    2015-01-01

    pH is one of the most important parameters in life, influencing virtually every biological process at the cellular, tissue, and whole-body level. Thus, for cells, it is critical to regulate intracellular pH (pHi) and, for multicellular organisms, to regulate extracellular pH (pHo). pHi regulation depends on the opposing actions of plasma-membrane transporters that tend to increase pHi, and others that tend to decrease pHi. In addition, passive fluxes of uncharged species (e.g., CO2, NH3) and charged species (e.g., HCO3− , NH4+) perturb pHi. These movements not only influence one another, but also perturb the equilibria of a multitude of intracellular and extracellular buffers. Thus, even at the level of a single cell, perturbations in acid-base reactions, diffusion, and transport are so complex that it is impossible to understand them without a quantitative model. Here we summarize some mathematical models developed to shed light onto the complex interconnected events triggered by acids-base movements. We then describe a mathematical model of a spherical cell–which to our knowledge is the first one capable of handling a multitude of buffer reaction–that our team has recently developed to simulate changes in pHi and pHo caused by movements of acid-base equivalents across the plasma membrane of a Xenopus oocyte. Finally, we extend our work to a consideration of the effects of simultaneous CO2 and HCO3− influx into a cell, and envision how future models might extend to other cell types (e.g., erythrocytes) or tissues (e.g., renal proximal-tubule epithelium) important for whole-body pH homeostasis. PMID:25617697

  15. Bipolar Membranes for Acid Base Flow Batteries

    NASA Astrophysics Data System (ADS)

    Anthamatten, Mitchell; Roddecha, Supacharee; Jorne, Jacob; Coughlan, Anna

    2011-03-01

    Rechargeable batteries can provide grid-scale electricity storage to match power generation with consumption and promote renewable energy sources. Flow batteries offer modular and flexible design, low cost per kWh and high efficiencies. A novel flow battery concept will be presented based on acid-base neutralization where protons (H+) and hydroxyl (OH-) ions react electrochemically to produce water. The large free energy of this highly reversible reaction can be stored chemically, and, upon discharge, can be harvested as usable electricity. The acid-base flow battery concept avoids the use of a sluggish oxygen electrode and utilizes the highly reversible hydrogen electrode, thus eliminating the need for expensive noble metal catalysts. The proposed flow battery is a hybrid of a battery and a fuel cell---hydrogen gas storing chemical energy is produced at one electrode and is immediately consumed at the other electrode. The two electrodes are exposed to low and high pH solutions, and these solutions are separated by a hybrid membrane containing a hybrid cation and anion exchange membrane (CEM/AEM). Membrane design will be discussed, along with ion-transport data for synthesized membranes.

  16. Photocurable bioadhesive based on lactic acid.

    PubMed

    Marques, D S; Santos, J M C; Ferreira, P; Correia, T R; Correia, I J; Gil, M H; Baptista, C M S G

    2016-01-01

    Novel photocurable and low molecular weight oligomers based on l-lactic acid with proven interest to be used as bioadhesive were successfully manufactured. Preparation of lactic acid oligomers with methacrylic end functionalizations was carried out in the absence of catalyst or solvents by self-esterification in two reaction steps: telechelic lactic acid oligomerization with OH end groups and further functionalization with methacrylic anhydride. The final adhesive composition was achieved by the addition of a reported biocompatible photoinitiator (Irgacure® 2959). Preliminary in vitro biodegradability was investigated by hydrolytic degradation in PBS (pH=7.4) at 37 °C. The adhesion performance was evaluated using glued aminated substrates (gelatine pieces) subjected to pull-to-break test. Surface energy measured by contact angles is lower than the reported values of the skin and blood. The absence of cytoxicity was evaluated using human fibroblasts. A notable antimicrobial behaviour was observed using two bacterial models (Staphylococcus aureus and Escherichia coli). The cured material exhibited a strong thrombogenic character when placed in contact with blood, which can be predicted as a haemostatic effect for bleeding control. This novel material was subjected to an extensive characterization showing great potential for bioadhesive or other biomedical applications where biodegradable and biocompatible photocurable materials are required.

  17. Multiple domains of parental secure base support during childhood and adolescence contribute to adolescents' representations of attachment as a secure base script.

    PubMed

    Vaughn, Brian E; Waters, Theodore E A; Steele, Ryan D; Roisman, Glenn I; Bost, Kelly K; Truitt, Warren; Waters, Harriet S; Booth-Laforce, Cathryn

    2016-08-01

    Although attachment theory claims that early attachment representations reflecting the quality of the child's "lived experiences" are maintained across developmental transitions, evidence that has emerged over the last decade suggests that the association between early relationship quality and adolescents' attachment representations is fairly modest in magnitude. We used aspects of parenting beyond sensitivity over childhood and adolescence and early security to predict adolescents' scripted attachment representations. At age 18 years, 673 participants from the NICHD Study of Early Child Care and Youth Development completed the Attachment Script Assessment from which we derived an assessment of secure base script knowledge. Measures of secure base support from childhood through age 15 years (e.g., parental monitoring of child activity, father presence in the home) were selected as predictors and accounted for an additional 8% of the variance in secure base script knowledge scores above and beyond direct observations of sensitivity and early attachment status alone, suggesting that adolescents' scripted attachment representations reflect multiple domains of parenting. Cognitive and demographic variables also significantly increased predicted variance in secure base script knowledge by 2% each.

  18. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context.

  19. Detection of dual-band infrared small target based on joint dynamic sparse representation

    NASA Astrophysics Data System (ADS)

    Zhou, Jinwei; Li, Jicheng; Shi, Zhiguang; Lu, Xiaowei; Ren, Dongwei

    2015-10-01

    Infrared small target detection is a crucial and yet still is a difficult issue in aeronautic and astronautic applications. Sparse representation is an important mathematic tool and has been used extensively in image processing in recent years. Joint sparse representation is applied in dual-band infrared dim target detection in this paper. Firstly, according to the characters of dim targets in dual-band infrared images, 2-dimension Gaussian intensity model was used to construct target dictionary, then the dictionary was classified into different sub-classes according to different positions of Gaussian function's center point in image block; The fact that dual-band small targets detection can use the same dictionary and the sparsity doesn't lie in atom-level but in sub-class level was utilized, hence the detection of targets in dual-band infrared images was converted to be a joint dynamic sparse representation problem. And the dynamic active sets were used to describe the sparse constraint of coefficients. Two modified sparsity concentration index (SCI) criteria was proposed to evaluate whether targets exist in the images. In experiments, it shows that the proposed algorithm can achieve better detecting performance and dual-band detection is much more robust to noise compared with single-band detection. Moreover, the proposed method can be expanded to multi-spectrum small target detection.

  20. Wavelet-Based Interpolation and Representation of Non-Uniformly Sampled Spacecraft Mission Data

    NASA Technical Reports Server (NTRS)

    Bose, Tamal

    2000-01-01

    A well-documented problem in the analysis of data collected by spacecraft instruments is the need for an accurate, efficient representation of the data set. The data may suffer from several problems, including additive noise, data dropouts, an irregularly-spaced sampling grid, and time-delayed sampling. These data irregularities render most traditional signal processing techniques unusable, and thus the data must be interpolated onto an even grid before scientific analysis techniques can be applied. In addition, the extremely large volume of data collected by scientific instrumentation presents many challenging problems in the area of compression, visualization, and analysis. Therefore, a representation of the data is needed which provides a structure which is conducive to these applications. Wavelet representations of data have already been shown to possess excellent characteristics for compression, data analysis, and imaging. The main goal of this project is to develop a new adaptive filtering algorithm for image restoration and compression. The algorithm should have low computational complexity and a fast convergence rate. This will make the algorithm suitable for real-time applications. The algorithm should be able to remove additive noise and reconstruct lost data samples from images.

  1. Embedded Data Representations.

    PubMed

    Willett, Wesley; Jansen, Yvonne; Dragicevic, Pierre

    2017-01-01

    We introduce embedded data representations, the use of visual and physical representations of data that are deeply integrated with the physical spaces, objects, and entities to which the data refers. Technologies like lightweight wireless displays, mixed reality hardware, and autonomous vehicles are making it increasingly easier to display data in-context. While researchers and artists have already begun to create embedded data representations, the benefits, trade-offs, and even the language necessary to describe and compare these approaches remain unexplored. In this paper, we formalize the notion of physical data referents - the real-world entities and spaces to which data corresponds - and examine the relationship between referents and the visual and physical representations of their data. We differentiate situated representations, which display data in proximity to data referents, and embedded representations, which display data so that it spatially coincides with data referents. Drawing on examples from visualization, ubiquitous computing, and art, we explore the role of spatial indirection, scale, and interaction for embedded representations. We also examine the tradeoffs between non-situated, situated, and embedded data displays, including both visualizations and physicalizations. Based on our observations, we identify a variety of design challenges for embedded data representation, and suggest opportunities for future research and applications.

  2. Teaching Acid/Base Physiology in the Laboratory

    ERIC Educational Resources Information Center

    Friis, Ulla G.; Plovsing, Ronni; Hansen, Klaus; Laursen, Bent G.; Wallstedt, Birgitta

    2010-01-01

    Acid/base homeostasis is one of the most difficult subdisciplines of physiology for medical students to master. A different approach, where theory and practice are linked, might help students develop a deeper understanding of acid/base homeostasis. We therefore set out to develop a laboratory exercise in acid/base physiology that would provide…

  3. A clinical approach to acid-base conundrums.

    PubMed

    Garrubba, Carl; Truscott, Judy

    2016-04-01

    Acid-base disorders can provide essential clues to underlying patient conditions. This article provides a simple, practical approach to identifying simple acid-base disorders and their compensatory mechanisms. Using this stepwise approach, clinicians can quickly identify and appropriately treat acid-base disorders.

  4. Using Willie's Acid-Base Box for Blood Gas Analysis

    ERIC Educational Resources Information Center

    Dietz, John R.

    2011-01-01

    In this article, the author describes a method developed by Dr. William T. Lipscomb for teaching blood gas analysis of acid-base status and provides three examples using Willie's acid-base box. Willie's acid-base box is constructed using three of the parameters of standard arterial blood gas analysis: (1) pH; (2) bicarbonate; and (3) CO[subscript…

  5. A New Data Representation Based on Training Data Characteristics to Extract Drug Name Entity in Medical Text

    PubMed Central

    Basaruddin, T.

    2016-01-01

    One essential task in information extraction from the medical corpus is drug name recognition. Compared with text sources come from other domains, the medical text mining poses more challenges, for example, more unstructured text, the fast growing of new terms addition, a wide range of name variation for the same drug, the lack of labeled dataset sources and external knowledge, and the multiple token representations for a single drug name. Although many approaches have been proposed to overwhelm the task, some problems remained with poor F-score performance (less than 0.75). This paper presents a new treatment in data representation techniques to overcome some of those challenges. We propose three data representation techniques based on the characteristics of word distribution and word similarities as a result of word embedding training. The first technique is evaluated with the standard NN model, that is, MLP. The second technique involves two deep network classifiers, that is, DBN and SAE. The third technique represents the sentence as a sequence that is evaluated with a recurrent NN model, that is, LSTM. In extracting the drug name entities, the third technique gives the best F-score performance compared to the state of the art, with its average F-score being 0.8645. PMID:27843447

  6. Oleic acid-based gemini surfactants with carboxylic acid headgroups.

    PubMed

    Sakai, Kenichi; Umemoto, Naoki; Matsuda, Wataru; Takamatsu, Yuichiro; Matsumoto, Mutsuyoshi; Sakai, Hideki; Abe, Masahiko

    2011-01-01

    Anionic gemini surfactants with carboxylic acid headgroups have been synthesized from oleic acid. The hydrocarbon chain is covalently bound to the terminal carbonyl group of oleic acid via an ester bond, and the carboxylic acid headgroups are introduced to the cis double bond of oleic acid via disuccinyl units. The surfactants exhibit pH-dependent protonation-deprotonation behavior in aqueous solutions. In alkaline solutions (pH 9 in the presence of 10 mmol dm(-3) NaCl as the background electrolyte), the surfactants can lower the surface tension as well as form molecular assemblies, even in the region of low surfactant concentrations. Under acidic (pH 3) or neutral (pH 6-7) conditions, the surfactants are intrinsically insoluble in aqueous media and form a monolayer at the air/water interface. In this study, we have investigated physicochemical properties such as the function of the hydrocarbon chain length by means of static surface tension, pyrene fluorescence, dynamic light scattering, surface pressure-area isotherms, and infrared external reflection measurements.

  7. Scenarios, personas and user stories from design ethnography: Evidence-based design representations of communicable disease investigations

    PubMed Central

    Turner, Anne M; Reeder, Blaine; Ramey, Judith

    2014-01-01

    Purpose Despite years of effort and millions of dollars spent to create a unified electronic communicable disease reporting systems, the goal remains elusive. A major barrier has been a lack of understanding by system designers of communicable disease (CD) work and the public health workers who perform this work. This study reports on the application of User Center Design representations, traditionally used for improving interface design, to translate the complex CD work identified through ethnographic studies to guide designers and developers of CD systems. The purpose of this work is to: (1) better understand public health practitioners and their information workflow with respect to communicable disease (CD) monitoring and control at a local health department, and (2) to develop evidence-based design representations that model this CD work to inform the design of future disease surveillance systems. Methods We performed extensive onsite semi-structured interviews, targeted work shadowing and a focus group to characterize local health department communicable disease workflow. Informed by principles of design ethnography and user-centered design (UCD) we created persona, scenarios and user stories to accurately represent the user to system designers. Results We sought to convey to designers the key findings from ethnographic studies: 1) that public health CD work is mobile and episodic, in contrast to current CD reporting systems, which are stationary and fixed 2) health department efforts are focused on CD investigation and response rather than reporting and 3) current CD information systems must conform to PH workflow to ensure their usefulness. In an effort to illustrate our findings to designers, we developed three contemporary design-support representations: persona, scenario, and user story. Conclusions Through application of user centered design principles, we were able to create design representations that illustrate complex public health communicable

  8. Quantum cognition based on an ambiguous representation derived from a rough set approximation.

    PubMed

    Gunji, Yukio-Pegio; Sonoda, Kohei; Basios, Vasileios

    2016-03-01

    Over the last years, in a series papers by Arecchi and others, a model for the cognitive processes involved in decision making has been proposed and investigated. The key element of this model is the expression of apprehension and judgment, basic cognitive process of decision making, as an inverse Bayes inference classifying the information content of neuron spike trains. It has been shown that for successive plural stimuli this inference, equipped with basic non-algorithmic jumps, is affected by quantum-like characteristics. We show here that such a decision making process is related consistently with an ambiguous representation by an observer within a universe of discourse. In our work the ambiguous representation of an object or a stimuli is defined as a pair of maps from objects of a set to their representations, where these two maps are interrelated in a particular structure. The a priori and a posteriori hypotheses in Bayes inference are replaced by the upper and lower approximations, correspondingly, for the initial data sets that are derived with respect to each map. Upper and lower approximations herein are defined in the context of "rough set" analysis. The inverse Bayes inference is implemented by the lower approximations with respect to the one map and for the upper approximation with respect to the other map for a given data set. We show further that, due to the particular structural relation between the two maps, the logical structure of such combined approximations can only be expressed as an orthomodular lattice and therefore can be represented by a quantum rather than a Boolean logic. To our knowledge, this is the first investigation aiming to reveal the concrete logic structure of inverse Bayes inference in cognitive processes.

  9. Identifying a base in a nucleic acid

    DOEpatents

    Fodor, Stephen P. A.; Lipshutz, Robert J.; Huang, Xiaohua

    2005-02-08

    Devices and techniques for hybridization of nucleic acids and for determining the sequence of nucleic acids. Arrays of nucleic acids are formed by techniques, preferably high resolution, light-directed techniques. Positions of hybridization of a target nucleic acid are determined by, e.g., epifluorescence microscopy. Devices and techniques are proposed to determine the sequence of a target nucleic acid more efficiently and more quickly through such synthesis and detection techniques.

  10. Polarity based fractionation of fulvic acids.

    PubMed

    Li, Aimin; Hu, Jundong; Li, Wenhui; Zhang, Wei; Wang, Xuejun

    2009-11-01

    Fulvic acids from the soil of Peking University (PF) and a Nordic river (NF) were separated into well defined sub-fractions using sequential elution techniques based on eluent polarity. The chemical properties of the fractions including: PF1 and NF1 (eluted by 0.01 M HCl), PF2 and NF2 (eluted by 0.01 M HCl+20% methanol), PF3 and NF3 (eluted by 0.01 M HCl+40% methanol), and PF4 and NF4 (eluted by 100% methanol), were characterized using UV-Visible spectroscopy, elemental analysis and (13)C NMR. The results showed that the UV absorptions of the elution peaks at 280 nm (A280) increased from PF2 to PF4 and NF2 to NF4. No elution peaks were observed for PF1 and NF1. The carbon contents increased from 43.34% to 51.90% and 43.06% to 53.26% while the oxygen contents decreased from 46.39% to 36.76% and 49.76% to 40.03% for PF1-PF4 and NF1-NF4, respectively. As a polarity indicator, the (O+N)/C ratio for PF1-PF4 and NF1-NF4 decreased from 0.88 to 0.62 and 0.89 to 0.58, respectively. The aromatic carbon content increased from PF1 to PF4 and NF1 to NF4, suggesting an increase of the hydrophobicity of these fractions. The polarity was positively related to the ratio of UV absorption at 250 nm and 365 nm (E2/E3), and negatively related to the aromaticity. A high positive relationship between the aromaticity and E2/E3 of fulvic acid fractions was also obtained. The use of an eluent with a decreasing polarity allowed to providing simpler fractions of soil and aquatic fulvic acids.

  11. Change detection based on deep feature representation and mapping transformation for multi-spatial-resolution remote sensing images

    NASA Astrophysics Data System (ADS)

    Zhang, Puzhao; Gong, Maoguo; Su, Linzhi; Liu, Jia; Li, Zhizhou

    2016-06-01

    Multi-spatial-resolution change detection is a newly proposed issue and it is of great significance in remote sensing, environmental and land use monitoring, etc. Though multi-spatial-resolution image-pair are two kinds of representations of the same reality, they are often incommensurable superficially due to their different modalities and properties. In this paper, we present a novel multi-spatial-resolution change detection framework, which incorporates deep-architecture-based unsupervised feature learning and mapping-based feature change analysis. Firstly, we transform multi-resolution image-pair into the same pixel-resolution through co-registration, followed by details recovery, which is designed to remedy the spatial details lost in the registration. Secondly, the denoising autoencoder is stacked to learn local and high-level representation/feature from the local neighborhood of the given pixel, in an unsupervised fashion. Thirdly, motivated by the fact that multi-resolution image-pair share the same reality in the unchanged regions, we try to explore the inner relationships between them by building a mapping neural network. And it can be used to learn a mapping function based on the most-unlikely-changed feature-pairs, which are selected from all the feature-pairs via a coarse initial change map generated in advance. The learned mapping function can bridge the different representations and highlight changes. Finally, we can build a robust and contractive change map through feature similarity analysis, and the change detection result is obtained through the segmentation of the final change map. Experiments are carried out on four real datasets, and the results confirmed the effectiveness and superiority of the proposed method.

  12. Social representations of adolescents on quality of life: structurally-based study.

    PubMed

    Moreira, Ramon Missias; Boery, Eduardo Nagib; De Oliveira, Denize Cristina; Sales, Zenilda Nogueira; Boery, Rita Narriman Silva de Oliveira; Teixeira, Jules Ramon Brito; Ribeiro, Ícaro José Santos; Mussi, Fernanda Carneiro

    2015-01-01

    This study sought to conduct a comparatively analysis and describe the contents of the structure of the social representations of adolescents on quality of life. It involves descriptive, quantitative research, with the benchmark of a structural approach to social representations. The informants included 316 adolescents from three public schools in Jequié in the State of Bahia. The Spontaneous Word-Choice Eliciting Technique using the key expression "Quality of Life" was used for data collection. The responses were processed using Evoc 2003 software, which generated the Four-House Chart. The results reveal the core nucleus of the terms: healthy eating; physical activity; money; and sex. In the 1st outer circle, the words absence of disease, condoms, liberty, marijuana, housing, work and living well are featured. In the 2nd outer circle, there appeared the words difficulty, family, peace and power, and the contrasting elements of well-being and soccer. The overall consensus is that adolescents associate quality of life with sports and other healthy behavior activities, and are influenced by the desires and curiosities of adolescence.

  13. Spectrum recovery method based on sparse representation for segmented multi-Gaussian model

    NASA Astrophysics Data System (ADS)

    Teng, Yidan; Zhang, Ye; Ti, Chunli; Su, Nan

    2016-09-01

    Hyperspectral images can realize crackajack features discriminability for supplying diagnostic characteristics with high spectral resolution. However, various degradations may generate negative influence on the spectral information, including water absorption, bands-continuous noise. On the other hand, the huge data volume and strong redundancy among spectrums produced intense demand on compressing HSIs in spectral dimension, which also leads to the loss of spectral information. The reconstruction of spectral diagnostic characteristics has irreplaceable significance for the subsequent application of HSIs. This paper introduces a spectrum restoration method for HSIs making use of segmented multi-Gaussian model (SMGM) and sparse representation. A SMGM is established to indicating the unsymmetrical spectral absorption and reflection characteristics, meanwhile, its rationality and sparse property are discussed. With the application of compressed sensing (CS) theory, we implement sparse representation to the SMGM. Then, the degraded and compressed HSIs can be reconstructed utilizing the uninjured or key bands. Finally, we take low rank matrix recovery (LRMR) algorithm for post processing to restore the spatial details. The proposed method was tested on the spectral data captured on the ground with artificial water absorption condition and an AVIRIS-HSI data set. The experimental results in terms of qualitative and quantitative assessments demonstrate that the effectiveness on recovering the spectral information from both degradations and loss compression. The spectral diagnostic characteristics and the spatial geometry feature are well preserved.

  14. Feature-based face representations and image reconstruction from behavioral and neural data

    PubMed Central

    Nestor, Adrian; Plaut, David C.; Behrmann, Marlene

    2016-01-01

    The reconstruction of images from neural data can provide a unique window into the content of human perceptual representations. Although recent efforts have established the viability of this enterprise using functional magnetic resonance imaging (MRI) patterns, these efforts have relied on a variety of prespecified image features. Here, we take on the twofold task of deriving features directly from empirical data and of using these features for facial image reconstruction. First, we use a method akin to reverse correlation to derive visual features from functional MRI patterns elicited by a large set of homogeneous face exemplars. Then, we combine these features to reconstruct novel face images from the corresponding neural patterns. This approach allows us to estimate collections of features associated with different cortical areas as well as to successfully match image reconstructions to corresponding face exemplars. Furthermore, we establish the robustness and the utility of this approach by reconstructing images from patterns of behavioral data. From a theoretical perspective, the current results provide key insights into the nature of high-level visual representations, and from a practical perspective, these findings make possible a broad range of image-reconstruction applications via a straightforward methodological approach. PMID:26711997

  15. Seismic detection method for small-scale discontinuities based on dictionary learning and sparse representation

    NASA Astrophysics Data System (ADS)

    Yu, Caixia; Zhao, Jingtao; Wang, Yanfei

    2017-02-01

    Studying small-scale geologic discontinuities, such as faults, cavities and fractures, plays a vital role in analyzing the inner conditions of reservoirs, as these geologic structures and elements can provide storage spaces and migration pathways for petroleum. However, these geologic discontinuities have weak energy and are easily contaminated with noises, and therefore effectively extracting them from seismic data becomes a challenging problem. In this paper, a method for detecting small-scale discontinuities using dictionary learning and sparse representation is proposed that can dig up high-resolution information by sparse coding. A K-SVD (K-means clustering via Singular Value Decomposition) sparse representation model that contains two stage of iteration procedure: sparse coding and dictionary updating, is suggested for mathematically expressing these seismic small-scale discontinuities. Generally, the orthogonal matching pursuit (OMP) algorithm is employed for sparse coding. However, the method can only update one dictionary atom at one time. In order to improve calculation efficiency, a regularized version of OMP algorithm is presented for simultaneously updating a number of atoms at one time. Two numerical experiments demonstrate the validity of the developed method for clarifying and enhancing small-scale discontinuities. The field example of carbonate reservoirs further demonstrates its effectiveness in revealing masked tiny faults and small-scale cavities.

  16. Polymer gel dosimeter based on itaconic acid.

    PubMed

    Mattea, Facundo; Chacón, David; Vedelago, José; Valente, Mauro; Strumia, Miriam C

    2015-11-01

    A new polymeric dosimeter based on itaconic acid and N, N'-methylenebisacrylamide was studied. The preparation method, compositions of monomer and crosslinking agent and the presence of oxygen in the dosimetric system were analyzed. The resulting materials were irradiated with an X-ray tube at 158cGy/min, 226cGymin and 298cGy/min with doses up to 1000Gy. The dosimeters presented a linear response in the dose range 75-1000Gy, sensitivities of 0.037 1/Gyat 298cGy/min and an increase in the sensitivity with lower dose rates. One of the most relevant outcomes in this study was obtaining different monomer to crosslinker inclusion in the formed gel for the dosimeters where oxygen was purged during the preparation method. This effect has not been reported in other typical dosimeters and could be attributed to the large differences in the reactivity among these species.

  17. Perception of oyster-based products by French consumers. The effect of processing and role of social representations.

    PubMed

    Debucquet, Gervaise; Cornet, Josiane; Adam, Isabelle; Cardinal, Mireille

    2012-12-01

    The search for new markets in the seafood sector, associated with the question of the continuity of raw oyster consumption over generations can be an opportunity for processors to extend their ranges with oyster-based products. The twofold aim of this study was to evaluate the impact of processing and social representation on perception of oyster-based products by French consumers and to identify the best means of development in order to avoid possible failure in the market. Five products with different degrees of processing (cooked oysters in a half-shell, hot preparation for toast, potted oyster, oyster butter and oyster-based soup) were presented within focus groups and consumer tests, at home and in canteens with the staff of several companies in order to reach consumers with different ages and professional activities. The results showed that social representation had a strong impact and that behaviours were contrasted according to the initial profile of the consumer (traditional raw oyster consumers or non-consumers) and their age distribution (younger and older people). The degree of processing has to be adapted to each segment. It is suggested to develop early exposure to influence the food choices and preferences of the youngest consumers on a long-term basis.

  18. Hamiltonian-based ray-tracing method with triangular-mesh representation for a large-scale cloaking device with an arbitrary shape.

    PubMed

    Tanaka, Tatsuo; Matoba, Osamu

    2016-05-01

    Hamiltonian-based ray-tracing technique with mesh representation is presented for designing large-scale cloaking devices with three-dimensional arbitrary shapes, which have inhomogeneity and anisotropy in their electric permittivity and magnetic permeability. In order to deal with arbitrary shapes, the surfaces of the cloaking devices are represented by triangular meshes. Comparison between the result of cloaking simulations with the mesh representation and those with the rigorous function representation is presented. The numerical results showed that fine-mesh resolution is required for accurate evaluation of cloaking performances.

  19. Dynamic uncertain causality graph for knowledge representation and probabilistic reasoning: statistics base, matrix, and application.

    PubMed

    Zhang, Qin; Dong, Chunling; Cui, Yan; Yang, Zhihui

    2014-04-01

    Graphical models for probabilistic reasoning are now in widespread use. Many approaches have been developed such as Bayesian network. A newly developed approach named as dynamic uncertain causality graph (DUCG) is initially presented in a previous paper, in which only the inference algorithm in terms of individual events and probabilities is addressed. In this paper, we first explain the statistic basis of DUCG. Then, we extend the algorithm to the form of matrices of events and probabilities. It is revealed that the representation of DUCG can be incomplete and the exact probabilistic inference may still be made. A real application of DUCG for fault diagnoses of a generator system of a nuclear power plant is demonstrated, which involves > 600 variables. Most inferences take < 1 s with a laptop computer. The causal logic between inference result and observations is graphically displayed to users so that they know not only the result, but also why the result obtained.

  20. Multi-objective analysis of a component-based representation within an interactive evolutionary design system

    NASA Astrophysics Data System (ADS)

    Machwe, A. T.; Parmee, I. C.

    2007-07-01

    This article describes research relating to a user-centered evolutionary design system that evaluates both engineering and aesthetic aspects of design solutions during early-stage conceptual design. The experimental system comprises several components relating to user interaction, problem representation, evolutionary search and exploration and online learning. The main focus of the article is the evolutionary aspect of the system when using a single quantitative objective function plus subjective judgment of the user. Additionally, the manner in which the user-interaction aspect affects system output is assessed by comparing Pareto frontiers generated with and without user interaction via a multi-objective evolutionary algorithm (MOEA). A solution clustering component is also introduced and it is shown how this can improve the level of support to the designer when dealing with a complex design problem involving multiple objectives. Supporting results are from the application of the system to the design of urban furniture which, in this case, largely relates to seating design.

  1. Dynamics of Random Boolean Networks under Fully Asynchronous Stochastic Update Based on Linear Representation

    PubMed Central

    Luo, Chao; Wang, Xingyuan

    2013-01-01

    A novel algebraic approach is proposed to study dynamics of asynchronous random Boolean networks where a random number of nodes can be updated at each time step (ARBNs). In this article, the logical equations of ARBNs are converted into the discrete-time linear representation and dynamical behaviors of systems are investigated. We provide a general formula of network transition matrices of ARBNs as well as a necessary and sufficient algebraic criterion to determine whether a group of given states compose an attractor of length in ARBNs. Consequently, algorithms are achieved to find all of the attractors and basins in ARBNs. Examples are showed to demonstrate the feasibility of the proposed scheme. PMID:23785502

  2. Multiple local feature representations and their fusion based on an SVR model for iris recognition using optimized Gabor filters

    NASA Astrophysics Data System (ADS)

    He, Fei; Liu, Yuanning; Zhu, Xiaodong; Huang, Chun; Han, Ye; Dong, Hongxing

    2014-12-01

    Gabor descriptors have been widely used in iris texture representations. However, fixed basic Gabor functions cannot match the changing nature of diverse iris datasets. Furthermore, a single form of iris feature cannot overcome difficulties in iris recognition, such as illumination variations, environmental conditions, and device variations. This paper provides multiple local feature representations and their fusion scheme based on a support vector regression (SVR) model for iris recognition using optimized Gabor filters. In our iris system, a particle swarm optimization (PSO)- and a Boolean particle swarm optimization (BPSO)-based algorithm is proposed to provide suitable Gabor filters for each involved test dataset without predefinition or manual modulation. Several comparative experiments on JLUBR-IRIS, CASIA-I, and CASIA-V4-Interval iris datasets are conducted, and the results show that our work can generate improved local Gabor features by using optimized Gabor filters for each dataset. In addition, our SVR fusion strategy may make full use of their discriminative ability to improve accuracy and reliability. Other comparative experiments show that our approach may outperform other popular iris systems.

  3. When Simple Harmonic Motion is not That Simple: Managing Epistemological Complexity by Using Computer-based Representations

    NASA Astrophysics Data System (ADS)

    Parnafes, Orit

    2010-12-01

    Many real-world phenomena, even "simple" physical phenomena such as natural harmonic motion, are complex in the sense that they require coordinating multiple subtle foci of attention to get the required information when experiencing them. Moreover, for students to develop sound understanding of a concept or a phenomenon, they need to learn to get the same type of information across different contexts and situations (diSessa and Sherin 1998; diSessa and Wagner 2005). Rather than simplifying complex situations, or creating a linear instructional sequence in which students move from one context to another, this paper demonstrates the use of computer-based representations to facilitate developing understanding of complex physical phenomena. The data is collected from 8 studies in which pairs of students are engaged in an exploratory activity, trying to understand the dynamic behavior of a simulation and, at the same time, to attribute meaning to it in terms of the physical phenomenon it represents. The analysis focuses on three episodes. The first two episodes demonstrate the epistemological complexity involved in attempting to make sense of natural harmonic oscillation. A third episode demonstrates the process by which students develop understanding in this complex perceptual and conceptual territory, through the mediation (Vygotsky 1978) of computer-based representations designed to facilitate understanding in this topic.

  4. Acid-base bifunctional catalytic surfaces for nucleophilic addition reactions.

    PubMed

    Motokura, Ken; Tada, Mizuki; Iwasawa, Yasuhiro

    2008-09-01

    This article illustrates the modification of oxide surfaces with organic amine functional groups to create acid-base bifunctional catalysts, summarizing our previous reports and also presenting new data. Immobilization of organic amines as bases on inorganic solid-acid surfaces afforded highly active acid-base bifunctional catalysts, which enabled various organic transformations including C--C coupling reactions, though these reactions did not proceed with either the homogeneous amine precursors or the acidic supports alone. Spectroscopic characterization, such as by solid-state MAS NMR and FTIR, revealed not only the interactions between acidic and basic sites but also bifunctional catalytic reaction mechanisms.

  5. Spatial choices of macaque monkeys based on the visual representation of the response space: rotation of the stimuli.

    PubMed

    Nedvidek, Jan; Nekovarova, Tereza; Bures, Jan

    2008-11-21

    In earlier experiments we have demonstrated that macaque monkeys (Macaca mulatta) are able to use abstract visual stimuli presented on a computer screen to make spatial choices in the real environment. In those experiments a touch board ("response space") was directly connected to the computer screen ("virtual space"). The goal of the present experiment was to find out whether macaque monkeys are able: (1) To make spatial choices in a response space which is completely separated from the screen where the stimuli (designed as representation of the response space) are presented. (2) To make spatial choices based on visual stimuli representing the configuration of the response space which are rotated with respect to this response space. The monkeys were trained to choose one of the nine "touch holes" on a transparent touch panel situated beside a computer monitor on which the visual stimuli were presented. The visual stimuli were designed as an abstract representation of the response space: the rewarded position was shown as a bright circle situated at a certain position in the rectangle representing the contours of the touch panel. At first, the monkeys were trained with non-rotated spatial stimuli. After this initial training, the visual stimuli were gradually rotated by 20 degrees in each step. In the last phase, the stimulus was suddenly rotated in the opposite direction by 60 degrees in one step. The results of the experiment suggest that the monkeys are able to use successfully abstract stimuli from one spatial frame for spatial choices in another frame. Effective use of the stimuli after their rotation suggested that the monkeys perceived the stimuli as a representation of the configuration of the touch holes in the real space, not only as different geometrical patterns without configuration information.

  6. Thermochemical comparisons of homogeneous and heterogeneous acids and bases. 1. Sulfonic acid solutions and resins as prototype Broensted acids

    SciTech Connect

    Arnett, E.M.; Haaksma, R.A.; Chawla, B.; Healy, M.H.

    1986-08-06

    Heats of ionization by thermometric titration for a series of bases (or acids) can be used to compare solid acids (or bases) with liquid analogues bearing the same functionalities in homogeneous solutions. The method is demonstrated for Broensted acids by reacting a series of substituted nitrogen bases with solutions of p-toluenesulfonic acid (PTSA) in acetonitrile and with suspensions of the microporous polymeric arylsulfonic acid resin-Dowex 50W-X8 in the same solvent. Under well-controlled anhydrous conditions there is a good correlation (r = 0.992) between the heats of reaction of the bases with the homogeneous and heterogeneous acid systems, but the homogeneous system gives a more exothermic interaction by 3-4 kcal mol/sup -1/ for a series of 29 substituted pyrimidines, anilines, and some other amines. This difference may be attributed to homohydrogen bonding interactions between excess acid and sulfonate anion sites which are more restricted geometrically in the resin than in solution. Other factors which affect the enthalpy change for the acid-base interaction are the acid/base ratio, the water content of the sulfonic acid, the solvent, and the resin structure (e.g., microporous vs. macroporous). Steric hindrance in the base does not differentiate solid from homogeneous acid. In addition to the use of titration calorimetry, heats of immersion are reported for the Dowex-arylsulfonic acid resins and the Nafion-perfluorinated sulfonic acid resin in a series of basic liquids. The results are compared with each other and with those from a previous study of several varieties of coal.

  7. Base pairing and base mis-pairing in nucleic acids

    NASA Technical Reports Server (NTRS)

    Wang, A. H. J.; Rich, A.

    1986-01-01

    In recent years we have learned that DNA is conformationally active. It can exist in a number of different stable conformations including both right-handed and left-handed forms. Using single crystal X-ray diffraction analysis we are able to discover not only additional conformations of the nucleic acids but also different types of hydrogen bonded base-base interactions. Although Watson-Crick base pairings are the predominant type of interaction in double helical DNA, they are not the only types. Recently, we have been able to examine mismatching of guanine-thymine base pairs in left-handed Z-DNA at atomic resolution (1A). A minimum amount of distortion of the sugar phosphate backbone is found in the G x T pairing in which the bases are held together by two hydrogen bonds in the wobble pairing interaction. Because of the high resolution of the analysis we can visualize water molecules which fill in to accommodate the other hydrogen bonding positions in the bases which are not used in the base-base interactions. Studies on other DNA oligomers have revealed that other types of non-Watson-Crick hydrogen bonding interactions can occur. In the structure of a DNA octamer with the sequence d(GCGTACGC) complexed to an antibiotic triostin A, it was found that the two central AT base pairs are held together by Hoogsteen rather than Watson-Crick base pairs. Similarly, the G x C base pairs at the ends are also Hoogsteen rather than Watson-Crick pairing. Hoogsteen base pairs make a modified helix which is distinct from the Watson-Crick double helix.

  8. Inherent Structure Based Multi-view Learning with Multi-template Feature Representation for Alzheimer’s Disease Diagnosis

    PubMed Central

    Liu, Mingxia; Zhang, Daoqiang; Adeli-Mosabbeb, Ehsan; Shen, Dinggang

    2016-01-01

    Multi-template based brain morphometric pattern analysis using magnetic resonance imaging (MRI) has been recently proposed for automatic diagnosis of Alzheimer’s disease (AD) and its prodromal stage (i.e., mild cognitive impairment or MCI). In such methods, multi-view morphological patterns generated from multiple templates are used as feature representation for brain images. However, existing multi-template based methods often simply assume that each class is represented by a specific type of data distribution (i.e., a single cluster), while in reality the underlying data distribution is actually not pre-known. In this paper, we propose an inherent structure based multi-view leaning (ISML) method using multiple templates for AD/MCI classification. Specifically, we first extract multi-view feature representations for subjects using multiple selected templates, and then cluster subjects within a specific class into several sub-classes (i.e., clusters) in each view space. Then, we encode those sub-classes with unique codes by considering both their original class information and their own distribution information, followed by a multi-task feature selection model. Finally, we learn an ensemble of view-specific support vector machine (SVM) classifiers based on their respectively selected features in each view, and fuse their results to draw the final decision. Experimental results on the Alzheimer’s Disease Neuroimaging Initiative (ADNI) database demonstrate that our method achieves promising results for AD/MCI classification, compared to the state-of-the-art multi-template based methods. PMID:26540666

  9. Grid-based methods for diatomic quantum scattering problems: a finite-element, discrete variable representation in prolate spheroidal coordinates

    SciTech Connect

    Tao, Liang; McCurdy, C.W.; Rescigno, T.N.

    2008-11-25

    We show how to combine finite elements and the discrete variable representation in prolate spheroidal coordinates to develop a grid-based approach for quantum mechanical studies involving diatomic molecular targets. Prolate spheroidal coordinates are a natural choice for diatomic systems and have been used previously in a variety of bound-state applications. The use of exterior complex scaling in the present implementation allows for a transparently simple way of enforcing Coulomb boundary conditions and therefore straightforward application to electronic continuum problems. Illustrative examples involving the bound and continuum states of H2+, as well as the calculation of photoionization cross sections, show that the speed and accuracy of the present approach offer distinct advantages over methods based on single-center expansions.

  10. Research on vibration response of a multi-faulted rotor system using LMD-based time-frequency representation

    NASA Astrophysics Data System (ADS)

    Jiao, Weidong; Qian, Suxiang; Chang, Yongping; Yang, Shixi

    2012-12-01

    Unbalance, fatigue crack and rotor-stator rub are the three common and important faults in a rotor-bearing system. They are originally interconnected each other, and their vibration behaviors do often show strong nonlinear and transient characteristic, especially when more than one of them coexist in the system. This article is aimed to study the vibration response of the rotor system in presence of multiple rotor faults such as unbalance, crack, and rotor-stator rub, using local mean decomposition-based time-frequency representation. Equations of motion of the multi-faulted Jeffcott rotor, including unbalance, crack, and rub, are presented. By solving the motion equations, steady-state vibration response is obtained in presence of multiple rotor faults. As a comparison, Hilbert-Huang transformation, based on empirical mode decomposition, is also applied to analyze the multi-faults data. By the study some diagnostic recommendations are derived.

  11. Representing Representation

    ERIC Educational Resources Information Center

    Kuntz, Aaron M.

    2010-01-01

    What can be known and how to render what we know are perpetual quandaries met by qualitative research, complicated further by the understanding that the everyday discourses influencing our representations are often tacit, unspoken or heard so often that they seem to warrant little reflection. In this article, I offer analytic memos as a means for…

  12. A Vehicle Active Safety Model: Vehicle Speed Control Based on Driver Vigilance Detection Using Wearable EEG and Sparse Representation.

    PubMed

    Zhang, Zutao; Luo, Dianyuan; Rasim, Yagubov; Li, Yanjun; Meng, Guanjun; Xu, Jian; Wang, Chunbai

    2016-02-19

    In this paper, we present a vehicle active safety model for vehicle speed control based on driver vigilance detection using low-cost, comfortable, wearable electroencephalographic (EEG) sensors and sparse representation. The proposed system consists of three main steps, namely wireless wearable EEG collection, driver vigilance detection, and vehicle speed control strategy. First of all, a homemade low-cost comfortable wearable brain-computer interface (BCI) system with eight channels is designed for collecting the driver's EEG signal. Second, wavelet de-noising and down-sample algorithms are utilized to enhance the quality of EEG data, and Fast Fourier Transformation (FFT) is adopted to extract the EEG power spectrum density (PSD). In this step, sparse representation classification combined with k-singular value decomposition (KSVD) is firstly introduced in PSD to estimate the driver's vigilance level. Finally, a novel safety strategy of vehicle speed control, which controls the electronic throttle opening and automatic braking after driver fatigue detection using the above method, is presented to avoid serious collisions and traffic accidents. The simulation and practical testing results demonstrate the feasibility of the vehicle active safety model.

  13. A Vehicle Active Safety Model: Vehicle Speed Control Based on Driver Vigilance Detection Using Wearable EEG and Sparse Representation

    PubMed Central

    Zhang, Zutao; Luo, Dianyuan; Rasim, Yagubov; Li, Yanjun; Meng, Guanjun; Xu, Jian; Wang, Chunbai

    2016-01-01

    In this paper, we present a vehicle active safety model for vehicle speed control based on driver vigilance detection using low-cost, comfortable, wearable electroencephalographic (EEG) sensors and sparse representation. The proposed system consists of three main steps, namely wireless wearable EEG collection, driver vigilance detection, and vehicle speed control strategy. First of all, a homemade low-cost comfortable wearable brain-computer interface (BCI) system with eight channels is designed for collecting the driver’s EEG signal. Second, wavelet de-noising and down-sample algorithms are utilized to enhance the quality of EEG data, and Fast Fourier Transformation (FFT) is adopted to extract the EEG power spectrum density (PSD). In this step, sparse representation classification combined with k-singular value decomposition (KSVD) is firstly introduced in PSD to estimate the driver’s vigilance level . Finally, a novel safety strategy of vehicle speed control, which controls the electronic throttle opening and automatic braking after driver fatigue detection using the above method, is presented to avoid serious collisions and traffic accidents. The simulation and practical testing results demonstrate the feasibility of the vehicle active safety model. PMID:26907278

  14. Attachment and God representations among lay Catholics, priests, and religious: a matched comparison study based on the Adult Attachment Interview.

    PubMed

    Cassibba, Rosalinda; Granqvist, Pehr; Costantini, Alessandro; Gatto, Sergio

    2008-11-01

    Based on the idea that believers' perceived relationships with God develop from their attachment-related experiences with primary caregivers, the authors explored the quality of such experiences and their representations among individuals who differed in likelihood of experiencing a principal attachment to God. Using the Adult Attachment Interview (AAI), they compared attachment-related experiences and representations in a group of 30 Catholic priests and religious with a matched group of lay Catholics and with the worldwide normal distribution of AAI classifications. They found an overrepresentation of secure-autonomous states regarding attachment among those more likely to experience a principal attachment to God (i.e., the priests and religious) compared with the other groups and an underrepresentation of unresolved-disorganized states in the two groups of Catholics compared with the worldwide normal distribution. Key findings also included links between secure-autonomous states regarding attachment and estimated experiences with loving or nonrejecting parents on the one hand and loving God imagery on the other. These results extend the literature on religion from an attachment perspective and support the idea that generalized working models derived from attachment experiences with parents are reflected in believers' perceptions of God.

  15. Learning the exception to the rule: model-based FMRI reveals specialized representations for surprising category members.

    PubMed

    Davis, Tyler; Love, Bradley C; Preston, Alison R

    2012-02-01

    Category knowledge can be explicit, yet not conform to a perfect rule. For example, a child may acquire the rule "If it has wings, then it is a bird," but then must account for exceptions to this rule, such as bats. The current study explored the neurobiological basis of rule-plus-exception learning by using quantitative predictions from a category learning model, SUSTAIN, to analyze behavioral and functional magnetic resonance imaging (fMRI) data. SUSTAIN predicts that exceptions require formation of specialized representations to distinguish exceptions from rule-following items in memory. By incorporating quantitative trial-by-trial predictions from SUSTAIN directly into fMRI analyses, we observed medial temporal lobe (MTL) activation consistent with 2 predicted psychological processes that enable exception learning: item recognition and error correction. SUSTAIN explains how these processes vary in the MTL across learning trials as category knowledge is acquired. Importantly, MTL engagement during exception learning was not captured by an alternate exemplar-based model of category learning or by standard contrasts comparing exception and rule-following items. The current findings thus provide a well-specified theory for the role of the MTL in category learning, where the MTL plays an important role in forming specialized category representations appropriate for the learning context.

  16. Representation of Stormflow and a More Responsive Water Table in a TOPMODEL-Based Hydrology Model

    NASA Technical Reports Server (NTRS)

    Shaman, Jeffrey; Stieglitz, Marc; Engel, Victor; Koster, Randal; Stark, Colin; Houser, Paul R. (Technical Monitor)

    2001-01-01

    This study presents two new modeling strategies. First, a methodology for representing the physical process of stormflow within a TOPMODEL framework is developed. In using this approach, discharge at quickflow time scales is simulated and a fuller depiction of hydrologic activity is brought about. Discharge of water from the vadose zone is permitted in a physically realistic manner without a priori assumption of the level within the soil column at which stormflow saturation can take place. Determination of the stormflow contribution to discharge is made using the equation for groundwater flow. No new parameters are needed. Instead, regions of near saturation that develop during storm events, producing vertical recharge, are allowed to contribute to soil column discharge. These stormflow contributions to river runoff, as for groundwater flow contributions, are a function of catchment topography and local hydraulic conductivity at the depth of these regions of near saturation. The second approach improves groundwater flow response through a reduction of porosity and field capacity with depth in the soil column. Large storm events are better captured and a more dynamic water table develops with application of this modified soil column profile (MSCP). The MSCP predominantly reflects soil depth differences in upland and lowland regions of a watershed. Combined, these two approaches - stormflow and the MSCP - provide a more accurate representation of the time scales at which soil column discharge responds and a more complete depiction of hydrologic activity. Storm events large and small are better simulated, and some of the biases previously evident in TOPMODEL simulations are reduced.

  17. Weak vs Strong Acids and Bases: The Football Analogy

    NASA Astrophysics Data System (ADS)

    Silverstein, Todd P.

    2000-07-01

    An important topic in any introductory chemistry course is that of acids and bases. Students generally have no trouble learning the Brønsted-Lowry definition of an acid as a proton donor and a base as a proton acceptor. Problems often arise, however, when chemistry teachers attempt to explain the difference between weak and strong acids, and between weak and strong bases. For acids in aqueous solution, discussing complete in contrast to partial ionization works well for those with a strong grasp of the equilibrium concept, but for many students it does not seem to do the trick. Partial ionization may not evoke much in the mind of a "visual learner". Accordingly, I have developed a football analogy for acids and bases in which acids are compared to quarterbacks, whose job is to get rid of the ball (H+). A strong acid, like an excellent quarterback, delivers the ball effectively; a weak acid, like a poor quarterback, is often left holding the ball. Furthermore, bases may be likened to wide receivers, whose job is to catch and hold onto the ball (H+). A strong base, like an excellent wide receiver, holds onto the ball; a weak base, like a poor receiver, often drops the ball. The concept of throwing and catching a ball is easy to visualize and the analogy to acids and bases can help even students unfamiliar with the mores of the gridiron to comprehend the mores of aqueous protons.

  18. Acid-Base Pairs in Lewis Acidic Zeolites Promote Direct Aldol Reactions by Soft Enolization.

    PubMed

    Lewis, Jennifer D; Van de Vyver, Stijn; Román-Leshkov, Yuriy

    2015-08-17

    Hf-, Sn-, and Zr-Beta zeolites catalyze the cross-aldol condensation of aromatic aldehydes with acetone under mild reaction conditions with near quantitative yields. NMR studies with isotopically labeled molecules confirm that acid-base pairs in the Si-O-M framework ensemble promote soft enolization through α-proton abstraction. The Lewis acidic zeolites maintain activity in the presence of water and, unlike traditional base catalysts, in acidic solutions.

  19. Determination of acidity constants of acid-base indicators by second-derivative spectrophotometry

    NASA Astrophysics Data System (ADS)

    Kara, Derya; Alkan, Mahir

    2000-12-01

    A method for calculation of acid-base dissociation constants of monoprotic weak organic acids whose acid and base species have overlapping spectra from absorptiometric and pH measurements is described. It has been shown that the second-derivative spectrophotometry can effectively be used for determining the dissociation constants, when dissociation constants obtained for methyl orange and bromothymol blue were compared with the values given in the literature.

  20. Chip-based sequencing nucleic acids

    DOEpatents

    Beer, Neil Reginald

    2014-08-26

    A system for fast DNA sequencing by amplification of genetic material within microreactors, denaturing, demulsifying, and then sequencing the material, while retaining it in a PCR/sequencing zone by a magnetic field. One embodiment includes sequencing nucleic acids on a microchip that includes a microchannel flow channel in the microchip. The nucleic acids are isolated and hybridized to magnetic nanoparticles or to magnetic polystyrene-coated beads. Microreactor droplets are formed in the microchannel flow channel. The microreactor droplets containing the nucleic acids and the magnetic nanoparticles are retained in a magnetic trap in the microchannel flow channel and sequenced.

  1. Assessment of acid-base balance. Stewart's approach.

    PubMed

    Fores-Novales, B; Diez-Fores, P; Aguilera-Celorrio, L J

    2016-04-01

    The study of acid-base equilibrium, its regulation and its interpretation have been a source of debate since the beginning of 20th century. Most accepted and commonly used analyses are based on pH, a notion first introduced by Sorensen in 1909, and on the Henderson-Hasselbalch equation (1916). Since then new concepts have been development in order to complete and make easier the understanding of acid-base disorders. In the early 1980's Peter Stewart brought the traditional interpretation of acid-base disturbances into question and proposed a new method. This innovative approach seems more suitable for studying acid-base abnormalities in critically ill patients. The aim of this paper is to update acid-base concepts, methods, limitations and applications.

  2. Learning with Interactive Graphical Representations.

    ERIC Educational Resources Information Center

    Saljo, Roger, Ed.

    1999-01-01

    The seven articles of this theme issue deal with the use of computer-based interactive graphical representations. Studying their use will bring answers to users of static graphics in traditional paper-based media and those who plan instruction using graphical representations that allow semantically direct manipulation. (SLD)

  3. Sensitivity analysis and probabilistic re-entry modeling for debris using high dimensional model representation based uncertainty treatment

    NASA Astrophysics Data System (ADS)

    Mehta, Piyush M.; Kubicek, Martin; Minisci, Edmondo; Vasile, Massimiliano

    2017-01-01

    Well-known tools developed for satellite and debris re-entry perform break-up and trajectory simulations in a deterministic sense and do not perform any uncertainty treatment. The treatment of uncertainties associated with the re-entry of a space object requires a probabilistic approach. A Monte Carlo campaign is the intuitive approach to performing a probabilistic analysis, however, it is computationally very expensive. In this work, we use a recently developed approach based on a new derivation of the high dimensional model representation method for implementing a computationally efficient probabilistic analysis approach for re-entry. Both aleatoric and epistemic uncertainties that affect aerodynamic trajectory and ground impact location are considered. The method is applicable to both controlled and un-controlled re-entry scenarios. The resulting ground impact distributions are far from the typically used Gaussian or ellipsoid distributions.

  4. Phosphonic acid based ion exchange resins

    DOEpatents

    Horwitz, E. Philip; Alexandratos, Spiro D.; Gatrone, Ralph C.; Chiarizia, Ronato

    1996-01-01

    An ion exchange resin for extracting metal ions from a liquid waste stream. An ion exchange resin is prepared by copolymerizing a vinylidene diphosphonic acid with styrene, acrylonitrile and divinylbenzene.

  5. Phosphonic acid based ion exchange resins

    DOEpatents

    Horwitz, E. Philip; Alexandratos, Spiro D.; Gatrone, Ralph C.; Chiarizia, Ronato

    1994-01-01

    An ion exchange resin for extracting metal ions from a liquid waste stream. An ion exchange resin is prepared by copolymerizing a vinylidene disphosphonic acid with styrene, acrylonitrile and divinylbenzene.

  6. Visual Design for Interactive Learning Tools: Representation of Time-Based Information.

    ERIC Educational Resources Information Center

    Boling, Elizabeth; Brown, J. P.; Ray, Sumitra Das; Erwin, Anthony; Kirkley, Sonny

    The development of potentially powerful computer-based tools within rich learning environments is hampered by the constraints of delivery systems (specifically low-resolution, low-real estate displays). In designing a World Wide Web-based tool for manipulating time-based information, the authors have encountered a lack of guidelines or empirical…

  7. Calcium-based Lewis acid catalysts.

    PubMed

    Begouin, Jeanne-Marie; Niggemann, Meike

    2013-06-17

    Recently, Lewis acidic calcium salts bearing weakly coordinating anions such as Ca(NTf₂)₂, Ca(OTf)₂, CaF₂ and Ca[OCH(CF₃)₂]₂ have been discovered as catalysts for the transformation of alcohols, olefins and carbonyl compounds. High stability towards air and moisture, selectivity and high reactivity under mild reaction conditions render these catalysts a sustainable and mild alternative to transition metals, rare-earth metals or strong Brønsted acids.

  8. Advances in nucleic acid-based detection methods.

    PubMed Central

    Wolcott, M J

    1992-01-01

    Laboratory techniques based on nucleic acid methods have increased in popularity over the last decade with clinical microbiologists and other laboratory scientists who are concerned with the diagnosis of infectious agents. This increase in popularity is a result primarily of advances made in nucleic acid amplification and detection techniques. Polymerase chain reaction, the original nucleic acid amplification technique, changed the way many people viewed and used nucleic acid techniques in clinical settings. After the potential of polymerase chain reaction became apparent, other methods of nucleic acid amplification and detection were developed. These alternative nucleic acid amplification methods may become serious contenders for application to routine laboratory analyses. This review presents some background information on nucleic acid analyses that might be used in clinical and anatomical laboratories and describes some recent advances in the amplification and detection of nucleic acids. PMID:1423216

  9. The Roles of Acids and Bases in Enzyme Catalysis

    ERIC Educational Resources Information Center

    Weiss, Hilton M.

    2007-01-01

    Many organic reactions are catalyzed by strong acids or bases that protonate or deprotonate neutral reactants leading to reactive cations or anions that proceed to products. In enzyme reactions, only weak acids and bases are available to hydrogen bond to reactants and to transfer protons in response to developing charges. Understanding this…

  10. What is the Ultimate Goal in Acid-Base Regulation?

    ERIC Educational Resources Information Center

    Balakrishnan, Selvakumar; Gopalakrishnan, Maya; Alagesan, Murali; Prakash, E. Sankaranarayanan

    2007-01-01

    It is common to see chapters on acid-base physiology state that the goal of acid-base regulatory mechanisms is to maintain the pH of arterial plasma and not arterial PCO [subscript 2] (Pa[subscript CO[subscript 2

  11. Acid-base properties of titanium-antimony oxides catalysts

    SciTech Connect

    Zenkovets, G.A.; Paukshtis, E.A.; Tarasova, D.V.; Yurchenko, E.N.

    1982-06-01

    The acid-base properties of titanium-antimony oxide catalysts were studied by the methods of back titration and ir spectroscopy. The interrelationship between the acid-base and catalytic properties in the oxidative ammonolysis of propylene was discussed. 3 figures, 1 table.

  12. A Closer Look at Acid-Base Olfactory Titrations

    ERIC Educational Resources Information Center

    Neppel, Kerry; Oliver-Hoyo, Maria T.; Queen, Connie; Reed, Nicole

    2005-01-01

    Olfactory titrations using raw onions and eugenol as acid-base indicators are reported. An in-depth investigation on olfactory titrations is presented to include requirements for potential olfactory indicators and protocols for using garlic, onions, and vanillin as acid-base olfactory indicators are tested.

  13. Luminol as a fluorescent acid-base indicator.

    PubMed

    Erdey, L; Buzás, I; Vigh, K

    1966-03-01

    The acid and base dissociation constants of luminol are determined at various ionic strengths. The transition interval occurs at pH 7.7-9.0, therefore luminol is a fluorescent indicator for the titration of strong and weak acids and strong bases. Its value as an indicator is established by titrating milk, red wine and cherry juice.

  14. Proton defect solvation and dynamics in aqueous acid and base.

    PubMed

    Kale, Seyit; Herzfeld, Judith

    2012-10-29

    Easy come, easy go: LEWIS, a new model of reactive and polarizable water that enables the simulation of a statistically reliable number of proton hopping events in aqueous acid and base at concentrations of practical interest, is used to evaluate proton transfer intermediates in aqueous acid and base (picture, left and right, respectively).

  15. Connecting Acids and Bases with Encapsulation... and Chemistry with Nanotechnology

    ERIC Educational Resources Information Center

    Criswell, Brett

    2007-01-01

    The features and the development of various new acids and bases activity sets that combines chemistry with nanotechnology are being described. These sets lead to the generation of many nanotechnology-based pharmaceuticals for the treatment of various diseases.

  16. Quaternionic representation of the genetic code.

    PubMed

    Carlevaro, C Manuel; Irastorza, Ramiro M; Vericat, Fernando

    2016-03-01

    A heuristic diagram of the evolution of the standard genetic code is presented. It incorporates, in a way that resembles the energy levels of an atom, the physical notion of broken symmetry and it is consistent with original ideas by Crick on the origin and evolution of the code as well as with the chronological order of appearance of the amino acids along the evolution as inferred from work that mixtures known experimental results with theoretical speculations. Suggested by the diagram we propose a Hamilton quaternions based mathematical representation of the code as it stands now-a-days. The central object in the description is a codon function that assigns to each amino acid an integer quaternion in such a way that the observed code degeneration is preserved. We emphasize the advantages of a quaternionic representation of amino acids taking as an example the folding of proteins. With this aim we propose an algorithm to go from the quaternions sequence to the protein three dimensional structure which can be compared with the corresponding experimental one stored at the Protein Data Bank. In our criterion the mathematical representation of the genetic code in terms of quaternions merits to be taken into account because it describes not only most of the known properties of the genetic code but also opens new perspectives that are mainly derived from the close relationship between quaternions and rotations.

  17. A Computer-Based Simulation of an Acid-Base Titration

    ERIC Educational Resources Information Center

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  18. Representation Method for Spectrally Overlapping Signals in Flow Cytometry Based on Fluorescence Pulse Time-Delay Estimation.

    PubMed

    Zhang, Wenchang; Lou, Xiaoping; Meng, Xiaochen; Zhu, Lianqing

    2016-11-23

    Flow cytometry is being applied more extensively because of the outstanding advantages of multicolor fluorescence analysis. However, the intensity measurement is susceptible to the nonlinearity of the detection method. Moreover, in multicolor analysis, it is impossible to discriminate between fluorophores that spectrally overlap; this influences the accuracy of the fluorescence pulse signal representation. Here, we focus on spectral overlap in two-color analysis, and assume that the fluorescence follows the single exponential decay model. We overcome these problems by analyzing the influence of the spectral overlap quantitatively, which enables us to propose a method of fluorescence pulse signal representation based on time-delay estimation (between fluorescence and scattered pulse signals). First, the time delays are estimated using a modified chirp Z-transform (MCZT) algorithm and a fine interpolation of the correlation peak (FICP) algorithm. Second, the influence of hardware is removed via calibration, in order to acquire the original fluorescence lifetimes. Finally, modulated signals containing phase shifts associated with these lifetimes are created artificially, using a digital signal processing method, and reference signals are introduced in order to eliminate the influence of spectral overlap. Time-delay estimation simulation and fluorescence signal representation experiments are conducted on fluorescently labeled cells. With taking the potentially overlap of autofluorescence as part of the observed fluorescence spectrum, rather than distinguishing the individual influence, the results show that the calculated lifetimes with spectral overlap can be rectified from 8.28 and 4.86 ns to 8.51 and 4.63 ns, respectively, using the comprehensive approach presented in this work. These values agree well with the lifetimes (8.48 and 4.67 ns) acquired for cells stained with single-color fluorochrome. Further, these results indicate that the influence of spectral

  19. Representation Method for Spectrally Overlapping Signals in Flow Cytometry Based on Fluorescence Pulse Time-Delay Estimation

    PubMed Central

    Zhang, Wenchang; Lou, Xiaoping; Meng, Xiaochen; Zhu, Lianqing

    2016-01-01

    Flow cytometry is being applied more extensively because of the outstanding advantages of multicolor fluorescence analysis. However, the intensity measurement is susceptible to the nonlinearity of the detection method. Moreover, in multicolor analysis, it is impossible to discriminate between fluorophores that spectrally overlap; this influences the accuracy of the fluorescence pulse signal representation. Here, we focus on spectral overlap in two-color analysis, and assume that the fluorescence follows the single exponential decay model. We overcome these problems by analyzing the influence of the spectral overlap quantitatively, which enables us to propose a method of fluorescence pulse signal representation based on time-delay estimation (between fluorescence and scattered pulse signals). First, the time delays are estimated using a modified chirp Z-transform (MCZT) algorithm and a fine interpolation of the correlation peak (FICP) algorithm. Second, the influence of hardware is removed via calibration, in order to acquire the original fluorescence lifetimes. Finally, modulated signals containing phase shifts associated with these lifetimes are created artificially, using a digital signal processing method, and reference signals are introduced in order to eliminate the influence of spectral overlap. Time-delay estimation simulation and fluorescence signal representation experiments are conducted on fluorescently labeled cells. With taking the potentially overlap of autofluorescence as part of the observed fluorescence spectrum, rather than distinguishing the individual influence, the results show that the calculated lifetimes with spectral overlap can be rectified from 8.28 and 4.86 ns to 8.51 and 4.63 ns, respectively, using the comprehensive approach presented in this work. These values agree well with the lifetimes (8.48 and 4.67 ns) acquired for cells stained with single-color fluorochrome. Further, these results indicate that the influence of spectral

  20. Ammonia Transporters and Their Role in Acid-Base Balance.

    PubMed

    Weiner, I David; Verlander, Jill W

    2017-04-01

    Acid-base homeostasis is critical to maintenance of normal health. Renal ammonia excretion is the quantitatively predominant component of renal net acid excretion, both under basal conditions and in response to acid-base disturbances. Although titratable acid excretion also contributes to renal net acid excretion, the quantitative contribution of titratable acid excretion is less than that of ammonia under basal conditions and is only a minor component of the adaptive response to acid-base disturbances. In contrast to other urinary solutes, ammonia is produced in the kidney and then is selectively transported either into the urine or the renal vein. The proportion of ammonia that the kidney produces that is excreted in the urine varies dramatically in response to physiological stimuli, and only urinary ammonia excretion contributes to acid-base homeostasis. As a result, selective and regulated renal ammonia transport by renal epithelial cells is central to acid-base homeostasis. Both molecular forms of ammonia, NH3 and NH4(+), are transported by specific proteins, and regulation of these transport processes determines the eventual fate of the ammonia produced. In this review, we discuss these issues, and then discuss in detail the specific proteins involved in renal epithelial cell ammonia transport.

  1. Discovery Learning, Representation, and Explanation within a Computer-Based Simulation: Finding the Right Mix

    ERIC Educational Resources Information Center

    Rieber, Lloyd P.; Tzeng, Shyh-Chii; Tribble, Kelly

    2004-01-01

    The purpose of this research was to explore how adult users interact and learn during an interactive computer-based simulation supplemented with brief multimedia explanations of the content. A total of 52 college students interacted with a computer-based simulation of Newton's laws of motion in which they had control over the motion of a simple…

  2. Caregiving Antecedents of Secure Base Script Knowledge: A Comparative Analysis of Young Adult Attachment Representations

    ERIC Educational Resources Information Center

    Steele, Ryan D.; Waters, Theodore E. A.; Bost, Kelly K.; Vaughn, Brian E.; Truitt, Warren; Waters, Harriet S.; Booth-LaForce, Cathryn; Roisman, Glenn I.

    2014-01-01

    Based on a subsample (N = 673) of the NICHD Study of Early Child Care and Youth Development (SECCYD) cohort, this article reports data from a follow-up assessment at age 18 years on the antecedents of "secure base script knowledge", as reflected in the ability to generate narratives in which attachment-related difficulties are…

  3. Nucleic acid duplexes incorporating a dissociable covalent base pair

    NASA Technical Reports Server (NTRS)

    Gao, K.; Orgel, L. E.; Bada, J. L. (Principal Investigator)

    1999-01-01

    We have used molecular modeling techniques to design a dissociable covalently bonded base pair that can replace a Watson-Crick base pair in a nucleic acid with minimal distortion of the structure of the double helix. We introduced this base pair into a potential precursor of a nucleic acid double helix by chemical synthesis and have demonstrated efficient nonenzymatic template-directed ligation of the free hydroxyl groups of the base pair with appropriate short oligonucleotides. The nonenzymatic ligation reactions, which are characteristic of base paired nucleic acid structures, are abolished when the covalent base pair is reduced and becomes noncoplanar. This suggests that the covalent base pair linking the two strands in the duplex is compatible with a minimally distorted nucleic acid double-helical structure.

  4. Nucleic Acid Duplexes Incorporating a Dissociable Covalent Base Pair

    NASA Astrophysics Data System (ADS)

    Gao, Kui; Orgel, Leslie E.

    1999-12-01

    We have used molecular modeling techniques to design a dissociable covalently bonded base pair that can replace a Watson-Crick base pair in a nucleic acid with minimal distortion of the structure of the double helix. We introduced this base pair into a potential precursor of a nucleic acid double helix by chemical synthesis and have demonstrated efficient nonenzymatic template-directed ligation of the free hydroxyl groups of the base pair with appropriate short oligonucleotides. The nonenzymatic ligation reactions, which are characteristic of base paired nucleic acid structures, are abolished when the covalent base pair is reduced and becomes noncoplanar. This suggests that the covalent base pair linking the two strands in the duplex is compatible with a minimally distorted nucleic acid double-helical structure.

  5. Characterizing and differentiating task-based and resting state fMRI signals via two-stage sparse representations.

    PubMed

    Zhang, Shu; Li, Xiang; Lv, Jinglei; Jiang, Xi; Guo, Lei; Liu, Tianming

    2016-03-01

    A relatively underexplored question in fMRI is whether there are intrinsic differences in terms of signal composition patterns that can effectively characterize and differentiate task-based or resting state fMRI (tfMRI or rsfMRI) signals. In this paper, we propose a novel two-stage sparse representation framework to examine the fundamental difference between tfMRI and rsfMRI signals. Specifically, in the first stage, the whole-brain tfMRI or rsfMRI signals of each subject were composed into a big data matrix, which was then factorized into a subject-specific dictionary matrix and a weight coefficient matrix for sparse representation. In the second stage, all of the dictionary matrices from both tfMRI/rsfMRI data across multiple subjects were composed into another big data-matrix, which was further sparsely represented by a cross-subjects common dictionary and a weight matrix. This framework has been applied on the recently publicly released Human Connectome Project (HCP) fMRI data and experimental results revealed that there are distinctive and descriptive atoms in the cross-subjects common dictionary that can effectively characterize and differentiate tfMRI and rsfMRI signals, achieving 100% classification accuracy. Moreover, our methods and results can be meaningfully interpreted, e.g., the well-known default mode network (DMN) activities can be recovered from the very noisy and heterogeneous aggregated big-data of tfMRI and rsfMRI signals across all subjects in HCP Q1 release.

  6. FPGA implementation of a modified FitzHugh-Nagumo neuron based causal neural network for compact internal representation of dynamic environments

    NASA Astrophysics Data System (ADS)

    Salas-Paracuellos, L.; Alba, Luis; Villacorta-Atienza, Jose A.; Makarov, Valeri A.

    2011-05-01

    Animals for surviving have developed cognitive abilities allowing them an abstract representation of the environment. This internal representation (IR) may contain a huge amount of information concerning the evolution and interactions of the animal and its surroundings. The temporal information is needed for IRs of dynamic environments and is one of the most subtle points in its implementation as the information needed to generate the IR may eventually increase dramatically. Some recent studies have proposed the compaction of the spatiotemporal information into only space, leading to a stable structure suitable to be the base for complex cognitive processes in what has been called Compact Internal Representation (CIR). The Compact Internal Representation is especially suited to be implemented in autonomous robots as it provides global strategies for the interaction with real environments. This paper describes an FPGA implementation of a Causal Neural Network based on a modified FitzHugh-Nagumo neuron to generate a Compact Internal Representation of dynamic environments for roving robots, developed under the framework of SPARK and SPARK II European project, to avoid dynamic and static obstacles.

  7. Acid-base properties of 2-phenethyldithiocarbamoylacetic acid, an antitumor agent

    NASA Astrophysics Data System (ADS)

    Novozhilova, N. E.; Kutina, N. N.; Petukhova, O. A.; Kharitonov, Yu. Ya.

    2013-07-01

    The acid-base properties of the 2-phenethyldithiocarbamoylacetic acid (PET) substance belonging to the class of isothiocyanates and capable of inhibiting the development of tumors on many experimental models were studied. The acidity and hydrolysis constants of the PET substance in ethanol, acetone, aqueous ethanol, and aqueous acetone solutions were determined from the data of potentiometric (pH-metric) titration of ethanol and acetone solutions of PET with aqueous solidum hydroxide at room temperature.

  8. Sparse representation of multi parametric DCE-MRI features using K-SVD for classifying gene expression based breast cancer recurrence risk

    NASA Astrophysics Data System (ADS)

    Mahrooghy, Majid; Ashraf, Ahmed B.; Daye, Dania; Mies, Carolyn; Rosen, Mark; Feldman, Michael; Kontos, Despina

    2014-03-01

    We evaluate the prognostic value of sparse representation-based features by applying the K-SVD algorithm on multiparametric kinetic, textural, and morphologic features in breast dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). K-SVD is an iterative dimensionality reduction method that optimally reduces the initial feature space by updating the dictionary columns jointly with the sparse representation coefficients. Therefore, by using K-SVD, we not only provide sparse representation of the features and condense the information in a few coefficients but also we reduce the dimensionality. The extracted K-SVD features are evaluated by a machine learning algorithm including a logistic regression classifier for the task of classifying high versus low breast cancer recurrence risk as determined by a validated gene expression assay. The features are evaluated using ROC curve analysis and leave one-out cross validation for different sparse representation and dimensionality reduction numbers. Optimal sparse representation is obtained when the number of dictionary elements is 4 (K=4) and maximum non-zero coefficients is 2 (L=2). We compare K-SVD with ANOVA based feature selection for the same prognostic features. The ROC results show that the AUC of the K-SVD based (K=4, L=2), the ANOVA based, and the original features (i.e., no dimensionality reduction) are 0.78, 0.71. and 0.68, respectively. From the results, it can be inferred that by using sparse representation of the originally extracted multi-parametric, high-dimensional data, we can condense the information on a few coefficients with the highest predictive value. In addition, the dimensionality reduction introduced by K-SVD can prevent models from over-fitting.

  9. Reactive Distillation for Esterification of Bio-based Organic Acids

    SciTech Connect

    Fields, Nathan; Miller, Dennis J.; Asthana, Navinchandra S.; Kolah, Aspi K.; Vu, Dung; Lira, Carl T.

    2008-09-23

    The following is the final report of the three year research program to convert organic acids to their ethyl esters using reactive distillation. This report details the complete technical activities of research completed at Michigan State University for the period of October 1, 2003 to September 30, 2006, covering both reactive distillation research and development and the underlying thermodynamic and kinetic data required for successful and rigorous design of reactive distillation esterification processes. Specifically, this project has led to the development of economical, technically viable processes for ethyl lactate, triethyl citrate and diethyl succinate production, and on a larger scale has added to the overall body of knowledge on applying fermentation based organic acids as platform chemicals in the emerging biorefinery. Organic acid esters constitute an attractive class of biorenewable chemicals that are made from corn or other renewable biomass carbohydrate feedstocks and replace analogous petroleum-based compounds, thus lessening U.S. dependence on foreign petroleum and enhancing overall biorefinery viability through production of value-added chemicals in parallel with biofuels production. Further, many of these ester products are candidates for fuel (particularly biodiesel) components, and thus will serve dual roles as both industrial chemicals and fuel enhancers in the emerging bioeconomy. The technical report from MSU is organized around the ethyl esters of four important biorenewables-based acids: lactic acid, citric acid, succinic acid, and propionic acid. Literature background on esterification and reactive distillation has been provided in Section One. Work on lactic acid is covered in Sections Two through Five, citric acid esterification in Sections Six and Seven, succinic acid in Section Eight, and propionic acid in Section Nine. Section Ten covers modeling of ester and organic acid vapor pressure properties using the SPEAD (Step Potential

  10. Process Flow Features as a Host-Based Event Knowledge Representation

    DTIC Science & Technology

    2012-06-14

    techniques. Morgan Kaufmann, 2011. 40. D. Davies and D. Bouldin, “A cluster seperation measur ,” Pattern Analysis and Machine Intelligence, IEEE... measuring deviations from the learned baseline. This exploratory research describes a novel host feature generation process that calculates statistics of...46 3.5 Similarity Measure . . . . . . . . . . . . . . . . . . . . . 48 3.6 Designator Analysis of Host-Based

  11. Memory-Based Decision-Making with Heuristics: Evidence for a Controlled Activation of Memory Representations

    ERIC Educational Resources Information Center

    Khader, Patrick H.; Pachur, Thorsten; Meier, Stefanie; Bien, Siegfried; Jost, Kerstin; Rosler, Frank

    2011-01-01

    Many of our daily decisions are memory based, that is, the attribute information about the decision alternatives has to be recalled. Behavioral studies suggest that for such decisions we often use simple strategies (heuristics) that rely on controlled and limited information search. It is assumed that these heuristics simplify decision-making by…

  12. Computer-Based Learning: Interleaving Whole and Sectional Representation of Neuroanatomy

    ERIC Educational Resources Information Center

    Pani, John R.; Chariker, Julia H.; Naaz, Farah

    2013-01-01

    The large volume of material to be learned in biomedical disciplines requires optimizing the efficiency of instruction. In prior work with computer-based instruction of neuroanatomy, it was relatively efficient for learners to master whole anatomy and then transfer to learning sectional anatomy. It may, however, be more efficient to continuously…

  13. Tensor Based Representation and Analysis of Diffusion-Weighted Magnetic Resonance Images

    ERIC Educational Resources Information Center

    Barmpoutis, Angelos

    2009-01-01

    Cartesian tensor bases have been widely used to model spherical functions. In medical imaging, tensors of various orders can approximate the diffusivity function at each voxel of a diffusion-weighted MRI data set. This approximation produces tensor-valued datasets that contain information about the underlying local structure of the scanned tissue.…

  14. Polymerization of amino acids containing nucleotide bases

    NASA Technical Reports Server (NTRS)

    Ben Cheikh, Azzouz; Orgel, Leslie E.

    1990-01-01

    The nucleoamino acids 1-(3'-amino,3'-carboxypropyl)uracil (3) and 9-(3'-amino,3'-carboxypropyl)adenine (4) have been prepared as (L)-en-antiomers and as racemic mixtures. When 3 or 4 is suspended in water and treated with N,N'-carbon-yldiimidazole, peptides are formed in good yield. The products formed from the (L)-enantiomers are hydrolyzed to the monomeric amino acids by pronase. Attempts to improve the efficiency of these oligomerizations by including a polyuridylate template in the reaction mixture were not successful. Similarly, oligomers derived from the (L)-enantiomer of 3 did not act as templates to facilitate the oligomerization of 4.

  15. Carbon-based strong solid acid for cornstarch hydrolysis

    SciTech Connect

    Nata, Iryanti Fatyasari; Irawan, Chairul; Mardina, Primata; Lee, Cheng-Kang

    2015-10-15

    Highly sulfonated carbonaceous spheres with diameter of 100–500 nm can be generated by hydrothermal carbonization of glucose in the presence of hydroxyethylsulfonic acid and acrylic acid at 180 °C for 4 h. The acidity of the prepared carbonaceous sphere C4-SO{sub 3}H can reach 2.10 mmol/g. It was used as a solid acid catalyst for the hydrolysis of cornstarch. Total reducing sugar (TRS) concentration of 19.91 mg/mL could be obtained by hydrolyzing 20 mg/mL cornstarch at 150 °C for 6 h using C4-SO{sub 3}H as solid acid catalyst. The solid acid catalyst demonstrated good stability that only 9% decrease in TRS concentration was observed after five repeat uses. The as-prepared carbon-based solid acid catalyst can be an environmentally benign replacement for homogeneous catalyst. - Highlights: • Carbon solid acid was successfully prepared by one-step hydrothermal carbonization. • The acrylic acid as monomer was effectively reduce the diameter size of particle. • The solid acid catalyst show good catalytic performance of starch hydrolysis. • The solid acid catalyst is not significantly deteriorated after repeated use.

  16. Truncated feature representation for automatic target detection using transformed data-based decomposition

    NASA Astrophysics Data System (ADS)

    Riasati, Vahid R.

    2016-05-01

    In this work, the data covariance matrix is diagonalized to provide an orthogonal bases set using the eigen vectors of the data. The eigen-vector decomposition of the data is transformed and filtered in the transform domain to truncate the data for robust features related to a specified set of targets. These truncated eigen features are then combined and reconstructed to utilize in a composite filter and consequently utilized for the automatic target detection of the same class of targets. The results associated with the testing of the current technique are evaluated using the peak-correlation and peak-correlation energy metrics and are presented in this work. The inverse transformed eigen-bases of the current technique may be thought of as an injected sparsity to minimize data in representing the skeletal data structure information associated with the set of targets under consideration.

  17. Acid-base properties of adhesive dental polymers.

    PubMed

    Morra, M

    1993-11-01

    The surface energetics of three resins (polymethylmethacrylate, polyhydroxyethylmethacrylate, and Bis-GMA/triethyleneglycoldimethacrylate) commonly used in adhesive interactions with tooth hard tissues were evaluated according to the Fowkes acid-base theory of interfacial interactions. From the measurement of the contact angle of test acidic and basic liquids on the sample surfaces, the acid-base contribution to the work of adhesion was evaluated. Results show that polyhydroxyethylmethacrylate is a comparatively strong Lewis base, a finding that can explain the important role played by this material in the formulation of dentin adhesive.

  18. Differential titration of bases in glacial acetic acid.

    PubMed

    Castellano, T; Medwick, T; Shinkai, J H; Bailey, L

    1981-01-01

    A study of bases in acetic acid and their differential titration was carried out. The overall basicity constants for 20 bases were measured in acetic acid, and the differential titration of five binary mixtures of variable delta pKb values in acetic acid was followed using a glass electrode-modified calomel electrode system. Agreement with literature values was good. A leveling diagram was constructed that indicated that bases stronger than aqueous pKb 10 are leveled to an acetous pKb 5.69, whereas weaker bases are not leveled but instead exhibit their own intrinsic basicity, with the acetous pKb to aqueous pKb values being linearly related (slope 1.18, correlation coefficient 0.962). A minimum acetous delta pKb of four units is required for the satisfactory differential titration of two bases in acetic acid.

  19. The acid-base titration of montmorillonite

    NASA Astrophysics Data System (ADS)

    Bourg, I. C.; Sposito, G.; Bourg, A. C.

    2003-12-01

    Proton binding to clay minerals plays an important role in the chemical reactivity of soils (e.g., acidification, retention of nutrients or pollutants). If should also affect the performance of clay barriers for waste disposal. The surface acidity of clay minerals is commonly modelled empirically by assuming generic amphoteric surface sites (>SOH) on a flat surface, with fitted site densities and acidity constant. Current advances in experimental methods (notably spectroscopy) are rapidly improving our understanding of the structure and reactivity of the surface of clay minerals (arrangement of the particles, nature of the reactive surface sites, adsorption mechanisms). These developments are motivated by the difficulty of modelling the surface chemistry of mineral surfaces at the macro-scale (e.g., adsorption or titration) without a detailed (molecular-scale) picture of the mechanisms, and should be progressively incorporated into surface complexation models. In this view, we have combined recent estimates of montmorillonite surface properties (surface site density and structure, edge surface area, surface electrostatic potential) with surface site acidities obtained from the titration of alpha-Al2O3 and SiO2, and a novel method of accounting for the unknown initial net proton surface charge of the solid. The model predictions were compared to experimental titrations of SWy-1 montmorillonite and purified MX-80 bentonite in 0.1-0.5 mol/L NaClO4 and 0.005-0.5 mol/L NaNO3 background electrolytes, respectively. Most of the experimental data were appropriately described by the model after we adjusted a single parameter (silanol sites on the surface of montmorillonite were made to be slightly more acidic than those of silica). At low ionic strength and acidic pH the model underestimated the buffering capacity of the montmorillonite, perhaps due to clay swelling or to the interlayer adsorption of dissolved aluminum. The agreement between our model and the experimental

  20. Nucleic acid based fluorescent sensor for mercury detection

    DOEpatents

    Lu, Yi; Liu, Juewen

    2013-02-05

    A nucleic acid enzyme comprises an oligonucleotide containing thymine bases. The nucleic acid enzyme is dependent on both Hg.sup.2+and a second ion as cofactors, to produce a product from a substrate. The substrate comprises a ribonucleotide, a deoxyribonucleotide, or both.

  1. Placement with Symmetry Constraints for Analog IC Layout Design Based on Tree Representation

    NASA Astrophysics Data System (ADS)

    Hirakawa, Natsumi; Fujiyoshi, Kunihiro

    Symmetry constrains are the constraints that the given cells should be placed symmetrically in design of analog ICs. We use O-tree to represent placements and propose a decoding algorithm which can obtain one of the minimum placements satisfying the constraints. The decoding algorithm uses linear programming, which is too much time consuming. Therefore we propose a graph based method to recognize if there exists no placement satisfying both the given symmetry and O-tree constraints, and use the method before application of linear programming. The effectiveness of the proposed method was shown by computational experiments.

  2. Boronic acid-tethered amphiphilic hyaluronic acid derivative-based nanoassemblies for tumor targeting and penetration.

    PubMed

    Jeong, Jae Young; Hong, Eun-Hye; Lee, Song Yi; Lee, Jae-Young; Song, Jae-Hyoung; Ko, Seung-Hak; Shim, Jae-Seong; Choe, Sunghwa; Kim, Dae-Duk; Ko, Hyun-Jeong; Cho, Hyun-Jong

    2017-02-16

    (3-Aminomethylphenyl)boronic acid (AMPB)-installed hyaluronic acid-ceramide (HACE)-based nanoparticles (NPs), including manassantin B (MB), were fabricated for tumor-targeted delivery. The amine group of AMPB was conjugated to the carboxylic acid group of hyaluronic acid (HA) via amide bond formation, and synthesis was confirmed by spectroscopic methods. HACE-AMPB/MB NPs with a 239-nm mean diameter, narrow size distribution, negative zeta potential, and >90% drug encapsulation efficiency were fabricated. Exposed AMPB in the outer surface of HACE-AMPB NPs (in the aqueous environment) may react with sialic acid of cancer cells. The improved cellular accumulation efficiency, in vitro antitumor efficacy, and tumor penetration efficiency of HACE-AMPB/MB NPs, compared with HACE/MB NPs, in MDA-MB-231 cells (CD44 receptor-positive human breast adenocarcinoma cells) may be based on the CD44 receptor-mediated endocytosis and phenylboronic acid-sialic acid interaction. Enhanced in vivo tumor targetability, infiltration efficiency, and antitumor efficacies of HACE-AMPB NPs, compared with HACE NPs, were observed in a MDA-MB-231 tumor-xenografted mouse model. In addition to passive tumor targeting (based on an enhanced permeability and retention effect) and active tumor targeting (interaction between HA and CD44 receptor), the phenylboronic acid-sialic acid interaction can play important roles in augmented tumor targeting and penetration of HACE-AMPB NPs. STATEMENT OF SIGNIFICANCE: (3-Aminomethylphenyl)boronic acid (AMPB)-tethered hyaluronic acid-ceramide (HACE)-based nanoparticles (NPs), including manassantin B (MB), were fabricated and their tumor targeting and penetration efficiencies were assessed in MDA-MB-231 (CD44 receptor-positive human adenocarcinoma) tumor models. MB, which exhibited antitumor efficacies via the inhibition of angiogenesis and hypoxia inducible factor (HIF)-1, was entrapped in HACE-AMPB NPs in this study. Phenylboronic acid located in the outer surface

  3. Coupled circuit based representation of piezoelectric structures modeled using the finite volume method.

    PubMed

    Bolborici, V; Dawson, F P

    2016-03-01

    This paper presents the methodology of generating a corresponding electrical circuit for a simple piezoelectric plate modeled with the finite volume method. The corresponding circuit is implemented using a circuit simulation software and the simulation results are compared to the finite volume modeling results for validation. It is noticed that both, the finite volume model and its corresponding circuit, generate identical results. The results of a corresponding circuit based on the finite volume model are also compared to the results of a corresponding circuit based on a simplified analytical model for a long piezoelectric plate, and to finite element simulation results for the same plate. It is observed that, for one control volume, the finite volume model corresponding circuit and the simplified analytical model corresponding circuit generate close results. It is also noticed that the results of the two corresponding circuits are different from the best approximation results obtained with high resolution finite element simulations due to the approximations made in the simplified analytical model and the fact that only one finite volume was used in the finite volume model. The implementation of the circuit can be automated for higher order systems by a program that takes as an input the matrix of the system and the forcing function vector, and returns a net list for the circuit.

  4. a Topic Modeling Based Representation to Detect Tweet Locations. Example of the Event "je Suis Charlie"

    NASA Astrophysics Data System (ADS)

    Morchid, M.; Josselin, D.; Portilla, Y.; Dufour, R.; Altman, E.; Linarès, G.

    2015-09-01

    Social Networks became a major actor in information propagation. Using the Twitter popular platform, mobile users post or relay messages from different locations. The tweet content, meaning and location, show how an event-such as the bursty one "JeSuisCharlie", happened in France in January 2015, is comprehended in different countries. This research aims at clustering the tweets according to the co-occurrence of their terms, including the country, and forecasting the probable country of a non-located tweet, knowing its content. First, we present the process of collecting a large quantity of data from the Twitter website. We finally have a set of 2,189 located tweets about "Charlie", from the 7th to the 14th of January. We describe an original method adapted from the Author-Topic (AT) model based on the Latent Dirichlet Allocation (LDA) method. We define an homogeneous space containing both lexical content (words) and spatial information (country). During a training process on a part of the sample, we provide a set of clusters (topics) based on statistical relations between lexical and spatial terms. During a clustering task, we evaluate the method effectiveness on the rest of the sample that reaches up to 95% of good assignment. It shows that our model is pertinent to foresee tweet location after a learning process.

  5. Local-Based Semantic Navigation on a Networked Representation of Information

    PubMed Central

    Capitán, José A.; Borge-Holthoefer, Javier; Gómez, Sergio; Martinez-Romo, Juan; Araujo, Lourdes; Cuesta, José A.; Arenas, Alex

    2012-01-01

    The size and complexity of actual networked systems hinders the access to a global knowledge of their structure. This fact pushes the problem of navigation to suboptimal solutions, one of them being the extraction of a coherent map of the topology on which navigation takes place. In this paper, we present a Markov chain based algorithm to tag networked terms according only to their topological features. The resulting tagging is used to compute similarity between terms, providing a map of the networked information. This map supports local-based navigation techniques driven by similarity. We compare the efficiency of the resulting paths according to their length compared to that of the shortest path. Additionally we claim that the path steps towards the destination are semantically coherent. To illustrate the algorithm performance we provide some results from the Simple English Wikipedia, which amounts to several thousand of pages. The simplest greedy strategy yields over an 80% of average success rate. Furthermore, the resulting content-coherent paths most often have a cost between one- and threefold compared to shortest-path lengths. PMID:22937081

  6. Acid-base titration curves for acids with very small ratios of successive dissociation constants.

    PubMed

    Campbell, B H; Meites, L

    1974-02-01

    The shapes of the potentiometric acid-base titration curves obtained in the neutralizations of polyfunctional acids or bases for which each successive dissociation constant is smaller than the following one are examined. In the region 0 < < 1 (where is the fraction of the equivalent volume of reagent that has been added) the slope of the titration curve decreases as the number j of acidic or basic sites increases. The difference between the pH-values at = 0.75 and = 0.25 has (1 j)log 9 as the lower limit of its maximum value.

  7. Renal acidification responses to respiratory acid-base disorders.

    PubMed

    Madias, Nicolaos E

    2010-01-01

    Respiratory acid-base disorders are those abnormalities in acid-base equilibrium that are expressed as primary changes in the arterial carbon dioxide tension (PaCO2). An increase in PaCO2 (hypercapnia) acidifies body fluids and initiates the acid-base disturbance known as respiratory acidosis. By contrast, a decrease in PaCO2 (hypocapnia) alkalinizes body fluids and initiates the acid-base disturbance known as respiratory alkalosis. The impact on systemic acidity of these primary changes in PaCO2 is ameliorated by secondary, directional changes in plasma [HCO3¯] that occur in 2 stages. Acutely, hypercapnia or hypocapnia yields relatively small changes in plasma [HCO3¯] that originate virtually exclusively from titration of the body's nonbicarbonate buffers. During sustained hypercapnia or hypocapnia, much larger changes in plasma [HCO3¯] occur that reflect adjustments in renal acidification mechanisms. Consequently, the deviation of systemic acidity from normal is smaller in the chronic forms of these disorders. Here we provide an overview of the renal acidification responses to respiratory acid-base disorders. We also identify gaps in knowledge that require further research.

  8. Status of the phenomena representation, 3D modeling, and cloud-based software architecture development

    SciTech Connect

    Smith, Curtis L.; Prescott, Steven; Kvarfordt, Kellie; Sampath, Ram; Larson, Katie

    2015-09-01

    Early in 2013, researchers at the Idaho National Laboratory outlined a technical framework to support the implementation of state-of-the-art probabilistic risk assessment to predict the safety performance of advanced small modular reactors. From that vision of the advanced framework for risk analysis, specific tasks have been underway in order to implement the framework. This report discusses the current development of a several tasks related to the framework implementation, including a discussion of a 3D physics engine that represents the motion of objects (including collision and debris modeling), cloud-based analysis tools such as a Bayesian-inference engine, and scenario simulations. These tasks were performed during 2015 as part of the technical work associated with the Advanced Reactor Technologies Program.

  9. Hybrid Feature Extraction-based Approach for Facial Parts Representation and Recognition

    NASA Astrophysics Data System (ADS)

    Rouabhia, C.; Tebbikh, H.

    2008-06-01

    Face recognition is a specialized image processing which has attracted a considerable attention in computer vision. In this article, we develop a new facial recognition system from video sequences images dedicated to person identification whose face is partly occulted. This system is based on a hybrid image feature extraction technique called ACPDL2D (Rouabhia et al. 2007), it combines two-dimensional principal component analysis and two-dimensional linear discriminant analysis with neural network. We performed the feature extraction task on the eyes and the nose images separately then a Multi-Layers Perceptron classifier is used. Compared to the whole face, the results of simulation are in favor of the facial parts in terms of memory capacity and recognition (99.41% for the eyes part, 98.16% for the nose part and 97.25 % for the whole face).

  10. Knowledge-based probabilistic representations of branching ratios in chemical networks: The case of dissociative recombinations

    SciTech Connect

    Plessis, Sylvain; Carrasco, Nathalie; Pernot, Pascal

    2010-10-07

    Experimental data about branching ratios for the products of dissociative recombination of polyatomic ions are presently the unique information source available to modelers of natural or laboratory chemical plasmas. Yet, because of limitations in the measurement techniques, data for many ions are incomplete. In particular, the repartition of hydrogen atoms among the fragments of hydrocarbons ions is often not available. A consequence is that proper implementation of dissociative recombination processes in chemical models is difficult, and many models ignore invaluable data. We propose a novel probabilistic approach based on Dirichlet-type distributions, enabling modelers to fully account for the available information. As an application, we consider the production rate of radicals through dissociative recombination in an ionospheric chemistry model of Titan, the largest moon of Saturn. We show how the complete scheme of dissociative recombination products derived with our method dramatically affects these rates in comparison with the simplistic H-loss mechanism implemented by default in all recent models.

  11. Trait-Based Representation of Biological Nitrification: Model Development, Testing, and Predicted Community Composition

    PubMed Central

    Bouskill, Nicholas J.; Tang, Jinyun; Riley, William J.; Brodie, Eoin L.

    2012-01-01

    Trait-based microbial models show clear promise as tools to represent the diversity and activity of microorganisms across ecosystem gradients. These models parameterize specific traits that determine the relative fitness of an “organism” in a given environment, and represent the complexity of biological systems across temporal and spatial scales. In this study we introduce a microbial community trait-based modeling framework (MicroTrait) focused on nitrification (MicroTrait-N) that represents the ammonia-oxidizing bacteria (AOB) and ammonia-oxidizing archaea (AOA) and nitrite-oxidizing bacteria (NOB) using traits related to enzyme kinetics and physiological properties. We used this model to predict nitrifier diversity, ammonia (NH3) oxidation rates, and nitrous oxide (N2O) production across pH, temperature, and substrate gradients. Predicted nitrifier diversity was predominantly determined by temperature and substrate availability, the latter was strongly influenced by pH. The model predicted that transient N2O production rates are maximized by a decoupling of the AOB and NOB communities, resulting in an accumulation and detoxification of nitrite to N2O by AOB. However, cumulative N2O production (over 6 month simulations) is maximized in a system where the relationship between AOB and NOB is maintained. When the reactions uncouple, the AOB become unstable and biomass declines rapidly, resulting in decreased NH3 oxidation and N2O production. We evaluated this model against site level chemical datasets from the interior of Alaska and accurately simulated NH3 oxidation rates and the relative ratio of AOA:AOB biomass. The predicted community structure and activity indicate (a) parameterization of a small number of traits may be sufficient to broadly characterize nitrifying community structure and (b) changing decadal trends in climate and edaphic conditions could impact nitrification rates in ways that are not captured by extant biogeochemical models. PMID

  12. An Olfactory Indicator for Acid-Base Titrations.

    ERIC Educational Resources Information Center

    Flair, Mark N.; Setzer, William N.

    1990-01-01

    The use of an olfactory acid-base indicator in titrations for visually impaired students is discussed. Potential olfactory indicators include eugenol, thymol, vanillin, and thiophenol. Titrations performed with each indicator with eugenol proved to be successful. (KR)

  13. Strength of object representation: its key role in object-based attention for determining the competition result between Gestalt and top-down objects.

    PubMed

    Zhao, Jingjing; Wang, Yonghui; Liu, Donglai; Zhao, Liang; Liu, Peng

    2015-10-01

    It was found in previous studies that two types of objects (rectangles formed according to the Gestalt principle and Chinese words formed in a top-down fashion) can both induce an object-based effect. The aim of the present study was to investigate how the strength of an object representation affects the result of the competition between these two types of objects based on research carried out by Liu, Wang and Zhou [(2011) Acta Psychologica, 138(3), 397-404]. In Experiment 1, the rectangles were filled with two different colors to increase the strength of Gestalt object representation, and we found that the object effect changed significantly for the different stimulus types. Experiment 2 used Chinese words with various familiarities to manipulate the strength of the top-down object representation. As a result, the object-based effect induced by rectangles was observed only when the Chinese word familiarity was low. These results suggest that the strength of object representation determines the result of competition between different types of objects.

  14. Towards lactic acid bacteria-based biorefineries.

    PubMed

    Mazzoli, Roberto; Bosco, Francesca; Mizrahi, Itzhak; Bayer, Edward A; Pessione, Enrica

    2014-11-15

    Lactic acid bacteria (LAB) have long been used in industrial applications mainly as starters for food fermentation or as biocontrol agents or as probiotics. However, LAB possess several characteristics that render them among the most promising candidates for use in future biorefineries in converting plant-derived biomass-either from dedicated crops or from municipal/industrial solid wastes-into biofuels and high value-added products. Lactic acid, their main fermentation product, is an attractive building block extensively used by the chemical industry, owing to the potential for production of polylactides as biodegradable and biocompatible plastic alternative to polymers derived from petrochemicals. LA is but one of many high-value compounds which can be produced by LAB fermentation, which also include biofuels such as ethanol and butanol, biodegradable plastic polymers, exopolysaccharides, antimicrobial agents, health-promoting substances and nutraceuticals. Furthermore, several LAB strains have ascertained probiotic properties, and their biomass can be considered a high-value product. The present contribution aims to provide an extensive overview of the main industrial applications of LAB and future perspectives concerning their utilization in biorefineries. Strategies will be described in detail for developing LAB strains with broader substrate metabolic capacity for fermentation of cheaper biomass.

  15. Synthesis and antimicrobial activities of new higher amino acid Schiff base derivatives of 6-aminopenicillanic acid and 7-aminocephalosporanic acid

    NASA Astrophysics Data System (ADS)

    Özdemir (nee Güngör), Özlem; Gürkan, Perihan; Özçelik, Berrin; Oyardı, Özlem

    2016-02-01

    Novel β-lactam derivatives (1c-3c) (1d-3d) were produced by using 6-aminopenicillanic acid (6-APA), 7-aminocephalosporanic acid (7-ACA) and the higher amino acid Schiff bases. The synthesized compounds were characterized by elemental analysis, IR, 1H/13C NMR and UV-vis spectra. Antibacterial activities of all the higher amino acid Schiff bases (1a-3a) (1b-3b) and β-lactam derivatives were screened against three gram negative bacteria (Escherichia coli ATCC 25922, Pseudomonas aeruginosa ATCC 27853, Acinetobacter baumannii RSKK 02026), three gram positive bacteria (Staphylococcus aureus ATCC 25923, Enterococcus faecalis ATCC 07005, Bacillus subtilis ATCC 6633) and their drug-resistant isolates by using broth microdilution method. Two fungi (Candida albicans and Candida krusei) were used for antifungal activity.

  16. From maternal sensitivity in infancy to adult attachment representations: a longitudinal adoption study with secure base scripts.

    PubMed

    Schoenmaker, Christie; Juffer, Femmie; van IJzendoorn, Marinus H; Linting, Mariëlle; van der Voort, Anja; Bakermans-Kranenburg, Marian J

    2015-01-01

    We examined whether differences in adult attachment representations could be predicted from early and later maternal sensitivity, controlling for early and later assessments of attachment. In this longitudinal study on 190 adoptees, attachment at 23 years was measured with the Attachment Script Assessment. Maternal sensitivity was observed in infancy and at seven and 14 years. Attachment was also measured in infancy and at 14 years. Higher maternal sensitivity in infancy predicted more secure attachment in infancy and more secure attachment representations in young adulthood. Higher maternal sensitivity in middle childhood also predicted more secure attachment representations in young adulthood. There was no continuity of attachment from infancy to young adulthood, but attachment in adolescence and young adulthood were significantly related. Even in genetically unrelated families, maternal sensitivity in early and middle childhood predicts attachment representations in young adults, confirming the importance of sensitive parenting for human development.

  17. Using listener-based perceptual features as intermediate representations in music information retrieval.

    PubMed

    Friberg, Anders; Schoonderwaldt, Erwin; Hedblad, Anton; Fabiani, Marco; Elowsson, Anders

    2014-10-01

    The notion of perceptual features is introduced for describing general music properties based on human perception. This is an attempt at rethinking the concept of features, aiming to approach the underlying human perception mechanisms. Instead of using concepts from music theory such as tones, pitches, and chords, a set of nine features describing overall properties of the music was selected. They were chosen from qualitative measures used in psychology studies and motivated from an ecological approach. The perceptual features were rated in two listening experiments using two different data sets. They were modeled both from symbolic and audio data using different sets of computational features. Ratings of emotional expression were predicted using the perceptual features. The results indicate that (1) at least some of the perceptual features are reliable estimates; (2) emotion ratings could be predicted by a small combination of perceptual features with an explained variance from 75% to 93% for the emotional dimensions activity and valence; (3) the perceptual features could only to a limited extent be modeled using existing audio features. Results clearly indicated that a small number of dedicated features were superior to a "brute force" model using a large number of general audio features.

  18. Acid-base homeostasis in the human system

    NASA Technical Reports Server (NTRS)

    White, R. J.

    1974-01-01

    Acid-base regulation is a cooperative phenomena in vivo with body fluids, extracellular and intracellular buffers, lungs, and kidneys all playing important roles. The present account is much too brief to be considered a review of present knowledge of these regulatory systems, and should be viewed, instead, as a guide to the elements necessary to construct a simple model of the mutual interactions of the acid-base regulatory systems of the body.

  19. Structural damage detection in wind turbine blades based on time series representations of dynamic responses

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2015-03-01

    The development of large wind turbines that enable to harvest energy more efficiently is a consequence of the increasing demand for renewables in the world. To optimize the potential energy output, light and flexible wind turbine blades (WTBs) are designed. However, the higher flexibilities and lower buckling capacities adversely affect the long-term safety and reliability of WTBs, and thus the increased operation and maintenance costs reduce the expected revenue. Effective structural health monitoring techniques can help to counteract this by limiting inspection efforts and avoiding unplanned maintenance actions. Vibration-based methods deserve high attention due to the moderate instrumentation efforts and the applicability for in-service measurements. The present paper proposes the use of cross-correlations (CCs) of acceleration responses between sensors at different locations for structural damage detection in WTBs. CCs were in the past successfully applied for damage detection in numerical and experimental beam structures while utilizing only single lags between the signals. The present approach uses vectors of CC coefficients for multiple lags between measurements of two selected sensors taken from multiple possible combinations of sensors. To reduce the dimensionality of the damage sensitive feature (DSF) vectors, principal component analysis is performed. The optimal number of principal components (PCs) is chosen with respect to a statistical threshold. Finally, the detection phase uses the selected PCs of the healthy structure to calculate scores from a current DSF vector, where statistical hypothesis testing is performed for making a decision about the current structural state. The method is applied to laboratory experiments conducted on a small WTB with non-destructive damage scenarios.

  20. A review of selection-based tests of abiotic surrogates for species representation.

    PubMed

    Beier, Paul; Sutcliffe, Patricia; Hjort, Jan; Faith, Daniel P; Pressey, Robert L; Albuquerque, Fabio

    2015-06-01

    Because conservation planners typically lack data on where species occur, environmental surrogates--including geophysical settings and climate types--have been used to prioritize sites within a planning area. We reviewed 622 evaluations of the effectiveness of abiotic surrogates in representing species in 19 study areas. Sites selected using abiotic surrogates represented more species than an equal number of randomly selected sites in 43% of tests (55% for plants) and on average improved on random selection of sites by about 8% (21% for plants). Environmental diversity (ED) (42% median improvement on random selection) and biotically informed clusters showed promising results and merit additional testing. We suggest 4 ways to improve performance of abiotic surrogates. First, analysts should consider a broad spectrum of candidate variables to define surrogates, including rarely used variables related to geographic separation, distance from coast, hydrology, and within-site abiotic diversity. Second, abiotic surrogates should be defined at fine thematic resolution. Third, sites (the landscape units prioritized within a planning area) should be small enough to ensure that surrogates reflect species' environments and to produce prioritizations that match the spatial resolution of conservation decisions. Fourth, if species inventories are available for some planning units, planners should define surrogates based on the abiotic variables that most influence species turnover in the planning area. Although species inventories increase the cost of using abiotic surrogates, a modest number of inventories could provide the data needed to select variables and evaluate surrogates. Additional tests of nonclimate abiotic surrogates are needed to evaluate the utility of conserving nature's stage as a strategy for conservation planning in the face of climate change.

  1. Introduction of an agent-based multi-scale modular architecture for dynamic knowledge representation of acute inflammation

    PubMed Central

    An, Gary

    2008-01-01

    Background One of the greatest challenges facing biomedical research is the integration and sharing of vast amounts of information, not only for individual researchers, but also for the community at large. Agent Based Modeling (ABM) can provide a means of addressing this challenge via a unifying translational architecture for dynamic knowledge representation. This paper presents a series of linked ABMs representing multiple levels of biological organization. They are intended to translate the knowledge derived from in vitro models of acute inflammation to clinically relevant phenomenon such as multiple organ failure. Results and Discussion ABM development followed a sequence starting with relatively direct translation from in-vitro derived rules into a cell-as-agent level ABM, leading on to concatenated ABMs into multi-tissue models, eventually resulting in topologically linked aggregate multi-tissue ABMs modeling organ-organ crosstalk. As an underlying design principle organs were considered to be functionally composed of an epithelial surface, which determined organ integrity, and an endothelial/blood interface, representing the reaction surface for the initiation and propagation of inflammation. The development of the epithelial ABM derived from an in-vitro model of gut epithelial permeability is described. Next, the epithelial ABM was concatenated with the endothelial/inflammatory cell ABM to produce an organ model of the gut. This model was validated against in-vivo models of the inflammatory response of the gut to ischemia. Finally, the gut ABM was linked to a similarly constructed pulmonary ABM to simulate the gut-pulmonary axis in the pathogenesis of multiple organ failure. The behavior of this model was validated against in-vivo and clinical observations on the cross-talk between these two organ systems Conclusion A series of ABMs are presented extending from the level of intracellular mechanism to clinically observed behavior in the intensive care setting

  2. Synthesis of polyacrylic-acid-based thermochromic polymers

    NASA Astrophysics Data System (ADS)

    Srivastava, Jyoti; Alam, Sarfaraz; Mathur, G. N.

    2003-10-01

    Smart materials respond to environmental stimuli with particular changes in some variables (for example temperature, pressure and electric field etc), for that reason they are often called responsive materials. In the present work, we have synthesized thermochromic polymer based on poly acrylic acid cobalt chloride (CoCl2) and phosphoric acid (H3PO4) that visually and reversibly changes color in the temperature range (70 - 130°C). These thermochromic materials can be used as visual sensors of temperature. Thermochromic polymers are based on polyacrylic acid and CoCl2 complex.

  3. Acid Base Titrations in Nonaqueous Solvents and Solvent Mixtures

    NASA Astrophysics Data System (ADS)

    Barcza, Lajos; Buvári-Barcza, Ágnes

    2003-07-01

    The acid base determination of different substances by nonaqueous titrations is highly preferred in pharmaceutical analyses since the method is quantitative, exact, and reproducible. The modern interpretation of the reactions in nonaqueous solvents started in the last century, but several inconsistencies and unsolved problems can be found in the literature. The acid base theories of Brønsted Lowry and Lewis as well as the so-called solvent theory are outlined first, then the promoting (and leveling) and the differentiating effects are discussed on the basis of the hydrogen-bond concept. Emphasis is put on the properties of formic acid and acetic anhydride since their importance is increasing.

  4. Novel Approach for the Recognition and Prediction of Multi-Function Radar Behaviours Based on Predictive State Representations

    PubMed Central

    Ou, Jian; Chen, Yongguang; Zhao, Feng; Liu, Jin; Xiao, Shunping

    2017-01-01

    The extensive applications of multi-function radars (MFRs) have presented a great challenge to the technologies of radar countermeasures (RCMs) and electronic intelligence (ELINT). The recently proposed cognitive electronic warfare (CEW) provides a good solution, whose crux is to perceive present and future MFR behaviours, including the operating modes, waveform parameters, scheduling schemes, etc. Due to the variety and complexity of MFR waveforms, the existing approaches have the drawbacks of inefficiency and weak practicability in prediction. A novel method for MFR behaviour recognition and prediction is proposed based on predictive state representation (PSR). With the proposed approach, operating modes of MFR are recognized by accumulating the predictive states, instead of using fixed transition probabilities that are unavailable in the battlefield. It helps to reduce the dependence of MFR on prior information. And MFR signals can be quickly predicted by iteratively using the predicted observation, avoiding the very large computation brought by the uncertainty of future observations. Simulations with a hypothetical MFR signal sequence in a typical scenario are presented, showing that the proposed methods perform well and efficiently, which attests to their validity. PMID:28335492

  5. Individual subject classification for Alzheimer's disease based on incremental learning using a spatial frequency representation of cortical thickness data.

    PubMed

    Cho, Youngsang; Seong, Joon-Kyung; Jeong, Yong; Shin, Sung Yong

    2012-02-01

    Patterns of brain atrophy measured by magnetic resonance structural imaging have been utilized as significant biomarkers for diagnosis of Alzheimer's disease (AD). However, brain atrophy is variable across patients and is non-specific for AD in general. Thus, automatic methods for AD classification require a large number of structural data due to complex and variable patterns of brain atrophy. In this paper, we propose an incremental method for AD classification using cortical thickness data. We represent the cortical thickness data of a subject in terms of their spatial frequency components, employing the manifold harmonic transform. The basis functions for this transform are obtained from the eigenfunctions of the Laplace-Beltrami operator, which are dependent only on the geometry of a cortical surface but not on the cortical thickness defined on it. This facilitates individual subject classification based on incremental learning. In general, methods based on region-wise features poorly reflect the detailed spatial variation of cortical thickness, and those based on vertex-wise features are sensitive to noise. Adopting a vertex-wise cortical thickness representation, our method can still achieve robustness to noise by filtering out high frequency components of the cortical thickness data while reflecting their spatial variation. This compromise leads to high accuracy in AD classification. We utilized MR volumes provided by Alzheimer's Disease Neuroimaging Initiative (ADNI) to validate the performance of the method. Our method discriminated AD patients from Healthy Control (HC) subjects with 82% sensitivity and 93% specificity. It also discriminated Mild Cognitive Impairment (MCI) patients, who converted to AD within 18 months, from non-converted MCI subjects with 63% sensitivity and 76% specificity. Moreover, it showed that the entorhinal cortex was the most discriminative region for classification, which is consistent with previous pathological findings. In

  6. An Acid-Base Chemistry Example: Conversion of Nicotine

    NASA Astrophysics Data System (ADS)

    Summerfield, John H.

    1999-10-01

    The current government interest in nicotine conversion by cigarette companies provides an example of acid-base chemistry that can be explained to students in the second semester of general chemistry. In particular, the conversion by ammonia of the +1 form of nicotine to the easier-to-assimilate free-base form illustrates the effect of pH on acid-base equilibrium. The part played by ammonia in tobacco smoke is analogous to what takes place when cocaine is "free-based".

  7. HF acid blends based on formation conditions eliminate precipitation problems

    SciTech Connect

    Gdanski, R.; Shuchart, C.

    1997-03-01

    Formulating HCl-HF acid blends based on the mineralogy and temperature of a formation can increase the success of hydrofluoric acid (HF) treatments. Sodium and potassium in the structures of formation minerals can cause precipitation and matrix plugging problems during acidizing. Slight modifications of the acid blend used in the treatment can help eliminate fluosilicate precipitation. Researchers recently conducted tests to determine how acid blends react in different formations under varying temperatures. The results of the tests indicate that the minimum HCl:HF ratio in an acid blend is 6-to-1, and the optimum ratio is 9-to-1. Regular mud acid (12% HCl-3% HF) has been used successfully for years to enhance production in sandstone formations. By the 1980s, operators began to vary the concentration of HF and HCl acids to solve excessive sanding problems in sandstone. The paper discusses treatment problems, formation characteristics, alumino-silicate scaling, research results, brine compatibility, optimum treatment, and acid volume guidelines.

  8. Translation between representation languages

    NASA Technical Reports Server (NTRS)

    Vanbaalen, Jeffrey

    1994-01-01

    A capability for translating between representation languages is critical for effective knowledge base reuse. A translation technology for knowledge representation languages based on the use of an interlingua for communicating knowledge is described. The interlingua-based translation process consists of three major steps: translation from the source language into a subset of the interlingua, translation between subsets of the interlingua, and translation from a subset of the interlingua into the target language. The first translation step into the interlingua can typically be specified in the form of a grammar that describes how each top-level form in the source language translates into the interlingua. In cases where the source language does not have a declarative semantics, such a grammar is also a specification of a declarative semantics for the language. A methodology for building translators that is currently under development is described. A 'translator shell' based on this methodology is also under development. The shell has been used to build translators for multiple representation languages and those translators have successfully translated nontrivial knowledge bases.

  9. Alternative Approach to Nuclear Data Representation

    SciTech Connect

    Pruet, J; Brown, D; Beck, B; McNabb, D P

    2005-07-27

    This paper considers an approach for representing nuclear data that is qualitatively different from the approach currently adopted by the nuclear science community. Specifically, they examine a representation in which complicated data is described through collections of distinct and self contained simple data structures. This structure-based representation is compared with the ENDF and ENDL formats, which can be roughly characterized as dictionary-based representations. A pilot data representation for replacing the format currently used at LLNL is presented. Examples are given as is a discussion of promises and shortcomings associated with moving from traditional dictionary-based formats to a structure-rich or class-like representation.

  10. A generalized wavelet extrema representation

    SciTech Connect

    Lu, Jian; Lades, M.

    1995-10-01

    The wavelet extrema representation originated by Stephane Mallat is a unique framework for low-level and intermediate-level (feature) processing. In this paper, we present a new form of wavelet extrema representation generalizing Mallat`s original work. The generalized wavelet extrema representation is a feature-based multiscale representation. For a particular choice of wavelet, our scheme can be interpreted as representing a signal or image by its edges, and peaks and valleys at multiple scales. Such a representation is shown to be stable -- the original signal or image can be reconstructed with very good quality. It is further shown that a signal or image can be modeled as piecewise monotonic, with all turning points between monotonic segments given by the wavelet extrema. A new projection operator is introduced to enforce piecewise inonotonicity of a signal in its reconstruction. This leads to an enhancement to previously developed algorithms in preventing artifacts in reconstructed signal.

  11. Revealing children's implicit spelling representations.

    PubMed

    Critten, Sarah; Pine, Karen J; Messer, David J

    2013-06-01

    Conceptualizing the underlying representations and cognitive mechanisms of children's spelling development is a key challenge for literacy researchers. Using the Representational Redescription model (Karmiloff-Smith), Critten, Pine and Steffler (2007) demonstrated that the acquisition of phonological and morphological knowledge may be underpinned by increasingly explicit levels of spelling representation. However, their proposal that implicit representations may underlie early 'visually based' spelling remains unresolved. Children (N = 101, aged 4-6 years) were given a recognition task (Critten et al., 2007) and a novel production task, both involving verbal justifications of why spellings are correct/incorrect, strategy use and word pattern similarity. Results for both tasks supported an implicit level of spelling characterized by the ability to correctly recognize/produce words but the inability to explain operational strategies or generalize knowledge. Explicit levels and multiple representations were also in evidence across the two tasks. Implications for cognitive mechanisms underlying spelling development are discussed.

  12. Parallel Representation of Value-Based and Finite State-Based Strategies in the Ventral and Dorsal Striatum.

    PubMed

    Ito, Makoto; Doya, Kenji

    2015-11-01

    Previous theoretical studies of animal and human behavioral learning have focused on the dichotomy of the value-based strategy using action value functions to predict rewards and the model-based strategy using internal models to predict environmental states. However, animals and humans often take simple procedural behaviors, such as the "win-stay, lose-switch" strategy without explicit prediction of rewards or states. Here we consider another strategy, the finite state-based strategy, in which a subject selects an action depending on its discrete internal state and updates the state depending on the action chosen and the reward outcome. By analyzing choice behavior of rats in a free-choice task, we found that the finite state-based strategy fitted their behavioral choices more accurately than value-based and model-based strategies did. When fitted models were run autonomously with the same task, only the finite state-based strategy could reproduce the key feature of choice sequences. Analyses of neural activity recorded from the dorsolateral striatum (DLS), the dorsomedial striatum (DMS), and the ventral striatum (VS) identified significant fractions of neurons in all three subareas for which activities were correlated with individual states of the finite state-based strategy. The signal of internal states at the time of choice was found in DMS, and for clusters of states was found in VS. In addition, action values and state values of the value-based strategy were encoded in DMS and VS, respectively. These results suggest that both the value-based strategy and the finite state-based strategy are implemented in the striatum.

  13. A novel protein characterization based on Pseudo Amino Acids composition and Star-like graph topological indices.

    PubMed

    Dai, Qi; Tao, Hong; Ma, Tingting; Yao, Yuhua; He, Pingan

    2017-02-17

    In the work, a new description of proteins based on five topological indices of star-like graph representation and the occurrence frequency of 20 amino acids was proposed to compare the similarities of proteins. A phylogenetic tree of eight ND6 proteins was constructed to demonstrate the effectiveness and rationality of our approach. Analogously, we applied this method to RNA polymerase proteins of some subtypes of influenza virus to infer their phylogenetic relationship. The results showed that the phylogenetic relationship among RNA polymerase of influenza virus is closely related to distributions of species virus host and geographical distribution.

  14. The semantic representation of event information depends on the cue modality: an instance of meaning-based retrieval.

    PubMed

    Karlsson, Kristina; Sikström, Sverker; Willander, Johan

    2013-01-01

    The semantic content, or the meaning, is the essence of autobiographical memories. In comparison to previous research, which has mainly focused on the phenomenological experience and the age distribution of retrieved events, the present study provides a novel view on the retrieval of event information by quantifying the information as semantic representations. We investigated the semantic representation of sensory cued autobiographical events and studied the modality hierarchy within the multimodal retrieval cues. The experiment comprised a cued recall task, where the participants were presented with visual, auditory, olfactory or multimodal retrieval cues and asked to recall autobiographical events. The results indicated that the three different unimodal retrieval cues generate significantly different semantic representations. Further, the auditory and the visual modalities contributed the most to the semantic representation of the multimodally retrieved events. Finally, the semantic representation of the multimodal condition could be described as a combination of the three unimodal conditions. In conclusion, these results suggest that the meaning of the retrieved event information depends on the modality of the retrieval cues.

  15. Amino acid profile of milk-based infant formulas.

    PubMed

    Viadel, B; Alegriá, A; Farré, R; Abellán, P; Romero, F

    2000-09-01

    The protein content and amino acid profile of three milk-based infant formulas, two of which were powdered (adapted and follow-on) and the third liquid, were determined to check their compliance with the EU directive and to evaluate whether or not they fulfil an infant's nutritional needs. To obtain the amino acid profile proteins were subjected to acid hydrolysis, prior to which the sulfur-containing amino acids were oxidized with performic acid. The amino acids were derivatized with phenylisothiocyanate (PITC) and then determined by ion-pair reverse phase high performance liquid chromatography (HPLC) In the case of tryptophan a basic hydrolysis was applied and there was no need of derivatization. The protein contents of the analysed formulas were in the ranges established by the EU directive for these products and the amino acid contents were in the ranges reported by other authors for these types of formulas. In all cases the tryptophan content determined the value of the chemical score, which was always lower than 80% of the reference protein but in the ranges reported by other authors. The analysed adapted infant formula provides amino acids in amounts higher than the established nutritional requirements.

  16. Poly (ricinoleic acid) based novel thermosetting elastomer.

    PubMed

    Ebata, Hiroki; Yasuda, Mayumi; Toshima, Kazunobu; Matsumura, Shuichi

    2008-01-01

    A novel bio-based thermosetting elastomer was prepared by the lipase-catalyzed polymerization of methyl ricinoleate with subsequent vulcanization. Some mechanical properties of the cured carbon black-filled polyricinoleate compounds were evaluated as a thermosetting elastomer. It was found that the carbon black-filled polyricinoleate compounds were readily cured by sulfur curatives to produce a thermosetting elastomer that formed a rubber-like sheet with a smooth and non-sticky surface. The curing behaviors and mechanical properties were dependent on both the molecular weight of the polyricinoleate and the amount of the sulfur curatives. Cured compounds consisting of polyricinoleate with a molecular weight of 100,800 showed good mechanical properties, such as a hardness of 48 A based on the durometer A measurements, a tensile strength at break of 6.91 MPa and an elongation at break of 350%.

  17. Multidirectional and Topography-based Dynamic-scale Varifold Representations with Application to Matching Developing Cortical Surfaces.

    PubMed

    Rekik, Islem; Li, Gang; Lin, Weili; Shen, Dinggang

    2016-07-15

    The human cerebral cortex is marked by great complexity as well as substantial dynamic changes during early postnatal development. To obtain a fairly comprehensive picture of its age-induced and/or disorder-related cortical changes, one needs to match cortical surfaces to one another, while maximizing their anatomical alignment. Methods that geodesically shoot surfaces into one another as currents (a distribution of oriented normals) and varifolds (a distribution of non-oriented normals) provide an elegant Riemannian framework for generic surface matching and reliable statistical analysis. However, both conventional current and varifold matching methods have two key limitations. First, they only use the normals of the surface to measure its geometry and guide the warping process, which overlooks the importance of the orientations of the inherently convoluted cortical sulcal and gyral folds. Second, the 'conversion' of a surface into a current or a varifold operates at a fixed scale under which geometric surface details will be neglected, which ignores the dynamic scales of cortical foldings. To overcome these limitations and improve varifold-based cortical surface registration, we propose two different strategies. The first strategy decomposes each cortical surface into its normal and tangent varifold representations, by integrating principal curvature direction field into the varifold matching framework, thus providing rich information of the orientation of cortical folding and better characterization of the complex cortical geometry. The second strategy explores the informative cortical geometric features to perform a dynamic-scale measurement of the cortical surface that depends on the local surface topography (e.g., principal curvature), thereby we introduce the concept of a topography-based dynamic-scale varifold. We tested the proposed varifold variants for registering 12 pairs of dynamically developing cortical surfaces from 0 to 6 months of age. Both

  18. Micellar acid-base potentiometric titrations of weak acidic and/or insoluble drugs.

    PubMed

    Gerakis, A M; Koupparis, M A; Efstathiou, C E

    1993-01-01

    The effect of various surfactants [the cationics cetyl trimethyl ammonium bromide (CTAB) and cetyl pyridinium chloride (CPC), the anionic sodium dodecyl sulphate (SDS), and the nonionic polysorbate 80 (Tween 80)] on the solubility and ionization constant of some sparingly soluble weak acids of pharmaceutical interest was studied. Benzoic acid (and its 3-methyl-, 3-nitro-, and 4-tert-butyl-derivatives), acetylsalicylic acid, naproxen and iopanoic acid were chosen as model examples. Precise and accurate acid-base titrations in micellar systems were made feasible using a microcomputer-controlled titrator. The response curve, response time and potential drift of the glass electrode in the micellar systems were examined. The cationics CTAB and CPC were found to increase considerably the ionization constant of the weak acids (delta pKa ranged from -0.21 to -3.57), while the anionic SDS showed negligible effect and the nonionic Tween 80 generally decreased the ionization constants. The solubility of the acids in aqueous micellar and acidified micellar solutions was studied spectrophotometrically and it was found increased in all cases. Acetylsalicylic acid, naproxen, benzoic acid and iopanoic acid could be easily determined in raw material and some of them in pharmaceutical preparations by direct titration in CTAB-micellar system instead of using the traditional non-aqueous or back titrimetry. Precisions of 0.3-4.3% RSD and good correlation with the official tedious methods were obtained. The interference study of some excipients showed that a preliminary test should be carried out before the assay of formulations.

  19. Soil Studies: Applying Acid-Base Chemistry to Environmental Analysis.

    ERIC Educational Resources Information Center

    West, Donna M.; Sterling, Donna R.

    2001-01-01

    Laboratory activities for chemistry students focus attention on the use of acid-base chemistry to examine environmental conditions. After using standard laboratory procedures to analyze soil and rainwater samples, students use web-based resources to interpret their findings. Uses CBL probes and graphing calculators to gather and analyze data and…

  20. High School Students' Concepts of Acids and Bases.

    ERIC Educational Resources Information Center

    Ross, Bertram H. B.

    An investigation of Ontario high school students' understanding of acids and bases with quantitative and qualitative methods revealed misconceptions. A concept map, based on the objectives of the Chemistry Curriculum Guideline, generated multiple-choice items and interview questions. The multiple-choice test was administered to 34 grade 12…

  1. Hard and soft acids and bases: atoms and atomic ions.

    PubMed

    Reed, James L

    2008-07-07

    The structural origin of hard-soft behavior in atomic acids and bases has been explored using a simple orbital model. The Pearson principle of hard and soft acids and bases has been taken to be the defining statement about hard-soft behavior and as a definition of chemical hardness. There are a number of conditions that are imposed on any candidate structure and associated property by the Pearson principle, which have been exploited. The Pearson principle itself has been used to generate a thermodynamically based scale of relative hardness and softness for acids and bases (operational chemical hardness), and a modified Slater model has been used to discern the electronic origin of hard-soft behavior. Whereas chemical hardness is a chemical property of an acid or base and the operational chemical hardness is an experimental measure of it, the absolute hardness is a physical property of an atom or molecule. A critical examination of chemical hardness, which has been based on a more rigorous application of the Pearson principle and the availability of quantitative measures of chemical hardness, suggests that the origin of hard-soft behavior for both acids and bases resides in the relaxation of the electrons not undergoing transfer during the acid-base interaction. Furthermore, the results suggest that the absolute hardness should not be taken as synonymous with chemical hardness but that the relationship is somewhat more complex. Finally, this work provides additional groundwork for a better understanding of chemical hardness that will inform the understanding of hardness in molecules.

  2. Ionic liquid supported acid/base-catalyzed production of biodiesel.

    PubMed

    Lapis, Alexandre A M; de Oliveira, Luciane F; Neto, Brenno A D; Dupont, Jairton

    2008-01-01

    The transesterification (alcoholysis) reaction was successfully applied to synthesize biodiesel from vegetable oils using imidazolium-based ionic liquids under multiphase acidic and basic conditions. Under basic conditions, the combination of the ionic liquid 1-n-butyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide (BMINTf2), alcohols, and K2CO3 (40 mol %) results in the production of biodiesel from soybean oil in high yields (>98%) and purity. H2SO4 immobilized in BMINTf2 efficiently promotes the transesterification reaction of soybean oil and various primary and secondary alcohols. In this multiphase process the acid is almost completely retained in the ionic liquid phase, while the biodiesel forms a separate phase. The recovered ionic liquid containing the acid could be reused at least six times without any significant loss in the biodiesel yield or selectivity. In both catalytic processes (acid and base), the reactions proceed as typical multiphasic systems in which the formed biodiesel accumulates as the upper phase and the glycerol by-product is selectively captured by the alcohol-ionic liquid-acid/base phase. Classical ionic liquids such as 1-n-butyl-3-methylimidazolium tetrafluoroborate and hexafluorophosphate are not stable under these acidic or basic conditions and decompose.

  3. A computational study of ultrafast acid dissociation and acid-base neutralization reactions. I. The model

    NASA Astrophysics Data System (ADS)

    Maurer, Patrick; Thomas, Vibin; Rivard, Ugo; Iftimie, Radu

    2010-07-01

    Ultrafast, time-resolved investigations of acid-base neutralization reactions have recently been performed using systems containing the photoacid 8-hydroxypyrene-1,3,6-trisulfonic acid trisodium salt (HPTS) and various Brønsted bases. Two conflicting neutralization mechanisms have been formulated by Mohammed et al. [Science 310, 83 (2005)] and Siwick et al. [J. Am. Chem. Soc. 129, 13412 (2007)] for the same acid-base system. Herein an ab initio molecular dynamics based computational model is formulated, which is able to investigate the validity of the proposed mechanisms in the general context of ground-state acid-base neutralization reactions. Our approach consists of using 2,4,6-tricyanophenol (exp. pKa≅1) as a model for excited-state HPTS∗ (pKa≅1.4) and carboxylate ions for the accepting base. We employ our recently proposed dipole-field/quantum mechanics (QM) treatment [P. Maurer and R. Iftimie, J. Chem. Phys. 132, 074112 (2010)] of the proton donor and acceptor molecules. This approach allows one to tune the free energy of neutralization to any desired value as well as model initial nonequilibrium hydration effects caused by a sudden increase in acidity, making it possible to achieve a more realistic comparison with experimental data than could be obtained via a full-QM treatment of the entire system. It is demonstrated that the dipole-field/QM model reproduces correctly key properties of the 2,4,6-tricyanophenol acid molecule including gas-phase proton dissociation energies and dipole moments, and condensed-phase hydration structure and pKa values.

  4. Acid-base properties of humic and fulvic acids formed during composting.

    PubMed

    Plaza, César; Senesi, Nicola; Polo, Alfredo; Brunetti, Gennaro

    2005-09-15

    The soil acid-base buffering capacity and the biological availability, mobilization, and transport of macro- and micronutrients, toxic metal ions, and xenobiotic organic cations in soil are strongly influenced by the acid-base properties of humic substances, of which humic and fulvic acids are the major fractions. For these reasons, the proton binding behavior of the humic acid-like (HA) and fulvic acid-like (FA) fractions contained in a compost are believed to be instrumental in its successful performance in soil. In this work, the acid-base properties of the HAs and FAs isolated from a mixture of the sludge residue obtained from olive oil mill wastewater (OMW) evaporated in an open-air pond and tree cuttings (TC) at different stages of composting were investigated by a current potentiometric titration method and the nonideal competitive adsorption (NICA)-Donnan model. The NICA-Donnan model provided an excellent description of the acid-base titration data, and pointed out substantial differences in site density and proton-binding affinity between the HAs and FAs examined. With respect to FAs, HAs were characterized by a smaller content of carboxylic- and phenolic-type groups and their larger affinities for proton binding. Further, HAs featured a greater heterogeneity in carboxylic-type groups than FAs. The composting process increased the content and decreased the proton affinity of carboxylic- and phenolic-type groups of HAs and FAs, and increased the heterogeneity of phenolic-type groups of HAs. As a whole, these effects indicated that the composting process could produce HA and FA fractions with greater cation binding capacities. These results suggest that composting of organic materials improves their agronomic and environmental value by increasing their potential to retain and exchange macro- and micronutrients, and to reduce the bioavailability of organic and inorganic pollutants.

  5. Deoxyribonucleic acid base compositions of dermatophytes.

    PubMed

    Davison, F D; Mackenzie, D W; Owen, R J

    1980-06-01

    DNA was extracted and purified from 55 dermatophyte isolates representing 34 species of Trichophyton, Microsporum and Epidermophyton. The base compositions of the chromosomal DNA were determined by CsCl density gradient centrifugation and were found to be in the narrow range of 48.7 to 50.3 mol % G + C. A satellite DNA component assumed to be of mitochondrial origin was present in most strains, with a G + C content ranging from 14.7 to 30.8 mol % G + C. Heterogeneity in microscopic and colonial characteristics was not reflected in differences in the mean G + C content of the chromosomal DNAs. Strains varied in the G + C contents of satelite DNA, but these did not correlate with traditional species concepts.

  6. A cartoon-texture decomposition-based image deburring model by using framelet-based sparse representation

    NASA Astrophysics Data System (ADS)

    Chen, Huasong; Qu, Xiangju; Jin, Ying; Li, Zhenhua; He, Anzhi

    2016-10-01

    Image deblurring is a fundamental problem in image processing. Conventional methods often deal with the degraded image as a whole while ignoring that an image contains two different components: cartoon and texture. Recently, total variation (TV) based image decomposition methods are introduced into image deblurring problem. However, these methods often suffer from the well-known stair-casing effects of TV. In this paper, a new cartoon -texture based sparsity regularization method is proposed for non-blind image deblurring. Based on image decomposition, it respectively regularizes the cartoon with a combined term including framelet-domain-based sparse prior and a quadratic regularization and the texture with the sparsity of discrete cosine transform domain. Then an adaptive alternative split Bregman iteration is proposed to solve the new multi-term sparsity regularization model. Experimental results demonstrate that our method can recover both cartoon and texture of images simultaneously, and therefore can improve the visual effect, the PSNR and the SSIM of the deblurred image efficiently than TV and the undecomposed methods.

  7. Chemistry Problem Solving Instruction: A Comparison of Three Computer-Based Formats for Learning from Hierarchical Network Problem Representations

    ERIC Educational Resources Information Center

    Ngu, Bing Hiong; Mit, Edwin; Shahbodin, Faaizah; Tuovinen, Juhani

    2009-01-01

    Within the cognitive load theory framework, we designed and compared three alternative instructional solution formats that can be derived from a common static hierarchical network representation depicting problem structure. The interactive-solution format permitted students to search in self-controlled manner for solution steps, static-solution…

  8. Evidence-Based Practices: Applications of Concrete Representational Abstract Framework across Math Concepts for Students with Mathematics Disabilities

    ERIC Educational Resources Information Center

    Agrawal, Jugnu; Morin, Lisa L.

    2016-01-01

    Students with mathematics disabilities (MD) experience difficulties with both conceptual and procedural knowledge of different math concepts across grade levels. Research shows that concrete representational abstract framework of instruction helps to bridge this gap for students with MD. In this article, we provide an overview of this strategy…

  9. Teaching Problem Solving to Students Receiving Tiered Interventions Using the Concrete-Representational-Abstract Sequence and Schema-Based Instruction

    ERIC Educational Resources Information Center

    Flores, Margaret M.; Hinton, Vanessa M.; Burton, Megan E.

    2016-01-01

    Mathematical word problems are the most common form of mathematics problem solving implemented in K-12 schools. Identifying key words is a frequent strategy taught in classrooms in which students struggle with problem solving and show low success rates in mathematics. Researchers show that using the concrete-representational-abstract (CRA)…

  10. Modelling the impact of the light regime on single tree transpiration based on 3D representations of plant architecture

    NASA Astrophysics Data System (ADS)

    Bittner, S.; Priesack, E.

    2012-04-01

    We apply a functional-structural model of tree water flow to single old-growth trees in a temperate broad-leaved forest stand. Roots, stems and branches are represented by connected porous cylinder elements further divided into the inner heartwood cylinders surrounded by xylem and phloem. Xylem water flow is simulated by applying a non-linear Darcy flow in porous media driven by the water potential gradient according to the cohesion-tension theory. The flow model is based on physiological input parameters such as the hydraulic conductivity, stomatal response to leaf water potential and root water uptake capability and, thus, can reflect the different properties of tree species. The actual root water uptake is calculated using also a non-linear Darcy law based on the gradient between root xylem water potential and rhizosphere soil water potential and by the simulation of soil water flow applying Richards equation. A leaf stomatal conductance model is combined with the hydrological tree and soil water flow model and a spatially explicit three-dimensional canopy light model. The structure of the canopy and the tree architectures are derived by applying an automatic tree skeleton extraction algorithm from point clouds obtained by use of a terrestrial laser scanner allowing an explicit representation of the water flow path in the stem and branches. The high spatial resolution of the root and branch geometry and their connectivity makes the detailed modelling of the water use of single trees possible and allows for the analysis of the interaction between single trees and the influence of the canopy light regime (including different fractions of direct sunlight and diffuse skylight) on the simulated sap flow and transpiration. The model can be applied at various sites and to different tree species, enabling the up-scaling of the water usage of single trees to the total transpiration of mixed stands. Examples are given to reveal differences between diffuse- and ring

  11. Relativistic effects on acidities and basicities of Brønsted acids and bases containing gold.

    PubMed

    Koppel, Ilmar A; Burk, Peeter; Kasemets, Kalev; Koppel, Ivar

    2013-11-07

    It is usually believed that relativistic effects as described by the Dirac-Schrödinger equation (relative to the classical or time-independent Schrödinger equation) are of little importance in chemistry. A closer look, however, reveals that some important and widely known properties (e.g., gold is yellow, mercury is liquid at room temperature) stem from relativistic effects. So far the influence of relativistic effects on the acid-base properties has been mostly ignored. Here we show that at least for compounds of gold such omission is completely erroneous and would lead to too high basicity and too low acidity values with errors in the range of 25-55 kcal mol(-1) (or 20 to 44 powers of ten in pK(a) units) in the gas-phase. These findings have important implications for the design of new superstrong acids and bases, and for the understanding of gold-catalysed reactions.

  12. Acid-Base Titration of (S)-Aspartic Acid: A Circular Dichroism Spectrophotometry Experiment

    NASA Astrophysics Data System (ADS)

    Cavaleiro, Ana M. V.; Pedrosa de Jesus, Júlio D.

    2000-09-01

    The magnitude of the circular dichroism of (S)-aspartic acid in aqueous solutions at a fixed wavelength varies with the addition of strong base. This laboratory experiment consists of the circular dichroism spectrophotometric acid-base titration of (S)-aspartic acid in dilute aqueous solutions, and the use of the resulting data to determine the ionization constant of the protonated amino group. The work familiarizes students with circular dichroism and illustrates the possibility of performing titrations using a less usual instrumental method of following the course of a reaction. It shows the use of a chiroptical property in the determination of the concentration in solution of an optically active molecule, and exemplifies the use of a spectrophotometric titration in the determination of an ionization constant.

  13. Students' Representational Fluency at University: A Cross-Sectional Measure of How Multiple Representations Are Used by Physics Students Using the Representational Fluency Survey

    ERIC Educational Resources Information Center

    Hill, Matthew; Sharma, Manjula Devi

    2015-01-01

    To succeed within scientific disciplines, using representations, including those based on words, graphs, equations, and diagrams, is important. Research indicates that the use of discipline specific representations (sometimes referred to as expert generated representations), as well as multi-representational use, is critical for problem solving…

  14. Vietnamese Document Representation and Classification

    NASA Astrophysics Data System (ADS)

    Nguyen, Giang-Son; Gao, Xiaoying; Andreae, Peter

    Vietnamese is very different from English and little research has been done on Vietnamese document classification, or indeed, on any kind of Vietnamese language processing, and only a few small corpora are available for research. We created a large Vietnamese text corpus with about 18000 documents, and manually classified them based on different criteria such as topics and styles, giving several classification tasks of different difficulty levels. This paper introduces a new syllable-based document representation at the morphological level of the language for efficient classification. We tested the representation on our corpus with different classification tasks using six classification algorithms and two feature selection techniques. Our experiments show that the new representation is effective for Vietnamese categorization, and suggest that best performance can be achieved using syllable-pair document representation, an SVM with a polynomial kernel as the learning algorithm, and using Information gain and an external dictionary for feature selection.

  15. Nucleic acid-based nanoengineering: novel structures for biomedical applications

    PubMed Central

    Li, Hanying; LaBean, Thomas H.; Leong, Kam W.

    2011-01-01

    Nanoengineering exploits the interactions of materials at the nanometre scale to create functional nanostructures. It relies on the precise organization of nanomaterials to achieve unique functionality. There are no interactions more elegant than those governing nucleic acids via Watson–Crick base-pairing rules. The infinite combinations of DNA/RNA base pairs and their remarkable molecular recognition capability can give rise to interesting nanostructures that are only limited by our imagination. Over the past years, creative assembly of nucleic acids has fashioned a plethora of two-dimensional and three-dimensional nanostructures with precisely controlled size, shape and spatial functionalization. These nanostructures have been precisely patterned with molecules, proteins and gold nanoparticles for the observation of chemical reactions at the single molecule level, activation of enzymatic cascade and novel modality of photonic detection, respectively. Recently, they have also been engineered to encapsulate and release bioactive agents in a stimulus-responsive manner for therapeutic applications. The future of nucleic acid-based nanoengineering is bright and exciting. In this review, we will discuss the strategies to control the assembly of nucleic acids and highlight the recent efforts to build functional nucleic acid nanodevices for nanomedicine. PMID:23050076

  16. Crystal and molecular structure of eight organic acid-base adducts from 2-methylquinoline and different acids

    NASA Astrophysics Data System (ADS)

    Zhang, Jing; Jin, Shouwen; Tao, Lin; Liu, Bin; Wang, Daqi

    2014-08-01

    Eight supramolecular complexes with 2-methylquinoline and acidic components as 4-aminobenzoic acid, 2-aminobenzoic acid, salicylic acid, 5-chlorosalicylic acid, 3,5-dinitrosalicylic acid, malic acid, sebacic acid, and 1,5-naphthalenedisulfonic acid were synthesized and characterized by X-ray crystallography, IR, mp, and elemental analysis. All of the complexes are organic salts except compound 2. All supramolecular architectures of 1-8 involve extensive classical hydrogen bonds as well as other noncovalent interactions. The results presented herein indicate that the strength and directionality of the classical hydrogen bonds (ionic or neutral) between acidic components and 2-methylquinoline are sufficient to bring about the formation of binary organic acid-base adducts. The role of weak and strong noncovalent interactions in the crystal packing is ascertained. These weak interactions combined, the complexes 1-8 displayed 2D-3D framework structure.

  17. Acid-base metabolism: implications for kidney stones formation.

    PubMed

    Hess, Bernhard

    2006-04-01

    The physiology and pathophysiology of renal H+ ion excretion and urinary buffer systems are reviewed. The main focus is on the two major conditions related to acid-base metabolism that cause kidney stone formation, i.e., distal renal tubular acidosis (dRTA) and abnormally low urine pH with subsequent uric acid stone formation. Both the entities can be seen on the background of disturbances of the major urinary buffer system, NH3+ <--> NH4+. On the one hand, reduced distal tubular secretion of H+ ions results in an abnormally high urinary pH and either incomplete or complete dRTA. On the other hand, reduced production/availability of NH4+ is the cause of an abnormally low urinary pH, which predisposes to uric acid stone formation. Most recent research indicates that the latter abnormality may be a renal manifestation of the increasingly prevalent metabolic syndrome. Despite opposite deviations from normal urinary pH values, both the dRTA and uric acid stone formation due to low urinary pH require the same treatment, i.e., alkali. In the dRTA, alkali is needed for improving the body's buffer capacity, whereas the goal of alkali treatment in uric acid stone formers is to increase the urinary pH to 6.2-6.8 in order to minimize uric acid crystallization.

  18. Evolution of Acid-Base Concept (1917-1984).

    ERIC Educational Resources Information Center

    Gamble, James L., Jr.

    1984-01-01

    Evaluates the accuracy and usefulness of a simpler rationale for teaching acid-base physiology as compared to more complex approaches frequently taught in physiology courses. Also reviews problems of terminology, giving emphasis to the significant effects that the choice of words can have on students' concepts. (JN)

  19. Using Spreadsheets to Produce Acid-Base Titration Curves.

    ERIC Educational Resources Information Center

    Cawley, Martin James; Parkinson, John

    1995-01-01

    Describes two spreadsheets for producing acid-base titration curves, one uses relatively simple cell formulae that can be written into the spreadsheet by inexperienced students and the second uses more complex formulae that are best written by the teacher. (JRH)

  20. Linear titration plots for polyfunctional weak acids and bases.

    PubMed

    Midgley, D; McCallum, C

    1976-04-01

    Procedures are derived for obtaining the equivalence volumes in the potentiometric titrations of polyfunctional weak acids and weak bases by a linear titration plot method. The effect of errors in the equilibrium constants on the accuracy is considered. A Fortran program is available to do the calculations.

  1. Dynamic Buffer Capacity in Acid-Base Systems.

    PubMed

    Michałowska-Kaczmarczyk, Anna M; Michałowski, Tadeusz

    The generalized concept of 'dynamic' buffer capacity βV is related to electrolytic systems of different complexity where acid-base equilibria are involved. The resulting formulas are presented in a uniform and consistent form. The detailed calculations are related to two Britton-Robinson buffers, taken as examples.

  2. Acid-Base Disorders--A Computer Simulation.

    ERIC Educational Resources Information Center

    Maude, David L.

    1985-01-01

    Describes and lists a program for Apple Pascal Version 1.1 which investigates the behavior of the bicarbonate-carbon dioxide buffer system in acid-base disorders. Designed specifically for the preclinical medical student, the program has proven easy to use and enables students to use blood gas parameters to arrive at diagnoses. (DH)

  3. Students' Understanding of Acids/Bases in Organic Chemistry Contexts

    ERIC Educational Resources Information Center

    Cartrette, David P.; Mayo, Provi M.

    2011-01-01

    Understanding key foundational principles is vital to learning chemistry across different contexts. One such foundational principle is the acid/base behavior of molecules. In the general chemistry sequence, the Bronsted-Lowry theory is stressed, because it lends itself well to studying equilibrium and kinetics. However, the Lewis theory of…

  4. Turkish Prospective Chemistry Teachers' Alternative Conceptions about Acids and Bases

    ERIC Educational Resources Information Center

    Boz, Yezdan

    2009-01-01

    The purpose of this study was to obtain prospective chemistry teachers' conceptions about acids and bases concepts. Thirty-eight prospective chemistry teachers were the participants. Data were collected by means of an open-ended questionnaire and semi-structured interviews. Analysis of data indicated that most prospective teachers did not have…

  5. Hard and soft acids and bases: small molecules.

    PubMed

    Reed, James L

    2009-08-03

    The operational chemical hardness has been determined for the hydride, chloride, and fluoride derivatives of the anionic atomic bases of the second period. Of interest is the identification of the structure and associated processes that give rise to hard-soft behavior in small molecules. The Pearson Principle of Hard and Soft Acids and Bases has been taken to be the defining statement about hard-soft behavior and as a definition of chemical hardness. Similar to the case for atoms, the molecule's responding electrons have been identified as the structure giving rise to hard-soft behavior, and a relaxation described by a modified Slater model has been identified as the associated process. The responding electrons are the molecule's valence electrons that are not undergoing electron transfer in an acid-base interaction. However, it has been demonstrated that chemical hardness is a local property, and only those responding electrons that are associated with the base's binding atom directly impact chemical hardness.

  6. Developmental Changes in the Profiles of Dyscalculia: An Explanation Based on a Double Exact-and-Approximate Number Representation Model.

    PubMed

    Noël, Marie-Pascale; Rousselle, Laurence

    2011-01-01

    Studies on developmental dyscalculia (DD) have tried to identify a basic numerical deficit that could account for this specific learning disability. The first proposition was that the number magnitude representation of these children was impaired. However, Rousselle and Noël (2007) brought data showing that this was not the case but rather that these children were impaired when processing the magnitude of symbolic numbers only. Since then, incongruent results have been published. In this paper, we will propose a developmental perspective on this issue. We will argue that the first deficit shown in DD regards the building of an exact representation of numerical value, thanks to the learning of symbolic numbers, and that the reduced acuity of the approximate number magnitude system appears only later and is secondary to the first deficit.

  7. Multiple Sparse Representations Classification

    PubMed Central

    Plenge, Esben; Klein, Stefan S.; Niessen, Wiro J.; Meijering, Erik

    2015-01-01

    Sparse representations classification (SRC) is a powerful technique for pixelwise classification of images and it is increasingly being used for a wide variety of image analysis tasks. The method uses sparse representation and learned redundant dictionaries to classify image pixels. In this empirical study we propose to further leverage the redundancy of the learned dictionaries to achieve a more accurate classifier. In conventional SRC, each image pixel is associated with a small patch surrounding it. Using these patches, a dictionary is trained for each class in a supervised fashion. Commonly, redundant/overcomplete dictionaries are trained and image patches are sparsely represented by a linear combination of only a few of the dictionary elements. Given a set of trained dictionaries, a new patch is sparse coded using each of them, and subsequently assigned to the class whose dictionary yields the minimum residual energy. We propose a generalization of this scheme. The method, which we call multiple sparse representations classification (mSRC), is based on the observation that an overcomplete, class specific dictionary is capable of generating multiple accurate and independent estimates of a patch belonging to the class. So instead of finding a single sparse representation of a patch for each dictionary, we find multiple, and the corresponding residual energies provides an enhanced statistic which is used to improve classification. We demonstrate the efficacy of mSRC for three example applications: pixelwise classification of texture images, lumen segmentation in carotid artery magnetic resonance imaging (MRI), and bifurcation point detection in carotid artery MRI. We compare our method with conventional SRC, K-nearest neighbor, and support vector machine classifiers. The results show that mSRC outperforms SRC and the other reference methods. In addition, we present an extensive evaluation of the effect of the main mSRC parameters: patch size, dictionary size, and

  8. Physiologically based pharmacokinetic modeling of dibromoacetic acid in F344 rats

    SciTech Connect

    Matthews, Jessica L.; Schultz, Irvin R.; Easterling, Michael R.; Melnick, Ronald L.

    2010-04-15

    A novel physiologically based pharmacokinetic (PBPK) model structure, which includes submodels for the common metabolites (glyoxylate (GXA) and oxalate (OXA)) that may be involved in the toxicity or carcinogenicity of dibromoacetic acid (DBA), has been developed. Particular attention is paid to the representation of hepatic metabolism, which is the primary elimination mechanism. DBA-induced suicide inhibition is modeled by irreversible covalent binding of the intermediate metabolite alpha-halocarboxymethylglutathione (alphaH1) to the glutathione-S-transferase zeta (GSTzeta) enzyme. We also present data illustrating the presence of a secondary non-GSTzeta metabolic pathway for DBA, but not dichloroacetic acid (DCA), that produces GXA. The model is calibrated with plasma and urine concentration data from DBA exposures in female F344 rats through intravenous (IV), oral gavage, and drinking water routes. Sensitivity analysis is performed to confirm identifiability of estimated parameters. Finally, model validation is performed with data sets not used during calibration. Given the structural similarity of dihaloacetates (DHAs), we hypothesize that the PBPK model presented here has the capacity to describe the kinetics of any member or mixture of members of this class in any species with the alteration of chemical-and species-specific parameters.

  9. [Time perceptions and representations].

    PubMed

    Tordjman, S

    2015-09-01

    fundamentally lacking in their physiological development due to possibly altered circadian rhythms, including arhythmy and asynchrony. Time measurement, based on the repetition of discontinuity at regular intervals, involves also a spatial representation. It is our own trajectory through space-time, and thus our own motion, including the physiological process of aging, that affords us a representation of the passing of time, just as the countryside seems to be moving past us when we travel in a vehicle. Chinese and Indian societies actually have circular representations of time, and linear representations of time and its trajectory through space-time are currently a feature of Western societies. Circular time is collective time, and its metaphysical representations go beyond the life of a single individual, referring to the cyclical, or at least nonlinear, nature of time. Linear time is individual time, in that it refers to the scale of a person's lifetime, and it is physically represented by an arrow flying ineluctably from the past to the future. An intermediate concept can be proposed that acknowledges the existence of linear time involving various arrows of time corresponding to different lifespans (human, animal, plant, planet lifespans, etc.). In fact, the very notion of time would depend on the trajectory of each arrow of time, like shooting stars in the sky with different trajectory lengths which would define different time scales. The time scale of these various lifespans are very different (for example, a few decades for humans and a few days or hours for insects). It would not make sense to try to understand the passage of time experienced by an insect which may live only a few hours based on a human time scale. One hour in an insect's life cannot be compared to one experienced by a human. Yet again, it appears that there is a coexistence of different clocks based here on different lifespans. Finally, the evolution of our society focused on the present moment and

  10. Chirality in a quaternionic representation of the genetic code.

    PubMed

    Manuel Carlevaro, C; Irastorza, Ramiro M; Vericat, Fernando

    2016-12-01

    A quaternionic representation of the genetic code, previously reported by the authors (BioSystems 141 (10-19), 2016), is updated in order to incorporate chirality of nucleotide bases and amino acids. The original representation associates with each nucleotide base a prime integer quaternion of norm 7 and involves a function that assigns to each codon, represented by three of these quaternions, another integer quaternion (amino acid type quaternion). The assignation is such that the essentials of the standard genetic code (particularly its degeneration) are preserved. To show the advantages of such a quaternionic representation we have designed an algorithm to go from the primary to the tertiary structure of the protein. The algorithm uses, besides of the type quaternions, a second kind of quaternions with real components that we additionally associate with the amino acids according to their order along the proteins (order quaternions). In this context, we incorporate chirality in our representation by observing that the set of eight integer quaternions of norm 7 can be partitioned into a pair of subsets of cardinality four each with their elements mutually conjugate and by putting them into correspondence one to one with the two sets of enantiomers (D and L) of the four nucleotide bases adenine, cytosine, guanine and uracil, respectively. We then propose two diagrams in order to describe the hypothetical evolution of the genetic codes corresponding to both of the chiral systems of affinities: D-nucleotide bases/L-amino acids and L-nucleotide bases/D-amino acids at reading frames 5'→3' and 3'→5', respectively. Guided by these diagrams we define functions that in each case assign to the triplets of D- (L-) bases a L- (D-) amino acid type integer quaternion. Specifically, the integer quaternion associated with a given D-amino acid is the conjugate of that one corresponding to the enantiomer L. The chiral type quaternions obtained for the amino acids are used

  11. Acid-base thermochemistry of gaseous oxygen and sulfur substituted amino acids (Ser, Thr, Cys, Met).

    PubMed

    Riffet, Vanessa; Frison, Gilles; Bouchoux, Guy

    2011-11-07

    Acid-base thermochemistry of isolated amino acids containing oxygen or sulfur in their side chain (serine, threonine, cysteine and methionine) have been examined by quantum chemical computations. Density functional theory (DFT) was used, with B3LYP, B97-D and M06-2X functionals using the 6-31+G(d,p) basis set for geometry optimizations and the larger 6-311++G(3df,2p) basis set for energy computations. Composite methods CBS-QB3, G3B3, G4MP2 and G4 were applied to large sets of neutral, protonated and deprotonated conformers. Conformational analysis of these species, based on chemical approach and AMOEBA force field calculations, has been used to identify the lowest energy conformers and to estimate the population of conformers expected to be present at thermal equilibrium at 298 K. It is observed that G4, G4MP2, G3B3, CBS-QB3 composite methods and M06-2X DFT lead to similar conformer energies. Thermochemical parameters have been computed using either the most stable conformers or equilibrium populations of conformers. Comparison of experimental and theoretical proton affinities and Δ(acid)H shows that the G4 method provides the better agreement with deviations of less than 1.5 kJ mol(-1). From this point of view, a set of evaluated thermochemical quantities for serine, threonine, cysteine and methionine may be proposed: PA = 912, 919, 903, 938; GB = 878, 886, 870, 899; Δ(acid)H = 1393, 1391, 1396, 1411; Δ(acid)G = 1363, 1362, 1367, 1382 kJ mol(-1). This study also confirms that a non-negligible ΔpS° is associated with protonation of methionine and that the most acidic hydrogen of cysteine in the gas phase is that of the SH group. In several instances new conformers were identified thus suggesting a re-examination of several IRMPD spectra.

  12. Acid-base transport in pancreas—new challenges

    PubMed Central

    Novak, Ivana; Haanes, Kristian A.; Wang, Jing

    2013-01-01

    Along the gastrointestinal tract a number of epithelia contribute with acid or basic secretions in order to aid digestive processes. The stomach and pancreas are the most extreme examples of acid (H+) and base (HCO−3) transporters, respectively. Nevertheless, they share the same challenges of transporting acid and bases across epithelia and effectively regulating their intracellular pH. In this review, we will make use of comparative physiology to enlighten the cellular mechanisms of pancreatic HCO−3 and fluid secretion, which is still challenging physiologists. Some of the novel transporters to consider in pancreas are the proton pumps (H+-K+-ATPases), as well as the calcium-activated K+ and Cl− channels, such as KCa3.1 and TMEM16A/ANO1. Local regulators, such as purinergic signaling, fine-tune, and coordinate pancreatic secretion. Lastly, we speculate whether dys-regulation of acid-base transport contributes to pancreatic diseases including cystic fibrosis, pancreatitis, and cancer. PMID:24391597

  13. Coulometric titration of bases in acetic acid and acetonitrile media.

    PubMed

    Vajgand, V J; Mihajlović, R

    1969-09-01

    The working conditions and the results for coulometric titration of milligram amounts of some bases in 0.1M sodium perchlorate in a mixture of acetic acid and acetic anhydride (1:6), are given. Determinations were made both by coulometric back-titration or direct titration at the platinum anode. Back-titration was done in the catholyte, by coulometric titration of the excess of added perchloric acid. The titration end-point was detected photometrically with Crystal Violet as indicator. The direct titration of bases was done at the platinum anode, in the same electrolyte, to which hydroquinone was added as anode depolarizer and as the source of hydrogen ions, Malachite Green being used as indicator. Similarly, bases can be determined in acetonitrile if sodium perchlorate, hydroquinone and Malachite Green are added to the solvent. Errors are below 1 %, and the precision is satisfactory.

  14. Food composition and acid-base balance: alimentary alkali depletion and acid load in herbivores.

    PubMed

    Kiwull-Schöne, Heidrun; Kiwull, Peter; Manz, Friedrich; Kalhoff, Hermann

    2008-02-01

    Alkali-enriched diets are recommended for humans to diminish the net acid load of their usual diet. In contrast, herbivores have to deal with a high dietary alkali impact on acid-base balance. Here we explore the role of nutritional alkali in experimentally induced chronic metabolic acidosis. Data were collected from healthy male adult rabbits kept in metabolism cages to obtain 24-h urine and arterial blood samples. Randomized groups consumed rabbit diets ad libitum, providing sufficient energy but variable alkali load. One subgroup (n = 10) received high-alkali food and approximately 15 mEq/kg ammonium chloride (NH4Cl) with its drinking water for 5 d. Another group (n = 14) was fed low-alkali food for 5 d and given approximately 4 mEq/kg NH4Cl daily for the last 2 d. The wide range of alimentary acid-base load was significantly reflected by renal base excretion, but normal acid-base conditions were maintained in the arterial blood. In rabbits fed a high-alkali diet, the excreted alkaline urine (pH(u) > 8.0) typically contained a large amount of precipitated carbonate, whereas in rabbits fed a low-alkali diet, both pH(u) and precipitate decreased considerably. During high-alkali feeding, application of NH4Cl likewise decreased pH(u), but arterial pH was still maintained with no indication of metabolic acidosis. During low-alkali feeding, a comparably small amount of added NH4Cl further lowered pH(u) and was accompanied by a significant systemic metabolic acidosis. We conclude that exhausted renal base-saving function by dietary alkali depletion is a prerequisite for growing susceptibility to NH4Cl-induced chronic metabolic acidosis in the herbivore rabbit.

  15. Acid-base titrations by stepwise addition of equal volumes of titrant with special reference to automatic titrations-II Theory of titration of mixtures of acids, polyprotic acids, acids in mixture with weak bases, and ampholytes.

    PubMed

    Pehrsson, L; Ingman, F; Johansson, S

    A general method for evaluating titration data for mixtures of acids and for acids in mixture with weak bases is presented. Procedures are given that do not require absolute [H]-data, i.e., relative [H]-data may be used. In most cases a very rough calibration of the electrode system is enough. Further, for simple systems, very approximate values of the stability constants are sufficient. As examples, the titration of the following are treated in some detail: a mixture of two acids, a diprotic acid, an acid in presence of its conjugate base, and an ampholyte.

  16. Hard and soft acids and bases: structure and process.

    PubMed

    Reed, James L

    2012-07-05

    Under investigation is the structure and process that gives rise to hard-soft behavior in simple anionic atomic bases. That for simple atomic bases the chemical hardness is expected to be the only extrinsic component of acid-base strength, has been substantiated in the current study. A thermochemically based operational scale of chemical hardness was used to identify the structure within anionic atomic bases that is responsible for chemical hardness. The base's responding electrons have been identified as the structure, and the relaxation that occurs during charge transfer has been identified as the process giving rise to hard-soft behavior. This is in contrast the commonly accepted explanations that attribute hard-soft behavior to varying degrees of electrostatic and covalent contributions to the acid-base interaction. The ability of the atomic ion's responding electrons to cause hard-soft behavior has been assessed by examining the correlation of the estimated relaxation energies of the responding electrons with the operational chemical hardness. It has been demonstrated that the responding electrons are able to give rise to hard-soft behavior in simple anionic bases.

  17. Bio-based production of organic acids with Corynebacterium glutamicum.

    PubMed

    Wieschalka, Stefan; Blombach, Bastian; Bott, Michael; Eikmanns, Bernhard J

    2013-03-01

    The shortage of oil resources, the steadily rising oil prices and the impact of its use on the environment evokes an increasing political, industrial and technical interest for development of safe and efficient processes for the production of chemicals from renewable biomass. Thus, microbial fermentation of renewable feedstocks found its way in white biotechnology, complementing more and more traditional crude oil-based chemical processes. Rational strain design of appropriate microorganisms has become possible due to steadily increasing knowledge on metabolism and pathway regulation of industrially relevant organisms and, aside from process engineering and optimization, has an outstanding impact on improving the performance of such hosts. Corynebacterium glutamicum is well known as workhorse for the industrial production of numerous amino acids. However, recent studies also explored the usefulness of this organism for the production of several organic acids and great efforts have been made for improvement of the performance. This review summarizes the current knowledge and recent achievements on metabolic engineering approaches to tailor C. glutamicum for the bio-based production of organic acids. We focus here on the fermentative production of pyruvate, L- and D-lactate, 2-ketoisovalerate, 2-ketoglutarate, and succinate. These organic acids represent a class of compounds with manifold application ranges, e.g. in pharmaceutical and cosmetics industry, as food additives, and economically very interesting, as precursors for a variety of bulk chemicals and commercially important polymers.

  18. Acid-Base Balance in Uremic Rats with Vascular Calcification

    PubMed Central

    Peralta-Ramírez, Alan; Raya, Ana Isabel; Pineda, Carmen; Rodríguez, Mariano; Aguilera-Tejero, Escolástico; López, Ignacio

    2014-01-01

    Background/Aims Vascular calcification (VC), a major complication in humans and animals with chronic kidney disease (CKD), is influenced by changes in acid-base balance. The purpose of this study was to describe the acid-base balance in uremic rats with VC and to correlate the parameters that define acid-base equilibrium with VC. Methods Twenty-two rats with CKD induced by 5/6 nephrectomy (5/6 Nx) and 10 nonuremic control rats were studied. Results The 5/6 Nx rats showed extensive VC as evidenced by a high aortic calcium (9.2 ± 1.7 mg/g of tissue) and phosphorus (20.6 ± 4.9 mg/g of tissue) content. Uremic rats had an increased pH level (7.57 ± 0.03) as a consequence of both respiratory (PaCO2 = 28.4 ± 2.1 mm Hg) and, to a lesser degree, metabolic (base excess = 4.1 ± 1 mmol/l) derangements. A high positive correlation between both anion gap (AG) and strong ion difference (SID) with aortic calcium (AG: r = 0.604, p = 0.02; SID: r = 0.647, p = 0.01) and with aortic phosphorus (AG: r = 0.684, p = 0.007; SID: r = 0.785, p = 0.01) was detected. Conclusions In an experimental model of uremic rats, VC showed high positive correlation with AG and SID. PMID:25177336

  19. Developing nucleic acid-based electrical detection systems

    PubMed Central

    Gabig-Ciminska, Magdalena

    2006-01-01

    Development of nucleic acid-based detection systems is the main focus of many research groups and high technology companies. The enormous work done in this field is particularly due to the broad versatility and variety of these sensing devices. From optical to electrical systems, from label-dependent to label-free approaches, from single to multi-analyte and array formats, this wide range of possibilities makes the research field very diversified and competitive. New challenges and requirements for an ideal detector suitable for nucleic acid analysis include high sensitivity and high specificity protocol that can be completed in a relatively short time offering at the same time low detection limit. Moreover, systems that can be miniaturized and automated present a significant advantage over conventional technology, especially if detection is needed in the field. Electrical system technology for nucleic acid-based detection is an enabling mode for making miniaturized to micro- and nanometer scale bio-monitoring devices via the fusion of modern micro- and nanofabrication technology and molecular biotechnology. The electrical biosensors that rely on the conversion of the Watson-Crick base-pair recognition event into a useful electrical signal are advancing rapidly, and recently are receiving much attention as a valuable tool for microbial pathogen detection. Pathogens may pose a serious threat to humans, animal and plants, thus their detection and analysis is a significant element of public health. Although different conventional methods for detection of pathogenic microorganisms and their toxins exist and are currently being applied, improvements of molecular-based detection methodologies have changed these traditional detection techniques and introduced a new era of rapid, miniaturized and automated electrical chip detection technologies into pathogen identification sector. In this review some developments and current directions in nucleic acid-based electrical

  20. The physiological assessment of acid-base balance.

    PubMed

    Howorth, P J

    1975-04-01

    Acid-base terminology including the sue of SI units is reviewed. The historical reasons why nomograms have been particularly used in acid-base work are discussed. The theoretical basis of the Henderson-Hasselbalch equation is considered. It is emphasized that the solubility of CO2 in plasma and the apparent first dissociation constant of carbonic acid are not chemical constants when applied to media of uncertain and varying composition such as blood plasma. The use of the Henderson-Hasselbalch equation in making hypothermia corrections for PCO2 is discussed. The Astrup system for the in vitro determination of blood gases and derived parameters is described and the theoretical weakness of the base excess concept stressed. A more clinically-oriented approach to the assessment of acid-base problems is presented. Measurement of blood [H+] and PCO2 are considered to be primary data which should be recorded on a chart with in vivo CO2-titration lines (see below). Clinical information and results of other laboratory investigations such as plasma bicarbonate, PO2,P50 are then to be considered together with the primary data. In order to interpret this combined information it is essential to take into account the known ventilatory response to metabolic acidosis and alkalosis, and the renal response to respiratory acidosis and alkalosis. The use is recommended of a chart showing the whole-body CO2-titration points obtained when patients with different initial levels of non-respiratory [H+] are ventilated. A number of examples are given of the use of this [H+] and PCO2 in vivo chart in the interpretation of acid-base data. The aetiology, prognosis and treatment of metabolic alkalosis is briefly reviewed. Treatment with intravenous acid is recommended for established cases. Attention is drawn to the possibility of iatrogenic production of metabolic alkalosis. Caution is expressed over the use of intravenous alkali in all but the severest cases of metabolic acidosis. The role of

  1. Acid and base stress and transcriptomic responses in Bacillus subtilis.

    PubMed

    Wilks, Jessica C; Kitko, Ryan D; Cleeton, Sarah H; Lee, Grace E; Ugwu, Chinagozi S; Jones, Brian D; BonDurant, Sandra S; Slonczewski, Joan L

    2009-02-01

    Acid and base environmental stress responses were investigated in Bacillus subtilis. B. subtilis AG174 cultures in buffered potassium-modified Luria broth were switched from pH 8.5 to pH 6.0 and recovered growth rapidly, whereas cultures switched from pH 6.0 to pH 8.5 showed a long lag time. Log-phase cultures at pH 6.0 survived 60 to 100% at pH 4.5, whereas cells grown at pH 7.0 survived <15%. Cells grown at pH 9.0 survived 40 to 100% at pH 10, whereas cells grown at pH 7.0 survived <5%. Thus, growth in a moderate acid or base induced adaptation to a more extreme acid or base, respectively. Expression indices from Affymetrix chip hybridization were obtained for 4,095 protein-encoding open reading frames of B. subtilis grown at external pH 6, pH 7, and pH 9. Growth at pH 6 upregulated acetoin production (alsDS), dehydrogenases (adhA, ald, fdhD, and gabD), and decarboxylases (psd and speA). Acid upregulated malate metabolism (maeN), metal export (czcDO and cadA), oxidative stress (catalase katA; OYE family namA), and the SigX extracytoplasmic stress regulon. Growth at pH 9 upregulated arginine catabolism (roc), which generates organic acids, glutamate synthase (gltAB), polyamine acetylation and transport (blt), the K(+)/H(+) antiporter (yhaTU), and cytochrome oxidoreductases (cyd, ctaACE, and qcrC). The SigH, SigL, and SigW regulons were upregulated at high pH. Overall, greater genetic adaptation was seen at pH 9 than at pH 6, which may explain the lag time required for growth shift to high pH. Low external pH favored dehydrogenases and decarboxylases that may consume acids and generate basic amines, whereas high external pH favored catabolism-generating acids.

  2. Acid-base properties of bentonite rocks with different origins.

    PubMed

    Nagy, Noémi M; Kónya, József

    2006-03-01

    Five bentonite samples (35-47% montmorillonite) from a Sarmatian sediment series with bentonite sites around Sajóbábony (Hungary) is studied. Some of these samples were tuffogenic bentonite (sedimentary), the others were bentonitized tuff with volcano sedimentary origin. The acid-base properties of the edge sites were studied by potentiometric titrations and surface complexation modeling. It was found that the number and the ratio of silanol and aluminol sites as well as the intrinsic stability constants are different for the sedimentary bentonite and bentonitized tuff. The characteristic properties of the edges sites depend on the origins. The acid-base properties are compared to other commercial and standard bentonites.

  3. Synthesis and characterization of copolyanhydrides of carbohydrate-based galactaric acid and adipic acid.

    PubMed

    Mehtiö, Tuomas; Nurmi, Leena; Rämö, Virpi; Mikkonen, Hannu; Harlin, Ali

    2015-01-30

    A series of copolyanhydrides, consisting of 2,3,4,5-tetra-O-acetylgalactaric acid (AGA) and adipic acid (AA) as monomer units, was polymerized. Synthesis of AGA monomer consisted of two steps. First, O-acetylation of galactaric acid secondary hydroxyl groups was performed using acetic anhydride as a reagent. Acetic anhydride was then further used as a reagent in the synthesis of diacetyl mixed anhydride of AGA. Polymerizations were conducted as bulk condensation polymerization at 150 °C. Thermal properties of the copolymers varied depending on monomer composition. Increase in the AGA content had a clear increasing effect on the Tg. A similar increasing effect was observed in Tm. The degree of crystallinity decreased as AGA content increased. There was a slightly lowering tendency in the molecular weights of the obtained polymers when the AGA content in the polymerization mixtures increased. The described synthesis route shows that bio-based aldaric acid monomers are potential candidates for the adjustment of thermal properties of polyanhydrides.

  4. Acid-base thermochemistry of gaseous aliphatic α-aminoacids.

    PubMed

    Bouchoux, Guy; Huang, Sihua; Inda, Bhawani Singh

    2011-01-14

    Acid-base thermochemistry of isolated aliphatic amino acids (denoted AAA): glycine, alanine, valine, leucine, isoleucine and proline has been examined theoretically by quantum chemical computations at the G3MP2B3 level. Conformational analysis on neutral, protonated and deprotonated species has been used to identify the lowest energy conformers and to estimate the population of conformers expected to be present at thermal equilibrium at 298 K. Comparison of the G3MP2B3 theoretical proton affinities, PA, and ΔH(acid) with experimental results is shown to be correct if experimental thermochemistry is re-evaluated and adapted to the most recent acidity-basicity scales. From this point of view, a set of evaluated proton affinities of 887, 902, 915, 916, 919 and 941 kJ mol(-1), and a set of evaluated ΔH(acid) of 1433, 1430, 1423, 1423, 1422 and 1426 kJ mol(-1), is proposed for glycine, alanine, valine, leucine, isoleucine and proline, respectively. Correlations with structural parameters (Taft's σ(α) polarizability parameter and molecular size) suggest that polarizability of the side chain is the major origin of the increase in PA and decrease in ΔH(acid) along the homologous series glycine, alanine, valine and leucine/isoleucine. Heats of formation of gaseous species AAA, AAAH(+) and [AAA-H](-) were computed at the G3MP2B3 level. The present study provides previously unavailable Δ(f)H°(298) for the ionized species AAAH(+) and [AAA-H](-). Comparison with Benson's estimate, and correlation with molecular size, show that several experimental Δ(f)H°(298) values of neutral or gaseous AAA might be erroneous.

  5. [Injuries caused by acids and bases - emergency treatment].

    PubMed

    Reifferscheid, Florian; Stuhr, Markus; Kaiser, Guido; Freudenberg, Matthias; Kerner, Thoralf

    2014-06-01

    Emergency medical care for injuries caused by acids and bases is challenging for rescue services. They have to deal with operational safety, detection of the toxic agent, emergency medical care of the patient and handling of the rescue mission. Because of the rareness of such situations experience and routine are largely missing. This article highlights some basic points for the therapy and provides support for such rescue missions.

  6. Synthesis of bio-based methacrylic acid by decarboxylation of itaconic acid and citric acid catalyzed by solid transition-metal catalysts.

    PubMed

    Le Nôtre, Jérôme; Witte-van Dijk, Susan C M; van Haveren, Jacco; Scott, Elinor L; Sanders, Johan P M

    2014-09-01

    Methacrylic acid, an important monomer for the plastics industry, was obtained in high selectivity (up to 84%) by the decarboxylation of itaconic acid using heterogeneous catalysts based on Pd, Pt and Ru. The reaction takes place in water at 200-250 °C without any external added pressure, conditions significantly milder than those described previously for the same conversion with better yield and selectivity. A comprehensive study of the reaction parameters has been performed, and the isolation of methacrylic acid was achieved in 50% yield. The decarboxylation procedure is also applicable to citric acid, a more widely available bio-based feedstock, and leads to the production of methacrylic acid in one pot in 41% selectivity. Aconitic acid, the intermediate compound in the pathway from citric acid to itaconic acid was also used successfully as a substrate.

  7. Cp-curve, a Novel 3-D Graphical Representation of Proteins

    NASA Astrophysics Data System (ADS)

    Bai, Haihua; Li, Chun; Agula, Hasi; Jirimutu, Jirimutu; Wang, Jun; Xing, Lili

    2007-12-01

    Based on a five-letter model of the 20 amino acids, we propose a novel 3-D graphical representation of proteins. The method is illustrated on the mutant exon 1 of EDA gene of a Mongolian family with X-linked congenital anodontia/wavy hair.

  8. The normal acid-base status of mice.

    PubMed

    Iversen, Nina K; Malte, Hans; Baatrup, Erik; Wang, Tobias

    2012-03-15

    Rodent models are commonly used for various physiological studies including acid-base regulation. Despite the widespread use of especially genetic modified mice, little attention have been made to characterise the normal acid-base status in these animals in order to reveal proper control values. Furthermore, several studies report blood gas values obtained in anaesthetised animals. We, therefore, decided to characterise blood CO(2) binding characteristic of mouse blood in vitro and to characterise normal acid-base status in conscious BALBc mice. In vitro CO(2) dissociation curves, performed on whole blood equilibrated to various PCO₂ levels in rotating tonometers, revealed a typical mammalian pK' (pK'=7.816-0.234 × pH (r=0.34)) and a non-bicarbonate buffer capacity (16.1 ± 2.6 slyke). To measure arterial acid-base status, small blood samples were taken from undisturbed mice with indwelling catheters in the carotid artery. In these animals, pH was 7.391 ± 0.026, plasma [HCO(3)(-)] 18.4 ± 0.83 mM, PCO₂ 30.3 ± 2.1 mm Hg and lactate concentration 4.6 ± 0.7 mM. Our study, therefore, shows that mice have an arterial pH that resembles other mammals, although arterial PCO₂ tends to be lower than in larger mammals. However, pH from arterial blood sampled from mice anaesthetised with isoflurane was significantly lower (pH 7.239 ± 0.021), while plasma [HCO(3)(-)] was 18.5 ± 1.4 mM, PCO₂ 41.9 ± 2.9 mm Hg and lactate concentration 4.48 ± 0.67 mM. Furthermore, we measured metabolism and ventilation (V(E)) in order to determine the ventilation requirements (VE/VO₂) to answer whether small mammals tend to hyperventilate. We recommend, therefore, that studies on acid-base regulation in mice should be based on samples taken for indwelling catheters rather than cardiac puncture of terminally anaesthetised mice.

  9. Lewis base activation of Lewis acids: development of a Lewis base catalyzed selenolactonization.

    PubMed

    Denmark, Scott E; Collins, William R

    2007-09-13

    The concept of Lewis base activation of Lewis acids has been applied to the selenolactonization reaction. Through the use of substoichiometric amounts of Lewis bases with "soft" donor atoms (S, Se, P) significant rate enhancements over the background reaction are seen. Preliminary mechanistic investigations have revealed the resting state of the catalyst as well as the significance of a weak Brønsted acid promoter.

  10. Identifying Bilingual Semantic Neural Representations across Languages

    ERIC Educational Resources Information Center

    Buchweitz, Augusto; Shinkareva, Svetlana V.; Mason, Robert A.; Mitchell, Tom M.; Just, Marcel Adam

    2012-01-01

    The goal of the study was to identify the neural representation of a noun's meaning in one language based on the neural representation of that same noun in another language. Machine learning methods were used to train classifiers to identify which individual noun bilingual participants were thinking about in one language based solely on their…

  11. [Nutrition, acid-base metabolism, cation-anion difference and total base balance in humans].

    PubMed

    Mioni, R; Sala, P; Mioni, G

    2008-01-01

    The relationship between dietary intake and acid-base metabolism has been investigated in the past by means of the inorganic cation-anion difference (C(+)(nm)-A(-)(nm)) method based on dietary ash-acidity titration after the oxidative combustion of food samples. Besides the inorganic components of TA (A(-)(nm)-C(+)(nm)), which are under renal control, there are also metabolizable components (A(-)(nm)-C(+)(nm)) of TA, which are under the control of the intermediate metabolism. The whole body base balance, NBb(W), is obtained only by the application of C(+)(nm)-A(-)(nm) to food, feces and urine, while the metabolizable component (A(-)(nm)-C(+)(nm)) is disregarded. A novel method has been subsequently suggested to calculate the net balance of fixed acid, made up by the difference between the input of net endogenous acid production: NEAP = SO(4)(2-)+A(-)(m)-(C(+)(nm)-A(-)(nm)), and the output of net acid excretion: NAE = TA + NH(4)(+) - HCO(3)(-). This approach has been criticized because 1) it includes metabolizable acids, whose production cannot be measured independently; 2) the specific control of metabolizable acid and base has been incorrectly attributed to the kidney; 3) the inclusion of A-m in the balance input generates an acid overload; 4) the object of measurement in making up a balance has to be the same, a condition not fulfilled as NEAP is different from NAE. Lastly, by rearranging the net balance of the acid equation, the balance of nonmetabolizable acid equation is obtained. Therefore, any discrepancy between these two equations is due to the inaccuracy in the urine measurement of metabolizable cations and/or anions.

  12. Acid Base Equilibrium in a Lipid/Water Gel

    NASA Astrophysics Data System (ADS)

    Streb, Kristina K.; Ilich, Predrag-Peter

    2003-12-01

    A new and original experiment in which partition of bromophenol blue dye between water and lipid/water gel causes a shift in the acid base equilibrium of the dye is described. The dye-absorbing material is a monoglyceride food additive of plant origin that mixes freely with water to form a stable cubic phase gel; the nascent gel absorbs the dye from aqueous solution and converts it to the acidic form. There are three concurrent processes taking place in the experiment: (a) formation of the lipid/water gel, (b) absorption of the dye by the gel, and (c) protonation of the dye in the lipid/water gel environment. As the aqueous solution of the dye is a deep purple-blue color at neutral pH and yellow at acidic pH the result of these processes is visually striking: the strongly green-yellow particles of lipid/water gel are suspended in purple-blue aqueous solution. The local acidity of the lipid/water gel is estimated by UV vis spectrophotometry. This experiment is an example of host-guest (lipid/water gel dye) interaction and is suitable for project-type biophysics, physical chemistry, or biochemistry labs. The experiment requires three, 3-hour lab sessions, two of which must not be separated by more than two days.

  13. Acid-base chemistry of frustrated water at protein interfaces.

    PubMed

    Fernández, Ariel

    2016-01-01

    Water molecules at a protein interface are often frustrated in hydrogen-bonding opportunities due to subnanoscale confinement. As shown, this condition makes them behave as a general base that may titrate side-chain ammonium and guanidinium cations. Frustration-based chemistry is captured by a quantum mechanical treatment of proton transference and shown to remove same-charge uncompensated anticontacts at the interface found in the crystallographic record and in other spectroscopic information on the aqueous interface. Such observations are untenable within classical arguments, as hydronium is a stronger acid than ammonium or guanidinium. Frustration enables a directed Grotthuss mechanism for proton transference stabilizing same-charge anticontacts.

  14. Understanding Representation in Design.

    ERIC Educational Resources Information Center

    Bodker, Susanne

    1998-01-01

    Discusses the design of computer applications, focusing on understanding design representations--what makes design representations work, and how, in different contexts. Examines the place of various types of representation (e.g., formal notations, models, prototypes, scenarios, and mock-ups) in design and the role of formalisms and representations…

  15. Nutrition, acid-base status and growth in early childhood.

    PubMed

    Kalhoff, H; Manz, F

    2001-10-01

    Optimal growth is only possible in a well-balanced "inner milieu". Premature infants are especially vulnerable for disturbances of acid-base metabolism with a predisposition to metabolic acidosis due to a transient disproportion between age-related low renal capacity for net acid excretion (NAE) and an unphysiologically high actual renal NAE on nutrition with standard formulas. During a 50 month period, 452 low birth-weight infants were screened for spontaneous development of incipient late metabolic acidosis (ILMA), an early stage during the development of retention acidosis, characterized by maximum renal acid stimulation (MRAS, urine-pH < 5.4) on two consecutive days but still compensated systemic acid-base status. Compared with controls, patients with ILMA showed higher serum creatinine values, an increased urinary excretion of sodium, aldosterone and nitrogen, but only slightly lower blood pH (7.38 vs 7.41) and base excess (-2.8 vs. 0.2 mmol/l) with respiratory compensation (PCO2 35 vs 37 mm Hg). Patients with altogether 149 episodes of ILMA were subsequently randomly allocated to either treatment with NaHCO3 2 mmol/kg/d for 7 days or no special therapy in protocol I, or NaHCO3 vs NaCl each 2 mmol/kg/d for 7 days in protocol II. Patients of protocol I with persistent MRAS for 7 days showed lowest weight gain and a tendency for a further increase in urinary aldosterone and nitrogen excretion. NaCl supplementation (protocol II) seemed to promote weight gain without affecting either impaired mineralization or suboptimal nitrogen retention. Patients with alkali therapy under both protocols showed normal weight gain and normalization of hormonal stimulation, mineralization (protocol II) and nitrogen assimilation. Modification of the mineral content of a standard preterm formula decreased renal NAE to the low level seen on alimentation with human milk and reduced the incidence of ILMA in preterm and small-for-gestational-age infants to 1%. The data show that ILMA is

  16. Acid-base catalysis of N-[(morpholine)methylene]daunorubicin.

    PubMed

    Krause, Anna; Jelińska, Anna; Cielecka-Piontek, Judyta; Klawitter, Maria; Zalewski, Przemysław; Oszczapowicz, Irena; Wąsowska, Małgorzata

    2012-08-01

    The stability of N-[(morpholine)methylene]-daunorubicin hydrochloride (MMD) was investigated in the pH range 0.44-13.54, at 313, 308, 303 and 298 K. The degradation of MMD as a result of hydrolysis is a pseudo-first-order reaction described by the following equation: ln c = ln c(0) - k(obs)• t. In the solutions of hydrochloric acid, sodium hydroxide, borate, acetate and phosphate buffers, k(obs) = k(pH) because general acid-base catalysis was not observed. Specific acid-base catalysis of MMD comprises the following reactions: hydrolysis of the protonated molecules of MMD catalyzed by hydrogen ions (k(1)) and spontaneous hydrolysis of MMD molecules other than the protonated ones (k(2)) under the influence of water. The total rate of the reaction is equal to the sum of partial reactions: k(pH) = k(1) • a(H)+ • f(1) + k(2) • f(2) where: k(1) is the second-order rate constant (mol(-1) l s(-1)) of the specific hydrogen ion-catalyzed degradation of the protonated molecules of MMD; k(2) is the pseudo-first-order rate constant (s(-1)) of the water-catalyzed degradation of MMD molecules other than the protonated ones, f(1) - f(2) are fractions of the compound. MMD is the most stable at approx. pH 2.5.

  17. A microarray-based method to perform nucleic acid selections.

    PubMed

    Aminova, Olga; Disney, Matthew D

    2010-01-01

    This method describes a microarray-based platform to perform nucleic acid selections. Chemical ligands to which a nucleic acid binder is desired are immobilized onto an agarose microarray surface; the array is then incubated with an RNA library. Bound RNA library members are harvested directly from the array surface via gel excision at the position on the array where a ligand was immobilized. The RNA is then amplified via RT-PCR, cloned, and sequenced. This method has the following advantages over traditional resin-based Systematic Evolution of Ligands by Exponential Enrichment (SELEX): (1) multiple selections can be completed in parallel on a single microarray surface; (2) kinetic biases in the selections are mitigated since all RNA binders are harvested from an array via gel excision; (3) the amount of chemical ligand needed to perform a selection is minimized; (4) selections do not require expensive resins or equipment; and (5) the matrix used for selections is inexpensive and easy to prepare. Although this protocol was demonstrated for RNA selections, it should be applicable for any nucleic acid selection.

  18. Age estimation based on aspartic acid racemization in human sclera.

    PubMed

    Klumb, Karolin; Matzenauer, Christian; Reckert, Alexandra; Lehmann, Klaus; Ritz-Timme, Stefanie

    2016-01-01

    Age estimation based on racemization of aspartic acid residues (AAR) in permanent proteins has been established in forensic medicine for years. While dentine is the tissue of choice for this molecular method of age estimation, teeth are not always available which leads to the need to identify other suitable tissues. We examined the suitability of total tissue samples of human sclera for the estimation of age at death. Sixty-five samples of scleral tissue were analyzed. The samples were hydrolyzed and after derivatization, the extent of aspartic acid racemization was determined by gas chromatography. The degree of AAR increased with age. In samples from younger individuals, the correlation of age and D-aspartic acid content was closer than in samples from older individuals. The age-dependent racemization in total tissue samples proves that permanent or at least long-living proteins are present in scleral tissue. The correlation of AAR in human sclera and age at death is close enough to serve as basis for age estimation. However, the precision of age estimation by this method is lower than that of age estimation based on the analysis of dentine which is due to molecular inhomogeneities of total tissue samples of sclera. Nevertheless, the approach may serve as a valuable alternative or addition in exceptional cases.

  19. How Do Undergraduate Students Conceptualize Acid-Base Chemistry? Measurement of a Concept Progression

    ERIC Educational Resources Information Center

    Romine, William L.; Todd, Amber N.; Clark, Travis B.

    2016-01-01

    We developed and validated a new instrument, called "Measuring Concept progressions in Acid-Base chemistry" (MCAB) and used it to better understand the progression of undergraduate students' understandings about acid-base chemistry. Items were developed based on an existing learning progression for acid-base chemistry. We used the Rasch…

  20. Urea biosensors based on PVC membrane containing palmitic acid.

    PubMed

    Karakuş, Emine; Pekyardimci, Sule; Esma, Kiliç

    2005-01-01

    A new urea biosensor was prepared by immobilizing urease with four different procedures on poly(vinylchloride) (PVC) ammonium membrane electrode containing palmitic acid by using nonactine as an ammonium-ionophore. The analytical characteristics were investigated and were compared those of the biosensor prepared by using carboxylated PVC. The effect of pH, buffer concentration, temperature, urease concentration, stirring rate and enzyme immobilization procedures on the response to urea of the enzyme electrode were investigated. The linear working range and sensitivity of the biosensor were also determined. The urea biosensor prepared by using the PVC membranes containing palmitic acid showed more effective performance than those of the carboxylated PVC based biosensors. Additionally, urea assay in serum was successfully carried out by using the standard addition method.

  1. Reaction mechanisms of riboflavin triplet state with nucleic acid bases.

    PubMed

    Lin, Weizhen; Lu, Changyuan; Du, Fuqiang; Shao, Zhiyong; Han, Zhenhui; Tu, Tiecheng; Yao, Side; Lin, Nianyun

    2006-04-01

    ESR and laser flash photolysis studies have determined a reasonable order of reactivity of nucleotides with triplet riboflavin (3Rb*) for the first time. ESR detection of triplet state reactivity of Rb with nucleoside, polynucleotide and DNA has been obtained simultaneously. In addition, ESR spin elimination measurement of the reactivity of 3Rb* with nucleotides in good accord with laser flash photolysis determination of the corresponding rate constants offers a simple and reliable method to detect the reactivities of nucleic acids and its components with photoexcited flavins. Kinetic, ESR and thermodynamic studies have demonstrated that Rb should be a strong endogenous photosensitizer capable of oxidizing all nucleic acid bases, and preferentially two purine nucleotides with high rate constants.

  2. Temporal channels and disparity representations in stereoscopic depth perception.

    PubMed

    Doi, Takahiro; Takano, Maki; Fujita, Ichiro

    2013-11-26

    Stereoscopic depth perception is supported by a combination of correlation-based and match-based representations of binocular disparity. It also relies on both transient and sustained temporal channels of the visual system. Previous studies suggest that the relative contribution of the correlation-based representation (over the match-based representation) and the transient channel (over the sustained channel) to depth perception increases with the disparity magnitude. The mechanisms of the correlation-based and match-based representations may receive preferential inputs from the transient and sustained channels, respectively. We examined near/far discrimination by observers using random-dot stereograms refreshed at various rates. The relative contribution of the two representations was inferred by changing the fraction of dots that were contrast reversed between the two eyes. Both representations contributed to depth discrimination over the tested range of refresh rates. As the rate increased, the correlation-based representation increased its contribution to near/far discrimination. Another experiment revealed that the match-based representation was constructed by exploiting the variability in correlation-based disparity signals. Thus, the relative weight of the transient over sustained channel differs between the two representations. The correlation-based representation dominates depth perception with dynamic inputs. The match-based representation, which may be a nonlinear refinement of the correlation-based representation, exerts more influence on depth perception with slower inputs.

  3. Bio-based production of organic acids with Corynebacterium glutamicum

    PubMed Central

    Wieschalka, Stefan; Blombach, Bastian; Bott, Michael; Eikmanns, Bernhard J

    2013-01-01

    The shortage of oil resources, the steadily rising oil prices and the impact of its use on the environment evokes an increasing political, industrial and technical interest for development of safe and efficient processes for the production of chemicals from renewable biomass. Thus, microbial fermentation of renewable feedstocks found its way in white biotechnology, complementing more and more traditional crude oil-based chemical processes. Rational strain design of appropriate microorganisms has become possible due to steadily increasing knowledge on metabolism and pathway regulation of industrially relevant organisms and, aside from process engineering and optimization, has an outstanding impact on improving the performance of such hosts. Corynebacterium glutamicum is well known as workhorse for the industrial production of numerous amino acids. However, recent studies also explored the usefulness of this organism for the production of several organic acids and great efforts have been made for improvement of the performance. This review summarizes the current knowledge and recent achievements on metabolic engineering approaches to tailor C. glutamicum for the bio-based production of organic acids. We focus here on the fermentative production of pyruvate, l-and d-lactate, 2-ketoisovalerate, 2-ketoglutarate, and succinate. These organic acids represent a class of compounds with manifold application ranges, e.g. in pharmaceutical and cosmetics industry, as food additives, and economically very interesting, as precursors for a variety of bulk chemicals and commercially important polymers. Funding Information Work in the laboratories of the authors was supported by the Fachagentur Nachwachsende Rohstoffe (FNR) of the Bundesministerium für Ernährung, Landwirtschaft und Verbraucherschutz (BMELV; FNR Grants 220-095-08A and 220-095-08D; Bio-ProChemBB project, ERA-IB programme), by the Deutsche Bundesstiftung Umwelt (DBU Grant AZ13040/05) and the Evonik Degussa AG. PMID

  4. Building Hierarchical Representations for Oracle Character and Sketch Recognition.

    PubMed

    Jun Guo; Changhu Wang; Roman-Rangel, Edgar; Hongyang Chao; Yong Rui

    2016-01-01

    In this paper, we study oracle character recognition and general sketch recognition. First, a data set of oracle characters, which are the oldest hieroglyphs in China yet remain a part of modern Chinese characters, is collected for analysis. Second, typical visual representations in shape- and sketch-related works are evaluated. We analyze the problems suffered when addressing these representations and determine several representation design criteria. Based on the analysis, we propose a novel hierarchical representation that combines a Gabor-related low-level representation and a sparse-encoder-related mid-level representation. Extensive experiments show the effectiveness of the proposed representation in both oracle character recognition and general sketch recognition. The proposed representation is also complementary to convolutional neural network (CNN)-based models. We introduce a solution to combine the proposed representation with CNN-based models, and achieve better performances over both approaches. This solution has beaten humans at recognizing general sketches.

  5. Superabsorbent biphasic system based on poly(lactic acid) and poly(acrylic acid)

    NASA Astrophysics Data System (ADS)

    Sartore, Luciana; Pandini, Stefano; Baldi, Francesco; Bignotti, Fabio

    2016-05-01

    In this research work, biocomposites based on crosslinked particles of poly(acrylic acid), commonly used as superabsorbent polymer (SAP), and poly-L-lactic acid (PLLA) were developed to elucidate the role of the filler (i.e., polymeric crosslinked particles) on the overall physico-mechanical behavior and to obtain superabsorbent thermoplastic products. Samples prepared by melt-blending of components in different ratios showed a biphasic system with a regular distribution of particles, with diameter ranging from 5 to 10 μm, within the PLLA polymeric matrix. The polymeric biphasic system, coded PLASA i.e. superabsorbent poly(lactic acid), showed excellent swelling properties, demonstrating that cross-linked particles retain their superabsorbent ability, as in their free counterparts, even if distributed in a thermoplastic polymeric matrix. The thermal characteristics of the biocomposites evidence enhanced thermal stability in comparison with neat PLLA and also mechanical properties are markedly modified by addition of crosslinked particles which induce regular stiffening effect. Furthermore, in aqueous environments the particles swell and are leached from PLLA matrix generating very high porosity. These new open-pore PLLA foams, produced in absence of organic solvents and chemical foaming agents, with good physico-mechanical properties appear very promising for several applications, for instance in tissue engineering for scaffold production.

  6. General analytical procedure for determination of acidity parameters of weak acids and bases.

    PubMed

    Pilarski, Bogusław; Kaliszan, Roman; Wyrzykowski, Dariusz; Młodzianowski, Janusz; Balińska, Agata

    2015-01-01

    The paper presents a new convenient, inexpensive, and reagent-saving general methodology for the determination of pK a values for components of the mixture of diverse chemical classes weak organic acids and bases in water solution, without the need to separate individual analytes. The data obtained from simple pH-metric microtitrations are numerically processed into reliable pK a values for each component of the mixture. Excellent agreement has been obtained between the determined pK a values and the reference literature data for compounds studied.

  7. Determination of the Acid-Base Dissociation Constant of Acid-Degradable Hexamethylenetetramine by Capillary Zone Electrophoresis.

    PubMed

    Takayanagi, Toshio; Shimakami, Natsumi; Kurashina, Masashi; Mizuguchi, Hitoshi; Yabutani, Tomoki

    2016-01-01

    The acid-base equilibrium of hexamethylenetetramine (hexamine) was analyzed with its effective electrophoretic mobility by capillary zone electrophoresis. Although hexamine is degradable in a weakly acidic aqueous solution, and the degraded products of ammonia and formaldehyde can be formed, the effective electrophoretic mobility of hexamine was measured in the pH range between 2.8 and 6.9. An acid-base dissociation equilibrium of the protonated hexamine was analyzed based on the mobility change, and an acid dissociation constant of pKa = 4.93 ± 0.01 (mean ± standard error, ionic strength: 0.020 mol dm(-3)) was determined. The monoprotic acid-base equilibrium of hexamine was confirmed through comparisons of its electrophoretic mobility with the N-ethylquinolinium ion and with the monocationic N-ethyl derivative of hexamine, as well as a slope analysis of the dissociation equilibrium.

  8. Functional nucleic-acid-based sensors for environmental monitoring.

    PubMed

    Sett, Arghya; Das, Suradip; Bora, Utpal

    2014-10-01

    Efforts to replace conventional chromatographic methods for environmental monitoring with cheaper and easy to use biosensors for precise detection and estimation of hazardous environmental toxicants, water or air borne pathogens as well as various other chemicals and biologics are gaining momentum. Out of the various types of biosensors classified according to their bio-recognition principle, nucleic-acid-based sensors have shown high potential in terms of cost, sensitivity, and specificity. The discovery of catalytic activities of RNA (ribozymes) and DNA (DNAzymes) which could be triggered by divalent metallic ions paved the way for their extensive use in detection of heavy metal contaminants in environment. This was followed with the invention of small oligonucleotide sequences called aptamers which can fold into specific 3D conformation under suitable conditions after binding to target molecules. Due to their high affinity, specificity, reusability, stability, and non-immunogenicity to vast array of targets like small and macromolecules from organic, inorganic, and biological origin, they can often be exploited as sensors in industrial waste management, pollution control, and environmental toxicology. Further, rational combination of the catalytic activity of DNAzymes and RNAzymes along with the sequence-specific binding ability of aptamers have given rise to the most advanced form of functional nucleic-acid-based sensors called aptazymes. Functional nucleic-acid-based sensors (FNASs) can be conjugated with fluorescent molecules, metallic nanoparticles, or quantum dots to aid in rapid detection of a variety of target molecules by target-induced structure switch (TISS) mode. Although intensive research is being carried out for further improvements of FNAs as sensors, challenges remain in integrating such bio-recognition element with advanced transduction platform to enable its use as a networked analytical system for tailor made analysis of environmental

  9. Liquid crystal based biosensors for bile acid detection

    NASA Astrophysics Data System (ADS)

    He, Sihui; Liang, Wenlang; Tanner, Colleen; Fang, Jiyu; Wu, Shin-Tson

    2013-03-01

    The concentration level of bile acids is a useful indicator for early diagnosis of liver diseases. The prevalent measurement method in detecting bile acids is the chromatography coupled with mass spectrometry, which is precise yet expensive. Here we present a biosensor platform based on liquid crystal (LC) films for the detection of cholic acid (CA). This platform has the advantage of low cost, label-free, solution phase detection and simple analysis. In this platform, LC film of 4-Cyano-4'-pentylbiphenyl (5CB) was hosted by a copper grid supported with a polyimide-coated glass substrate. By immersing into sodium dodecyl sulfate (SDS) solution, the LC film was coated with SDS which induced a homeotropic anchoring of 5CB. Addition of CA introduced competitive adsorption between CA and SDS at the interface, triggering a transition from homeotropic to homogeneous anchoring. The detection limit can be tuned by changing the pH value of the solution from 12uM to 170uM.

  10. A fully automatic system for acid-base coulometric titrations

    PubMed Central

    Cladera, A.; Caro, A.; Estela, J. M.; Cerdà, V.

    1990-01-01

    An automatic system for acid-base titrations by electrogeneration of H+ and OH- ions, with potentiometric end-point detection, was developed. The system includes a PC-compatible computer for instrumental control, data acquisition and processing, which allows up to 13 samples to be analysed sequentially with no human intervention. The system performance was tested on the titration of standard solutions, which it carried out with low errors and RSD. It was subsequently applied to the analysis of various samples of environmental and nutritional interest, specifically waters, soft drinks and wines. PMID:18925283

  11. Review of Sparse Representation-Based Classification Methods on EEG Signal Processing for Epilepsy Detection, Brain-Computer Interface and Cognitive Impairment

    PubMed Central

    Wen, Dong; Jia, Peilei; Lian, Qiusheng; Zhou, Yanhong; Lu, Chengbiao

    2016-01-01

    At present, the sparse representation-based classification (SRC) has become an important approach in electroencephalograph (EEG) signal analysis, by which the data is sparsely represented on the basis of a fixed dictionary or learned dictionary and classified based on the reconstruction criteria. SRC methods have been used to analyze the EEG signals of epilepsy, cognitive impairment and brain computer interface (BCI), which made rapid progress including the improvement in computational accuracy, efficiency and robustness. However, these methods have deficiencies in real-time performance, generalization ability and the dependence of labeled sample in the analysis of the EEG signals. This mini review described the advantages and disadvantages of the SRC methods in the EEG signal analysis with the expectation that these methods can provide the better tools for analyzing EEG signals. PMID:27458376

  12. Grid-based methods for diatomic quantum scattering problems: A finite-element discrete-variable representation in prolate spheroidal coordinates

    NASA Astrophysics Data System (ADS)

    Tao, Liang; McCurdy, C. W.; Rescigno, T. N.

    2009-01-01

    We show how to combine finite elements and the discrete-variable representation in prolate spheroidal coordinates to develop a grid-based approach for quantum mechanical studies involving diatomic molecular targets. Prolate spheroidal coordinates are a natural choice for diatomic systems and have been used previously in a variety of bound-state applications. The use of exterior complex scaling in the present implementation allows for a transparently simple way of enforcing Coulomb boundary conditions and therefore straightforward application to electronic continuum problems. Illustrative examples involving the bound and continuum states of H2+ , as well as the calculation of photoionization cross sections, show that the speed and accuracy of the present approach offer distinct advantages over methods based on single-center expansions.

  13. Predicting the Viscosity of Low VOC Vinyl Ester and Fatty Acid-Based Resins

    DTIC Science & Technology

    2005-12-01

    The sample was titrated with the perchloric acid / peracetic acid solution (Aldrich) until the indicator, 0.1% crystal violet in acetic acid (Aldrich...Predicting the Viscosity of Low VOC Vinyl Ester and Fatty Acid -Based Resins by John J. La Scala, Amutha Jeyarajasingam, Cherise Winston...Aberdeen Proving Ground, MD 21005-5069 ARL-TR-3681 December 2005 Predicting the Viscosity of Low VOC Vinyl Ester and Fatty Acid -Based

  14. Ridge extraction from the time-frequency representation (TFR) of signals based on an image processing approach: application to the analysis of uterine electromyogram AR TFR.

    PubMed

    Terrien, Jérémy; Marque, Catherine; Germain, Guy

    2008-05-01

    Time-frequency representations (TFRs) of signals are increasingly being used in biomedical research. Analysis of such representations is sometimes difficult, however, and is often reduced to the extraction of ridges, or local energy maxima. In this paper, we describe a new ridge extraction method based on the image processing technique of active contours or snakes. We have tested our method on several synthetic signals and for the analysis of uterine electromyogram or electrohysterogram (EHG) recorded during gestation in monkeys. We have also evaluated a postprocessing algorithm that is especially suited for EHG analysis. Parameters are evaluated on real EHG signals in different gestational periods. The presented method gives good results when applied to synthetic as well as EHG signals. We have been able to obtain smaller ridge extraction errors when compared to two other methods specially developed for EHG. The gradient vector flow (GVF) snake method, or GVF-snake method, appears to be a good ridge extraction tool, which could be used on TFR of mono or multicomponent signals with good results.

  15. Characterizing Interaction with Visual Mathematical Representations

    ERIC Educational Resources Information Center

    Sedig, Kamran; Sumner, Mark

    2006-01-01

    This paper presents a characterization of computer-based interactions by which learners can explore and investigate visual mathematical representations (VMRs). VMRs (e.g., geometric structures, graphs, and diagrams) refer to graphical representations that visually encode properties and relationships of mathematical structures and concepts.…

  16. Soluble adenylyl cyclase is an acid-base sensor in epithelial base-secreting cells.

    PubMed

    Roa, Jinae N; Tresguerres, Martin

    2016-08-01

    Blood acid-base regulation by specialized epithelia, such as gills and kidney, requires the ability to sense blood acid-base status. Here, we developed primary cultures of ray (Urolophus halleri) gill cells to study mechanisms for acid-base sensing without the interference of whole animal hormonal regulation. Ray gills have abundant base-secreting cells, identified by their noticeable expression of vacuolar-type H(+)-ATPase (VHA), and also express the evolutionarily conserved acid-base sensor soluble adenylyl cyclase (sAC). Exposure of cultured cells to extracellular alkalosis (pH 8.0, 40 mM HCO3 (-)) triggered VHA translocation to the cell membrane, similar to previous reports in live animals experiencing blood alkalosis. VHA translocation was dependent on sAC, as it was blocked by the sAC-specific inhibitor KH7. Ray gill base-secreting cells also express transmembrane adenylyl cyclases (tmACs); however, tmAC inhibition by 2',5'-dideoxyadenosine did not prevent alkalosis-dependent VHA translocation, and tmAC activation by forskolin reduced the abundance of VHA at the cell membrane. This study demonstrates that sAC is a necessary and sufficient sensor of extracellular alkalosis in ray gill base-secreting cells. In addition, this study indicates that different sources of cAMP differentially modulate cell biology.

  17. Applications of synchrotron-based spectroscopic techniques in studying nucleic acids and nucleic acid-functionalized nanomaterials

    PubMed Central

    Wu, Peiwen; Yu, Yang; McGhee, Claire E.; Tan, Li Huey

    2014-01-01

    In this review, we summarize recent progresses in the application of synchrotron-based spectroscopic techniques for nucleic acid research that takes advantage of high-flux and high-brilliance electromagnetic radiation from synchrotron sources. The first section of the review focuses on the characterization of the structure and folding processes of nucleic acids using different types of synchrotron-based spectroscopies, such as X-ray absorption spectroscopy, X-ray emission spectroscopy, X-ray photoelectron spectroscopy, synchrotron radiation circular dichroism, X-ray footprinting and small-angle X-ray scattering. In the second section, the characterization of nucleic acid-based nanostructures, nucleic acid-functionalized nanomaterials and nucleic acid-lipid interactions using these spectroscopic techniques is summarized. Insights gained from these studies are described and future directions of this field are also discussed. PMID:25205057

  18. Applications of synchrotron-based spectroscopic techniques in studying nucleic acids and nucleic acid-functionalized nanomaterials.

    PubMed

    Wu, Peiwen; Yu, Yang; McGhee, Claire E; Tan, Li Huey; Lu, Yi

    2014-12-10

    In this review, we summarize recent progress in the application of synchrotron-based spectroscopic techniques for nucleic acid research that takes advantage of high-flux and high-brilliance electromagnetic radiation from synchrotron sources. The first section of the review focuses on the characterization of the structure and folding processes of nucleic acids using different types of synchrotron-based spectroscopies, such as X-ray absorption spectroscopy, X-ray emission spectroscopy, X-ray photoelectron spectroscopy, synchrotron radiation circular dichroism, X-ray footprinting and small-angle X-ray scattering. In the second section, the characterization of nucleic acid-based nanostructures, nucleic acid-functionalized nanomaterials and nucleic acid-lipid interactions using these spectroscopic techniques is summarized. Insights gained from these studies are described and future directions of this field are also discussed.

  19. Applications of synchrotron-based spectroscopic techniques in studying nucleic acids and nucleic acid-functionalized nanomaterials

    SciTech Connect

    Wu, Peiwen; Yu, Yang; McGhee, Claire E.; Tan, Li Huey; Lu, Yi

    2014-09-10

    In this paper, we summarize recent progress in the application of synchrotron-based spectroscopic techniques for nucleic acid research that takes advantage of high-flux and high-brilliance electromagnetic radiation from synchrotron sources. The first section of the review focuses on the characterization of the structure and folding processes of nucleic acids using different types of synchrotron-based spectroscopies, such as X-ray absorption spectroscopy, X-ray emission spectroscopy, X-ray photoelectron spectroscopy, synchrotron radiation circular dichroism, X-ray footprinting and small-angle X-ray scattering. In the second section, the characterization of nucleic acid-based nanostructures, nucleic acid-functionalized nanomaterials and nucleic acid-lipid interactions using these spectroscopic techniques is summarized. Insights gained from these studies are described and future directions of this field are also discussed.

  20. Applications of synchrotron-based spectroscopic techniques in studying nucleic acids and nucleic acid-functionalized nanomaterials

    DOE PAGES

    Wu, Peiwen; Yu, Yang; McGhee, Claire E.; ...

    2014-09-10

    In this paper, we summarize recent progress in the application of synchrotron-based spectroscopic techniques for nucleic acid research that takes advantage of high-flux and high-brilliance electromagnetic radiation from synchrotron sources. The first section of the review focuses on the characterization of the structure and folding processes of nucleic acids using different types of synchrotron-based spectroscopies, such as X-ray absorption spectroscopy, X-ray emission spectroscopy, X-ray photoelectron spectroscopy, synchrotron radiation circular dichroism, X-ray footprinting and small-angle X-ray scattering. In the second section, the characterization of nucleic acid-based nanostructures, nucleic acid-functionalized nanomaterials and nucleic acid-lipid interactions using these spectroscopic techniques is summarized. Insightsmore » gained from these studies are described and future directions of this field are also discussed.« less

  1. The effects of borate minerals on the synthesis of nucleic acid bases, amino acids and biogenic carboxylic acids from formamide.

    PubMed

    Saladino, Raffaele; Barontini, Maurizio; Cossetti, Cristina; Di Mauro, Ernesto; Crestini, Claudia

    2011-08-01

    The thermal condensation of formamide in the presence of mineral borates is reported. The products afforded are precursors of nucleic acids, amino acids derivatives and carboxylic acids. The efficiency and the selectivity of the reaction was studied in relation to the elemental composition of the 18 minerals analyzed. The possibility of synthesizing at the same time building blocks of both genetic and metabolic apparatuses, along with the production of amino acids, highlights the interest of the formamide/borate system in prebiotic chemistry.

  2. Intentionality, Representation, and Anticipation

    NASA Astrophysics Data System (ADS)

    De Preester, Helena

    2002-09-01

    Both Brentano and Merleau-Ponty have developed an account of intentionality, which nevertheless differ profoundly in the following respect. According to Brentano, intentionality mainly is a matter of mental presentations. This marks the beginning of phenomenology's difficult relation with the nature of the intentional reference. Merleau-Ponty, on the other hand, has situated intentionality on the level of the body, a turn which has important implications for the nature of intentionality. Intentionality no longer is primarily based on having (re)presentations, but is rooted in the dynamics of the living body. To contrast those approaches enables us to make clear in what way intentionality is studied nowadays. On the one hand, intentionality is conceived of as a matter of formal-syntactical causality in cognitive science, and in particular in classical-computational theory. On the other hand, a interactivist approach offers a more Merleau-Ponty-like point of view, in which autonomy, embodiment and interaction are stressed.

  3. Effect of temperature on the acid-base properties of the alumina surface: microcalorimetry and acid-base titration experiments.

    PubMed

    Morel, Jean-Pierre; Marmier, Nicolas; Hurel, Charlotte; Morel-Desrosiers, Nicole

    2006-06-15

    Sorption reactions on natural or synthetic materials that can attenuate the migration of pollutants in the geosphere could be affected by temperature variations. Nevertheless, most of the theoretical models describing sorption reactions are at 25 degrees C. To check these models at different temperatures, experimental data such as the enthalpies of sorption are thus required. Highly sensitive microcalorimeters can now be used to determine the heat effects accompanying the sorption of radionuclides on oxide-water interfaces, but enthalpies of sorption cannot be extracted from microcalorimetric data without a clear knowledge of the thermodynamics of protonation and deprotonation of the oxide surface. However, the values reported in the literature show large discrepancies and one must conclude that, amazingly, this fundamental problem of proton binding is not yet resolved. We have thus undertaken to measure by titration microcalorimetry the heat effects accompanying proton exchange at the alumina-water interface at 25 degrees C. Based on (i) the surface sites speciation provided by a surface complexation model (built from acid-base titrations at 25 degrees C) and (ii) results of the microcalorimetric experiments, calculations have been made to extract the enthalpic variations associated respectively to first and second deprotonation of the alumina surface. Values obtained are deltaH1 = 80+/-10 kJ mol(-1) and deltaH2 = 5+/-3 kJ mol(-1). In a second step, these enthalpy values were used to calculate the alumina surface acidity constants at 50 degrees C via the van't Hoff equation. Then a theoretical titration curve at 50 degrees C was calculated and compared to the experimental alumina surface titration curve. Good agreement between the predicted acid-base titration curve and the experimental one was observed.

  4. Sphingoid bases inhibit acid-induced demineralization of hydroxyapatite.

    PubMed

    Valentijn-Benz, Marianne; van 't Hof, Wim; Bikker, Floris J; Nazmi, Kamran; Brand, Henk S; Sotres, Javier; Lindh, Liselott; Arnebrant, Thomas; Veerman, Enno C I

    2015-01-01

    Calcium hydroxyapatite (HAp), the main constituent of dental enamel, is inherently susceptible to the etching and dissolving action of acids, resulting in tooth decay such as dental caries and dental erosion. Since the prevalence of erosive wear is gradually increasing, there is urgent need for agents that protect the enamel against erosive attacks. In the present study we studied in vitro the anti-erosive effects of a number of sphingolipids and sphingoid bases, which form the backbone of sphingolipids. Pretreatment of HAp discs with sphingosine, phytosphingosine (PHS), PHS phosphate and sphinganine significantly protected these against acid-induced demineralization by 80 ± 17%, 78 ± 17%, 78 ± 7% and 81 ± 8%, respectively (p < 0.001). On the other hand, sphingomyelin, acetyl PHS, octanoyl PHS and stearoyl PHS had no anti-erosive effects. Atomic force measurement revealed that HAp discs treated with PHS were almost completely and homogeneously covered by patches of PHS. This suggests that PHS and other sphingoid bases form layers on the surface of HAp, which act as diffusion barriers against H(+) ions. In principle, these anti-erosive properties make PHS and related sphingosines promising and attractive candidates as ingredients in oral care products.

  5. Method of Identifying a Base in a Nucleic Acid

    DOEpatents

    Fodor, Stephen P. A.; Lipshutz, Robert J.; Huang, Xiaohua

    1999-01-01

    Devices and techniques for hybridization of nucleic acids and for determining the sequence of nucleic acids. Arrays of nucleic acids are formed by techniques, preferably high resolution, light-directed techniques. Positions of hybridization of a target nucleic acid are determined by, e.g., epifluorescence microscopy. Devices and techniques are proposed to determine the sequence of a target nucleic acid more efficiently and more quickly through such synthesis and detection techniques.

  6. Probe kit for identifying a base in a nucleic acid

    DOEpatents

    Fodor, Stephen P. A.; Lipshutz, Robert J.; Huang, Xiaohua

    2001-01-01

    Devices and techniques for hybridization of nucleic acids and for determining the sequence of nucleic acids. Arrays of nucleic acids are formed by techniques, preferably high resolution, light-directed techniques. Positions of hybridization of a target nucleic acid are determined by, e.g., epifluorescence microscopy. Devices and techniques are proposed to determine the sequence of a target nucleic acid more efficiently and more quickly through such synthesis and detection techniques.

  7. Hybridization and sequencing of nucleic acids using base pair mismatches

    DOEpatents

    Fodor, Stephen P. A.; Lipshutz, Robert J.; Huang, Xiaohua

    2001-01-01

    Devices and techniques for hybridization of nucleic acids and for determining the sequence of nucleic acids. Arrays of nucleic acids are formed by techniques, preferably high resolution, light-directed techniques. Positions of hybridization of a target nucleic acid are determined by, e.g., epifluorescence microscopy. Devices and techniques are proposed to determine the sequence of a target nucleic acid more efficiently and more quickly through such synthesis and detection techniques.

  8. Microarray-based transcriptome of Listeria monocytogenes adapted to sublethal concentrations of acetic acid, lactic acid, and hydrochloric acid.

    PubMed

    Tessema, Girum Tadesse; Møretrø, Trond; Snipen, Lars; Heir, Even; Holck, Askild; Naterstad, Kristine; Axelsson, Lars

    2012-09-01

    Listeria monocytogenes , an important foodborne pathogen, commonly encounters organic acids in food-related environments. The transcriptome of L. monocytogenes L502 was analyzed after adaptation to pH 5 in the presence of acetic acid, lactic acid, or hydrochloric acid (HCl) at 25 °C, representing a condition encountered in mildly acidic ready-to-eat food kept at room temperature. The acid-treated cells were compared with a reference culture with a pH of 6.7 at the time of RNA harvesting. The number of genes and magnitude of transcriptional responses were higher for the organic acids than for HCl. Protein coding genes described for low pH stress, energy transport and metabolism, virulence determinates, and acid tolerance response were commonly regulated in the 3 acid-stressed cultures. Interestingly, the transcriptional levels of histidine and cell wall biosynthetic operons were upregulated, indicating possible universal response against low pH stress in L. monocytogenes. The opuCABCD operon, coding proteins for compatible solutes transport, and the transcriptional regulator sigL were significantly induced in the organic acids, strongly suggesting key roles during organic acid stress. The present study revealed the complex transcriptional responses of L. monocytogenes towards food-related acidulants and opens the roadmap for more specific and in-depth future studies.

  9. Multi-stage classification method oriented to aerial image based on low-rank recovery and multi-feature fusion sparse representation.

    PubMed

    Ma, Xu; Cheng, Yongmei; Hao, Shuai

    2016-12-10

    Automatic classification of terrain surfaces from an aerial image is essential for an autonomous unmanned aerial vehicle (UAV) landing at an unprepared site by using vision. Diverse terrain surfaces may show similar spectral properties due to the illumination and noise that easily cause poor classification performance. To address this issue, a multi-stage classification algorithm based on low-rank recovery and multi-feature fusion sparse representation is proposed. First, color moments and Gabor texture feature are extracted from training data and stacked as column vectors of a dictionary. Then we perform low-rank matrix recovery for the dictionary by using augmented Lagrange multipliers and construct a multi-stage terrain classifier. Experimental results on an aerial map database that we prepared verify the classification accuracy and robustness of the proposed method.

  10. Nutrient based estimation of acid-base balance in vegetarians and non-vegetarians.

    PubMed

    Deriemaeker, Peter; Aerenhouts, Dirk; Hebbelinck, Marcel; Clarys, Peter

    2010-03-01

    A first objective of the present study was to estimate the acid-base balance of the food intake in vegetarians and non-vegetarians. A second objective was to evaluate if additional input of specific food items on the existing potential renal acid load (PRAL) list was necessary for the comparison of the two dietary patterns. Thirty vegetarians between the age of 18 and 30 years were matched for sex, age and BMI with 30 non-vegetarians. Based on the 3-days food diaries the acid-base status of the food intake was estimated using the PRAL method. Mean PRAL values as estimated with the standard table yielded an alkaline load of -5.4 +/- 14.4 mEq/d in the vegetarians compared to an acid load of 10.3 +/- 14.4 mEq/d in the nonvegetarians (p<0.001). Mean PRAL values as estimated with the extended table yielded an alkaline load of -10.9 +/-19.7 mEq/d in the vegetarians compared to an acid load of 13.8 +/- 17.1 mEq/d for the non-vegetarians (p<0.001). The findings of this study indicate that vegetarian food intake produces more alkaline outcomes compared to non-vegetarian diets. The use of the standard PRAL table was sufficient for discrimination between the two diets.

  11. [Microspeciation of amphoteric molecules of unusual acid-base properties].

    PubMed

    Kóczián, Kristóf

    2007-01-01

    The phisico-chemical properties of bio- and drug molecules greatly influence their interactions in the body and strongly effect the mechanism of drug action. Among these properties, macroscopic and site-specific protonation constants are of crucial importance. Latter one is the tool to calculate the relative concentration of the various microspecies in the compartments of the body at different pH values, and also, it is the versatile parameter to improve the pharmacokinetic properties of a new molecule in a particular family of drugs. In the present thesis work, the microspeciation of three molecules of great pharmaceutical importance and unusual acid-base properties, were carried out. The microconstants of tenoxicam, the non-steroidal anti-inflammatory drug, were described, introducing a novel deductive method using Hammett constants. For this purpose, a total of 8 tenoxicam and piroxicam derivatives were synthesised. To the best of our knowledge, the log k(N)O microconstant of tenoxicam obtained thus is the lowest enolate basicity value, which, however, can be well explained by the effects of the intramolecular environment. The developed evaluation procedure is suitable for microconstant determination of compounds in other molecule families. Besides, prodrug-type compounds and analogues similar to the structures of selective COX-2 isoenzyme inhibitors were synthesised. The other two molecules studied, the 6-aminopenicillanic acid and 7-cephalosporanic acid, the core molecules of the two most important beta-lactam antibiotic-types were derivatised and investigated by 1D and 2D NMR techniques. The NMR-pH titration on the parent compounds and their ester derivatives, combined with in situ pH-measurements allowed the microspeciation of these easily decomposing molecules. One of the protonation constant of 7-ACA (log kN(O) = 4.12), to the best of our knowledge, is the least non-aromatic basic amino-site among the natural compounds.

  12. Acid-base titrations using microfluidic paper-based analytical devices.

    PubMed

    Karita, Shingo; Kaneta, Takashi

    2014-12-16

    Rapid and simple acid-base titration was accomplished using a novel microfluidic paper-based analytical device (μPAD). The μPAD was fabricated by wax printing and consisted of ten reservoirs for reaction and detection. The reaction reservoirs contained various amounts of a primary standard substance, potassium hydrogen phthalate (KHPth), whereas a constant amount of phenolphthalein was added to all the detection reservoirs. A sample solution containing NaOH was dropped onto the center of the μPAD and was allowed to spread to the reaction reservoirs where the KHPth neutralized it. When the amount of NaOH exceeded that of the KHPth in the reaction reservoirs, unneutralized hydroxide ion penetrated the detection reservoirs, resulting in a color reaction from the phenolphthalein. Therefore, the number of the detection reservoirs with no color change determined the concentration of the NaOH in the sample solution. The titration was completed within 1 min by visually determining the end point, which required neither instrumentation nor software. The volumes of the KHPth and phenolphthalein solutions added to the corresponding reservoirs were optimized to obtain reproducible and accurate results for the concentration of NaOH. The μPADs determined the concentration of NaOH at orders of magnitude ranging from 0.01 to 1 M. An acid sample, HCl, was also determined using Na2CO3 as a primary standard substance instead of KHPth. Furthermore, the μPAD was applicable to the titrations of nitric acid, sulfuric acid, acetic acid, and ammonia solutions. The μPADs were stable for more than 1 month when stored in darkness at room temperature, although this was reduced to only 5 days under daylight conditions. The analysis of acidic hot spring water was also demonstrated in the field using the μPAD, and the results agreed well with those obtained by classic acid-base titration.

  13. Structure of six organic acid-base adducts from 6-bromobenzo[d]thiazol-2-amine and acidic compounds

    NASA Astrophysics Data System (ADS)

    Jin, Shouwen; Zhang, Jing; Wang, Daqi; Tao, Lin; Zhou, Mengjian; Shen, Yinyan; Chen, Quan; Lin, Zhanghui; Gao, Xingjun

    2014-05-01

    Six anhydrous organic acid-base adducts of 6-bromobenzo[d]thiazol-2-amine were prepared with organic acids as 2,4,6-trinitrophenol, salicylic acid, 3,5-dinitrobenzoic acid, 3,5-dinitrosalicylic acid, malonic acid and sebacic acid. The compounds 1-6 were characterized by X-ray diffraction analysis, IR, and elemental analysis. The melting points of all the adducts were given. Of the six adducts, 1, 3, 4, and 5 are organic salts, while 2, and 6 are cocrystals. The supramolecular arrangement in the crystals 2-6 is based on the R22(8) synthon. Analysis of the crystal packing of 1-6 suggests that there are strong NH⋯O, OH⋯N, and OH⋯O hydrogen bonds (charge assisted or neutral) between acid and base components in the supramolecular assemblies. When the hydroxyl group is present in the ortho position of the carboxy, the intramolecular S6 synthon is present, as expected. Besides the classical hydrogen bonding interactions, other noncovalent interactions also play important roles in structure extension. Due to the synergetic effect of these weak interactions, compounds 1-6 display 1D-3D framework structure.

  14. Commonality of neural representations of sentences across languages: Predicting brain activation during Portuguese sentence comprehension using an English-based model of brain function.

    PubMed

    Yang, Ying; Wang, Jing; Bailer, Cyntia; Cherkassky, Vladimir; Just, Marcel Adam

    2017-02-01

    The aim of the study was to test the cross-language generative capability of a model that predicts neural activation patterns evoked by sentence reading, based on a semantic characterization of the sentence. In a previous study on English monolingual speakers (Wang et al., submitted), a computational model performed a mapping from a set of 42 concept-level semantic features (Neurally Plausible Semantic Features, NPSFs) as well as 6 thematic role markers to neural activation patterns (assessed with fMRI), to predict activation levels in a network of brain locations. The model used two types of information gained from the English-based fMRI data to predict the activation for individual sentences in Portuguese. First, it used the mapping weights from NPSFs to voxel activation levels derived from the model for English reading. Second, the brain locations for which the activation levels were predicted were derived from a factor analysis of the brain activation patterns during English reading. These meta-language locations were defined by the clusters of voxels with high loadings on each of the four main dimensions (factors), namely people, places, actions and feelings, underlying the neural representations of the stimulus sentences. This cross-language model succeeded in predicting the brain activation patterns associated with the reading of 60 individual Portuguese sentences that were entirely new to the model, attaining accuracies reliably above chance level. The prediction accuracy was not affected by whether the Portuguese speaker was monolingual or Portuguese-English bilingual. The model's confusion errors indicated an accurate capture of the events or states described in the sentence at a conceptual level. Overall, the cross-language predictive capability of the model demonstrates the neural commonality between speakers of different languages in the representations of everyday events and states, and provides an initial characterization of the common meta

  15. DNA Methylation Profiling at Single-Base Resolution Reveals Gestational Folic Acid Supplementation Influences the Epigenome of Mouse Offspring Cerebellum

    PubMed Central

    Barua, Subit; Kuizon, Salomon; Brown, W. Ted; Junaid, Mohammed A.

    2016-01-01

    It is becoming increasingly more evident that lifestyle, environmental factors, and maternal nutrition during gestation can influence the epigenome of the developing fetus and thus modulate the physiological outcome. Variations in the intake of maternal nutrients affecting one-carbon metabolism may influence brain development and exert long-term effects on the health of the progeny. In this study, we investigated whether supplementation with high maternal folic acid during gestation alters DNA methylation and gene expression in the cerebellum of mouse offspring. We used reduced representation bisulfite sequencing to analyze the DNA methylation profile at the single-base resolution level. The genome-wide DNA methylation analysis revealed that supplementation with higher maternal folic acid resulted in distinct methylation patterns (P < 0.05) of CpG and non-CpG sites in the cerebellum of offspring. Such variations of methylation and gene expression in the cerebellum of offspring were highly sex-specific, including several genes of the neuronal pathways. These findings demonstrate that alterations in the level of maternal folic acid during gestation can influence methylation and gene expression in the cerebellum of offspring. Such changes in the offspring epigenome may alter neurodevelopment and influence the functional outcome of neurologic and psychiatric diseases. PMID:27199632

  16. Fast high-throughput method for the determination of acidity constants by capillary electrophoresis: I. Monoprotic weak acids and bases.

    PubMed

    Fuguet, Elisabet; Ràfols, Clara; Bosch, Elisabeth; Rosés, Martí

    2009-04-24

    A new and fast method to determine acidity constants of monoprotic weak acids and bases by capillary zone electrophoresis based on the use of an internal standard (compound of similar nature and acidity constant as the analyte) has been developed. This method requires only two electrophoretic runs for the determination of an acidity constant: a first one at a pH where both analyte and internal standard are totally ionized, and a second one at another pH where both are partially ionized. Furthermore, the method is not pH dependent, so an accurate measure of the pH of the buffer solutions is not needed. The acidity constants of several phenols and amines have been measured using internal standards of known pK(a), obtaining a mean deviation of 0.05 pH units compared to the literature values.

  17. Towards a Representation of Flexible Canopy N Stiochiometry for Land-surface Models Based on Optimality Concepts

    NASA Astrophysics Data System (ADS)

    Zaehle, S.; Caldararu, S.

    2015-12-01

    Foliar nitrogen (N) is know to acclimate to environmental conditions. One particular pertinent response is the general decline in foliar N following exposure to elevated levels of atmospheric CO2 (eCO2). Associated with reduced foliar N is an increased plant nitrogen-use efficiency, which contributes to the plants' sustained growth response to eCO2 in the absence of any counteracting litter N feedbacks. Flexible leaf N thus has important consequences for the mid- to long-term response of terrestrial ecosystems to eCO2. The current generation of land-surface models including a prognostic N cycle generally employ heuristic, and simply mass-balancing parameterisations to estimate changes in stoichiometry given altered N and carbon (C) availability. This generation generally and substantially overestimates the decline of foliar N (and thus the increase in plant nitrogen use efficiency) observed in Free Air CO2 Enrichment Experiments (FACE; Zaehle et al. 2014). In this presentation, I develop a simple, prognostic and dynamic representation of flexible foliar N for use in land-surface models by maximising the marginal gain of net assimilation with respect to the energy investment to generate foliar area and foliar N. I elucidate the underlying assumptions required to simulate the commonly observed decline in foliar N with eCO2 under different scenarios of N availability (Feng et al. 2015). References: Zaehle, Sönke, Belinda E Medlyn, Martin G De Kauwe, Anthony P Walker, Michael C Dietze, Hickler Thomas, Yiqi Luo, et al. 2014. "Evaluation of 11 Terrestrial Carbon-Nitrogen Cycle Models Against Observations From Two Temperate Free-Air CO2 Enrichment Studies." New Phytologist 202 (3): 803-22. doi:10.1111/nph.12697. Feng, Zhaozhong, Tobias RUtting, Håkan Pleijel, GORAN WALLIN, Peter B Reich, Claudia I Kammann, Paul C D Newton, Kazuhiko Kobayashi, Yunjian Luo, and Johan Uddling. 2015. "Constraints to Nitrogen Acquisition of Terrestrial Plants Under Elevated CO 2." Global

  18. Comparison of Support-Vector Machine and Sparse Representation Using a Modified Rule-Based Method for Automated Myocardial Ischemia Detection

    PubMed Central

    Tseng, Yi-Li; Lin, Keng-Sheng; Jaw, Fu-Shan

    2016-01-01

    An automatic method is presented for detecting myocardial ischemia, which can be considered as the early symptom of acute coronary events. Myocardial ischemia commonly manifests as ST- and T-wave changes on ECG signals. The methods in this study are proposed to detect abnormal ECG beats using knowledge-based features and classification methods. A novel classification method, sparse representation-based classification (SRC), is involved to improve the performance of the existing algorithms. A comparison was made between two classification methods, SRC and support-vector machine (SVM), using rule-based vectors as input feature space. The two methods are proposed with quantitative evaluation to validate their performances. The results of SRC method encompassed with rule-based features demonstrate higher sensitivity than that of SVM. However, the specificity and precision are a trade-off. Moreover, SRC method is less dependent on the selection of rule-based features and can achieve high performance using fewer features. The overall performances of the two methods proposed in this study are better than the previous methods. PMID:26925158

  19. Kinetics of acid base catalyzed transesterification of Jatropha curcas oil.

    PubMed

    Jain, Siddharth; Sharma, M P

    2010-10-01

    Out of various non-edible oil resources, Jatropha curcas oil (JCO) is considered as future feedstock for biodiesel production in India. Limited work is reported on the kinetics of transesterification of high free fatty acids containing oil. The present study reports the results of kinetic study of two-step acid base catalyzed transesterification process carried out at an optimum temperature of 65 °C and 50 °C for esterification and transesterification respectively under the optimum methanol to oil ratio of 3:7 (v/v), catalyst concentration 1% (w/w) for H₂SO₄ and NaOH. The yield of methyl ester (ME) has been used to study the effect of different parameters. The results indicate that both esterification and transesterification reaction are of first order with reaction rate constant of 0.0031 min⁻¹ and 0.008 min⁻¹ respectively. The maximum yield of 21.2% of ME during esterification and 90.1% from transesterification of pretreated JCO has been obtained.

  20. Data and Knowledge Base on the Basis of the Expanded Matrix Model of Their Representation for the Intelligent System of Road-Climatic Zoning of Territories

    NASA Astrophysics Data System (ADS)

    Yankovskaya, A.; Cherepanov, D.; Selivanikova, O.

    2016-08-01

    An extended matrix model of data and knowledge representation on the investigated area, as well as a matrix model of data representation on the territory under investigation, are proposed for the intelligent system of road-climatic zoning of territories (RCZT) - the main information technology of RCZT. A part of the West Siberian region has been selected as the investigated territory. The extended matrix model of knowledge representation is filled out by knowledge engineers with participation of highly qualified experts in the field of RCZT. The matrix model of data representation on the territory under investigation is filled out by persons concerned in RCZT of the motor-roads management system.

  1. Dynamical Approach to Multiequilibria Problems for Mixtures of Acids and Their Conjugated Bases

    ERIC Educational Resources Information Center

    Glaser, Rainer E.; Delarosa, Marco A.; Salau, Ahmed Olasunkanmi; Chicone, Carmen

    2014-01-01

    Mathematical methods are described for the determination of steady-state concentrations of all species in multiequilibria systems consisting of several acids and their conjugated bases in aqueous solutions. The main example consists of a mixture of a diprotic acid H[subscript 2]A, a monoprotic acid HB, and their conjugate bases. The reaction…

  2. Spherical Nucleic Acids as Intracellular Agents for Nucleic Acid Based Therapeutics

    NASA Astrophysics Data System (ADS)

    Hao, Liangliang

    Recent functional discoveries on the noncoding sequences of human genome and transcriptome could lead to revolutionary treatment modalities because the noncoding RNAs (ncRNAs) can be applied as therapeutic agents to manipulate disease-causing genes. To date few nucleic acid-based therapeutics have been translated into the clinic due to challenges in the delivery of the oligonucleotide agents in an effective, cell specific, and non-toxic fashion. Unmodified oligonucleotide agents are destroyed rapidly in biological fluids by enzymatic degradation and have difficulty crossing the plasma membrane without the aid of transfection reagents, which often cause inflammatory, cytotoxic, or immunogenic side effects. Spherical nucleic acids (SNAs), nanoparticles consisting of densely organized and highly oriented oligonucleotides, pose one possible solution to circumventing these problems in both the antisense and RNA interference (RNAi) pathways. The unique three dimensional architecture of SNAs protects the bioactive oligonucleotides from unspecific degradation during delivery and supports their targeting of class A scavenger receptors and endocytosis via a lipid-raft-dependent, caveolae-mediated pathway. Owing to their unique structure, SNAs are able to cross cell membranes and regulate target genes expression as a single entity, without triggering the cellular innate immune response. Herein, my thesis has focused on understanding the interactions between SNAs and cellular components and developing SNA-based nanostructures to improve therapeutic capabilities. Specifically, I developed a novel SNA-based, nanoscale agent for delivery of therapeutic oligonucleotides to manipulate microRNAs (miRNAs), the endogenous post-transcriptional gene regulators. I investigated the role of SNAs involving miRNAs in anti-cancer or anti-inflammation responses in cells and in in vivo murine disease models via systemic injection. Furthermore, I explored using different strategies to construct

  3. Inscriptions Becoming Representations in Representational Practices

    ERIC Educational Resources Information Center

    Medina, Richard; Suthers, Daniel

    2013-01-01

    We analyze the interaction of 3 students working on mathematics problems over several days in a virtual math team. Our analysis traces out how successful collaboration in a later session is contingent upon the work of prior sessions and shows how the development of representational practices is an important aspect of these participants' problem…

  4. Drug delivery systems based on nucleic acid nanostructures.

    PubMed

    de Vries, Jan Willem; Zhang, Feng; Herrmann, Andreas

    2013-12-10

    The field of DNA nanotechnology has progressed rapidly in recent years and hence a large variety of 1D-, 2D- and 3D DNA nanostructures with various sizes, geometries and shapes is readily accessible. DNA-based nanoobjects are fabricated by straight forward design and self-assembly processes allowing the exact positioning of functional moieties and the integration of other materials. At the same time some of these nanosystems are characterized by a low toxicity profile. As a consequence, the use of these architectures in a biomedical context has been explored. In this review the progress and possibilities of pristine nucleic acid nanostructures and DNA hybrid materials for drug delivery will be discussed. For the latter class of structures, a distinction is made between carriers with an inorganic core composed of gold or silica and amphiphilic DNA block copolymers that exhibit a soft hydrophobic interior.

  5. The Significance of Acid/Base Properties in Drug Discovery

    PubMed Central

    Manallack, David T.; Prankerd, Richard J.; Yuriev, Elizabeth; Oprea, Tudor I.; Chalmers, David K.

    2013-01-01

    While drug discovery scientists take heed of various guidelines concerning drug-like character, the influence of acid/base properties often remains under-scrutinised. Ionisation constants (pKa values) are fundamental to the variability of the biopharmaceutical characteristics of drugs and to underlying parameters such as logD and solubility. pKa values affect physicochemical properties such as aqueous solubility, which in turn influences drug formulation approaches. More importantly, absorption, distribution, metabolism, excretion and toxicity (ADMET) are profoundly affected by the charge state of compounds under varying pH conditions. Consideration of pKa values in conjunction with other molecular properties is of great significance and has the potential to be used to further improve the efficiency of drug discovery. Given the recent low annual output of new drugs from pharmaceutical companies, this review will provide a timely reminder of an important molecular property that influences clinical success. PMID:23099561

  6. Nucleic Acid-Based Therapy Approaches for Huntington's Disease

    PubMed Central

    Vagner, Tatyana; Young, Deborah; Mouravlev, Alexandre

    2012-01-01

    Huntington's disease (HD) is caused by a dominant mutation that results in an unstable expansion of a CAG repeat in the huntingtin gene leading to a toxic gain of function in huntingtin protein which causes massive neurodegeneration mainly in the striatum and clinical symptoms associated with the disease. Since the mutation has multiple effects in the cell and the precise mechanism of the disease remains to be elucidated, gene therapy approaches have been developed that intervene in different aspects of the condition. These approaches include increasing expression of growth factors, decreasing levels of mutant huntingtin, and restoring cell metabolism and transcriptional balance. The aim of this paper is to outline the nucleic acid-based therapeutic strategies that have been tested to date. PMID:22288011

  7. Ultrasonic and densimetric titration applied for acid-base reactions.

    PubMed

    Burakowski, Andrzej; Gliński, Jacek

    2014-01-01

    Classical acoustic acid-base titration was monitored using sound speed and density measurements. Plots of these parameters, as well as of the adiabatic compressibility coefficient calculated from them, exhibit changes with the volume of added titrant. Compressibility changes can be explained and quantitatively predicted theoretically in terms of Pasynski theory of non-compressible hydrates combined with that of the additivity of the hydration numbers with the amount and type of ions and molecules present in solution. It also seems that this development could be applied in chemical engineering for monitoring the course of chemical processes, since the applied experimental methods can be carried out almost independently on the medium under test (harmful, aggressive, etc.).

  8. Are carboxyl groups the most acidic sites in amino acids? Gas-phase acidities, photoelectron spectra, and computations on tyrosine, p-hydroxybenzoic acid, and their conjugate bases.

    PubMed

    Tian, Zhixin; Wang, Xue-Bin; Wang, Lai-Sheng; Kass, Steven R

    2009-01-28

    Deprotonation of tyrosine in the gas phase was found to occur preferentially at the phenolic site, and the conjugate base consists of a 70:30 mixture of phenoxide and carboxylate anions at equilibrium. This result was established by developing a chemical probe for differentiating these two isomers, and the presence of both ions was confirmed by photoelectron spectroscopy. Equilibrium acidity measurements on tyrosine indicated that deltaG(acid)(o) = 332.5 +/- 1.5 kcal mol(-1) and deltaH(acid)(o) = 340.7 +/- 1.5 kcal mol(-1). Photoelectron spectra yielded adiabatic electron detachment energies of 2.70 +/- 0.05 and 3.55 +/- 0.10 eV for the phenoxide and carboxylate anions, respectively. The H/D exchange behavior of deprotonated tyrosine was examined using three different alcohols (CF3CH2OD, C6H5CH2OD, and CH3CH2OD), and incorporation of up to three deuterium atoms was observed. Two pathways are proposed to account for these results, and all of the experimental findings are supplemented with B3LYP/aug-cc-pVDZ and G3B3 calculations. In addition, it was found that electrospray ionization of tyrosine from a 3:1 (v/v) CH3OH/H2O solution using a commercial source produces a deprotonated [M-H]- anion with the gas-phase equilibrium composition rather than the structure of the ion that exists in aqueous media. Electrospray ionization from acetonitrile, however, leads largely to the liquid-phase (carboxylate) structure. A control molecule, p-hydroxybenzoic acid, was found to behave in a similar manner. Thus, the electrospray conditions that are employed for the analysis of a compound can alter the isomeric composition of the resulting anion.

  9. Microgel Tethering For Microarray-Based Nucleic Acid Diagnostics

    NASA Astrophysics Data System (ADS)

    Dai, Xiaoguang

    Molecular diagnostics (MDx) have radically changed the process of clinical microbial identification based on identifying genetic information, MDx approaches are both specific and fast. They can identify microbes to the species and strain level over a time scale that can be as short as one hour. With such information clinicians can administer the most effective and appropriate antimicrobial treatment at an early time point with substantial implications both for patient well-being and for easing the burden on the health-care system. Among the different MDx approaches, such as fluorescence in-situ hybridization, microarrays, next-generation sequencing, and mass spectrometry, point-of-care MDx platforms are drawing particular interest due to their low cost, robustness, and wide application. This dissertation develops a novel MDx technology platform capable of high target amplification and detection performance. For nucleic acid target detection, we fabricate an array of electron-beam-patterned microgels on a standard glass microscope slide. The microgels can be as small as a few hundred nanometers. The unique way of energy deposition during electron-beam lithography provides the microgels with a very diffuse water -gel interface that enables them to not only serve as substrates to immobilize DNA probes but do so while preserving them in a highly hydrated environment that optimizes their performance. Benefiting from the high spatial resolution provided by such techniques as position-sensitive microspotting and dip-pen nanolithography, multiple oligonucleotide probes known as molecular beacons (MBs) can be patterned on microgels. Furthermore, nucleic acid target amplification can be conducted in direct contact with the microgel-tethered detection array. Specifically, we use an isothermal RNA amplification reaction - nucleic acid sequence-based amplification (NASBA). ssRNA amplicons of from the NASBA reaction can directly hybridize with microgel-tethered MBs, and the

  10. Nanoconstructions Based on Spatially Ordered Nucleic Acid Molecules

    NASA Astrophysics Data System (ADS)

    Yevdokimov, Yu. M.

    Different strategies for the design of nanoconstructions whose building blocks are both linear molecules of double-stranded nucleic acids and nucleic acid molecules fixed in the spatial structure of particles of liquid-crystalline dispersions are described.

  11. Acid-base transport by the renal proximal tubule

    PubMed Central

    Skelton, Lara A.; Boron, Walter F.; Zhou, Yuehan

    2015-01-01

    Each day, the kidneys filter 180 L of blood plasma, equating to some 4,300 mmol of the major blood buffer, bicarbonate (HCO3−). The glomerular filtrate enters the lumen of the proximal tubule (PT), and the majority of filtered HCO3− is reclaimed along the early (S1) and convoluted (S2) portions of the PT in a manner coupled to the secretion of H+ into the lumen. The PT also uses the secreted H+ to titrate non-HCO3− buffers in the lumen, in the process creating “new HCO3−” for transport into the blood. Thus, the PT – along with more distal renal segments – is largely responsible for regulating plasma [HCO3−]. In this review we first focus on the milestone discoveries over the past 50+ years that define the mechanism and regulation of acid-base transport by the proximal tubule. Further on in the review, we will summarize research still in progress from our laboratory, work that addresses the problem of how the PT is able to finely adapt to acid–base disturbances by rapidly sensing changes in basolateral levels of HCO3− and CO2 (but not pH), and thereby to exert tight control over the acid–base composition of the blood plasma. PMID:21170887

  12. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  13. The Effects of Scaffolded Simulation-Based Inquiry Learning on Fifth-Graders' Representations of the Greenhouse Effect

    ERIC Educational Resources Information Center

    Kukkonen, Jari Ensio; Kärkkäinen, Sirpa; Dillon, Patrick; Keinonen, Tuula

    2014-01-01

    Research has demonstrated that simulation-based inquiry learning has significant advantages for learning outcomes when properly scaffolded. For successful learning in science with simulation-based inquiry, one needs to ascertain levels of background knowledge so as to support learners in making, evaluating and modifying hypotheses, conducting…

  14. Methylene-bis[(aminomethyl)phosphinic acids]: synthesis, acid-base and coordination properties.

    PubMed

    David, Tomáš; Procházková, Soňa; Havlíčková, Jana; Kotek, Jan; Kubíček, Vojtěch; Hermann, Petr; Lukeš, Ivan

    2013-02-21

    Three symmetrical methylene-bis[(aminomethyl)phosphinic acids] bearing different substituents on the central carbon atom, (NH(2)CH(2))PO(2)H-C(R(1))(R(2))-PO(2)H(CH(2)NH(2)) where R(1) = OH, R(2) = Me (H(2)L(1)), R(1) = OH, R(2) = Ph (H(2)L(2)) and R(1),R(2) = H (H(2)L(3)), were synthesized. Acid-base and complexing properties of the ligands were studied in solution as well as in the solid state. The ligands show unusually high basicity of the nitrogen atoms (log K(1) = 9.5-10, log K(2) = 8.5-9) if compared with simple (aminomethyl)phosphinic acids and, consequently, high stability constants of the complexes with studied divalent metal ions. The study showed the important role of the hydroxo group attached to the central carbon atom of the geminal bis(phosphinate) moiety. Deprotonation of the hydroxo group yields the alcoholate anion which tends to play the role of a bridging ligand and induces formation of polynuclear complexes. Solid-state structures of complexes [H(2)N=C(NH(2))(2)][Cu(2)(H(-1)L(2))(2)]CO(3)·10H(2)O and Li(2)[Co(4)(H(-1)L(1))(3)(OH)]·17.5H(2)O were determined by X-ray diffraction. The complexes show unexpected geometries forming dinuclear and cubane-like structures, respectively. The dinuclear copper(II) complex contains a bridging μ(2)-alcoholate group with the (-)O-P(=O)-CH(2)-NH(2) fragments of each ligand molecule chelated to the different central ion. In the cubane cobalt(II) complex, one μ(3)-hydroxide and three μ(3)-alcoholate anions are located in the cube vertices and both phosphinate groups of one ligand molecule are chelating the same cobalt(II) ion while each of its amino groups are bound to different neighbouring metal ions. All such three metal ions are bridged by the alcoholate group of a given ligand.

  15. The Process-Interaction-Model: a common representation of rule-based and logical models allows studying signal transduction on different levels of detail

    PubMed Central

    2012-01-01

    Background Signaling systems typically involve large, structured molecules each consisting of a large number of subunits called molecule domains. In modeling such systems these domains can be considered as the main players. In order to handle the resulting combinatorial complexity, rule-based modeling has been established as the tool of choice. In contrast to the detailed quantitative rule-based modeling, qualitative modeling approaches like logical modeling rely solely on the network structure and are particularly useful for analyzing structural and functional properties of signaling systems. Results We introduce the Process-Interaction-Model (PIM) concept. It defines a common representation (or basis) of rule-based models and site-specific logical models, and, furthermore, includes methods to derive models of both types from a given PIM. A PIM is based on directed graphs with nodes representing processes like post-translational modifications or binding processes and edges representing the interactions among processes. The applicability of the concept has been demonstrated by applying it to a model describing EGF insulin crosstalk. A prototypic implementation of the PIM concept has been integrated in the modeling software ProMoT. Conclusions The PIM concept provides a common basis for two modeling formalisms tailored to the study of signaling systems: a quantitative (rule-based) and a qualitative (logical) modeling formalism. Every PIM is a compact specification of a rule-based model and facilitates the systematic set-up of a rule-based model, while at the same time facilitating the automatic generation of a site-specific logical model. Consequently, modifications can be made on the underlying basis and then be propagated into the different model specifications – ensuring consistency of all models, regardless of the modeling formalism. This facilitates the analysis of a system on different levels of detail as it guarantees the application of established

  16. Chance-constrained overland flow modeling for improving conceptual distributed hydrologic simulations based on scaling representation of sub-daily rainfall variability.

    PubMed

    Han, Jing-Cheng; Huang, Guohe; Huang, Yuefei; Zhang, Hua; Li, Zhong; Chen, Qiuwen

    2015-08-15

    Lack of hydrologic process representation at the short time-scale would lead to inadequate simulations in distributed hydrological modeling. Especially for complex mountainous watersheds, surface runoff simulations are significantly affected by the overland flow generation, which is closely related to the rainfall characteristics at a sub-time step. In this paper, the sub-daily variability of rainfall intensity was considered using a probability distribution, and a chance-constrained overland flow modeling approach was proposed to capture the generation of overland flow within conceptual distributed hydrologic simulations. The integrated modeling procedures were further demonstrated through a watershed of China Three Gorges Reservoir area, leading to an improved SLURP-TGR hydrologic model based on SLURP. Combined with rainfall thresholds determined to distinguish various magnitudes of daily rainfall totals, three levels of significance were simultaneously employed to examine the hydrologic-response simulation. Results showed that SLURP-TGR could enhance the model performance, and the deviation of runoff simulations was effectively controlled. However, rainfall thresholds were so crucial for reflecting the scaling effect of rainfall intensity that optimal levels of significance and rainfall threshold were 0.05 and 10 mm, respectively. As for the Xiangxi River watershed, the main runoff contribution came from interflow of the fast store. Although slight differences of overland flow simulations between SLURP and SLURP-TGR were derived, SLURP-TGR was found to help improve the simulation of peak flows, and would improve the overall modeling efficiency through adjusting runoff component simulations. Consequently, the developed modeling approach favors efficient representation of hydrological processes and would be expected to have a potential for wide applications.

  17. Science review: quantitative acid-base physiology using the Stewart model.

    PubMed

    Wooten, E Wrenn

    2004-12-01

    There has been renewed interest in quantifying acid-base disorders in the intensive care unit. One of the methods that has become increasingly used to calculate acid-base balance is the Stewart model. This model is briefly discussed in terms of its origin, its relationship to other methods such as the base excess approach, and the information it provides for the assessment and treatment of acid-base disorders in critically ill patients.

  18. The Effects of Self-Explanation and Metacognitive Instruction on Undergraduate Students' Learning of Statistics Materials Containing Multiple External Representations in a Web-Based Environment

    ERIC Educational Resources Information Center

    Hsu, Yu-Chang

    2009-01-01

    Students in the Science, Technology, Engineering, and Mathematics (STEM) fields are confronted with multiple external representations (MERs) in their learning materials. The ability to learn from and communicate with these MERs requires not only that students comprehend each representation individually but also that students recognize how the…

  19. Squeezing, Striking, and Vocalizing: Is Number Representation Fundamentally Spatial?

    ERIC Educational Resources Information Center

    Nunez, Rafael; Doan, D.; Nikoulina, Anastasia

    2011-01-01

    Numbers are fundamental entities in mathematics, but their cognitive bases are unclear. Abundant research points to linear space as a natural grounding for number representation. But, is number representation fundamentally spatial? We disentangle number representation from standard number-to-line reporting methods, and compare numerical…

  20. Guanine base stacking in G-quadruplex nucleic acids

    PubMed Central

    Lech, Christopher Jacques; Heddi, Brahim; Phan, Anh Tuân

    2013-01-01

    G-quadruplexes constitute a class of nucleic acid structures defined by stacked guanine tetrads (or G-tetrads) with guanine bases from neighboring tetrads stacking with one another within the G-tetrad core. Individual G-quadruplexes can also stack with one another at their G-tetrad interface leading to higher-order structures as observed in telomeric repeat-containing DNA and RNA. In this study, we investigate how guanine base stacking influences the stability of G-quadruplexes and their stacked higher-order structures. A structural survey of the Protein Data Bank is conducted to characterize experimentally observed guanine base stacking geometries within the core of G-quadruplexes and at the interface between stacked G-quadruplex structures. We couple this survey with a systematic computational examination of stacked G-tetrad energy landscapes using quantum mechanical computations. Energy calculations of stacked G-tetrads reveal large energy differences of up to 12 kcal/mol between experimentally observed geometries at the interface of stacked G-quadruplexes. Energy landscapes are also computed using an AMBER molecular mechanics description of stacking energy and are shown to agree quite well with quantum mechanical calculated landscapes. Molecular dynamics simulations provide a structural explanation for the experimentally observed preference of parallel G-quadruplexes to stack in a 5′–5′ manner based on different accessible tetrad stacking modes at the stacking interfaces of 5′–5′ and 3′–3′ stacked G-quadruplexes. PMID:23268444